Pressed by investigators in Congress, Facebook said Wednesday that it has found evidence that a pro-Kremlin Russian “troll farm” bought $100,000 worth of ads targeted at U.S. voters between 2015 and 2017. The finding was first reported by the Washington Post, and Facebook published its own statement Wednesday afternoon.
A few of the roughly 3,000 ads that Facebook traced to the Russian company mentioned presidential candidates Donald Trump or Hillary Clinton directly, according to the Post’s sources. The majority focused on stoking people’s emotions around divisive issues such as “gun rights and immigration fears, as well as gay rights and racial discrimination.”
Facebook wouldn’t disclose the ads in question, nor exactly how the scheme worked. But it said the tactics were consistent with those outlined in a white paper on information operations that the company published in April. That white paper described how trolls and foreign agents can use false accounts to amplify divisive messages and disinformation on Facebook’s platform.
One hundred thousand dollars in ad spending might not sound like a lot of money, but it’s a big deal for at least five reasons.
First, it confirms that Facebook was one of the pathways by which Russian operatives sought to influence the U.S. election.
Second, it raises the question of how those Russian operatives knew which U.S. voters to target, and whether the Trump campaign might have played any role.
Third, it casts a new light on Facebook’s “fake news” problem, which looks more sinister if some of the misinformation spread on the platform in the runup to the U.S. election was fueled by Russian-funded ad dollars or troll networks.
Fourth, it suggests that Facebook may have a more widespread oversight problem in its ad sales. As the Post’s story notes, it’s illegal for foreign nationals or governments to buy ads or spend money aimed at influencing a U.S. election. It now seems clear they’ve been using Facebook to do just that.
Finally, while $100,000 amounts to a miniscule fraction of U.S. election spending, it could go a long way in amplifying posts among a targeted audience. Facebook said only about 25 percent of the ads were geographically targeted. But it’s worth remembering that the company has a history of not being forthcoming when it comes to the scale and mechanisms of misinformation on its platform. It’s possible that the activities the company has uncovered and disclosed so far represent only a small part of a larger problem.