Russian troll farm spent $100K on Facebook ads during the U.S. election
Hundreds of fake Facebook accounts, probably run from Russia, spent about $100,000 on ads aimed at stirring up divisive issues such as gun control and race relations during the 2016 U.S. presidential election, the social network said Wednesday.
Although the number of ads is relatively small, the disclosure provides a more detailed peek into what investigators believe was a targeted effort by Russians to influence U.S. politics during the campaign, this time through social media.
The 470 accounts appeared to come from a notorious “troll farm,” a St. Petersburg-based organization known for promoting pro-Russian government positions via fake accounts, according to two people familiar with the investigation. The people were granted anonymity because they weren’t authorized to publicly discuss details of the investigation.
In all, the accounts purchased some 3,000 ads between June 2015 and May 2017. While the ads didn’t specifically reference the election, a candidate or voting, they nevertheless allowed “divisive messages” to be amplified via the social media platform, the company’s chief security officer, Alex Stamos, said in a statement.
About a quarter of the ads were geographically targeted, Facebook said. The region wasn’t identified.
Facebook has turned over its findings to federal authorities investigating Russian interference in the U.S. presidential election. Robert Mueller, the special counsel, is charged with overseeing Russian meddling in the U.S. election and any potential co-ordination with associates of President Donald Trump.
The techniques were consistent with those described in an analysis Facebook released in April of “information operations” on its platform, Wednesday’s statement said. In that paper, Facebook described a number of ways its platform had been manipulated, including fake news, “false amplifiers,” which it said engaged in “discouraging specific parties from participating in discussion, or amplifying sensationalistic voices over others.”
The report described fake news as one of several tools of disinformation, which could also involve “… more subtle methods, such as false flag operations, feeding inaccurate quotes or stories to innocent intermediaries, or knowingly amplifying biased or misleading information.”
At the time, Facebook said it couldn’t make a “definitive attribution” of who was sponsoring this activity, but that its findings “did not contradict” a U.S. intelligence report produced in January.
That document asserted that “Russian efforts to influence the 2016 U.S. presidential election represent the most recent expression of Moscow’s longstanding desire to undermine the U.S.-led liberal democratic order, but these activities demonstrated a significant escalation in directness, level of activity, and scope of effort compared to previous operations.”
Warner said he also wants to know more about the content of the ads pushed out by the Russian-based Internet Research Agency and whether they targeted specific voters or locations in the U.S.
He said in many cases the social media messaging “was more about voter depression and suppression without having to necessarily mention an individual candidate’s name.”
Rep. Adam Schiff, the top Democrat on the House Intelligence Committee, said Facebook’s disclosure confirmed what many lawmakers investigating Russian interference in the U.S. election had long suspected.
“One of the things that we’re interested obviously in finding out is whether there was any co-ordination in terms of the use of those paid social media trolls or the Russian use of bots,” he said.