Joe Osborne, a Facebook spokesperson, mentioned in a press release that the corporate “had already been investigating these topics” on the time of Allen’s report, including: “Since that time, we have stood up teams, developed new policies, and collaborated with industry peers to address these networks. We’ve taken aggressive enforcement actions against these kinds of foreign and domestic inauthentic groups and have shared the results publicly on a quarterly basis.”
In the method of fact-checking this story shortly earlier than publication, MIT Technology Review discovered that 5 of the troll-farm pages talked about within the report remained energetic.
The report discovered that troll farms had been reaching the identical demographic teams singled out by the Kremlin-backed Internet Research Agency (IRA) through the 2016 election, which had focused Christians, Black Americans, and Native Americans. A 2018 BuzzFeed News investigation discovered that no less than one member of the Russian IRA, indicted for alleged interference within the 2016 US election, had additionally visited Macedonia across the emergence of its first troll farms, although it didn’t discover concrete proof of a connection. (Facebook mentioned its investigations hadn’t turned up a connection between the IRA and Macedonian troll farms both.)
“This is not normal. This is not healthy,” Allen wrote. “We have empowered inauthentic actors to accumulate huge followings for largely unknown purposes … The fact that actors with possible ties to the IRA have access to huge audience numbers in the same demographic groups targeted by the IRA poses an enormous risk to the US 2020 election.”
As lengthy as troll farms discovered success in utilizing these ways, some other dangerous actor might too, he continued: “If the Troll Farms are reaching 30M US users with content targeted to African Americans, we should not at all be surprised if we discover the IRA also currently has large audiences there.”
Allen wrote the report because the fourth and closing installment of a year-and-a-half-long effort to grasp troll farms. He left the corporate that very same month, partly due to frustration that management had “effectively ignored” his analysis, based on the previous Facebook worker who equipped the report. Allen declined to remark.
The report reveals the alarming state of affairs during which Facebook management left the platform for years, regardless of repeated public guarantees to aggressively deal with foreign-based election interference. MIT Technology Review is making the complete report out there, with worker names redacted, as a result of it’s within the public curiosity.
Its revelations embody:
- As of October 2019, round 15,000 Facebook pages with a majority US viewers had been being run out of Kosovo and Macedonia, recognized dangerous actors through the 2016 election.
- Collectively, these troll-farm pages—which the report treats as a single web page for comparability functions—reached 140 million US customers month-to-month and 360 million world customers weekly. Walmart’s web page reached the second-largest US viewers at 100 million.
- The troll farm pages additionally mixed to kind:
- the most important Christian American web page on Facebook, 20 instances bigger than the following largest—reaching 75 million US customers month-to-month, 95% of whom had by no means adopted any of the pages.
- the most important African-American web page on Facebook, thrice bigger than the following largest—reaching 30 million US customers month-to-month, 85% of whom had by no means adopted any of the pages.
- the second-largest Native American web page on Facebook, reaching 400,000 customers month-to-month, 90% of whom had by no means adopted any of the pages.
- the fifth-largest girls’s web page on Facebook, reaching 60 million US customers month-to-month, 90% of whom had by no means adopted any of the pages.
- Troll farms primarily have an effect on the US but in addition goal the UK, Australia, India, and Central and South American nations.
- Facebook has carried out a number of research confirming that content material extra more likely to obtain consumer engagement (likes, feedback, and shares) is extra possible of a kind recognized to be dangerous. Still, the corporate has continued to rank content material in consumer’s newsfeeds based on what’s going to obtain the best engagement.
- Facebook forbids pages from posting content material merely copied and pasted from different elements of the platform however doesn’t implement the coverage towards recognized dangerous actors. This makes it straightforward for international actors who don’t converse the native language to publish solely copied content material and nonetheless attain a large viewers. At one level, as many as 40% of web page views on US pages went to these that includes primarily unoriginal content material or materials of restricted originality.
- Troll farms beforehand made their manner into Facebook’s Instant Articles and Ad Breaks partnership packages, that are designed to assist information organizations and different publishers monetize their articles and movies. At one level, due to an absence of primary high quality checks, as many as 60% of Instant Article reads had been going to content material that had been plagiarized from elsewhere. This made it straightforward for troll farms to combine in unnoticed, and even obtain funds from Facebook.
How Facebook allows troll farms and grows their audiences
The report appears particularly at troll farms primarily based in Kosovo and Macedonia, that are run by individuals who don’t essentially perceive American politics. Yet due to the way in which Facebook’s newsfeed reward methods are designed, they will nonetheless have a big influence on political discourse.