Facebook says it has removed 32 pages and accounts from both Facebook and Instagram because they were determined to be engaging in “inauthentic” behavior that purportedly aimed to influence the midterm elections.
“This kind of behavior is not allowed on Facebook because we don’t want people or organizations creating networks of accounts to mislead others about who they are, or what they’re doing,” the company said in a statement.
Facebook is highly sensitive to that kind of activity after it was revealed in March that Cambridge Analytica, a political marketing firm, made unauthorized use of Facebook user data to try to sway the 2016 U.S. presidential election.
Discovered two weeks ago
Facebook said the issue first came to light in mid-July. Since then it has tried to isolate the source, but it has been unsuccessful thus far.
In its investigation, the social media giant has identified eight pages and 17 profiles on Facebook, as well as seven Instagram accounts, that it says violates the company's ban on coordinated inauthentic behavior. All have been taken down.
Facebook also said it has turned over all of its findings to law enforcement agencies, Congress, other technology companies, and the Atlantic Council’s Digital Forensic Research Lab, a research organization that helps Facebook identify abuse.
The Facebook investigation also determined that the group or groups behind the “inauthentic” content apparently had little impact. There were more than 9,500 organic posts by the accounts, but very few seemed to draw much notice since the accounts had few followers.
The most followers were for four groups – “Aztlan Warriors,” “Black Elevation,” “Mindful Being,” and “Resisters.” The rest of the pages reportedly had between zero and 10 followers.
Facebook said the targeted groups paid about $11,000 for 150 ads on Facebook and Instagram starting in April 2017. The last ad ran last month.
“It’s clear that whoever set up these accounts went to much greater lengths to obscure their true identities than the Russian-based Internet Research Agency (IRA) has in the past,” Facebook said. “We believe this could be partly due to changes we’ve made over the last year to make this kind of abuse much harder.”
The company said it would continue to work with law enforcement and other tech firms to better understand the threats it faces.