Business

Facebook Grapples With a Maturing Adversary in Election Meddling

They covered their tracks, using software to camouflage their internet traffic. They created Facebook pages for anti-Trump culture warriors, Hispanic activists and fans of alternative medicine. And they organized protests in coordination with real-world political groups.

Posted Updated

By
Kevin Roose
, New York Times

They covered their tracks, using software to camouflage their internet traffic. They created Facebook pages for anti-Trump culture warriors, Hispanic activists and fans of alternative medicine. And they organized protests in coordination with real-world political groups.

The people behind an influence campaign ahead of this year’s elections, which Facebook disclosed on Tuesday, copied enough of the tactics used by Russians in the 2016 races to raise suspicion that Russia was at it again. But the new efforts also revealed signs of a maturing adversary, adapting and evolving to better disguise itself, while also better imitating real activists.

The coordinated activity — a collection of memes, photos and posts on issues like feminist empowerment, indigenous rights and the Immigration and Customs Enforcement agency — show the enormity of the challenge ahead of Facebook, as it tries to weed out impersonators. As the forces behind the accounts become harder to detect, the company is left to separate the ordinary rants and raves of legitimate users from coordinated, possibly state-backed attempts to sway public opinion.

“This is not a one-time event limited to the 2016 election,” said Michael Posner, the director of the Center for Business and Human Rights at New York University’s Stern School of Business, which recently published a report about Russian influence campaigns. “It’s a daily drumbeat. These are entities trying to disrupt our democratic process by pushing various forms of disinformation into the system.”

Facebook said it had shut down 32 accounts that were collectively followed by 290,000 users. Many of the accounts had few followers, and the overall scope of the operation was smaller than the 2016 Russian campaign. The announcement from the company reinforced concerns in Washington that the federal government and social media companies would not be able to keep foreign actors out of this year’s election.

Facebook said it found enough evidence to shut down the accounts, pull the pages and remove the events they organized. But it did not immediately say who was behind the campaign, or where it was based, even as some lawmakers immediately pointed the finger at Russia.

In some ways, there were similarities with the earlier Russian efforts, including those by the Internet Research Agency, the troll farm that meddled in the 2016 election. Pages posted on Facebook were often written in broken English, for example.

But there were significant differences, too. According to Facebook, the recently revealed operation used virtual private networks to obscure its location. It also used internet-based telephone services, and routed Facebook ad purchases through third parties.

The content evolved, too, to better match the current debates in the United States. According to a report about past meddling released by the Senate Intelligence Committee on Wednesday, the IRA’s most successful Facebook posts in 2016 focused on conventional conservative themes: posts supporting hard-line immigration policies tended to do well, as did posts denouncing gun control and Hillary Clinton.

The new operation focused on issues like feminist empowerment, anti-racist activism and liberal outrage over the actions of ICE, according to information released by Facebook on Tuesday, and additional data it shared with the Atlantic Council’s Digital Forensic Research Lab. Specific groups were targeted with narrow messages that would appeal to them, such as a wellness-focused page that posted a graphic about the dangers of genetically modified food.

“It shows a level of sophistication in retail politics,” said Graham Brookie, the director of the Digital Forensic Lab, which published a report about the material this week.

The recent operation also appears to have been more focused on creating offline chaos, by setting up and promoting Facebook events, such as a counterprotest to a planned white nationalist rally that was co-hosted by one of the suspicious pages, which also had real groups behind it. The event attracted interest from more than 3,000 users.

“They’re better integrated into groups and events,” said Jonathan Albright, the director of the Digital Forensics Initiative at the Tow Center for Digital Journalism. The new group’s goal, Albright said, appeared to be “pushing real-life events and connecting the online and offline together in some type of activism that results in conflict.”

One of the biggest suspicious accounts disclosed by Facebook was “Resisters,” a page that appeared to be run by feminist activists. The page’s description stated its mission: “Online and offline feminist activism against fascism. Widespread public education and direct action, amplifying the voices from targeted communities.”

The Resisters page, created in March 2017, accumulated thousands of followers with a mix of posts about gender equality and other progressive ideals. Its most popular post was about an anti-rape device invented in South Africa, with the caption, “What do you think about this?”

Many of the campaign’s posts were similar in tone to ones found in 2016, when the IRA used issues like the Black Lives Matter movement to inflame racial tension. A page called “Aztlan Warriors” was set up in March 2017, a day before the Resisters page. Its description read “We empower our gente,” using the Spanish word for “people.” With a mixture of memes and photos, the page promoted the mistreatment of Native Americans and indigenous Hispanic people by white people. A typical post, made on June 28, showed a protester carrying a sign that read “In 1492 Native Americans Discovered Columbus Lost at Sea.”

Another page created by the group, “Ancestral Wisdom,” posted an image of a dark-skinned girl with the caption, “Black girls don’t have to be mixed to be beautiful.”

Many of these posts were stolen from other websites and accounts. But they showed that influence campaigns have become increasingly adept at mimicking the tenor of social media activists of all stripes, and turning online outrage into real-world action.

None of this is rocket science, exactly. Any Facebook user could tell you that outrage performs better than calm equanimity on social media, and that heated discussions of issues like women’s rights and the persecution of indigenous people are pervasive on the American left. And experts have suggested that trolls may be trying to build a large and hyper-engaged audience, with the intent of weaponizing the account for more strategic purposes later on. The people behind the Facebook campaign may have tested more advanced techniques. Several of the pages that were taken down contained links to a Facebook Messenger account, suggesting that the influence operators may have used messages to connect with individual followers.

A Facebook spokesman declined to comment on whether pages connected with the campaign used Facebook Messenger, citing the company’s continuing investigation.

A side effect of the disinformation campaigns is that they make social media as a whole seem inherently untrustworthy, and give fodder to those who want to cast doubt on the legitimacy of authentic movements. Already, some partisans have adopted the tactic of sowing doubt about internet-based movements by painting their opponents as Russian trolls or agents of a foreign-influence campaign.

This type of suspicion appears likely to grow, as influence campaigns get harder and harder to distinguish from authentic activity.

Facebook’s rules do not forbid people from suggesting, as the new influence campaign did on its network of pages, that electromagnetic radiation from cellphones causes life-threatening illness, or that ICE should be abolished. The campaign’s sin, in Facebook’s eyes, was conducting “inauthentic activity” using fake accounts. Artificially stoked outrage is permissible, as long as you use your real name. In a call with reporters on Tuesday, Facebook’s chief security officer, Alex Stamos — who is leaving the company this month — sounded a determined note, saying that Facebook would do everything in its power to sniff out fake accounts and coordinated influence campaigns.

“We should not assume that increased complexity in attributing actors means that we’re not going to be able to stop this activity,” Stamos said.

Posner of NYU, said the disclosure of the new information operation should prompt all social media companies to take the threat of information warfare more seriously. He said the company should build teams of Russian-speaking analysts who are familiar with the methods employed by these groups, and specifically focus on threats originating from IRA-type campaigns.

“They’re looking for ways to divide our society,” he said. “They’re not limiting themselves to siding with one political candidate or another. It’s a much deeper and more pernicious kind of activity.”

Copyright 2024 New York Times News Service. All rights reserved.