Advertisement

China-linked ‘Spamouflage’ network mimics Americans online to sway U.S. political debate

A woman walks with her ballot to a vacant voting booth at City Hall in San Francisco.
A woman walks with her ballot to a vacant voting booth at City Hall in San Francisco, on March 5.
(Eric Risberg / Associated Press)
Share via

When he first emerged on social media, the user known as Harlan claimed to be a New Yorker and an Army veteran who supported Donald Trump for president. Harlan said he was 29, and his profile picture showed a smiling, handsome young man.

A few months later, Harlan underwent a transformation. Now, he claimed to be 31 and from Florida.

New research into Chinese disinformation networks targeting American voters shows Harlan’s claims were as fictitious as his profile picture, which analysts think was created using artificial intelligence.

Advertisement

As voters prepare to cast their ballots this fall, China has been making its own plans, cultivating networks of fake social media users designed to mimic Americans. Whoever or wherever he really is, Harlan is a small part of a larger effort by U.S. adversaries to use social media to influence and upend America’s political debate.

FBI Director Christopher Wray says the U.S. expects to confront complex and fast-moving threats to American elections this year.

Feb. 29, 2024

The account was traced back to Spamouflage, a Chinese disinformation group, by analysts at Graphika, a New York-based firm that tracks online networks. Known to online researchers for several years, Spamouflage earned its moniker through its habit of spreading large amounts of seemingly unrelated content alongside disinformation.

“One of the world’s largest covert online influence operations — an operation run by Chinese state actors — has become more aggressive in its efforts to infiltrate and to sway U.S. political conversations ahead of the election,” Jack Stubbs, Graphika’s chief intelligence officer, told the Associated Press.

Advertisement

Intelligence and national security officials have said that Russia, China and Iran have all mounted online influence operations targeting U.S. voters ahead of the November election. Russia remains the top threat, intelligence officials say, even as Iran has become more aggressive in recent months, covertly supporting U.S. protests against the war in Gaza and attempting to hack into the email systems of the two presidential candidates.

Update: President Obama on Thursday slapped Russia with new penalties for meddling in the U.S. presidential election, kicking out dozens of suspected spies and imposing banking restrictions on five people and four organizations the administration says were involved.

Dec. 21, 2016

China, however, has taken a more cautious, nuanced approach. Beijing sees little advantage in supporting one presidential candidate over the other, intelligence analysts say. Instead, China’s disinformation efforts focus on campaign issues particularly important to Beijing — such as American policy toward Taiwan — while seeking to undermine confidence in elections, voting and the U.S. in general.

Officials have said it’s a longer-term effort that will continue well past election day as China and other authoritarian nations try to use the internet to erode support for democracy.

Advertisement

Chinese Embassy spokesperson Liu Pengyu rejected Graphika’s findings as full of “prejudice and malicious speculation” and said “China has no intention and will not interfere” in the election.

Compared with armed conflict or economic sanctions, online influence operations can be a low-cost, low-risk means of flexing geopolitical power. Given the increasing reliance on digital communications, the use of online disinformation and fake information networks is only likely to increase, said Max Lesser, senior analyst for emerging threats at the Foundation for Defense of Democracies, a national security think tank in Washington.

President Trump, after two years of hammering home a simple, powerful defense — “no collusion!”

June 13, 2019

“We’re going to see a widening of the playing field when it comes to influence operations, where it’s not just Russia, China and Iran but you also see smaller actors getting involved,” Lesser said.

That list could include not only nations but also criminal organizations, domestic extremist groups and terrorist organizations, Lesser said.

When analysts first noticed Spamouflage five years ago, the network tended to post generically pro-China, anti-American content. In recent years, the tone sharpened as Spamouflage expanded and began focusing on divisive political topics like gun control, crime, race relations and support for Israel during its war in Gaza. The network also began creating large numbers of fake accounts designed to mimic American users.

Spamouflage accounts don’t post much original content, instead using platforms like X or TikTok to recycle and repost content from far-right and far-left users. Some of the accounts seemed designed to appeal to Republicans, while others cater to Democrats.

Advertisement

Intelligence officials have warned lawmakers that Russia is interfering in the 2020 election campaign to help Donald Trump get reelected, according to three officials familiar with the closed-door briefing.

Feb. 21, 2020

While Harlan’s accounts succeeded in getting traction — one video mocking President Biden was seen 1.5 million times — many of the accounts created by the Spamouflage campaign did not. It’s a reminder that online influence operations are often a numbers game: the more accounts, the more content, the better the chance that one specific post goes viral.

Many of the accounts newly linked to Spamouflage took pains to pose as Americans, sometimes in obvious ways. “I am an American,” one of the accounts proclaimed. Some of the accounts gave themselves away by using stilted English or strange word choices. Some were clumsier than others: “Broken English, brilliant brain, I love Trump,” read the biographical section of one account.

Harlan’s profile picture, which Graphika researchers believe was created using AI, was identical to one used in an earlier account linked to Spamouflage. Messages sent to the person operating Harlan’s accounts were not returned.

Several of the accounts linked to Spamouflage remain active on TikTok and X.

Klepper writes for the Associated Press.

Advertisement