The Strange Afterlife of Wagner’s Yevgeny Prigozhin
While the posts on X are only a tiny snapshot of social media activity, they highlight how Russian-linked propaganda has changed since the Internet Research Agency interfered in US politics in 2016, experts say. The Russian misinformation and disinformation industry has evolved into a rich ecosystem of state-backed media, massive Telegram channels, and more conventional social media posts. Millions of people follow so-called military bloggers and war journalists on Telegram—some of these channels are linked to the Russian state, while others are aligned with Pirgozhin and the Wagner Group. But all can muddy the waters or repeat set lines.
“Confusion in the information space is one of the aims of the Kremlin information operations—to make everything equally unbelievable so people’s trust in all kinds of sources is undermined,” says Eto Buziashvili, a disinformation and influence operations researcher with a focus on Russia at the Atlantic Council’s Digital Forensic Research Lab. Since the start of its full-scale war in Ukraine in February 2022, Russia has blocked and censored social media websites, banned independent news media, and pushed reams of disinformation.
Kyle Walter, head of research at misinformation and disinformation research company Logically, reviewed the posts shared by Antibot4Navalny and says they show “signs of being inauthentic.” The X accounts were largely created earlier this year, have low volumes of original posts, and mostly retweet or reply to accounts, and some of them also follow each other, Walter says. The themes the accounts posted about around the plane crash also match what Logically has seen from monitoring Telegram channels linked to the Wagner Group, he says. Walter adds, however, that linking them directly to the Internet Research Agency is harder to do.
The Antibot4Navalny researcher says that based on their previous research, they believe that the pro-Prigozhin trolls operate in similar ways. They “primarily serve” the interests of Putin, but they also push pro-Prigozhin narratives when it doesn’t “hurt” the Russian president, the researcher says. The approach “still worked in the plane-crash episode: Cover Putin as strongly as possible, but also, it is a nice opportunity to praise Prigozhin,” they say. The researcher says they are reporting the accounts to X.
As well as the posts around the plane crash, the Antibot4Navalny group also shared previous research and analysis with WIRED. In one instance, the group reported more than 7,000 suspected accounts to X. We tested dozens of these accounts and found that they have all been removed from the Elon Musk-owned social media company. Antibot4Navalny says the “troll” accounts are often active in groups, pushing the “same set of talking points” and mostly replying to tweets about news related to Russia and Ukraine or pro-Ukrainian channels. X did not immediately respond to WIRED’s request for comment.
On July 14, the Antibot4Navalny researcher says, some of the accounts they have tracked replied to posts discussing comments from Putin, who said that the Wagner Group “does not exist” and that there is no legal basis for the group. The accounts, the researcher says, sent messages saying that Wagner operated legally and referenced Concord, the catering company owned by Prigozhin. The Antibot4Navalny researcher claims that the points were not included in any Kremlin-controlled media and that mentions of the company “served interests of the troll factory/its owner—rather than interests of the Kremlin.”
Gloss