Nieman Foundation at Harvard
HOME
          
LATEST STORY
From shrimp Jesus to fake self-portraits, AI-generated images have become the latest form of social media spam
ABOUT                    SUBSCRIBE
Sept. 8, 2022, 10:37 a.m.
Audience & Social

Doxxed, threatened, and arrested: Russia’s war on Wikipedia editors

Wikipedia, which has more than 1,800 Russian-speaking volunteer editors, has long been a thorn in the Russian government’s side.

One Friday in March, not long after Russia invaded Ukraine, Mikhail, a Russia-based Wikipedia editor, opened the Telegram app to discover that he had been doxxed. His personal information, including his name and social media accounts, had been posted in a channel run by a group of Russian online vigilantes targeting Wikipedians writing about the war. Below the text was an image with a single word: “Retribution.” The post had been viewed more than 110,000 times.

Soon after, Mikhail, who was granted anonymity for this story for his safety, started receiving threats on social media.

“I began to act more cautiously,” he said. “I closed off the social networks from outsiders and became even more careful to monitor [Wikipedia] edits that could be linked to my name and brought up as [legal] violations.”

That month, at least four other Wikipedia editors were also doxxed, and accused of smearing Russia’s war efforts, by the group, which called itself Mrakoborec — a reference to the Aurors, or wizarding police, in Harry Potter. Among them was Mark Bernstein, an editor based in Belarus, Russia’s ally in the war in Ukraine. After Bernstein’s name appeared in the Mrakoborec group on March 10, he was arrested, and detained in Minsk’s notorious Okrestina detention center. In June, he received a sentence of three years of restricted freedom for “organizing and preparing activities that disrupt social order.”

“Before this event, nothing like this ever happened in the Russian [Wikipedia] community,” Sergey Leschina, another Russian Wikipedia editor, said. Like many other editors, Leschina left Russia after the war started. He now lives in Lithuania, and says many editors are hesitant to work on pages about the war. “I think almost everybody who edits from Russia or Belarus does it with different accounts,” he said.

Bernstein’s arrest and the threats to individual Wikipedia editors are part of a broader campaign to stifle the platform as the Russian government pushes a pro-war propaganda drive, including banning Western social media platforms and cracking down on independent reporting. Although local media has linked the Mrakoborec group to the Russian government, observers say that the connection is difficult to prove, as groups such as these are created to provide the government with plausible deniability.

The organization that runs Wikipedia has also found itself targeted by Russia’s propaganda drive. In March, Russia passed a law that criminalized the publishing of any information about the military that the state considers to be false information. Under the new law, a Russian court fined the Wikimedia Foundation, which owns Wikipedia, 5 million rubles (USD $88,000) for failing to remove what a Russian court claimed was disinformation about the war in Ukraine. The organization launched an appeal in June.

“I am quite sad to see how the idea of the free flow of information, which has always been at the core of Wikipedia, is being suppressed in my home country with all its might, giving way to government censorship,” said Mikhail, who continues to work on Wikipedia from an anonymous account.

Wikipedia, which has more than 1,800 Russian-speaking volunteer editors, has long been a thorn in the Russian government’s side. The country blocked the site in 2015, over a page about drugs, but quickly reversed its decision. “It is fortunate that Wikipedia is too serious a player to be so easily blocked, as Facebook, Instagram, and all kinds of opposition sites have already been blocked,” Mikhail said.

Since February, Wikipedia editors have regularly updated content on the Ukraine war in Russian, including articles such as “2022 Invasion of Ukraine,” which is viewed more than 40,000 times daily on average. Anton Protsiuk, programs coordinator at Wikimedia Ukraine, said that the Russian authorities have been fairly unsuccessful in promoting their point of view on Wikipedia, and that Russian-language Wikipedia has been largely neutral in terms of describing the war.

In March, Russian communications agency Roskomnadzor threatened to block the site over the article describing the invasion of Ukraine, leading to mass downloads of offline copies of the encyclopedia. The agency ordered Wikipedia to remove several articles on the Ukrainian war. Wikipedia did not comply. Meanwhile, local officials trashed the encyclopedia as a “weapon of informational war.” In July, Roskomnadzor ordered Russian search engines to start marking Wikipedia as being in violation of Russian laws on search result pages.

Russian media has also targeted Wikipedia with accusations of hosting child pornography material over what the unofficial local Wikipedia group describes as pictures of anime girls.

“This is an old smear tactic,” Victoria Doronina, a member of the Wikimedia Foundation’s board of trustees and a molecular biologist originally from Belarus, said.

As an organization based in San Francisco that adheres to U.S. laws, the Wikimedia Foundation has been able to fend off Russian authorities’ requests to remove content. But, as it has with Google, Twitter, and other Western tech companies, Russia has demanded that the Wikimedia Foundation open an official representative office that would be liable to Russian law — including official requests for censorship. While some companies have taken steps towards complying with the local hiring law, sometimes referred to as a “hostage-taking law,” other online platforms have been reluctant as it opens up risks that authorities may arrest local staff or request user data.

“The Russian government would like to have all the information so it can punish Wikipedia editors and maybe even the readers,” said Doronina.

Going after the editors may be another way that the Russian state is trying to exert control over Wikipedia. In 2021, an investigation by Russian media outlet Daily Storm linked the Mrakoborec Telegram group with Russia’s notorious troll factories connected to the Internet Research Agency, the online influence organization accused of interfering with the 2016 U.S. presidential elections. Proving a direct connection with the government, however, is tricky, said Roman Osadchuk, a research associate in the digital forensics research lab at Washington-based think tank Atlantic Council.

But, regardless of who is behind the group, “the whole reason for [Mrakoborec]’s existence is to silence voices that are in contrast with the Russian state’s position,” said Osadchuk. He said that Mrakoborec has been trying to stifle narratives contradicting the Russian state’s position by unfairly flagging news on Russian atrocities in Ukraine that appears on social media platforms. The tactic is less effective on Wikipedia, where the platform’s community plays a greater role in editing articles. As a result, Osadchuk said, Mrakoborec is turning to doxxing.

Telegram shut down the original Mrakoborec group in March, but the group has reappeared on the platform under a different account.

Mrakoborec claimed in a Telegram post that it has collected information on more than 1,000 Wikipedia editors who have participated in editing articles on the war in Ukraine. Doronina of the Wikimedia Foundation’s board of trustees said that Russia’s state agencies could target one of those editors any day. “You cannot predict who is going to attack. It’s not a bear, it’s a pack of bears,” she said.

Not all Russian Wikipedia editors are under threat of the crackdowns. The local Wikimedia organization includes members with diametrically opposed views on the war, with some supporting the country’s president, Vladimir Putin. This has made discussing the war like “walking in a minefield,” another Russian editor, Konstantin, said. Konstantin, who asked that his name be changed for his safety, said, “Any careless word spoken publicly can lead to real reprisals.”

Konstantin said that many editors in Russia are now participating in Wikipedia less, with some having quit the project altogether. As a result, he is concerned that the neutrality of Wikipedia’s coverage of the war in Ukraine may suffer, with articles on the topic left to Russian-speaking editors who either live beyond the reach of Russian and Belarusian authorities, or who support Russia’s war effort.

The editors who remain are still trying to do what they can, said Konstantin, but he characterized their efforts as resembling those of an orchestra playing on a sinking Titanic.

“Our aim is to create an encyclopedia which is as neutral and comprehensive as possible, and as accessible to all as possible. And, in this sense, Russian Wikipedians do not differ from, say, Polish, Croatian, American, English — or Ukrainian [Wikipedians],” he said. “Despite the fact that our countries have entered into such a tragic interaction, this allows us to still do something together on this platform.”

Masha Borak is a journalist covering the intersection of technology with politics, business, and society. This piece was originally published by Rest of World, a nonprofit newsroom covering global technology, and is being republished with permission.

POSTED     Sept. 8, 2022, 10:37 a.m.
SEE MORE ON Audience & Social
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
From shrimp Jesus to fake self-portraits, AI-generated images have become the latest form of social media spam
Within days of visiting the pages — and without commenting on, liking, or following any of the material — Facebook’s algorithm recommended reams of other AI-generated content.
What journalists and independent creators can learn from each other
“The question is not about the topics but how you approach the topics.”
Deepfake detection improves when using algorithms that are more aware of demographic diversity
“Our research addresses deepfake detection algorithms’ fairness, rather than just attempting to balance the data. It offers a new approach to algorithm design that considers demographic fairness as a core aspect.”