Nieman Foundation at Harvard
HOME
          
LATEST STORY
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
ABOUT                    SUBSCRIBE
March 9, 2022, 9:48 a.m.
Audience & Social

Russia is having less success at spreading social media disinformation (for now)

But that could change if people tire of defending against an onslaught of misinformation.

This article was originally published in Scientific American. It is being republished with permission.

Days after Russia invaded Ukraine, multiple social media platforms — including Facebook, Twitter and YouTube — announced they had dismantled coordinated networks of accounts spreading disinformation. These networks, which were comprised of fabricated accounts disguised with fake names and AI-generated profile images or hacked accounts, were sharing suspiciously similar anti-Ukraine talking points, suggesting they were being controlled by centralized sources linked to Russia and Belarus.

Russia’s Internet Research Agency used similar disinformation campaigns to amplify propaganda about the U.S. election in 2016. But their extent was unclear until after the election—and at the time, they were conducted with little pushback from social media platforms. “There was a sense that the platforms just didn’t know what to do,” says Laura Edelson, a misinformation researcher and Ph.D. candidate in computer science at New York University.

Since then, she says, platforms and governments have become more adept at combating this type of information warfare — and more willing to deplatform bad actors that deliberately spread disinformation. Edelson spoke to Scientific American’s Sophie Bushwick about how an information war is being waged as the conflict continues.

Sophie Bushwick: How do social media platforms combat accounts that spread disinformation?

Laura Edelson:These kinds of disinformation campaigns — where they are specifically misleading users about the source of the content — that’s really easy for platforms to take action against because Facebook has this real name policy: misleading users about who you are is a violation of Facebook’s platform rules. But there are [other] things that shouldn’t be difficult to take down — that historically Facebook has really struggled with — and that is actors like RT. RT is a Russian state-backed media outlet. And Facebook has really struggled historically on what to do with that.

That’s what was so impressive about seeing that [Facebook and other platforms] really did start to take some action against RT in the past week, because this has been going on for such a long time. And also, frankly, [social media platforms] have had cover from governments, where governments in Europe have banned Russian state media. And that has given cover to Facebook, YouTube and other major platforms to do the same thing. In general, banning anyone — but especially banning media — is not a step anyone should take lightly. But RT and Sputnik [another Russia state-backed media outlet] are not regular media: they have such a long track record of polluting the information space.

Bushwick: What else can be done to fight harmful false information?

Edelson: One of the things that the U.S. did really well going into this conflict — and why, at least from a misinformation [controlling] perspective, the first week went very well — is that the U.S. government was really aggressive with releasing information about what it knew about the ground realities in Russia and Ukraine. That was really helpful for creating a space where it was difficult for the Russians to put out misinformation about those same things. Because the U.S. government was very forthcoming, it didn’t leave a lot of room; there wasn’t an information vacuum that the Russians could step in and fill.

And then the Ukrainian government has been tremendously savvy in telling the story of the Ukrainian resistance. There are definitely times when it has stepped over the line into propaganda. But in general, it has made sure that the world sees the Ukrainian resistance and the fight that the Ukrainian people are willing to put up. That [helps] people see what is going on and understand that the people who are there fighting are real people who, not that long ago, were not fighters. They were civilians, and now they are defending their country.

I think both of those things are going to be difficult to maintain over time. But if they are not maintained, then the window for Russian misinformation will open. A challenge we are all going to have to deal with is that this war is not going to be over in the next few days, but the news cycle cannot maintain this level of focus on these events. It’s shocking to say, but in three weeks’ time, you will have hours go by without thinking about it. And that is when people’s guards are going to go down. If someone is trying to spread some kind of [disinformation] — maybe the Russians make up some fake Ukrainian atrocity or something — that’s when the world is going to be susceptible to that kind of thing. And that’s when we’re going to have to remember all this stuff of “Who was telling you the story? Do we trust them? How verifiable is this account?” This is going to be part of how conflict is waged going forward.

But this is something that is new for all actors, and everyone is going to have to get used to keeping up their ground game in the information war, not just in the kinetic war.

Bushwick: Some people have also pointed out an apparent reduction in other forms of misinformation, such as vaccine-related conspiracy theories, since Russia’s internet infrastructure and payment networks were limited by sanctions. What is going on with that?

Edelson: I haven’t seen a large-scale analysis published about this. That said, there have been quite a few anecdotal reports that misinformation in other sectors has decreased markedly in the past week. We can’t say for certain that this is because of lack of internet access in Russia. The conclusion is not that all of this stuff that had been taken down was sourced from Russia. The conclusion that’s reasonable to draw from these anecdotal reports is that Russian internet infrastructure was a vital part of the tool kit of people who spread misinformation. There’s a lot of pieces of this economy that are run out of Russia — bot networks, for example, networks of people who sell who buy and sell stolen credit card information, a lot of the economy around buying stolen [social media] accounts — because Russia has historically tolerated a lot of cybercrime. Either it turns a blind eye or a lot of these groups actually directly work for, or are contractors to, the Russian state.

Bushwick: How can we avoid falling for or spreading misinformation?

Edelson: The bottom line is that people shouldn’t have to do this. This is kind of like saying, “My car doesn’t have any seatbelt. What can I do to protect myself in a crash?” The answer is: your car should have seatbelts, and that shouldn’t be your job. But unfortunately, it is.

With that small caveat, you have to remember that the most successful misinformation succeeds by appealing to emotions rather than reason. If misinformation can tap into that emotive pathway, you’re never going to question it because it feels good, and if it feels good, it’s adjacent to being true. So the first thing that I recommend is: if something makes you feel emotional — particularly if something makes you feel angry — before you share it or interact with it, really ask yourself the question “Who is promoting this, and do I trust them?”

Bushwick: What is the most important thing platforms need to do to install metaphorical seatbelts?

Edelson: I think the single biggest thing that platforms should be doing, especially in these moments of crisis, is [recognize they] should not promote content solely based on engagement. Because you have to remember that misinformation is really engaging. It is engaging because of some of those reasons I talked about: highly emotive appeal, things that circumvent reason and go straight to the gut. That’s a really effective tactic for deception. So I think this is when platforms need to step up the importance of quality of content versus how engaging content is. That is the number one thing they could do, and almost everything else pales in comparison.

Sophie Bushwick is an associate editor covering technology at Scientific American.

Photo of a mural painted in South London in support of Ukraine by Loco Steve, used under a Creative Commons license.

POSTED     March 9, 2022, 9:48 a.m.
SEE MORE ON Audience & Social
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
“While there is even more need for this intervention than when we began the project, the initiative needs more resources than the current team can provide.”
Is the Texas Tribune an example or an exception? A conversation with Evan Smith about earned income
“I think risk aversion is the thing that’s killing our business right now.”
The California Journalism Preservation Act would do more harm than good. Here’s how the state might better help news
“If there are resources to be put to work, we must ask where those resources should come from, who should receive them, and on what basis they should be distributed.”