Nieman Foundation at Harvard
HOME
          
LATEST STORY
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
ABOUT                    SUBSCRIBE
April 4, 2022, 2:55 p.m.
Mobile & Apps

WhatsApp seems ready to restrict how easily messages spread in a bid to reduce misinformation

A new beta version would add significant friction to forwarding messages more than once — the latest in a line of structural changes meant to reduce how often misinformation goes viral.

It would be difficult to design a social platform more optimized for disinformation than WhatsApp. Each of its best features — private! huge scale! 1-to-1 and 1-to-many! free! — comes with a corresponding downside. Being end-to-end encrypted is a win for security but makes it impossible to see how misinfo is spreading through the system — either for outside observers or for Facebook-owned1 WhatsApp itself. For fake-news peddlers, WhatsApp’s huge scale makes it appealing, being free makes it accessible, and being both 1-to-1 (chats) and 1-to-many (groups) makes it potent. WhatsApp was very good at removing friction in messaging — but so good that it removed friction for bad actors, too.

That combination of features has led it to be blamed, at least in part, for a variety of incitement-related sins: lynch mobs in India, lockdown panics in Australia, and people burned alive in Mexico.

WhatsApp has taken a variety of steps to tamp down these problems, usually after a particular outrage has gotten the world’s attention. Perhaps the most emblematic country is Brazil, where WhatsApp usage is near universal (80% of adults use it, more than any other platform) and its recent history is troubled. It was a huge platform for misinformation in Brazil’s 2018 presidential election, won by the fringe candidate Jair Bolsonaro. Polling showed Bolsonaro supporters were nearly twice as likely to use WhatsApp for news than his opponent’s. And the messages surging through the platform had a decided slant:

False rumors, manipulated photos, decontextualized videos and audio hoaxes have become campaign ammunition, going viral on the platform with no way to monitor their origin or full reach.

Many of the fakes portray Haddad as a communist whose Workers Party would turn Brazil into another Cuba, convert children to homosexuality and plans to rig voting machines.

Here’s what one academic study found:

From a sample of more than 100,000 political images that circulated in those 347 [public WhatsApp] groups, we selected the 50 most widely shared. They were reviewed by Agência Lupa, which is Brazil’s leading fact-checking platform.

Eight of those 50 photos and images were considered completely false; 16 were real pictures but used out of their original context or related to distorted data; four were unsubstantiated claims, not based on a trustworthy public source. This means that 56 percent of the most-shared images were misleading.

Only 8 percent of the 50 most widely shared images were considered fully truthful.

The vast majority of false information shared on WhatsApp in Brazil during the presidential election favoured the far-right winner, Jair Bolsonaro, a Guardian analysis of data suggests…

In a sample of 11,957 viral messages shared across 296 group chats on the instant-messaging platform in the campaign period, approximately 42% of rightwing items contained information found to be false by factcheckers. Less than 3% of the leftwing messages analysed in the study contained externally verified falsehoods.

The figures suggest the spread of fake news was highly asymmetrical, accounting for much of the content being spread by and to Bolsonaro supporters on WhatsApp.

Nearly half of the right-wing posts flagged by fact-checkers “mentioned a fictional plot to fraudulently manipulate the electronic ballot system, echoing conspiracy theories promoted by Bolsonaro’s team and casting suspicion on the democratic process.” (Sound familiar, Americans?)

How did WhatsApp respond? By limiting the amount of message forwarding that Brazilian WhatsApp users can do. Initially, users could forward a message to up to 256 groups at once; that number was cut to 20 in 2018 and 5 in 2019, a move first tested in the wake of the Indian mob violence. In 2020, the limit dropped to 1, but only for messages that had already been forwarded five or more times.

All these moves limited an individual’s ability to spam a single message out to many communities — and it seems to have worked in reducing misinformation. By 2020, WhatsApp could announce, based on internal data, that the spread of “highly forwarded” messages was down 70%.

(Of course, because they’re encrypted, there’s no way to know that “highly forwarded” messages are misinformation or not. Some JPEGs of perfectly truthful kittens were no doubt constrained in the process, too. But, as WhatsApp PR put it: “Is all forwarding bad? Certainly not…However, we’ve seen a significant increase in the amount of forwarding which users have told us can feel overwhelming and can contribute to the spread of misinformation. We believe it’s important to slow the spread of these messages down to keep WhatsApp a place for personal conversation.”)

And now, the restrictions are apparently growing tighter still. The WhatsApp-covering news site WABetaInfo — there are niche markets everywhere! — noticed it over the weekend in the beta version released for iOS, after previously spotting it in an Android beta.

As you can see in this screenshot, it is no longer possible to forward forwarded messages to more than one group chat at a time and this is an additional way to limit spam and misinformation. New rules for forwarding messages only apply to already forwarded messages.

That would mean the restrictions previously placed on “highly forwarded” messages would now apply to any previously forwarded messages.

This is, of course, only a beta, meaning the feature could change (or vanish) before reaching the full user base. (I reached out to WhatsApp to ask for any further info, and a spokesperson said that it has nothing to share about beta features.) But I’d expect this restriction to have an effect similar to previous ones, reducing forwards and thus misinformation (as well as videos of cute cats).

Note that this change doesn’t limit someone’s ability to forward the message to lots of different groups. It just makes it harder, forcing you to forward to each one individually rather than all with a single tap. It adds friction.

Most of the internet’s glories have been about reducing. You could go to your local library to find out how tall the Empire State Building is, but it’s so much easier to check Wikipedia. (Roof height 1,250 feet, 1,454 if you count antennas.) You could find out who your middle school crush ended up marrying, but Facebook make it simple. You could express your thoughts on politics to a potential mass audience, but it typically required buying a printing press or getting interviewed on TV. Blogs and then social media made it nearly friction-free.

But it’s remarkable how many of the attempts to deal with misinformation (and to improve civic information more broadly) are about intentionally increasing friction. Want to tell your Facebook friends about how Dr. Fauci is a crypto-nazi bent on murdering all your kids? Facebook might now pop up a warning that lets you know that what you believe is incorrect. It might still let you post it, but it will add the friction of not showing it to as many of your friends as usual.2 Want to quote-tweet an article you haven’t read yet? Twitter might gently ask if you’d like to click the link first — though it won’t stop you if you don’t.

Of course there will be arguments about which are the right frictions to add and which ones might be too onerous or too restrictive of particular opinions or actions. And not all frictions work as well as you’d hope, requiring lots of testing and data. But restrictions like WhatsApp have the benefit of being content-neutral while also aligning with the app’s DNA as a more personal, less broadcast-y platform for chat. Amid all the debates over deplatforming — which tend to involve individual users, often with their own fan bases — it’s good to also make room for more structural changes within a platform that can approach the problem from a different angle.

  1. Still not ready to say Meta. ↩︎
  2. At least that’s how it’s supposed to work↩︎
Joshua Benton is the senior writer and former director of Nieman Lab. You can reach him via email (joshua_benton@harvard.edu) or Twitter DM (@jbenton).
POSTED     April 4, 2022, 2:55 p.m.
SEE MORE ON Mobile & Apps
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
PressPad, an attempt to bring some class diversity to posh British journalism, is shutting down
“While there is even more need for this intervention than when we began the project, the initiative needs more resources than the current team can provide.”
Is the Texas Tribune an example or an exception? A conversation with Evan Smith about earned income
“I think risk aversion is the thing that’s killing our business right now.”
The California Journalism Preservation Act would do more harm than good. Here’s how the state might better help news
“If there are resources to be put to work, we must ask where those resources should come from, who should receive them, and on what basis they should be distributed.”