Nieman Foundation at Harvard
HOME
          
LATEST STORY
True Genius: How to go from “the future of journalism” to a fire sale in a few short years
ABOUT                    SUBSCRIBE
March 27, 2014, 11:39 a.m.

A researcher at Stanford has some new insight into how content — specifically, visual content — becomes massively popular on Facebook, a phenomenon he calls cascade sharing.

Justin Cheng wanted to see if it was possible to predict what content would be shared over and over again. With the help of some people at Facebook, they were able to get access to data that showed “which people (nodes) reshared each photograph and at what time.”

Cheng and pals use a portion of their data to train a machine learning algorithm to search for features of cascades that make them predictable.

These features include the type of image, whether a close-up or outdoors or having a caption and so on; the number of followers the original poster has; the shape of the cascade that forms, whether a simple star graph or more complex structures; and finally how quickly the cascade takes place, its speed.

Having trained their algorithm, they used it to see whether it could make predictions about other cascades. They started with images that had been shared only five times, so the question was whether they would eventually be shared more than 10 times.

It turns out that this is surprisingly predictable. “For this task, random guessing would obtain a performance of 0.5, while our method achieves surprisingly strong performance: classification accuracy of 0.795,” they say.

There’s a lot more work to be done in this area of research, but some of Cheng’s findings — for example, content that is shared rapidly is likely to become viral — could be useful in a publishing context.

Show tags Show comments / Leave a comment
 
Join the 50,000 who get the freshest future-of-journalism news in our daily email.
True Genius: How to go from “the future of journalism” to a fire sale in a few short years
Genius (née Rap Genius) wanted to “annotate the world” and give your content a giant comment section you can’t control. Now it can’t pay back its investors.
This study shows how people reason their way through echo chambers — and what might guide them out
“You really don’t know whether this person making a good-sounding argument is really smart, is really educated, or whether they’re just reading off something that they read on Twitter.”
Misinformation is a global problem. One of the solutions might work across continents too.
Plus: What Africa’s top fact-checkers are doing to combat false beliefs about Covid-19.