Nieman Foundation at Harvard
HOME
          
LATEST STORY
Are news publishers directly liable for embedding tweets that contain images not created by that tweeter?
ABOUT                    SUBSCRIBE
March 27, 2014, 11:39 a.m.

A researcher at Stanford has some new insight into how content — specifically, visual content — becomes massively popular on Facebook, a phenomenon he calls cascade sharing.

Justin Cheng wanted to see if it was possible to predict what content would be shared over and over again. With the help of some people at Facebook, they were able to get access to data that showed “which people (nodes) reshared each photograph and at what time.”

Cheng and pals use a portion of their data to train a machine learning algorithm to search for features of cascades that make them predictable.

These features include the type of image, whether a close-up or outdoors or having a caption and so on; the number of followers the original poster has; the shape of the cascade that forms, whether a simple star graph or more complex structures; and finally how quickly the cascade takes place, its speed.

Having trained their algorithm, they used it to see whether it could make predictions about other cascades. They started with images that had been shared only five times, so the question was whether they would eventually be shared more than 10 times.

It turns out that this is surprisingly predictable. “For this task, random guessing would obtain a performance of 0.5, while our method achieves surprisingly strong performance: classification accuracy of 0.795,” they say.

There’s a lot more work to be done in this area of research, but some of Cheng’s findings — for example, content that is shared rapidly is likely to become viral — could be useful in a publishing context.

Show tags Show comments / Leave a comment
 
Join the 45,000 who get the freshest future-of-journalism news in our daily email.
Are news publishers directly liable for embedding tweets that contain images not created by that tweeter?
A New York federal judge ruled that when publishers from The Boston Globe to Vox Media to Breitbart “caused the embedded tweets to appear on their websites, their actions violated plaintiff’s exclusive display right.”
What strategies work best for increasing trust in local newsrooms? Trusting News has some ideas
“It’s not so much about gaming Facebook’s algorithm or working with the Facebook changes as much as it is taking advantage of Facebook as a truly social platform.”
Should we consider fake news another form of (not particularly effective) political persuasion — or something more dangerous?
Plus: The lines between “fake news” and psyops, the Russians shared real news too, and “reality apathy.”