Nieman Foundation at Harvard
HOME
          
LATEST STORY
A journalistic lesson for an algorithmic age: Let the scientific method be your guide
ABOUT                    SUBSCRIBE
Oct. 5, 2017, 12:31 p.m.
Audience & Social
LINK: newsroom.fb.com  ➚   |   Posted by: Ricardo Bilton   |   October 5, 2017

Facebook’s outsized role in spreading fake news both during last year’s presidential election is, at this point, undeniable. Facebook has taken some responsibility, and on Thursday introduced a new feature meant to give users more context about articles while they’re reading and before they share them with others.

The new feature is small but, in Facebook’s view, significant: Facebook users will soon start seeing a small information button on news articles that appear in the News Feed. When users click the button, they’ll see a panel with information from the source site’s Wikipedia page, content related to the article in question, and details about where and how the article is being shared.

Facebook says that the goal is to give people tools to make more informed decisions about which stories to read, share, and trust. A news source without a Wikipedia page could signal that it shouldn’t be trusted as much as site that has one (though the nature of Wikipedia means that anyone can create and edit entries, making it a somewhat dubious source of authority; some legitimate small news sites may also not have Wikipedia pages at all). The related content feature will give users other takes or more context on an article they’re reading.

While it’s not clear how popular the new context feature will be with users, it will probably have some kind o impact, if only because the feature puts all of these new tools a click away. The project is a product of feedback from users and from the organizations involved in the Facebook Journalism Project, whose work we’ve covered previously.

Mark Zuckerberg initially denied claims that Facebook had a significant role in spreading fake news during the election, but he’s reversed his stance of late (sort of). At the same time as it’s introducing new tools meant to curtail the spread of fake news in the News Feed, Facebook is also cooperating with a Congressional investigation into how Russians may have used the platform’s ad tools to target specific voters and those in key swing states.

And Facebook is also realizing that it can’t always throw more tech at these problems. Facebook said this week that it plans to add 1,000 employees to the team that reviews the ads purchased on its platform.

Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
A journalistic lesson for an algorithmic age: Let the scientific method be your guide
“One of the best parts about using the scientific method as a guide is that it moves us beyond the endless debates about whether journalism is ‘fair’ or ‘objective.’ Rather than focus on fairness, it’s better to focus on what you know and what you don’t know.”
The future of local news is “civic information,” not “declining legacy systems,” says new report
“In this vision, the community librarian facilitating conversations around authoritative, trusted digital news is as celebrated as the dogged reporter pursuing a scoop.”
Is text-generating AI an industry killer or just another wave of hype?
“There can potentially be massive shifts, benefits, and risks in many industries, but I cannot see a scenario where this is a ‘sky is falling’ kind of issue.”