Nieman Foundation at Harvard
HOME
          
LATEST STORY
Is unpublishing old crime stories Orwellian or empathetic? The Boston Globe is offering past story subjects a “fresh start”
ABOUT                    SUBSCRIBE
Oct. 5, 2017, 12:31 p.m.
Audience & Social
LINK: newsroom.fb.com  ➚   |   Posted by: Ricardo Bilton   |   October 5, 2017

Facebook’s outsized role in spreading fake news both during last year’s presidential election is, at this point, undeniable. Facebook has taken some responsibility, and on Thursday introduced a new feature meant to give users more context about articles while they’re reading and before they share them with others.

The new feature is small but, in Facebook’s view, significant: Facebook users will soon start seeing a small information button on news articles that appear in the News Feed. When users click the button, they’ll see a panel with information from the source site’s Wikipedia page, content related to the article in question, and details about where and how the article is being shared.

Facebook says that the goal is to give people tools to make more informed decisions about which stories to read, share, and trust. A news source without a Wikipedia page could signal that it shouldn’t be trusted as much as site that has one (though the nature of Wikipedia means that anyone can create and edit entries, making it a somewhat dubious source of authority; some legitimate small news sites may also not have Wikipedia pages at all). The related content feature will give users other takes or more context on an article they’re reading.

While it’s not clear how popular the new context feature will be with users, it will probably have some kind o impact, if only because the feature puts all of these new tools a click away. The project is a product of feedback from users and from the organizations involved in the Facebook Journalism Project, whose work we’ve covered previously.

Mark Zuckerberg initially denied claims that Facebook had a significant role in spreading fake news during the election, but he’s reversed his stance of late (sort of). At the same time as it’s introducing new tools meant to curtail the spread of fake news in the News Feed, Facebook is also cooperating with a Congressional investigation into how Russians may have used the platform’s ad tools to target specific voters and those in key swing states.

And Facebook is also realizing that it can’t always throw more tech at these problems. Facebook said this week that it plans to add 1,000 employees to the team that reviews the ads purchased on its platform.

Show tags Show comments / Leave a comment
 
Join the 50,000 who get the freshest future-of-journalism news in our daily email.
Is unpublishing old crime stories Orwellian or empathetic? The Boston Globe is offering past story subjects a “fresh start”
Should the worst moment of your life also be your top Google search result? Your “permanent record” is sometimes more about old news stories than court records, and newspapers are increasingly rethinking their responsibilities.
From “pop-up” to “pilot,” Votebeat hopes to stick around until 2022
“This blend of local reporting and issue expertise is extremely powerful. We think it’s one that can respond to the local news crisis quickly.”
The enduring allure of conspiracies
Conspiracy theories seem to meet psychological needs and can be almost impossible to eradicate. One remedy: Keep them from taking root in the first place.