Nieman Foundation at Harvard
HOME
          
LATEST STORY
What journalists and independent creators can learn from each other
ABOUT                    SUBSCRIBE
April 9, 2018, 12:52 p.m.
Audience & Social

Facebook and Twitter are opening up a bit to academic researchers, so platforms “can make better decisions”

A limited group of academics will be given access to some Facebook data.

At least a few platforms are lifting the curtain a little bit.

Facebook announced on Monday, ahead of CEO Mark Zuckerberg’s appearance before Congress on Wednesday, that it plans to give a limited group of soon-to-be determined academics some access to Facebook data as needed, with a research emphasis on how Facebook influences elections in different countries around the word.

“If you’ve followed me for a while, you know one of my top priorities for 2018 [is] making sure Facebook prevents interference and misinformation in elections,” Mark Zuckerberg wrote in a Facebook post to promote the new research push (side note: “a while” means mostly this past year). “Today we’re taking another step — establishing an independent election research commission that will solicit research on the effects of social media on elections and democracy.”

The research, which Facebook says will be released publicly and will not be subject to approval by Facebook, is funded through the William and Flora Hewlett Foundation, with the Alfred P. Sloan Foundation, Charles Koch Foundation, Democracy Fund, the John S. and James L. Knight Foundation, Laura and John Arnold Foundation, and Omidyar Network. Facebook “will make no financial contribution to this initiative or its research.” Hewlett hinted at an emphasis on platform-focused research when it announced $10 million in funding over the next two years devoted to research on disinformation on social media. (Disclosure: Nieman Lab is supported by the Knight Foundation.)

“The focus will be entirely forward looking. And our goals are to understand Facebook’s impact on upcoming elections — like Brazil, India, Mexico and the US midterms — and to inform our future product and policy decisions,” wrote Facebook’s VP of communications and public policy Elliot Schrage and its director of research David Ginsberg. “For example, will our current product roadmap effectively fight the spread of misinformation and foreign interference? Specific topics may include misinformation; polarizing content; promoting freedom of expression and association; protecting domestic elections from foreign interference; and civic engagement.”

The committee of scholars involved will be “international” and represent “different political outlooks.” Facebook will invite scholars based on input from the foundations funding this research committee, and that committee will solicit, evaluate, and set research topics. The research will go through a peer review process, which the Social Science Research Council is helping with. Proposals must pass university Institutional Review Board (or “an international equivalent”) review. The announcements don’t mention whether Facebook-owned platforms like WhatsApp or Instagram are included in the datasets that will potentially be granted to researchers.

A little overshadowed by all this Facebook hubbub has been an announcement over the weekend about a study researchers Susan Benesch and J. Nathan Matias will lead to find practical solutions to curbing abusive behaviors on Twitter, research that Benesch and Matias themselves proposed to Twitter:

Today Twitter will begin testing such an idea: that showing an internet platform’s rules to users will improve behavior on that platform. Social norms, which are people’s beliefs about what institutions and other people consider acceptable behavior, powerfully influence what people do and don’t do. Research has shown that when institutions publish rules clearly, people are more likely to follow them. We also have early evidence from Nathan’s research with reddit communities that making policies visible can improve online behavior. In an experiment starting today, Twitter is publicizing its rules, to test whether this improves civility.

While we’re on this topic: Any academics proposing research to Google/YouTube with any success?

Image by Thought Catalog, used under a Creative Commons license.

POSTED     April 9, 2018, 12:52 p.m.
SEE MORE ON Audience & Social
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
What journalists and independent creators can learn from each other
“The question is not about the topics but how you approach the topics.”
Deepfake detection improves when using algorithms that are more aware of demographic diversity
“Our research addresses deepfake detection algorithms’ fairness, rather than just attempting to balance the data. It offers a new approach to algorithm design that considers demographic fairness as a core aspect.”
What it takes to run a metro newspaper in the digital era, according to four top editors
“People will pay you to make their lives easier, even when it comes to telling them which burrito to eat.”