Twitter  Quartz found an unlikely inspiration for its relaunched homepage: The email newsletter. nie.mn/1AQXuxD  
Nieman Journalism Lab
Pushing to the future of journalism — A project of the Nieman Foundation at Harvard
X-ray Gogs

Are you sure that’s true? Truth Goggles tackles fishy claims at the moment of consumption

Dan Schultz’s BS-detection software really works, but there are a lot of technology issues — and people issues — getting in the way of a mainstream product.

X-ray Gogs

True or false? No googling.

“The total unemployment rate for Hispanic or Latino workers has increased from 10% to 10.3%” between January 2009 and March 2012.

Now, what if I told you President Obama uttered those words? Do you trust the statistic more or less? What if Mitt Romney said it?

The claim is neither true nor false, really; truth is three-dimensional. For the answer, click here to activate Truth Goggles.

Click the text Truth Goggles highlights and you’ll see that PolitiFact rated the claim (it was Romney’s) as “mostly false.”1 It is true the unemployment rate for Hispanics and Latinos rate rose during that period, but the numbers actually fell if February 2009 — Obama’s first full month in office — is used as the baseline.

Imagine if every factual claim were highlighted in news articles — true, false, or otherwise. The gap between consumption and correction of bad information effectively would be reduced to zero. That’s the goal of Truth Goggles, a tool created by MIT master’s graduate Dan Schultz. (Go ahead and drag this Truth Goggles link to your bookmarks bar and try it around the web.) Truth Goggles draws on PolitiFact’s database of about 5,500 fact-checked claims and flags any matches in the article you’re reading.

Schultz has open-sourced the code and posted it to GitHub, about eight months after we first covered the idea. The front-end is written in the JavaScript library jQuery, and the back-end is written in PHP (mixed with some Python he’s still working on). Truth Goggles communicates with PolitiFact via private API, so Schultz’s code won’t do you much good without a database to check against.

Schultz is now working as a Knight-Mozilla OpenNews fellow at The Boston Globe, where he will try to continue developing the project part-time. Bill Adair, the editor of PolitiFact, said his operation is considering adopting the source code and building a PolitiFact-branded version of Truth Goggles.

Schultz created this project as his thesis project at the MIT Media Lab. He identified three major technology problems that need to be solved or improved for Truth Goggles to be a fully functional, user-friendly product and recently shared them with me.

1. Paraphrase detection

You’re unlikely to see Truth Goggles work on the vast majority of news articles. Truth Goggles matches only exact instances of fact-checked phrases. Taking the example from the top, a reporter could have written: Romney said the unemployment rate for Hispanics has increased from 10 percent to 10.3 percent since President Obama took office. That sentence would be invisible to Truth Goggles.

Figuring this out is the Holy Grail of automated fact-checkers, Schultz said. Natural language processing is advancing in its quest for code to understand language the way we do, but truly reliable NLP is a long way off. And if the software gets close but still messes up, highlighting the wrong claim would just confuse the user.

2. Scale

Truth Goggles is limited to those claims which PolitiFact has checked — an impressive corpus of journalism, sure, but a wimpy number compared to all of the things politicians have ever claimed. You could add FactCheck.org’s database to the mix. And Snopes, if it ever released an API. Say that gets the number up to 15,000. “That’s not nearly enough to create a system that will be actually relevant on a regular basis,” Schultz said. “Let’s say everything was perfect…you’d still rarely see a highlight.”

This is a problem with fact-checking, not fact-checking software. It can take days to verify a claim that leaves a politician’s lips in seconds. By the time PolitiFact publishes a judgment, that particular claim may be old news. Or it might not have made the news at all. Or maybe it didn’t made the transition from words in a video to words in text. I googled several dozen claims in search of news articles that included those claims — I wanted to blockquote a real article for the lead of this story, instead of a hypothetical. It was all but impossible. Virtually every result is a fact-check of the claim, or people linking to a fact-check of the claim, or a transcript of whatever it was the claim appeared in — rather than the false claim itself. So Truth Goggles will not work on most articles, because journalists aren’t writing stories about every claim. (And that’s a good thing, right?)

3. User interface

Setting aside the back-end wizardry, the front-end design of Truth Goggles proved to be a massive project of its own. For Truth Goggles to work, the software has to interrupt a user’s reading without driving him or her crazy.

Schultz conducted a user study in which he presented three interfaces: “Goggles Mode,” which blurs all of the text following the first highlighted claim; “Safe Mode,” which blocks out claims until a user clicks each one to reveal it; and “Highlight Mode,” which highlights claims in yellow while leaving the other text alone. Seventy percent of participants selected “highlight mode” when given the choice. (Schultz stresses his user study was not very scientific, since people probably wanted to play with all of the options.)

Then there is the matter of color. Truth Goggles always highlights text in neutral yellow. Red and green are automatic cues — False! True! — which can defeat the purpose of the software. Red and green are so final, literally opposites on the color spectrum. That reflects the false polarity of truth, not the continuum. (In fact, PolitiFact uses six flavors of “true” and “false.”)

If I’m an Obama supporter and I see that Romney claim highlighted in red, I only become more deeply entrenched. I might be less inclined to click on the claim to learn more. I might not want to click on it.

“I didn’t want it to be possible for people to become less thoughtful,” Schultz told me. “You’re in a spot where you don’t have to take any more action as to why it’s false. Plus, PolitiFact can make mistakes. It does update its judgments from time to time. “If you highlight something red as false, and you made a mistake, that is much more damaging than highlighting something as yellow and saying, ‘This has been fact-checked,’” Schultz said.

Indeed, the people problems might prove more daunting than the technology problems. “This is the great challenge in political journalism that, to use a different eyewear metaphor, people see things through their own partisan prisms,” Adair said.

“Even if you are a nonpartisan fact-checker, you’re going to anger one or both sides, and that’s the nature of this disruptive form of journalism. And at a time when people are going into echo chambers for their information, it can be a challenge. The one thing I would say to that is I don’t think what we’re doing is telling people what to think. We’re just trying to tell them information to consider.”

That was the biggest lesson Schultz said he learned: “Trying to tell people what to think is a losing battle,” he wrote in a blog post. The winning battle is telling people when to think.

Photo by photobunny/Earl used under a Creative Commons license.

Notes
  1. Truth Goggles says “barely true,” which was the ruling PolitiFact used to award before changing its name. Schultz hasn’t updated the code yet.
                                   
What to read next
Police Shooting Missouri
Joseph Lichterman    Aug. 22, 2014
Local meets global: The papers are jointly seeking reader-submitted stories of racial profiling and are cross-publishing each other’s work.
  • http://www.hisnameistimmy.com Tim in SF

    This app relies on Politifact, but that organization has been shown to be composed of right-wing hacks and dupes, the latter of which are burdened under the crushing weight of false equivalency.

    John Cole said it best when talking about Politifact’s “lie of the year”: 
    http://www.balloon-juice.com/2011/12/04/the-year-of-lying-decadently/

    Something that relies on Politifact should not have “Truth” in its name or description.

  • Rearl

    You would be smarter to personally cross-check statements yourself. Relying on any single entity is a mistake today. That goes for Truth Goggles or Snopes or any of these other resources.

  • http://www.slifty.com Daniel Schultz

    You are absolutely correct — the point of Truth Goggles is to simply remind you when you should go beyond your gut and think more carefully even when you are tempted to just accept or reject a claim based on the person who said it or the way it was presented to you.

    It’s using one source for now; it will use more eventually.  Ultimately, though, the goal of the tool is to remind people to think for themselves.

  • http://www.slifty.com Daniel Schultz

    It relies far more on your brain than it relies on PolitiFact.

    The problem is that it is so easy for people to let their own biases limit the functionality of that resource!  For instance, I suspect that you are a bit more extreme about your beliefs concerning PolitiFact than you ought to be.

    Regardless, though, interfaces that remind you when claims should be viewed with scrutiny *should* help you ultimately reach more informed conclusions on your own. You are absolutely right, though, that a single source powering such an interface is dangerous. Really this tool needs many sources with many perspectives to reach it’s full potential.

  • Tasheka11213

    Its a flat out lie!!!  According to the Bureau of Labor Statistics, in February 2009 the Hispanic unemployment rate was 11.1 percent.  Which means it actually went down under President Obama to 10.3 percent. Between  January 2008 and January 2009 it jumped from  6.5 percent to 10.0 percent under President BUSH.