What do readers want from the news?
It’s a hard question to answer, and not only because we don’t often know what we like until we find ourselves liking it. To figure it out, news outlets have traffic patterns on the one hand, and, if they choose, user surveys on the other; each is effective and unsatisfying in its own way. But what about the middle ground — an analytic approach to creative user feedback?
Meet All Our Ideas, the “suggestion box for the digital age“: a crowdsourcing platform designed to crowdsource concepts and opinions rather than facts alone. The platform was designed by a team at Princeton under the leadership of sociology professor Matt Salganik — initially, to create a web-native platform for sociological research. (The platform is funded in part by Google’s Research Awards program.) But its potential uses extend far beyond sociology — and, for that matter, far beyond academia. “The idea is to provide a direct idea-sharing platform where people can be heard in their own voices,” Salganik told me; for news outlets trying to figure out the best ways to harness the wisdom and creativity and affection of their users, a platform that mingles commenting and crowdsourcing could be a welcome combination.
The platform’s user interface is deceptively simple: at each iteration, it asks you to choose between two choices, as you would at the optometrist’s office: “Is A better…or B?” (In fact, Salganik told me, All Our Ideas’ structure was inspired by the kitten-cuteness-comparison site Kittenwar, which aims to find images of the “winningest” kittens (and — oof — the “losingest”) through a similar A/B selection framework.) But the platform also gives you the option — and here’s the “crowdsourcing” part — of adding your own idea into the mix. Not as a narrative addition — the open-ended “Additional Comments” box of traditional surveys — but as a contribution that will be added into the survey’s marketplace and voted up or down by the other survey-takers. (The open-ended responses are limited in length — to, natch, 140 characters — thus preventing modern-day Montaignes from gumming up the works.) You can vote on as many pairings — or as few — as you wish, and contribute as many/few ideas as you wish.
That contribution aspect is a small, but significant, shift. (Think of All Our Ideas, in fact, like Google Moderator — with a cleaner interface and, more significantly, hidden results that prevent users from being influenced by others’ feedback.) Because, it should be said: in general, from the user perspective, traditional surveys suck. With their pre-populated, multiple-choice framework, with those “Additional Comments” boxes (whose contents one assumes, won’t be counted as “data” proper and so likely won’t be counted all), they tend to preclude creativity on the part of the people taking them. They fall victim to a paradox: the larger the population of survey-takers — and thus, ostensibly, the more rigorous the data they can provide — the less incentive individual users have to take them. Or to take them seriously.
But All Our Ideas, with its invitation to creativity implicit in its “Add your own idea” button, adjusts that dynamic. The point is to inspire participation — meaningful participation — by a simple interface with practically no barriers to entry. The whole thing was designed, Salganik says, “to be very light and easy.”
Here’s what it looks like (you can also test it out for yourself using All Our Ideas’ sample interface — a survey issued by Princeton’s student government asking undergrads what improvements it should make to campus life):
The ease-of-use translates to the survey-issuers, as well: All Our Ideas is available for sites to use via an API and, for the less tech-savvy or more time-pressed, an embeddable widget. (Which is also to say: it’s free.) Surveyors can tailor the platform to the particular survey they want to run, seeding it with initial ideas and deciding whether the survey run will be entirely algorithmic or human-moderated. For the latter option, each surveyor designates a moderator, charged with approving user-generated ideas before they become part of a survey’s idea marketplace; for both options, users themselves can flag ideas as inappropriate.
So far, it’s been used by organizations like Catholic Relief Services in Baltimore, which used the platform to survey more than 4,000 employees — based out of 150 offices worldwide and speaking several different languages — about what makes an ideal relief worker; Columbia Law School’s student government used it to find the best idea for improving campus life (that survey got 15,000 votes, Salganik told me, with 200 new ideas uploaded in the first 48 hours). And the Princeton student government survey got more than 2,000 students to contribute 40,000 votes and 100 new ideas in the space of a few weeks.
All Our Ideas, Salganik says, “deals with a fundamental problem that exists in the social sciences in terms of how we aggregate information.” Traditionally, academics can gather feedback either using pre-populated surveys, which are good at quantifying huge swaths of information, but also limited in the scope of the data they can gather…or, on the other hand, using focus groups and interviews, which are great for gathering open, unstructured information — information that’s “unfiltered by an pre-existing biases that you might have,” Salganik points out — but that are also difficult to analyze. Not to mention inefficient and, often, expensive.
And from the surveyers’ perspective, as well, surveys can be a blunt instrument: their general inability to quantify narrative feedback has forced survey-writers to rely on pre-determined questions. Which is to say, on pre-determined answers. “I’ve actually designed some surveys before, and had the suspicion that I’d left something out,” Salganik says. It’s a guessing game — educated guessing, yes, but guessing all the same. “You only get out what you put in,” he points out. And you don’t know what you don’t know.
But “one of the patterns we see consistently is that ideas that are uploaded by users sometimes score better than the best ideas that started it off,” Salganik says. “Because no matter how hard you try, there are just ideas out there that you don’t know.” But other people do.
That utility easily translates to news organizations, who might use All Our Ideas to crowdsource thoughts on anything from news articles to opinion pieces to particular areas of editorial focus. “Let’s say you’re a newspaper,” Salganik says. “You could have one of these [surveys] set up for each neighborhood in a city. You could have twenty of them.”
The platform could also be used to conduct internal surveys — particularly useful at larger organizations, where the lower-level reporters, editors, and producers who man the trenches of daily journalism might have the most meaningful ideas about organizational priorities…but where those workers’ voices might also have the least chance of being heard. News outlets both mammoth and slightly less so have been trying to rectify that asymmetry; an org-wide survey, where every contribution exists on equal footing with every other, could bring structure to the ideal of an idea marketplace that is — yes — truly democratic.
But perhaps the most significant use of the platform could be broad-scale and systemic: surveying users about, yes, what they want. (See, for example, ProPublica’s employment of an editorially focused reader survey a couple months ago.) Pose one basic question — broad (“What kinds of stories are you most interested in knowing?”) or narrow (“Whom should we bring on as our next columnist?”) — and see what results. That’s a way of giving more agency to users than traditional surveys have; it’s also a way of letting them know that you value their opinions in the first place.