Prediction
AI companies grapple with what it means to be creators of news
Name
Maggie Harrison Dupré
Excerpt
“The media and tech industries, frequently to the dismay of both, are deeply and inextricably intertwined.”
Prediction ID
4d6167676965-25
 

As generative AI creeps further into news delivery, discovery, and consumption, both the media and the tech industries will increasingly find themselves in a strange new digital landscape where tech giants aren’t just aggregators of journalism, but creators of it.

The media and tech industries, frequently to the dismay of both, are deeply and inextricably intertwined. For centuries, technology has offered pathways for media companies to create and distribute their product; media companies, meanwhile, have provided value for new distribution platforms. Today, it’s search giants like Google and social media behemoths like Meta that run the distribution show, printing money off ad dollars and farming data-rich engagement as digital media ventures rely on them for readership.

But that landscape is changing. Investments continue to flood the AI bubble, and tech companies throughout Silicon Valley are racing to infuse generative AI into every corner of their businesses. This trend is perhaps nowhere more palpable than in the online search market, where AI is altering existing products like Google Search, the monopolistic entryway to the web that’s now frequently outfitted with an “AI Overview” — in short, an AI-generated paraphrasing of search results affixed above search queries — as well as entirely new search products, like OpenAI’s “ChatGPT search” version of its flagship chatbot and the startup Perplexity’s similar “Answer Engine.”

Powering large-scale AI tools is a practice of setting boatloads of cash on fire, so how tech companies will effectively monetize AI search products remains unclear (advertising and subscription models are being floated, but the economics are dicey). But with executive enthusiasm for AI showing little sign of slowing down, Google, OpenAI, and others are due to find themselves grappling with an entirely different set of challenges: the editorial responsibilities that come with purveying, instead of merely curating, news and information.

Journalism isn’t easy. Reporting is difficult, time-consuming work that necessitates attention to detail. Get those details wrong, and the consequences vary. A mistake like a wrong date could result in a simple — if never fun — correction, while a misleading statement or outright falsehood could change the entire context of a piece of news, which could result in reputational and legal damages to publishers and journalists as well as to reporting subjects. Ensuring proper attribution is also critical; it’s important to know where information is coming from, and to avoid plagiarizing others’ work.

Of course, right now, AI tools aren’t actually reporting — they’re swallowing the work of human journalists and regurgitating it into something just scrambled enough to evade copyright law. Doing so, though, marks a step for the tech industry beyond offering algorithm-organized links to news; editorial choices are inherent to even short news and search summaries, and accurate information and attributions are just as important.

As it stands, generative AI tools aren’t reliable agents for any of these basic journalistic standards. Google’s AI Overview was caught spitting out unhinged medical recommendations and mangling the details of historical events, with the company’s CEO later admitting that there’s no way of ensuring accuracy in the product’s current state; a German journalist was falsely accused by Microsoft’s OpenAI-powered Copilot tool of being a child molester; Forbes is suing Perplexity for copyright theft; and in addition to sourcing and citing blatantly plagiarized articles, a new review by Columbia’s Tow Center for Digital Journalism found that ChatGPT search frequently misattributes direct quotes from publishers — even those that have entered into “strategic content partnerships” with OpenAI — in its AI summaries. If a human journalist were consistently making these same errors, they’d probably be out of a job.

Will the tech get better? Maybe! Will ongoing lawsuits filed by media institutions and AI-defamed individuals against AI companies change the industry’s legal and regulatory landscape? We’ll see! Regardless, tech companies’ role in the information ecosystem is shifting — and with it, their relationship with publishers and the media economy at large. The bottom line: tech companies are no longer just operators of the information superhighway.

Maggie Harrison Dupré is a senior staff writer for Futurism.

As generative AI creeps further into news delivery, discovery, and consumption, both the media and the tech industries will increasingly find themselves in a strange new digital landscape where tech giants aren’t just aggregators of journalism, but creators of it.

The media and tech industries, frequently to the dismay of both, are deeply and inextricably intertwined. For centuries, technology has offered pathways for media companies to create and distribute their product; media companies, meanwhile, have provided value for new distribution platforms. Today, it’s search giants like Google and social media behemoths like Meta that run the distribution show, printing money off ad dollars and farming data-rich engagement as digital media ventures rely on them for readership.

But that landscape is changing. Investments continue to flood the AI bubble, and tech companies throughout Silicon Valley are racing to infuse generative AI into every corner of their businesses. This trend is perhaps nowhere more palpable than in the online search market, where AI is altering existing products like Google Search, the monopolistic entryway to the web that’s now frequently outfitted with an “AI Overview” — in short, an AI-generated paraphrasing of search results affixed above search queries — as well as entirely new search products, like OpenAI’s “ChatGPT search” version of its flagship chatbot and the startup Perplexity’s similar “Answer Engine.”

Powering large-scale AI tools is a practice of setting boatloads of cash on fire, so how tech companies will effectively monetize AI search products remains unclear (advertising and subscription models are being floated, but the economics are dicey). But with executive enthusiasm for AI showing little sign of slowing down, Google, OpenAI, and others are due to find themselves grappling with an entirely different set of challenges: the editorial responsibilities that come with purveying, instead of merely curating, news and information.

Journalism isn’t easy. Reporting is difficult, time-consuming work that necessitates attention to detail. Get those details wrong, and the consequences vary. A mistake like a wrong date could result in a simple — if never fun — correction, while a misleading statement or outright falsehood could change the entire context of a piece of news, which could result in reputational and legal damages to publishers and journalists as well as to reporting subjects. Ensuring proper attribution is also critical; it’s important to know where information is coming from, and to avoid plagiarizing others’ work.

Of course, right now, AI tools aren’t actually reporting — they’re swallowing the work of human journalists and regurgitating it into something just scrambled enough to evade copyright law. Doing so, though, marks a step for the tech industry beyond offering algorithm-organized links to news; editorial choices are inherent to even short news and search summaries, and accurate information and attributions are just as important.

As it stands, generative AI tools aren’t reliable agents for any of these basic journalistic standards. Google’s AI Overview was caught spitting out unhinged medical recommendations and mangling the details of historical events, with the company’s CEO later admitting that there’s no way of ensuring accuracy in the product’s current state; a German journalist was falsely accused by Microsoft’s OpenAI-powered Copilot tool of being a child molester; Forbes is suing Perplexity for copyright theft; and in addition to sourcing and citing blatantly plagiarized articles, a new review by Columbia’s Tow Center for Digital Journalism found that ChatGPT search frequently misattributes direct quotes from publishers — even those that have entered into “strategic content partnerships” with OpenAI — in its AI summaries. If a human journalist were consistently making these same errors, they’d probably be out of a job.

Will the tech get better? Maybe! Will ongoing lawsuits filed by media institutions and AI-defamed individuals against AI companies change the industry’s legal and regulatory landscape? We’ll see! Regardless, tech companies’ role in the information ecosystem is shifting — and with it, their relationship with publishers and the media economy at large. The bottom line: tech companies are no longer just operators of the information superhighway.

Maggie Harrison Dupré is a senior staff writer for Futurism.