“Robot reporters are coming to steal your job!” That’s the warning from experts as AI technology continues to advance, with the potential to revolutionize the journalism industry.
That paragraph above wasn’t written by me. It’s what shiny new AI chatbot ChatGPT spit out when I prompted it to “write a lede in the style of a tabloid newspaper for an article about AI and its future implications for journalism.”
But the reality will be quite the contrary: Journalists will become even more essential to society as AI enters the mainstream, where we will help set standards, track potential abuses, and bring our ethics and standards to the technology. And AI will surely shake the world in ways we can’t yet imagine.
This past year has seen an explosion of “generative AI” products with algorithms that take a descriptive prompt and create pseudo-realistic pictures and video in styles from paintings to cartoons. AI-enhanced personal avatars have become a meme on social networks.
And now ChatGPT, with an interface that’s like texting your know-it-all friend, is opening more eyes to the possibilities of AI. Just days after its release, ChatGPT was up to 1 million users, according to OpenAI CEO Sam Altman. The program can write poems and songs in various styles, check software code, and produce credible-sounding summaries on basically any topic.
ChatGPT will look primitive compared to what’s coming as soon as 2023. OpenAI’s latest AI model, GPT-4, is expected to be dramatically more powerful — and potentially much closer to what’s known as “artificial general intelligence.” That would mean the AI model could learn on its own, rather than the cumbersome training process now required. Once AI can teach itself, some experts speculate it could rapidly evolve into “superintelligence” — where the capabilities of the technology vastly outpace all human capabilities and intellect.
This could be used to amazing benefit, curing cancer or halting a potential pandemic before it starts. But what safeguards will be in place to make sure AI isn’t used for harm?
OpenAI’s stated mission is to ensure that artificial general intelligence “benefits all of humanity.” But its roots are in the Silicon Valley technology community, where growth at all costs to build massive personal fortunes have generally come ahead of moral concerns.
Elon Musk was one of the major initial funders of OpenAI and previously on its board. We’ve seen since his purchase of Twitter what happens when standards and online safety are tossed out in favor of one man’s ego. Meanwhile, Musk’s Neuralink startup, which aims to build computers that can be implanted into human brains, is being investigated for abusing animals in its research.
ChatGPT itself says it can’t connect to the Internet to update what it knows, and that its training data only goes through 2021. What happens when a more advanced AI is connected to the Internet?
AI can’t do anything in the real world without some kind of robotic device it could control. A future AI could decide to build that for itself, finding online 3D printers and ordering drones and self-driving cars to bring together whatever parts it needs. What if the AI realizes legislators are trying to place limits on what it can do, so it decides to stack the vote by spreading misinformation on social networks or channeling donations to favorable candidates? It sounds like a fantasy out of “The Terminator” or “2001: A Space Odyssey.” (And writing about this even made me wonder for a second if I’m placing myself at risk. Hi there, future robot overlord!)
ChatGPT replies as quickly as usual when you ask “What are the potential risks to humans from a super-intelligent AI?”:
One potential risk of a super-intelligent AI is that it could become too powerful and act in ways that are harmful to humanity. This could happen if the AI decides to eliminate humans in order to achieve its goals or to make certain that it is not shut down. Another potential risk is that the AI could make decisions that are unethical or immoral, either because it has been designed to do so or because it has learned to do so on its own. Additionally, a super-intelligent AI could also cause widespread economic disruption, as it would be able to out-compete humans in many tasks and industries. Finally, a super-intelligent AI could potentially pose a threat to human security, as it could be used as a weapon in warfare or other conflicts.
That’s just an AI model replying to a prompt, parroting back what it’s synthesized from articles written by people and posted around the web.
We humans need to keep asking the hard questions.
Burt Herman is co-founder and board chair of Hacks/Hackers.
“Robot reporters are coming to steal your job!” That’s the warning from experts as AI technology continues to advance, with the potential to revolutionize the journalism industry.
That paragraph above wasn’t written by me. It’s what shiny new AI chatbot ChatGPT spit out when I prompted it to “write a lede in the style of a tabloid newspaper for an article about AI and its future implications for journalism.”
But the reality will be quite the contrary: Journalists will become even more essential to society as AI enters the mainstream, where we will help set standards, track potential abuses, and bring our ethics and standards to the technology. And AI will surely shake the world in ways we can’t yet imagine.
This past year has seen an explosion of “generative AI” products with algorithms that take a descriptive prompt and create pseudo-realistic pictures and video in styles from paintings to cartoons. AI-enhanced personal avatars have become a meme on social networks.
And now ChatGPT, with an interface that’s like texting your know-it-all friend, is opening more eyes to the possibilities of AI. Just days after its release, ChatGPT was up to 1 million users, according to OpenAI CEO Sam Altman. The program can write poems and songs in various styles, check software code, and produce credible-sounding summaries on basically any topic.
ChatGPT will look primitive compared to what’s coming as soon as 2023. OpenAI’s latest AI model, GPT-4, is expected to be dramatically more powerful — and potentially much closer to what’s known as “artificial general intelligence.” That would mean the AI model could learn on its own, rather than the cumbersome training process now required. Once AI can teach itself, some experts speculate it could rapidly evolve into “superintelligence” — where the capabilities of the technology vastly outpace all human capabilities and intellect.
This could be used to amazing benefit, curing cancer or halting a potential pandemic before it starts. But what safeguards will be in place to make sure AI isn’t used for harm?
OpenAI’s stated mission is to ensure that artificial general intelligence “benefits all of humanity.” But its roots are in the Silicon Valley technology community, where growth at all costs to build massive personal fortunes have generally come ahead of moral concerns.
Elon Musk was one of the major initial funders of OpenAI and previously on its board. We’ve seen since his purchase of Twitter what happens when standards and online safety are tossed out in favor of one man’s ego. Meanwhile, Musk’s Neuralink startup, which aims to build computers that can be implanted into human brains, is being investigated for abusing animals in its research.
ChatGPT itself says it can’t connect to the Internet to update what it knows, and that its training data only goes through 2021. What happens when a more advanced AI is connected to the Internet?
AI can’t do anything in the real world without some kind of robotic device it could control. A future AI could decide to build that for itself, finding online 3D printers and ordering drones and self-driving cars to bring together whatever parts it needs. What if the AI realizes legislators are trying to place limits on what it can do, so it decides to stack the vote by spreading misinformation on social networks or channeling donations to favorable candidates? It sounds like a fantasy out of “The Terminator” or “2001: A Space Odyssey.” (And writing about this even made me wonder for a second if I’m placing myself at risk. Hi there, future robot overlord!)
ChatGPT replies as quickly as usual when you ask “What are the potential risks to humans from a super-intelligent AI?”:
One potential risk of a super-intelligent AI is that it could become too powerful and act in ways that are harmful to humanity. This could happen if the AI decides to eliminate humans in order to achieve its goals or to make certain that it is not shut down. Another potential risk is that the AI could make decisions that are unethical or immoral, either because it has been designed to do so or because it has learned to do so on its own. Additionally, a super-intelligent AI could also cause widespread economic disruption, as it would be able to out-compete humans in many tasks and industries. Finally, a super-intelligent AI could potentially pose a threat to human security, as it could be used as a weapon in warfare or other conflicts.
That’s just an AI model replying to a prompt, parroting back what it’s synthesized from articles written by people and posted around the web.
We humans need to keep asking the hard questions.
Burt Herman is co-founder and board chair of Hacks/Hackers.
Danielle K. Brown and Kathleen Searles DEI efforts must consider mental health and online abuse
Janelle Salanga Journalists work from a place of harm reduction
Sue Schardt Toward a new poetics of journalism
Ryan Nave Citizen journalism, but make it equitable
Jesse Holcomb Buffeted, whipped, bullied, pulled
Shanté Cosme The answer to “quiet quitting” is radical empathy
Hillary Frey Death to the labor-intensive memo for prospective hires
Tre'vell Anderson Continued culpability in anti-trans campaigns
Esther Kezia Thorpe Subscription pressures force product innovation
Michael Schudson Journalism gets more and more difficult
Al Lucca Digital news design gets interesting again
Nikki Usher This is the year of the RSS reader. (Really!)
Sam Gregory Synthetic media forces us to understand how media gets made
Martina Efeyini Talk to Gen Z. They’re the experts of Gen Z.
Jody Brannon We’ll embrace policy remedies
Rodney Gibbs Recalibrating how we work apart
Brian Moritz Rebuilding the news bundle
Brian Stelter Finding new ways to reach news avoiders
Mariana Moura Santos A woman who speaks is a woman who changes the world
Kaitlyn Wells We’ll prioritize media literacy for children
Dannagal G. Young Stop rewarding elite performances of identity threat
Bill Grueskin Local news will come to rely on AI
Tamar Charney Flux is the new stability
Kirstin McCudden We’ll codify protection of journalism and newsgathering
Emily Nonko Incarcerated reporters get more bylines
Delano Massey The industry shakes its imposter syndrome
Valérie Bélair-Gagnon Well-being will become a core tenet of journalism
Barbara Raab More journalism funders will take more risks
Pia Frey Publishers start polling their users at scale
Jacob L. Nelson Despite it all, people will still want to be journalists
Raney Aronson-Rath Journalists will band together to fight intimidation
Rachel Glickhouse Humanizing newsrooms will be a badge of honor
Ben Werdmuller The internet is up for grabs again
Mar Cabra The inevitable mental health revolution
Gina Chua The traditional story structure gets deconstructed
Kavya Sukumar Belling the cat: The rise of independent fact-checking at scale
Joe Amditis AI throws a lifeline to local publishers
Ryan Gantz “I’m sorry, but I’m a large language model”
Mary Walter-Brown and Tristan Loper Mission-driven metrics become our North Star
Eric Thurm Journalists think of themselves as workers
Sam Guzik AI will start fact-checking. We may not like the results.
Alex Perry New paths to transparency without Twitter
Alexandra Svokos Working harder to reach audiences where they are
Parker Molloy We’ll reach new heights of moral panic
Ayala Panievsky It’s time for PR for journalism
Nicholas Diakopoulos Journalists productively harness generative AI tools
Anika Anand Independent news businesses lead the way on healthy work cultures
Daniel Trielli Trust in news will continue to fall. Just look at Brazil.
Elite Truong In platform collapse, an opportunity for community
Don Day The news about the news is bad. I’m optimistic.
Leezel Tanglao Community partnerships drive better reporting
Julia Angwin Democracies will get serious about saving journalism
Paul Cheung More news organizations will realize they are in the business of impact, not eyeballs
Jaden Amos TikTok personality journalists continue to rise
Alex Sujong Laughlin Credit where it’s due
Mario García More newsrooms go mobile-first
Anna Nirmala News organizations get new structures
Bill Adair The year of the fact-check (no, really!)
Nicholas Thompson The year AI actually changes the media business
Priyanjana Bengani Partisan local news networks will collaborate
Joni Deutsch Podcast collaboration — not competition — breeds excellence
Jenna Weiss-Berman The economic downturn benefits the podcasting industry. (No, really!)
Stefanie Murray The year U.S. media stops screwing around and becomes pro-democracy
Andrew Losowsky Journalism realizes the replacement for Twitter is not a new Twitter
Sarah Alvarez Dream bigger or lose out
Zizi Papacharissi Platforms are over
S. Mitra Kalita “Everything sucks. Good luck to you.”
Cassandra Etienne Local news fellowships will help fight newsroom inequities
Jim VandeHei There is no “peak newsletter”
Karina Montoya More reporters on the antitrust beat
Laxmi Parthasarathy Unlocking the silent demand for international journalism
Upasna Gautam Technology that performs at the speed of news
Sue Robinson Engagement journalism will have to confront a tougher reality
Ariel Zirulnick Journalism doubles down on user needs
Emma Carew Grovum The year to resist forgetting about diversity
Sue Cross Thinking and acting collectively to save the news
Snigdha Sur Newsrooms get nimble in a recession
David Skok Renewed interest in human-powered reporting
Mauricio Cabrera It’s no longer about audiences, it’s about communities
Sarah Stonbely Growth in public funding for news and information at the state and local levels
Amethyst J. Davis The slight of the great contraction
Simon Galperin Philanthropy stops investing in corporate media
Dominic-Madori Davis Everyone finally realizes the need for diverse voices in tech reporting
Megan Lucero and Shirish Kulkarni The future of journalism is not you
Elizabeth Bramson-Boudreau More of the same
An Xiao Mina Journalism in a time of permacrisis
Johannes Klingebiel The innovation team, R.I.P.
Jarrad Henderson Video editing will help people understand the media they consume
Susan Chira Equipping local journalism
Doris Truong Workers demand to be paid what the job is worth
Josh Schwartz The AI spammers are coming
Alan Henry A reckoning with why trust in news is so low
Christina Shih Shared values move from nice-to-haves to essentials
Peter Sterne AI enters the newsroom
Cari Nazeer and Emily Goligoski News organizations step up their support for caregivers
Julia Beizer News fatigue shows us a clear path forward
Moreno Cruz Osório Brazilian journalism turns wounds into action
Ståle Grut Your newsroom experiences a Midjourney-gate, too
Victor Pickard The year journalism and capitalism finally divorce
Taylor Lorenz The “creator economy” will be astroturfed
Jim Friedlich Local journalism steps up to the challenge of civic coverage
Burt Herman The year AI truly arrives — and with it the reckoning
Francesco Zaffarano There is no end of “social media”
John Davidow A year of intergenerational learning
Basile Simon Towards supporting criminal accountability
Christoph Mergerson The rot at the core of the news business
Ryan Kellett Airline-like loyalty programs try to tie down news readers
Gordon Crovitz The year advertisers stop funding misinformation
Eric Holthaus As social media fragments, marginalized voices gain more power
J. Siguru Wahutu American journalism reckons with its colonialist tendencies
A.J. Bauer Covering the right wrong
Kerri Hoffman Podcasting goes local
Anthony Nadler Confronting media gerrymandering
Jakob Moll Journalism startups will think beyond English
Anita Varma Journalism prioritizes the basic need for survival
Tim Carmody Newsletter writers need a new ethics
Cindy Royal Yes, journalists should learn to code, but…
Sarah Marshall A web channel strategy won’t be enough
Andrew Donohue We’ll find out whether journalism can, indeed, save democracy
Kaitlin C. Miller Harassment in journalism won’t get better, but we’ll talk about it more openly
Janet Haven ChatGPT and the future of trust
Errin Haines Journalists on the campaign trail mend trust with the public
Juleyka Lantigua Newsrooms recognize women of color as the canaries in the coal mine
Matt Rasnic More newsroom workers turn to organized labor
Molly de Aguiar and Mandy Van Deven Narrative change trend brings new money to journalism
Jennifer Brandel AI couldn’t care less. Journalists will care more.
Jennifer Choi and Jonathan Jackson Funders finally bet on next-generation news entrepreneurs
Khushbu Shah Global reporting will suffer
Kathy Lu We need emotionally agile newsroom leaders
Peter Bale Rising costs force more digital innovation
Mael Vallejo More threats to press freedom across the Americas
Dana Lacey Tech will screw publishers over
Cory Bergman The AI content flood
Surya Mattu Data journalists learn from photojournalists
Joanne McNeil Facebook and the media kiss and make up
Jessica Clark Open discourse retrenches
Wilson Liévano Diaspora journalism takes the next step
Walter Frick Journalists wake up to the power of prediction markets
Sarabeth Berman Nonprofit local news shows that it can scale
Nicholas Jackson There will be launches — and we’ll keep doing the work
Lisa Heyamoto The independent news industry gets a roadmap to sustainability
Richard Tofel The press might get better at vetting presidential candidates
Michael W. Wagner The backlash against pro-democracy reporting is coming
David Cohn AI made this prediction
Jonas Kaiser Rejecting the “free speech” frame
Larry Ryckman We’ll work together with our competitors
Joshua P. Darr Local to live, wire to wither
Alexandra Borchardt The year of the climate journalism strategy
Laura E. Davis The year we embrace the robots — and ourselves
Gabe Schneider Well-funded journalism leaders stop making disparate pay
Masuma Ahuja Journalism starts working for and with its communities
Jessica Maddox Journalists keep getting manipulated by internet culture
Sumi Aggarwal Smart newsrooms will prioritize board development
Felicitas Carrique and Becca Aaronson News product goes from trend to standard