The year AI truly arrives — and with it the reckoning

“Journalists will become even more essential to society as AI enters the mainstream, where we will help set standards, track potential abuses, and bring our ethics and standards to the technology.”

“Robot reporters are coming to steal your job!” That’s the warning from experts as AI technology continues to advance, with the potential to revolutionize the journalism industry.

That paragraph above wasn’t written by me. It’s what shiny new AI chatbot ChatGPT spit out when I prompted it to “write a lede in the style of a tabloid newspaper for an article about AI and its future implications for journalism.”

But the reality will be quite the contrary: Journalists will become even more essential to society as AI enters the mainstream, where we will help set standards, track potential abuses, and bring our ethics and standards to the technology. And AI will surely shake the world in ways we can’t yet imagine.

This past year has seen an explosion of “generative AI” products with algorithms that take a descriptive prompt and create pseudo-realistic pictures and video in styles from paintings to cartoons. AI-enhanced personal avatars have become a meme on social networks.

And now ChatGPT, with an interface that’s like texting your know-it-all friend, is opening more eyes to the possibilities of AI. Just days after its release, ChatGPT was up to 1 million users, according to OpenAI CEO Sam Altman. The program can write poems and songs in various styles, check software code, and produce credible-sounding summaries on basically any topic.

ChatGPT will look primitive compared to what’s coming as soon as 2023. OpenAI’s latest AI model, GPT-4, is expected to be dramatically more powerful — and potentially much closer to what’s known as “artificial general intelligence.” That would mean the AI model could learn on its own, rather than the cumbersome training process now required. Once AI can teach itself, some experts speculate it could rapidly evolve into “superintelligence” — where the capabilities of the technology vastly outpace all human capabilities and intellect.

This could be used to amazing benefit, curing cancer or halting a potential pandemic before it starts. But what safeguards will be in place to make sure AI isn’t used for harm?

OpenAI’s stated mission is to ensure that artificial general intelligence “benefits all of humanity.” But its roots are in the Silicon Valley technology community, where growth at all costs to build massive personal fortunes have generally come ahead of moral concerns.

Elon Musk was one of the major initial funders of OpenAI and previously on its board. We’ve seen since his purchase of Twitter what happens when standards and online safety are tossed out in favor of one man’s ego. Meanwhile, Musk’s Neuralink startup, which aims to build computers that can be implanted into human brains, is being investigated for abusing animals in its research.

ChatGPT itself says it can’t connect to the Internet to update what it knows, and that its training data only goes through 2021. What happens when a more advanced AI is connected to the Internet?

AI can’t do anything in the real world without some kind of robotic device it could control. A future AI could decide to build that for itself, finding online 3D printers and ordering drones and self-driving cars to bring together whatever parts it needs. What if the AI realizes legislators are trying to place limits on what it can do, so it decides to stack the vote by spreading misinformation on social networks or channeling donations to favorable candidates? It sounds like a fantasy out of “The Terminator” or “2001: A Space Odyssey.” (And writing about this even made me wonder for a second if I’m placing myself at risk. Hi there, future robot overlord!)

ChatGPT replies as quickly as usual when you ask “What are the potential risks to humans from a super-intelligent AI?”:

One potential risk of a super-intelligent AI is that it could become too powerful and act in ways that are harmful to humanity. This could happen if the AI decides to eliminate humans in order to achieve its goals or to make certain that it is not shut down. Another potential risk is that the AI could make decisions that are unethical or immoral, either because it has been designed to do so or because it has learned to do so on its own. Additionally, a super-intelligent AI could also cause widespread economic disruption, as it would be able to out-compete humans in many tasks and industries. Finally, a super-intelligent AI could potentially pose a threat to human security, as it could be used as a weapon in warfare or other conflicts.

That’s just an AI model replying to a prompt, parroting back what it’s synthesized from articles written by people and posted around the web.

We humans need to keep asking the hard questions.

Burt Herman is co-founder and board chair of Hacks/Hackers.

“Robot reporters are coming to steal your job!” That’s the warning from experts as AI technology continues to advance, with the potential to revolutionize the journalism industry.

That paragraph above wasn’t written by me. It’s what shiny new AI chatbot ChatGPT spit out when I prompted it to “write a lede in the style of a tabloid newspaper for an article about AI and its future implications for journalism.”

But the reality will be quite the contrary: Journalists will become even more essential to society as AI enters the mainstream, where we will help set standards, track potential abuses, and bring our ethics and standards to the technology. And AI will surely shake the world in ways we can’t yet imagine.

This past year has seen an explosion of “generative AI” products with algorithms that take a descriptive prompt and create pseudo-realistic pictures and video in styles from paintings to cartoons. AI-enhanced personal avatars have become a meme on social networks.

And now ChatGPT, with an interface that’s like texting your know-it-all friend, is opening more eyes to the possibilities of AI. Just days after its release, ChatGPT was up to 1 million users, according to OpenAI CEO Sam Altman. The program can write poems and songs in various styles, check software code, and produce credible-sounding summaries on basically any topic.

ChatGPT will look primitive compared to what’s coming as soon as 2023. OpenAI’s latest AI model, GPT-4, is expected to be dramatically more powerful — and potentially much closer to what’s known as “artificial general intelligence.” That would mean the AI model could learn on its own, rather than the cumbersome training process now required. Once AI can teach itself, some experts speculate it could rapidly evolve into “superintelligence” — where the capabilities of the technology vastly outpace all human capabilities and intellect.

This could be used to amazing benefit, curing cancer or halting a potential pandemic before it starts. But what safeguards will be in place to make sure AI isn’t used for harm?

OpenAI’s stated mission is to ensure that artificial general intelligence “benefits all of humanity.” But its roots are in the Silicon Valley technology community, where growth at all costs to build massive personal fortunes have generally come ahead of moral concerns.

Elon Musk was one of the major initial funders of OpenAI and previously on its board. We’ve seen since his purchase of Twitter what happens when standards and online safety are tossed out in favor of one man’s ego. Meanwhile, Musk’s Neuralink startup, which aims to build computers that can be implanted into human brains, is being investigated for abusing animals in its research.

ChatGPT itself says it can’t connect to the Internet to update what it knows, and that its training data only goes through 2021. What happens when a more advanced AI is connected to the Internet?

AI can’t do anything in the real world without some kind of robotic device it could control. A future AI could decide to build that for itself, finding online 3D printers and ordering drones and self-driving cars to bring together whatever parts it needs. What if the AI realizes legislators are trying to place limits on what it can do, so it decides to stack the vote by spreading misinformation on social networks or channeling donations to favorable candidates? It sounds like a fantasy out of “The Terminator” or “2001: A Space Odyssey.” (And writing about this even made me wonder for a second if I’m placing myself at risk. Hi there, future robot overlord!)

ChatGPT replies as quickly as usual when you ask “What are the potential risks to humans from a super-intelligent AI?”:

One potential risk of a super-intelligent AI is that it could become too powerful and act in ways that are harmful to humanity. This could happen if the AI decides to eliminate humans in order to achieve its goals or to make certain that it is not shut down. Another potential risk is that the AI could make decisions that are unethical or immoral, either because it has been designed to do so or because it has learned to do so on its own. Additionally, a super-intelligent AI could also cause widespread economic disruption, as it would be able to out-compete humans in many tasks and industries. Finally, a super-intelligent AI could potentially pose a threat to human security, as it could be used as a weapon in warfare or other conflicts.

That’s just an AI model replying to a prompt, parroting back what it’s synthesized from articles written by people and posted around the web.

We humans need to keep asking the hard questions.

Burt Herman is co-founder and board chair of Hacks/Hackers.

Eric Nuzum   A focus on people instead of power

Dana Lacey   Tech will screw publishers over

Julia Beizer   News fatigue shows us a clear path forward

Masuma Ahuja   Journalism starts working for and with its communities

Don Day   The news about the news is bad. I’m optimistic.

Moreno Cruz Osório   Brazilian journalism turns wounds into action

Christoph Mergerson   The rot at the core of the news business

Joni Deutsch   Podcast collaboration — not competition — breeds excellence

Nicholas Thompson   The year AI actually changes the media business

S. Mitra Kalita   “Everything sucks. Good luck to you.”

Sue Cross   Thinking and acting collectively to save the news

Alan Henry   A reckoning with why trust in news is so low

Juleyka Lantigua   Newsrooms recognize women of color as the canaries in the coal mine

Walter Frick   Journalists wake up to the power of prediction markets

Martina Efeyini   Talk to Gen Z. They’re the experts of Gen Z.

Raney Aronson-Rath   Journalists will band together to fight intimidation

Ståle Grut   Your newsroom experiences a Midjourney-gate, too

Kaitlin C. Miller   Harassment in journalism won’t get better, but we’ll talk about it more openly

Dominic-Madori Davis   Everyone finally realizes the need for diverse voices in tech reporting

Basile Simon   Towards supporting criminal accountability

Bill Grueskin   Local news will come to rely on AI

Nicholas Diakopoulos   Journalists productively harness generative AI tools

Joshua P. Darr   Local to live, wire to wither

Pia Frey   Publishers start polling their users at scale

Eric Ulken   Generative AI brings wrongness at scale

Jim VandeHei   There is no “peak newsletter”

Leezel Tanglao   Community partnerships drive better reporting

Rachel Glickhouse   Humanizing newsrooms will be a badge of honor

Sue Schardt   Toward a new poetics of journalism

Bill Adair   The year of the fact-check (no, really!)

Zizi Papacharissi   Platforms are over

Errin Haines   Journalists on the campaign trail mend trust with the public

Kerri Hoffman   Podcasting goes local

Burt Herman   The year AI truly arrives — and with it the reckoning

Anna Nirmala   News organizations get new structures

Richard Tofel   The press might get better at vetting presidential candidates

Lisa Heyamoto   The independent news industry gets a roadmap to sustainability

Daniel Trielli   Trust in news will continue to fall. Just look at Brazil.

Kaitlyn Wells   We’ll prioritize media literacy for children

Laxmi Parthasarathy   Unlocking the silent demand for international journalism

Sarah Marshall   A web channel strategy won’t be enough

Jody Brannon   We’ll embrace policy remedies

Christina Shih   Shared values move from nice-to-haves to essentials

Alexandra Borchardt   The year of the climate journalism strategy

Nikki Usher   This is the year of the RSS reader. (Really!)

Joanne McNeil   Facebook and the media kiss and make up

Megan Lucero and Shirish Kulkarni   The future of journalism is not you

Michael W. Wagner   The backlash against pro-democracy reporting is coming

Anthony Nadler   Confronting media gerrymandering

Simon Galperin   Philanthropy stops investing in corporate media

Valérie Bélair-Gagnon   Well-being will become a core tenet of journalism

Emma Carew Grovum   The year to resist forgetting about diversity

Dannagal G. Young   Stop rewarding elite performances of identity threat

Ryan Nave   Citizen journalism, but make it equitable

Tim Carmody   Newsletter writers need a new ethics

Anita Varma   Journalism prioritizes the basic need for survival

Jonas Kaiser   Rejecting the “free speech” frame

Khushbu Shah   Global reporting will suffer

Jarrad Henderson   Video editing will help people understand the media they consume

Cari Nazeer and Emily Goligoski   News organizations step up their support for caregivers

Sarabeth Berman   Nonprofit local news shows that it can scale

Peter Bale   Rising costs force more digital innovation

Alexandra Svokos   Working harder to reach audiences where they are

Jenna Weiss-Berman   The economic downturn benefits the podcasting industry. (No, really!)

Snigdha Sur   Newsrooms get nimble in a recession

Taylor Lorenz   The “creator economy” will be astroturfed

Jim Friedlich   Local journalism steps up to the challenge of civic coverage

Al Lucca   Digital news design gets interesting again

Gordon Crovitz   The year advertisers stop funding misinformation

Mariana Moura Santos   A woman who speaks is a woman who changes the world

Elizabeth Bramson-Boudreau   More of the same

Sarah Alvarez   Dream bigger or lose out

Cindy Royal   Yes, journalists should learn to code, but…

Janet Haven   ChatGPT and the future of trust 

Jakob Moll   Journalism startups will think beyond English

Josh Schwartz   The AI spammers are coming

Julia Angwin   Democracies will get serious about saving journalism

Sam Gregory   Synthetic media forces us to understand how media gets made

Hillary Frey   Death to the labor-intensive memo for prospective hires

Eric Thurm   Journalists think of themselves as workers

Delano Massey   The industry shakes its imposter syndrome

Paul Cheung   More news organizations will realize they are in the business of impact, not eyeballs

Doris Truong   Workers demand to be paid what the job is worth

Parker Molloy   We’ll reach new heights of moral panic

Sue Robinson   Engagement journalism will have to confront a tougher reality

Sarah Stonbely   Growth in public funding for news and information at the state and local levels

Barbara Raab   More journalism funders will take more risks

Kathy Lu   We need emotionally agile newsroom leaders

An Xiao Mina   Journalism in a time of permacrisis

Laura E. Davis   The year we embrace the robots — and ourselves

Kavya Sukumar   Belling the cat: The rise of independent fact-checking at scale

Sam Guzik   AI will start fact-checking. We may not like the results.

Johannes Klingebiel   The innovation team, R.I.P.

Gina Chua   The traditional story structure gets deconstructed

Janelle Salanga   Journalists work from a place of harm reduction

Tre'vell Anderson   Continued culpability in anti-trans campaigns

Kirstin McCudden   We’ll codify protection of journalism and newsgathering

Sumi Aggarwal   Smart newsrooms will prioritize board development

Stefanie Murray   The year U.S. media stops screwing around and becomes pro-democracy

Rodney Gibbs   Recalibrating how we work apart

Jacob L. Nelson   Despite it all, people will still want to be journalists

Jennifer Choi and Jonathan Jackson   Funders finally bet on next-generation news entrepreneurs

Alex Perry   New paths to transparency without Twitter

Upasna Gautam   Technology that performs at the speed of news

Amy Schmitz Weiss   Journalism education faces a crossroads

Emily Nonko   Incarcerated reporters get more bylines

Alex Sujong Laughlin   Credit where it’s due

Wilson Liévano   Diaspora journalism takes the next step

John Davidow   A year of intergenerational learning

Jaden Amos   TikTok personality journalists continue to rise

David Skok   Renewed interest in human-powered reporting

David Cohn   AI made this prediction

Ariel Zirulnick   Journalism doubles down on user needs

Matt Rasnic   More newsroom workers turn to organized labor

Esther Kezia Thorpe   Subscription pressures force product innovation

Mauricio Cabrera   It’s no longer about audiences, it’s about communities

Gabe Schneider   Well-funded journalism leaders stop making disparate pay

Mary Walter-Brown and Tristan Loper   Mission-driven metrics become our North Star

Jennifer Brandel   AI couldn’t care less. Journalists will care more. 

Jessica Clark   Open discourse retrenches

Surya Mattu   Data journalists learn from photojournalists

Nicholas Jackson   There will be launches — and we’ll keep doing the work

Cassandra Etienne   Local news fellowships will help fight newsroom inequities

A.J. Bauer   Covering the right wrong

Tamar Charney   Flux is the new stability

Peter Sterne   AI enters the newsroom

Brian Moritz   Rebuilding the news bundle

Mael Vallejo   More threats to press freedom across the Americas

Mario García   More newsrooms go mobile-first

Elite Truong   In platform collapse, an opportunity for community

Ben Werdmuller   The internet is up for grabs again

J. Siguru Wahutu   American journalism reckons with its colonialist tendencies

Ayala Panievsky   It’s time for PR for journalism

Brian Stelter   Finding new ways to reach news avoiders

Ryan Kellett   Airline-like loyalty programs try to tie down news readers

Anika Anand   Independent news businesses lead the way on healthy work cultures

Andrew Donohue   We’ll find out whether journalism can, indeed, save democracy

Eric Holthaus   As social media fragments, marginalized voices gain more power

Shanté Cosme   The answer to “quiet quitting” is radical empathy

Molly de Aguiar and Mandy Van Deven   Narrative change trend brings new money to journalism

Cory Bergman   The AI content flood

Amethyst J. Davis   The slight of the great contraction

Jesse Holcomb   Buffeted, whipped, bullied, pulled

Priyanjana Bengani   Partisan local news networks will collaborate

Danielle K. Brown and Kathleen Searles   DEI efforts must consider mental health and online abuse

Mar Cabra   The inevitable mental health revolution

Francesco Zaffarano   There is no end of “social media”

Joe Amditis   AI throws a lifeline to local publishers

Felicitas Carrique and Becca Aaronson   News product goes from trend to standard

Victor Pickard   The year journalism and capitalism finally divorce

Karina Montoya   More reporters on the antitrust beat

Michael Schudson   Journalism gets more and more difficult

Andrew Losowsky   Journalism realizes the replacement for Twitter is not a new Twitter

Susan Chira   Equipping local journalism

Jessica Maddox   Journalists keep getting manipulated by internet culture

Ryan Gantz   “I’m sorry, but I’m a large language model”

Larry Ryckman   We’ll work together with our competitors