The year AI truly arrives — and with it the reckoning

“Journalists will become even more essential to society as AI enters the mainstream, where we will help set standards, track potential abuses, and bring our ethics and standards to the technology.”

“Robot reporters are coming to steal your job!” That’s the warning from experts as AI technology continues to advance, with the potential to revolutionize the journalism industry.

That paragraph above wasn’t written by me. It’s what shiny new AI chatbot ChatGPT spit out when I prompted it to “write a lede in the style of a tabloid newspaper for an article about AI and its future implications for journalism.”

But the reality will be quite the contrary: Journalists will become even more essential to society as AI enters the mainstream, where we will help set standards, track potential abuses, and bring our ethics and standards to the technology. And AI will surely shake the world in ways we can’t yet imagine.

This past year has seen an explosion of “generative AI” products with algorithms that take a descriptive prompt and create pseudo-realistic pictures and video in styles from paintings to cartoons. AI-enhanced personal avatars have become a meme on social networks.

And now ChatGPT, with an interface that’s like texting your know-it-all friend, is opening more eyes to the possibilities of AI. Just days after its release, ChatGPT was up to 1 million users, according to OpenAI CEO Sam Altman. The program can write poems and songs in various styles, check software code, and produce credible-sounding summaries on basically any topic.

ChatGPT will look primitive compared to what’s coming as soon as 2023. OpenAI’s latest AI model, GPT-4, is expected to be dramatically more powerful — and potentially much closer to what’s known as “artificial general intelligence.” That would mean the AI model could learn on its own, rather than the cumbersome training process now required. Once AI can teach itself, some experts speculate it could rapidly evolve into “superintelligence” — where the capabilities of the technology vastly outpace all human capabilities and intellect.

This could be used to amazing benefit, curing cancer or halting a potential pandemic before it starts. But what safeguards will be in place to make sure AI isn’t used for harm?

OpenAI’s stated mission is to ensure that artificial general intelligence “benefits all of humanity.” But its roots are in the Silicon Valley technology community, where growth at all costs to build massive personal fortunes have generally come ahead of moral concerns.

Elon Musk was one of the major initial funders of OpenAI and previously on its board. We’ve seen since his purchase of Twitter what happens when standards and online safety are tossed out in favor of one man’s ego. Meanwhile, Musk’s Neuralink startup, which aims to build computers that can be implanted into human brains, is being investigated for abusing animals in its research.

ChatGPT itself says it can’t connect to the Internet to update what it knows, and that its training data only goes through 2021. What happens when a more advanced AI is connected to the Internet?

AI can’t do anything in the real world without some kind of robotic device it could control. A future AI could decide to build that for itself, finding online 3D printers and ordering drones and self-driving cars to bring together whatever parts it needs. What if the AI realizes legislators are trying to place limits on what it can do, so it decides to stack the vote by spreading misinformation on social networks or channeling donations to favorable candidates? It sounds like a fantasy out of “The Terminator” or “2001: A Space Odyssey.” (And writing about this even made me wonder for a second if I’m placing myself at risk. Hi there, future robot overlord!)

ChatGPT replies as quickly as usual when you ask “What are the potential risks to humans from a super-intelligent AI?”:

One potential risk of a super-intelligent AI is that it could become too powerful and act in ways that are harmful to humanity. This could happen if the AI decides to eliminate humans in order to achieve its goals or to make certain that it is not shut down. Another potential risk is that the AI could make decisions that are unethical or immoral, either because it has been designed to do so or because it has learned to do so on its own. Additionally, a super-intelligent AI could also cause widespread economic disruption, as it would be able to out-compete humans in many tasks and industries. Finally, a super-intelligent AI could potentially pose a threat to human security, as it could be used as a weapon in warfare or other conflicts.

That’s just an AI model replying to a prompt, parroting back what it’s synthesized from articles written by people and posted around the web.

We humans need to keep asking the hard questions.

Burt Herman is co-founder and board chair of Hacks/Hackers.

“Robot reporters are coming to steal your job!” That’s the warning from experts as AI technology continues to advance, with the potential to revolutionize the journalism industry.

That paragraph above wasn’t written by me. It’s what shiny new AI chatbot ChatGPT spit out when I prompted it to “write a lede in the style of a tabloid newspaper for an article about AI and its future implications for journalism.”

But the reality will be quite the contrary: Journalists will become even more essential to society as AI enters the mainstream, where we will help set standards, track potential abuses, and bring our ethics and standards to the technology. And AI will surely shake the world in ways we can’t yet imagine.

This past year has seen an explosion of “generative AI” products with algorithms that take a descriptive prompt and create pseudo-realistic pictures and video in styles from paintings to cartoons. AI-enhanced personal avatars have become a meme on social networks.

And now ChatGPT, with an interface that’s like texting your know-it-all friend, is opening more eyes to the possibilities of AI. Just days after its release, ChatGPT was up to 1 million users, according to OpenAI CEO Sam Altman. The program can write poems and songs in various styles, check software code, and produce credible-sounding summaries on basically any topic.

ChatGPT will look primitive compared to what’s coming as soon as 2023. OpenAI’s latest AI model, GPT-4, is expected to be dramatically more powerful — and potentially much closer to what’s known as “artificial general intelligence.” That would mean the AI model could learn on its own, rather than the cumbersome training process now required. Once AI can teach itself, some experts speculate it could rapidly evolve into “superintelligence” — where the capabilities of the technology vastly outpace all human capabilities and intellect.

This could be used to amazing benefit, curing cancer or halting a potential pandemic before it starts. But what safeguards will be in place to make sure AI isn’t used for harm?

OpenAI’s stated mission is to ensure that artificial general intelligence “benefits all of humanity.” But its roots are in the Silicon Valley technology community, where growth at all costs to build massive personal fortunes have generally come ahead of moral concerns.

Elon Musk was one of the major initial funders of OpenAI and previously on its board. We’ve seen since his purchase of Twitter what happens when standards and online safety are tossed out in favor of one man’s ego. Meanwhile, Musk’s Neuralink startup, which aims to build computers that can be implanted into human brains, is being investigated for abusing animals in its research.

ChatGPT itself says it can’t connect to the Internet to update what it knows, and that its training data only goes through 2021. What happens when a more advanced AI is connected to the Internet?

AI can’t do anything in the real world without some kind of robotic device it could control. A future AI could decide to build that for itself, finding online 3D printers and ordering drones and self-driving cars to bring together whatever parts it needs. What if the AI realizes legislators are trying to place limits on what it can do, so it decides to stack the vote by spreading misinformation on social networks or channeling donations to favorable candidates? It sounds like a fantasy out of “The Terminator” or “2001: A Space Odyssey.” (And writing about this even made me wonder for a second if I’m placing myself at risk. Hi there, future robot overlord!)

ChatGPT replies as quickly as usual when you ask “What are the potential risks to humans from a super-intelligent AI?”:

One potential risk of a super-intelligent AI is that it could become too powerful and act in ways that are harmful to humanity. This could happen if the AI decides to eliminate humans in order to achieve its goals or to make certain that it is not shut down. Another potential risk is that the AI could make decisions that are unethical or immoral, either because it has been designed to do so or because it has learned to do so on its own. Additionally, a super-intelligent AI could also cause widespread economic disruption, as it would be able to out-compete humans in many tasks and industries. Finally, a super-intelligent AI could potentially pose a threat to human security, as it could be used as a weapon in warfare or other conflicts.

That’s just an AI model replying to a prompt, parroting back what it’s synthesized from articles written by people and posted around the web.

We humans need to keep asking the hard questions.

Burt Herman is co-founder and board chair of Hacks/Hackers.

David Cohn   AI made this prediction

Parker Molloy   We’ll reach new heights of moral panic

Pia Frey   Publishers start polling their users at scale

Laura E. Davis   The year we embrace the robots — and ourselves

Doris Truong   Workers demand to be paid what the job is worth

Anika Anand   Independent news businesses lead the way on healthy work cultures

Raney Aronson-Rath   Journalists will band together to fight intimidation

Karina Montoya   More reporters on the antitrust beat

Masuma Ahuja   Journalism starts working for and with its communities

Cassandra Etienne   Local news fellowships will help fight newsroom inequities

Joanne McNeil   Facebook and the media kiss and make up

Nicholas Jackson   There will be launches — and we’ll keep doing the work

Esther Kezia Thorpe   Subscription pressures force product innovation

Lisa Heyamoto   The independent news industry gets a roadmap to sustainability

Nicholas Thompson   The year AI actually changes the media business

Upasna Gautam   Technology that performs at the speed of news

Mael Vallejo   More threats to press freedom across the Americas

Susan Chira   Equipping local journalism

Wilson Liévano   Diaspora journalism takes the next step

Mar Cabra   The inevitable mental health revolution

Burt Herman   The year AI truly arrives — and with it the reckoning

Ryan Kellett   Airline-like loyalty programs try to tie down news readers

Cory Bergman   The AI content flood

Gabe Schneider   Well-funded journalism leaders stop making disparate pay

Alex Sujong Laughlin   Credit where it’s due

Jody Brannon   We’ll embrace policy remedies

J. Siguru Wahutu   American journalism reckons with its colonialist tendencies

Brian Stelter   Finding new ways to reach news avoiders

Paul Cheung   More news organizations will realize they are in the business of impact, not eyeballs

Anna Nirmala   News organizations get new structures

Simon Galperin   Philanthropy stops investing in corporate media

Sarah Marshall   A web channel strategy won’t be enough

Jim Friedlich   Local journalism steps up to the challenge of civic coverage

Bill Adair   The year of the fact-check (no, really!)

Dana Lacey   Tech will screw publishers over

Danielle K. Brown and Kathleen Searles   DEI efforts must consider mental health and online abuse

Ståle Grut   Your newsroom experiences a Midjourney-gate, too

Jarrad Henderson   Video editing will help people understand the media they consume

Elizabeth Bramson-Boudreau   More of the same

Jesse Holcomb   Buffeted, whipped, bullied, pulled

Mary Walter-Brown and Tristan Loper   Mission-driven metrics become our North Star

Basile Simon   Towards supporting criminal accountability

Jacob L. Nelson   Despite it all, people will still want to be journalists

Eric Ulken   Generative AI brings wrongness at scale

Andrew Donohue   We’ll find out whether journalism can, indeed, save democracy

Janelle Salanga   Journalists work from a place of harm reduction

Rachel Glickhouse   Humanizing newsrooms will be a badge of honor

Cindy Royal   Yes, journalists should learn to code, but…

Ryan Nave   Citizen journalism, but make it equitable

Alex Perry   New paths to transparency without Twitter

Alexandra Svokos   Working harder to reach audiences where they are

Ariel Zirulnick   Journalism doubles down on user needs

Julia Angwin   Democracies will get serious about saving journalism

Nicholas Diakopoulos   Journalists productively harness generative AI tools

Jennifer Choi and Jonathan Jackson   Funders finally bet on next-generation news entrepreneurs

Joni Deutsch   Podcast collaboration — not competition — breeds excellence

Kathy Lu   We need emotionally agile newsroom leaders

Surya Mattu   Data journalists learn from photojournalists

Victor Pickard   The year journalism and capitalism finally divorce

Peter Bale   Rising costs force more digital innovation

Larry Ryckman   We’ll work together with our competitors

Gordon Crovitz   The year advertisers stop funding misinformation

Jenna Weiss-Berman   The economic downturn benefits the podcasting industry. (No, really!)

Francesco Zaffarano   There is no end of “social media”

Shanté Cosme   The answer to “quiet quitting” is radical empathy

Kaitlyn Wells   We’ll prioritize media literacy for children

Barbara Raab   More journalism funders will take more risks

Anthony Nadler   Confronting media gerrymandering

Mario García   More newsrooms go mobile-first

Brian Moritz   Rebuilding the news bundle

Walter Frick   Journalists wake up to the power of prediction markets

Khushbu Shah   Global reporting will suffer

Ben Werdmuller   The internet is up for grabs again

Alexandra Borchardt   The year of the climate journalism strategy

Matt Rasnic   More newsroom workers turn to organized labor

Richard Tofel   The press might get better at vetting presidential candidates

Sue Schardt   Toward a new poetics of journalism

Felicitas Carrique and Becca Aaronson   News product goes from trend to standard

Ryan Gantz   “I’m sorry, but I’m a large language model”

Priyanjana Bengani   Partisan local news networks will collaborate

Josh Schwartz   The AI spammers are coming

Gina Chua   The traditional story structure gets deconstructed

Joshua P. Darr   Local to live, wire to wither

Zizi Papacharissi   Platforms are over

Sarah Alvarez   Dream bigger or lose out

Mauricio Cabrera   It’s no longer about audiences, it’s about communities

Anita Varma   Journalism prioritizes the basic need for survival

Janet Haven   ChatGPT and the future of trust 

Juleyka Lantigua   Newsrooms recognize women of color as the canaries in the coal mine

Delano Massey   The industry shakes its imposter syndrome

Joe Amditis   AI throws a lifeline to local publishers

Tim Carmody   Newsletter writers need a new ethics

Elite Truong   In platform collapse, an opportunity for community

Michael Schudson   Journalism gets more and more difficult

A.J. Bauer   Covering the right wrong

Christoph Mergerson   The rot at the core of the news business

Jaden Amos   TikTok personality journalists continue to rise

S. Mitra Kalita   “Everything sucks. Good luck to you.”

Emma Carew Grovum   The year to resist forgetting about diversity

Sam Guzik   AI will start fact-checking. We may not like the results.

AX Mina   Journalism in a time of permacrisis

Sarabeth Berman   Nonprofit local news shows that it can scale

Hillary Frey   Death to the labor-intensive memo for prospective hires

Jessica Maddox   Journalists keep getting manipulated by internet culture

Taylor Lorenz   The “creator economy” will be astroturfed

Martina Efeyini   Talk to Gen Z. They’re the experts of Gen Z.

Eric Thurm   Journalists think of themselves as workers

Dannagal G. Young   Stop rewarding elite performances of identity threat

Molly de Aguiar and Mandy Van Deven   Narrative change trend brings new money to journalism

Eric Nuzum   A focus on people instead of power

Kirstin McCudden   We’ll codify protection of journalism and newsgathering

Amethyst J. Davis   The slight of the great contraction

Jonas Kaiser   Rejecting the “free speech” frame

Jim VandeHei   There is no “peak newsletter”

Kerri Hoffman   Podcasting goes local

Megan Lucero and Shirish Kulkarni   The future of journalism is not you

Nikki Usher   This is the year of the RSS reader. (Really!)

Sumi Aggarwal   Smart newsrooms will prioritize board development

Dominic-Madori Davis   Everyone finally realizes the need for diverse voices in tech reporting

Peter Sterne   AI enters the newsroom

Don Day   The news about the news is bad. I’m optimistic.

Amy Schmitz Weiss   Journalism education faces a crossroads

Tre'vell Anderson   Continued culpability in anti-trans campaigns

Sue Robinson   Engagement journalism will have to confront a tougher reality

Daniel Trielli   Trust in news will continue to fall. Just look at Brazil.

Snigdha Sur   Newsrooms get nimble in a recession

Moreno Cruz Osório   Brazilian journalism turns wounds into action

Kavya Sukumar   Belling the cat: The rise of independent fact-checking at scale

Christina Shih   Shared values move from nice-to-haves to essentials

Emily Nonko   Incarcerated reporters get more bylines

Sam Gregory   Synthetic media forces us to understand how media gets made

Tamar Charney   Flux is the new stability

Stefanie Murray   The year U.S. media stops screwing around and becomes pro-democracy

Bill Grueskin   Local news will come to rely on AI

Rodney Gibbs   Recalibrating how we work apart

Sarah Stonbely   Growth in public funding for news and information at the state and local levels

Jakob Moll   Journalism startups will think beyond English

Michael W. Wagner   The backlash against pro-democracy reporting is coming

Al Lucca   Digital news design gets interesting again

Jessica Clark   Open discourse retrenches

David Skok   Renewed interest in human-powered reporting

Cari Nazeer and Emily Goligoski   News organizations step up their support for caregivers

Valérie Bélair-Gagnon   Well-being will become a core tenet of journalism

Laxmi Parthasarathy   Unlocking the silent demand for international journalism

Johannes Klingebiel   The innovation team, R.I.P.

John Davidow   A year of intergenerational learning

Ayala Panievsky   It’s time for PR for journalism

Julia Beizer   News fatigue shows us a clear path forward

Alan Henry   A reckoning with why trust in news is so low

Kaitlin C. Miller   Harassment in journalism won’t get better, but we’ll talk about it more openly

Errin Haines   Journalists on the campaign trail mend trust with the public

Jennifer Brandel   AI couldn’t care less. Journalists will care more. 

Leezel Tanglao   Community partnerships drive better reporting

Sue Cross   Thinking and acting collectively to save the news

Andrew Losowsky   Journalism realizes the replacement for Twitter is not a new Twitter

Eric Holthaus   As social media fragments, marginalized voices gain more power

Mariana Moura Santos   A woman who speaks is a woman who changes the world