The year AI truly arrives — and with it the reckoning

“Journalists will become even more essential to society as AI enters the mainstream, where we will help set standards, track potential abuses, and bring our ethics and standards to the technology.”

“Robot reporters are coming to steal your job!” That’s the warning from experts as AI technology continues to advance, with the potential to revolutionize the journalism industry.

That paragraph above wasn’t written by me. It’s what shiny new AI chatbot ChatGPT spit out when I prompted it to “write a lede in the style of a tabloid newspaper for an article about AI and its future implications for journalism.”

But the reality will be quite the contrary: Journalists will become even more essential to society as AI enters the mainstream, where we will help set standards, track potential abuses, and bring our ethics and standards to the technology. And AI will surely shake the world in ways we can’t yet imagine.

This past year has seen an explosion of “generative AI” products with algorithms that take a descriptive prompt and create pseudo-realistic pictures and video in styles from paintings to cartoons. AI-enhanced personal avatars have become a meme on social networks.

And now ChatGPT, with an interface that’s like texting your know-it-all friend, is opening more eyes to the possibilities of AI. Just days after its release, ChatGPT was up to 1 million users, according to OpenAI CEO Sam Altman. The program can write poems and songs in various styles, check software code, and produce credible-sounding summaries on basically any topic.

ChatGPT will look primitive compared to what’s coming as soon as 2023. OpenAI’s latest AI model, GPT-4, is expected to be dramatically more powerful — and potentially much closer to what’s known as “artificial general intelligence.” That would mean the AI model could learn on its own, rather than the cumbersome training process now required. Once AI can teach itself, some experts speculate it could rapidly evolve into “superintelligence” — where the capabilities of the technology vastly outpace all human capabilities and intellect.

This could be used to amazing benefit, curing cancer or halting a potential pandemic before it starts. But what safeguards will be in place to make sure AI isn’t used for harm?

OpenAI’s stated mission is to ensure that artificial general intelligence “benefits all of humanity.” But its roots are in the Silicon Valley technology community, where growth at all costs to build massive personal fortunes have generally come ahead of moral concerns.

Elon Musk was one of the major initial funders of OpenAI and previously on its board. We’ve seen since his purchase of Twitter what happens when standards and online safety are tossed out in favor of one man’s ego. Meanwhile, Musk’s Neuralink startup, which aims to build computers that can be implanted into human brains, is being investigated for abusing animals in its research.

ChatGPT itself says it can’t connect to the Internet to update what it knows, and that its training data only goes through 2021. What happens when a more advanced AI is connected to the Internet?

AI can’t do anything in the real world without some kind of robotic device it could control. A future AI could decide to build that for itself, finding online 3D printers and ordering drones and self-driving cars to bring together whatever parts it needs. What if the AI realizes legislators are trying to place limits on what it can do, so it decides to stack the vote by spreading misinformation on social networks or channeling donations to favorable candidates? It sounds like a fantasy out of “The Terminator” or “2001: A Space Odyssey.” (And writing about this even made me wonder for a second if I’m placing myself at risk. Hi there, future robot overlord!)

ChatGPT replies as quickly as usual when you ask “What are the potential risks to humans from a super-intelligent AI?”:

One potential risk of a super-intelligent AI is that it could become too powerful and act in ways that are harmful to humanity. This could happen if the AI decides to eliminate humans in order to achieve its goals or to make certain that it is not shut down. Another potential risk is that the AI could make decisions that are unethical or immoral, either because it has been designed to do so or because it has learned to do so on its own. Additionally, a super-intelligent AI could also cause widespread economic disruption, as it would be able to out-compete humans in many tasks and industries. Finally, a super-intelligent AI could potentially pose a threat to human security, as it could be used as a weapon in warfare or other conflicts.

That’s just an AI model replying to a prompt, parroting back what it’s synthesized from articles written by people and posted around the web.

We humans need to keep asking the hard questions.

Burt Herman is co-founder and board chair of Hacks/Hackers.

“Robot reporters are coming to steal your job!” That’s the warning from experts as AI technology continues to advance, with the potential to revolutionize the journalism industry.

That paragraph above wasn’t written by me. It’s what shiny new AI chatbot ChatGPT spit out when I prompted it to “write a lede in the style of a tabloid newspaper for an article about AI and its future implications for journalism.”

But the reality will be quite the contrary: Journalists will become even more essential to society as AI enters the mainstream, where we will help set standards, track potential abuses, and bring our ethics and standards to the technology. And AI will surely shake the world in ways we can’t yet imagine.

This past year has seen an explosion of “generative AI” products with algorithms that take a descriptive prompt and create pseudo-realistic pictures and video in styles from paintings to cartoons. AI-enhanced personal avatars have become a meme on social networks.

And now ChatGPT, with an interface that’s like texting your know-it-all friend, is opening more eyes to the possibilities of AI. Just days after its release, ChatGPT was up to 1 million users, according to OpenAI CEO Sam Altman. The program can write poems and songs in various styles, check software code, and produce credible-sounding summaries on basically any topic.

ChatGPT will look primitive compared to what’s coming as soon as 2023. OpenAI’s latest AI model, GPT-4, is expected to be dramatically more powerful — and potentially much closer to what’s known as “artificial general intelligence.” That would mean the AI model could learn on its own, rather than the cumbersome training process now required. Once AI can teach itself, some experts speculate it could rapidly evolve into “superintelligence” — where the capabilities of the technology vastly outpace all human capabilities and intellect.

This could be used to amazing benefit, curing cancer or halting a potential pandemic before it starts. But what safeguards will be in place to make sure AI isn’t used for harm?

OpenAI’s stated mission is to ensure that artificial general intelligence “benefits all of humanity.” But its roots are in the Silicon Valley technology community, where growth at all costs to build massive personal fortunes have generally come ahead of moral concerns.

Elon Musk was one of the major initial funders of OpenAI and previously on its board. We’ve seen since his purchase of Twitter what happens when standards and online safety are tossed out in favor of one man’s ego. Meanwhile, Musk’s Neuralink startup, which aims to build computers that can be implanted into human brains, is being investigated for abusing animals in its research.

ChatGPT itself says it can’t connect to the Internet to update what it knows, and that its training data only goes through 2021. What happens when a more advanced AI is connected to the Internet?

AI can’t do anything in the real world without some kind of robotic device it could control. A future AI could decide to build that for itself, finding online 3D printers and ordering drones and self-driving cars to bring together whatever parts it needs. What if the AI realizes legislators are trying to place limits on what it can do, so it decides to stack the vote by spreading misinformation on social networks or channeling donations to favorable candidates? It sounds like a fantasy out of “The Terminator” or “2001: A Space Odyssey.” (And writing about this even made me wonder for a second if I’m placing myself at risk. Hi there, future robot overlord!)

ChatGPT replies as quickly as usual when you ask “What are the potential risks to humans from a super-intelligent AI?”:

One potential risk of a super-intelligent AI is that it could become too powerful and act in ways that are harmful to humanity. This could happen if the AI decides to eliminate humans in order to achieve its goals or to make certain that it is not shut down. Another potential risk is that the AI could make decisions that are unethical or immoral, either because it has been designed to do so or because it has learned to do so on its own. Additionally, a super-intelligent AI could also cause widespread economic disruption, as it would be able to out-compete humans in many tasks and industries. Finally, a super-intelligent AI could potentially pose a threat to human security, as it could be used as a weapon in warfare or other conflicts.

That’s just an AI model replying to a prompt, parroting back what it’s synthesized from articles written by people and posted around the web.

We humans need to keep asking the hard questions.

Burt Herman is co-founder and board chair of Hacks/Hackers.

Mario García   More newsrooms go mobile-first

Errin Haines   Journalists on the campaign trail mend trust with the public

Rodney Gibbs   Recalibrating how we work apart

Jennifer Brandel   AI couldn’t care less. Journalists will care more. 

David Cohn   AI made this prediction

Alex Perry   New paths to transparency without Twitter

Tre'vell Anderson   Continued culpability in anti-trans campaigns

Alan Henry   A reckoning with why trust in news is so low

Cassandra Etienne   Local news fellowships will help fight newsroom inequities

Mar Cabra   The inevitable mental health revolution

James Salanga   Journalists work from a place of harm reduction

Eric Ulken   Generative AI brings wrongness at scale

Lisa Heyamoto   The independent news industry gets a roadmap to sustainability

Taylor Lorenz   The “creator economy” will be astroturfed

Jesse Holcomb   Buffeted, whipped, bullied, pulled

Christoph Mergerson   The rot at the core of the news business

Elizabeth Bramson-Boudreau   More of the same

Amethyst J. Davis   The slight of the great contraction

Julia Beizer   News fatigue shows us a clear path forward

Jessica Maddox   Journalists keep getting manipulated by internet culture

J. Siguru Wahutu   American journalism reckons with its colonialist tendencies

Surya Mattu   Data journalists learn from photojournalists

Joni Deutsch   Podcast collaboration — not competition — breeds excellence

Esther Kezia Thorpe   Subscription pressures force product innovation

Raney Aronson-Rath   Journalists will band together to fight intimidation

Cindy Royal   Yes, journalists should learn to code, but…

Jim VandeHei   There is no “peak newsletter”

Hillary Frey   Death to the labor-intensive memo for prospective hires

Bill Adair   The year of the fact-check (no, really!)

Masuma Ahuja   Journalism starts working for and with its communities

Jenna Weiss-Berman   The economic downturn benefits the podcasting industry. (No, really!)

Tim Carmody   Newsletter writers need a new ethics

Basile Simon   Towards supporting criminal accountability

Kerri Hoffman   Podcasting goes local

A.J. Bauer   Covering the right wrong

Janet Haven   ChatGPT and the future of trust 

Daniel Trielli   Trust in news will continue to fall. Just look at Brazil.

Eric Nuzum   A focus on people instead of power

Sam Gregory   Synthetic media forces us to understand how media gets made

Moreno Cruz Osório   Brazilian journalism turns wounds into action

Brian Moritz   Rebuilding the news bundle

Jennifer Choi and Jonathan Jackson   Funders finally bet on next-generation news entrepreneurs

Anika Anand   Independent news businesses lead the way on healthy work cultures

Michael W. Wagner   The backlash against pro-democracy reporting is coming

Walter Frick   Journalists wake up to the power of prediction markets

Ryan Nave   Citizen journalism, but make it equitable

Jim Friedlich   Local journalism steps up to the challenge of civic coverage

Alexandra Borchardt   The year of the climate journalism strategy

Amy Schmitz Weiss   Journalism education faces a crossroads

Joanne McNeil   Facebook and the media kiss and make up

Peter Sterne   AI enters the newsroom

Paul Cheung   More news organizations will realize they are in the business of impact, not eyeballs

Ayala Panievsky   It’s time for PR for journalism

Gina Chua   The traditional story structure gets deconstructed

Sumi Aggarwal   Smart newsrooms will prioritize board development

Nicholas Thompson   The year AI actually changes the media business

Karina Montoya   More reporters on the antitrust beat

Laura E. Davis   The year we embrace the robots — and ourselves

Rachel Glickhouse   Humanizing newsrooms will be a badge of honor

David Skok   Renewed interest in human-powered reporting

John Davidow   A year of intergenerational learning

Mary Walter-Brown and Tristan Loper   Mission-driven metrics become our North Star

Megan Lucero and Shirish Kulkarni   The future of journalism is not you

Alexandra Svokos   Working harder to reach audiences where they are

Ariel Zirulnick   Journalism doubles down on user needs

Sam Guzik   AI will start fact-checking. We may not like the results.

Jessica Clark   Open discourse retrenches

Richard Tofel   The press might get better at vetting presidential candidates

Barbara Raab   More journalism funders will take more risks

Alex Sujong Laughlin   Credit where it’s due

Upasna Gautam   Technology that performs at the speed of news

Doris Truong   Workers demand to be paid what the job is worth

Kathy Lu   We need emotionally agile newsroom leaders

S. Mitra Kalita   “Everything sucks. Good luck to you.”

Jarrad Henderson   Video editing will help people understand the media they consume

Felicitas Carrique and Becca Aaronson   News product goes from trend to standard

Cory Bergman   The AI content flood

Eric Holthaus   As social media fragments, marginalized voices gain more power

Larry Ryckman   We’ll work together with our competitors

Anna Nirmala   News organizations get new structures

Jaden Amos   TikTok personality journalists continue to rise

Nicholas Diakopoulos   Journalists productively harness generative AI tools

Leezel Tanglao   Community partnerships drive better reporting

Simon Galperin   Philanthropy stops investing in corporate media

Elite Truong   In platform collapse, an opportunity for community

Joe Amditis   AI throws a lifeline to local publishers

Juleyka Lantigua   Newsrooms recognize women of color as the canaries in the coal mine

Emma Carew Grovum   The year to resist forgetting about diversity

Burt Herman   The year AI truly arrives — and with it the reckoning

Jakob Moll   Journalism startups will think beyond English

Sarabeth Berman   Nonprofit local news shows that it can scale

Mael Vallejo   More threats to press freedom across the Americas

Snigdha Sur   Newsrooms get nimble in a recession

Ryan Gantz   “I’m sorry, but I’m a large language model”

Emily Nonko   Incarcerated reporters get more bylines

Shanté Cosme   The answer to “quiet quitting” is radical empathy

AX Mina   Journalism in a time of permacrisis

Victor Pickard   The year journalism and capitalism finally divorce

Cari Nazeer and Emily Goligoski   News organizations step up their support for caregivers

Martina Efeyini   Talk to Gen Z. They’re the experts of Gen Z.

Ryan Kellett   Airline-like loyalty programs try to tie down news readers

Dana Lacey   Tech will screw publishers over

Josh Schwartz   The AI spammers are coming

Jonas Kaiser   Rejecting the “free speech” frame

Tamar Charney   Flux is the new stability

Molly de Aguiar and Mandy Van Deven   Narrative change trend brings new money to journalism

Mauricio Cabrera   It’s no longer about audiences, it’s about communities

Matt Rasnic   More newsroom workers turn to organized labor

Sue Schardt   Toward a new poetics of journalism

Zizi Papacharissi   Platforms are over

Don Day   The news about the news is bad. I’m optimistic.

Jacob L. Nelson   Despite it all, people will still want to be journalists

Anita Varma   Journalism prioritizes the basic need for survival

Stefanie Murray   The year U.S. media stops screwing around and becomes pro-democracy

Gabe Schneider   Well-funded journalism leaders stop making disparate pay

Ståle Grut   Your newsroom experiences a Midjourney-gate, too

Bill Grueskin   Local news will come to rely on AI

Michael Schudson   Journalism gets more and more difficult

Peter Bale   Rising costs force more digital innovation

Parker Molloy   We’ll reach new heights of moral panic

Brian Stelter   Finding new ways to reach news avoiders

Dominic-Madori Davis   Everyone finally realizes the need for diverse voices in tech reporting

Delano Massey   The industry shakes its imposter syndrome

Andrew Losowsky   Journalism realizes the replacement for Twitter is not a new Twitter

Kaitlyn Wells   We’ll prioritize media literacy for children

Priyanjana Bengani   Partisan local news networks will collaborate

Laxmi Parthasarathy   Unlocking the silent demand for international journalism

Khushbu Shah   Global reporting will suffer

Wilson Liévano   Diaspora journalism takes the next step

Joshua P. Darr   Local to live, wire to wither

Valérie Bélair-Gagnon   Well-being will become a core tenet of journalism

Jody Brannon   We’ll embrace policy remedies

Kirstin McCudden   We’ll codify protection of journalism and newsgathering

Danielle K. Brown and Kathleen Searles   DEI efforts must consider mental health and online abuse

Dannagal G. Young   Stop rewarding elite performances of identity threat

Johannes Klingebiel   The innovation team, R.I.P.

Nicholas Jackson   There will be launches — and we’ll keep doing the work

Anthony Nadler   Confronting media gerrymandering

Al Lucca   Digital news design gets interesting again

Eric Thurm   Journalists think of themselves as workers

Francesco Zaffarano   There is no end of “social media”

Gordon Crovitz   The year advertisers stop funding misinformation

Kaitlin C. Miller   Harassment in journalism won’t get better, but we’ll talk about it more openly

Kavya Sukumar   Belling the cat: The rise of independent fact-checking at scale

Sue Robinson   Engagement journalism will have to confront a tougher reality

Julia Angwin   Democracies will get serious about saving journalism

Pia Frey   Publishers start polling their users at scale

Susan Chira   Equipping local journalism

Christina Shih   Shared values move from nice-to-haves to essentials

Sarah Alvarez   Dream bigger or lose out

Andrew Donohue   We’ll find out whether journalism can, indeed, save democracy

Sarah Marshall   A web channel strategy won’t be enough

Mariana Moura Santos   A woman who speaks is a woman who changes the world

Sarah Stonbely   Growth in public funding for news and information at the state and local levels

Nikki Usher   This is the year of the RSS reader. (Really!)

Ben Werdmuller   The internet is up for grabs again

Sue Cross   Thinking and acting collectively to save the news