The year AI truly arrives — and with it the reckoning

“Journalists will become even more essential to society as AI enters the mainstream, where we will help set standards, track potential abuses, and bring our ethics and standards to the technology.”

“Robot reporters are coming to steal your job!” That’s the warning from experts as AI technology continues to advance, with the potential to revolutionize the journalism industry.

That paragraph above wasn’t written by me. It’s what shiny new AI chatbot ChatGPT spit out when I prompted it to “write a lede in the style of a tabloid newspaper for an article about AI and its future implications for journalism.”

But the reality will be quite the contrary: Journalists will become even more essential to society as AI enters the mainstream, where we will help set standards, track potential abuses, and bring our ethics and standards to the technology. And AI will surely shake the world in ways we can’t yet imagine.

This past year has seen an explosion of “generative AI” products with algorithms that take a descriptive prompt and create pseudo-realistic pictures and video in styles from paintings to cartoons. AI-enhanced personal avatars have become a meme on social networks.

And now ChatGPT, with an interface that’s like texting your know-it-all friend, is opening more eyes to the possibilities of AI. Just days after its release, ChatGPT was up to 1 million users, according to OpenAI CEO Sam Altman. The program can write poems and songs in various styles, check software code, and produce credible-sounding summaries on basically any topic.

ChatGPT will look primitive compared to what’s coming as soon as 2023. OpenAI’s latest AI model, GPT-4, is expected to be dramatically more powerful — and potentially much closer to what’s known as “artificial general intelligence.” That would mean the AI model could learn on its own, rather than the cumbersome training process now required. Once AI can teach itself, some experts speculate it could rapidly evolve into “superintelligence” — where the capabilities of the technology vastly outpace all human capabilities and intellect.

This could be used to amazing benefit, curing cancer or halting a potential pandemic before it starts. But what safeguards will be in place to make sure AI isn’t used for harm?

OpenAI’s stated mission is to ensure that artificial general intelligence “benefits all of humanity.” But its roots are in the Silicon Valley technology community, where growth at all costs to build massive personal fortunes have generally come ahead of moral concerns.

Elon Musk was one of the major initial funders of OpenAI and previously on its board. We’ve seen since his purchase of Twitter what happens when standards and online safety are tossed out in favor of one man’s ego. Meanwhile, Musk’s Neuralink startup, which aims to build computers that can be implanted into human brains, is being investigated for abusing animals in its research.

ChatGPT itself says it can’t connect to the Internet to update what it knows, and that its training data only goes through 2021. What happens when a more advanced AI is connected to the Internet?

AI can’t do anything in the real world without some kind of robotic device it could control. A future AI could decide to build that for itself, finding online 3D printers and ordering drones and self-driving cars to bring together whatever parts it needs. What if the AI realizes legislators are trying to place limits on what it can do, so it decides to stack the vote by spreading misinformation on social networks or channeling donations to favorable candidates? It sounds like a fantasy out of “The Terminator” or “2001: A Space Odyssey.” (And writing about this even made me wonder for a second if I’m placing myself at risk. Hi there, future robot overlord!)

ChatGPT replies as quickly as usual when you ask “What are the potential risks to humans from a super-intelligent AI?”:

One potential risk of a super-intelligent AI is that it could become too powerful and act in ways that are harmful to humanity. This could happen if the AI decides to eliminate humans in order to achieve its goals or to make certain that it is not shut down. Another potential risk is that the AI could make decisions that are unethical or immoral, either because it has been designed to do so or because it has learned to do so on its own. Additionally, a super-intelligent AI could also cause widespread economic disruption, as it would be able to out-compete humans in many tasks and industries. Finally, a super-intelligent AI could potentially pose a threat to human security, as it could be used as a weapon in warfare or other conflicts.

That’s just an AI model replying to a prompt, parroting back what it’s synthesized from articles written by people and posted around the web.

We humans need to keep asking the hard questions.

Burt Herman is co-founder and board chair of Hacks/Hackers.

“Robot reporters are coming to steal your job!” That’s the warning from experts as AI technology continues to advance, with the potential to revolutionize the journalism industry.

That paragraph above wasn’t written by me. It’s what shiny new AI chatbot ChatGPT spit out when I prompted it to “write a lede in the style of a tabloid newspaper for an article about AI and its future implications for journalism.”

But the reality will be quite the contrary: Journalists will become even more essential to society as AI enters the mainstream, where we will help set standards, track potential abuses, and bring our ethics and standards to the technology. And AI will surely shake the world in ways we can’t yet imagine.

This past year has seen an explosion of “generative AI” products with algorithms that take a descriptive prompt and create pseudo-realistic pictures and video in styles from paintings to cartoons. AI-enhanced personal avatars have become a meme on social networks.

And now ChatGPT, with an interface that’s like texting your know-it-all friend, is opening more eyes to the possibilities of AI. Just days after its release, ChatGPT was up to 1 million users, according to OpenAI CEO Sam Altman. The program can write poems and songs in various styles, check software code, and produce credible-sounding summaries on basically any topic.

ChatGPT will look primitive compared to what’s coming as soon as 2023. OpenAI’s latest AI model, GPT-4, is expected to be dramatically more powerful — and potentially much closer to what’s known as “artificial general intelligence.” That would mean the AI model could learn on its own, rather than the cumbersome training process now required. Once AI can teach itself, some experts speculate it could rapidly evolve into “superintelligence” — where the capabilities of the technology vastly outpace all human capabilities and intellect.

This could be used to amazing benefit, curing cancer or halting a potential pandemic before it starts. But what safeguards will be in place to make sure AI isn’t used for harm?

OpenAI’s stated mission is to ensure that artificial general intelligence “benefits all of humanity.” But its roots are in the Silicon Valley technology community, where growth at all costs to build massive personal fortunes have generally come ahead of moral concerns.

Elon Musk was one of the major initial funders of OpenAI and previously on its board. We’ve seen since his purchase of Twitter what happens when standards and online safety are tossed out in favor of one man’s ego. Meanwhile, Musk’s Neuralink startup, which aims to build computers that can be implanted into human brains, is being investigated for abusing animals in its research.

ChatGPT itself says it can’t connect to the Internet to update what it knows, and that its training data only goes through 2021. What happens when a more advanced AI is connected to the Internet?

AI can’t do anything in the real world without some kind of robotic device it could control. A future AI could decide to build that for itself, finding online 3D printers and ordering drones and self-driving cars to bring together whatever parts it needs. What if the AI realizes legislators are trying to place limits on what it can do, so it decides to stack the vote by spreading misinformation on social networks or channeling donations to favorable candidates? It sounds like a fantasy out of “The Terminator” or “2001: A Space Odyssey.” (And writing about this even made me wonder for a second if I’m placing myself at risk. Hi there, future robot overlord!)

ChatGPT replies as quickly as usual when you ask “What are the potential risks to humans from a super-intelligent AI?”:

One potential risk of a super-intelligent AI is that it could become too powerful and act in ways that are harmful to humanity. This could happen if the AI decides to eliminate humans in order to achieve its goals or to make certain that it is not shut down. Another potential risk is that the AI could make decisions that are unethical or immoral, either because it has been designed to do so or because it has learned to do so on its own. Additionally, a super-intelligent AI could also cause widespread economic disruption, as it would be able to out-compete humans in many tasks and industries. Finally, a super-intelligent AI could potentially pose a threat to human security, as it could be used as a weapon in warfare or other conflicts.

That’s just an AI model replying to a prompt, parroting back what it’s synthesized from articles written by people and posted around the web.

We humans need to keep asking the hard questions.

Burt Herman is co-founder and board chair of Hacks/Hackers.

Jesse Holcomb   Buffeted, whipped, bullied, pulled

Richard Tofel   The press might get better at vetting presidential candidates

Johannes Klingebiel   The innovation team, R.I.P.

Kerri Hoffman   Podcasting goes local

Dannagal G. Young   Stop rewarding elite performances of identity threat

Sue Cross   Thinking and acting collectively to save the news

Sumi Aggarwal   Smart newsrooms will prioritize board development

Francesco Zaffarano   There is no end of “social media”

Jennifer Choi and Jonathan Jackson   Funders finally bet on next-generation news entrepreneurs

Snigdha Sur   Newsrooms get nimble in a recession

Nicholas Thompson   The year AI actually changes the media business

Alexandra Svokos   Working harder to reach audiences where they are

Joanne McNeil   Facebook and the media kiss and make up

Anthony Nadler   Confronting media gerrymandering

Jaden Amos   TikTok personality journalists continue to rise

Lisa Heyamoto   The independent news industry gets a roadmap to sustainability

Janelle Salanga   Journalists work from a place of harm reduction

Valérie Bélair-Gagnon   Well-being will become a core tenet of journalism

Shanté Cosme   The answer to “quiet quitting” is radical empathy

Sarabeth Berman   Nonprofit local news shows that it can scale

Christoph Mergerson   The rot at the core of the news business

Michael Schudson   Journalism gets more and more difficult

Ståle Grut   Your newsroom experiences a Midjourney-gate, too

Megan Lucero and Shirish Kulkarni   The future of journalism is not you

Raney Aronson-Rath   Journalists will band together to fight intimidation

Jakob Moll   Journalism startups will think beyond English

Paul Cheung   More news organizations will realize they are in the business of impact, not eyeballs

Juleyka Lantigua   Newsrooms recognize women of color as the canaries in the coal mine

Tre'vell Anderson   Continued culpability in anti-trans campaigns

Wilson Liévano   Diaspora journalism takes the next step

Bill Adair   The year of the fact-check (no, really!)

Dominic-Madori Davis   Everyone finally realizes the need for diverse voices in tech reporting

Kavya Sukumar   Belling the cat: The rise of independent fact-checking at scale

Nicholas Diakopoulos   Journalists productively harness generative AI tools

Leezel Tanglao   Community partnerships drive better reporting

J. Siguru Wahutu   American journalism reckons with its colonialist tendencies

Jacob L. Nelson   Despite it all, people will still want to be journalists

Jody Brannon   We’ll embrace policy remedies

Mar Cabra   The inevitable mental health revolution

Jonas Kaiser   Rejecting the “free speech” frame

Amethyst J. Davis   The slight of the great contraction

David Skok   Renewed interest in human-powered reporting

Eric Ulken   Generative AI brings wrongness at scale

Jessica Clark   Open discourse retrenches

Dana Lacey   Tech will screw publishers over

Kaitlyn Wells   We’ll prioritize media literacy for children

Jenna Weiss-Berman   The economic downturn benefits the podcasting industry. (No, really!)

Andrew Losowsky   Journalism realizes the replacement for Twitter is not a new Twitter

Burt Herman   The year AI truly arrives — and with it the reckoning

Kaitlin C. Miller   Harassment in journalism won’t get better, but we’ll talk about it more openly

Mael Vallejo   More threats to press freedom across the Americas

Alan Henry   A reckoning with why trust in news is so low

Cassandra Etienne   Local news fellowships will help fight newsroom inequities

Tim Carmody   Newsletter writers need a new ethics

Jessica Maddox   Journalists keep getting manipulated by internet culture

Jennifer Brandel   AI couldn’t care less. Journalists will care more. 

Jim Friedlich   Local journalism steps up to the challenge of civic coverage

Sarah Marshall   A web channel strategy won’t be enough

Jarrad Henderson   Video editing will help people understand the media they consume

Julia Angwin   Democracies will get serious about saving journalism

Priyanjana Bengani   Partisan local news networks will collaborate

Cari Nazeer and Emily Goligoski   News organizations step up their support for caregivers

Alex Perry   New paths to transparency without Twitter

Al Lucca   Digital news design gets interesting again

Esther Kezia Thorpe   Subscription pressures force product innovation

Laura E. Davis   The year we embrace the robots — and ourselves

Martina Efeyini   Talk to Gen Z. They’re the experts of Gen Z.

Anna Nirmala   News organizations get new structures

Brian Moritz   Rebuilding the news bundle

Doris Truong   Workers demand to be paid what the job is worth

Khushbu Shah   Global reporting will suffer

Susan Chira   Equipping local journalism

Moreno Cruz Osório   Brazilian journalism turns wounds into action

Ryan Gantz   “I’m sorry, but I’m a large language model”

Ayala Panievsky   It’s time for PR for journalism

Nikki Usher   This is the year of the RSS reader. (Really!)

Emily Nonko   Incarcerated reporters get more bylines

John Davidow   A year of intergenerational learning

Andrew Donohue   We’ll find out whether journalism can, indeed, save democracy

Alex Sujong Laughlin   Credit where it’s due

Amy Schmitz Weiss   Journalism education faces a crossroads

A.J. Bauer   Covering the right wrong

Masuma Ahuja   Journalism starts working for and with its communities

Laxmi Parthasarathy   Unlocking the silent demand for international journalism

Tamar Charney   Flux is the new stability

Emma Carew Grovum   The year to resist forgetting about diversity

Michael W. Wagner   The backlash against pro-democracy reporting is coming

Danielle K. Brown and Kathleen Searles   DEI efforts must consider mental health and online abuse

Walter Frick   Journalists wake up to the power of prediction markets

Gina Chua   The traditional story structure gets deconstructed

Sam Guzik   AI will start fact-checking. We may not like the results.

Hillary Frey   Death to the labor-intensive memo for prospective hires

Errin Haines   Journalists on the campaign trail mend trust with the public

Joe Amditis   AI throws a lifeline to local publishers

Mariana Moura Santos   A woman who speaks is a woman who changes the world

S. Mitra Kalita   “Everything sucks. Good luck to you.”

Brian Stelter   Finding new ways to reach news avoiders

Don Day   The news about the news is bad. I’m optimistic.

Eric Nuzum   A focus on people instead of power

Delano Massey   The industry shakes its imposter syndrome

Elizabeth Bramson-Boudreau   More of the same

Surya Mattu   Data journalists learn from photojournalists

Kirstin McCudden   We’ll codify protection of journalism and newsgathering

Basile Simon   Towards supporting criminal accountability

Sarah Alvarez   Dream bigger or lose out

Stefanie Murray   The year U.S. media stops screwing around and becomes pro-democracy

Janet Haven   ChatGPT and the future of trust 

Julia Beizer   News fatigue shows us a clear path forward

David Cohn   AI made this prediction

Gordon Crovitz   The year advertisers stop funding misinformation

Eric Thurm   Journalists think of themselves as workers

Matt Rasnic   More newsroom workers turn to organized labor

Joni Deutsch   Podcast collaboration — not competition — breeds excellence

Kathy Lu   We need emotionally agile newsroom leaders

Peter Bale   Rising costs force more digital innovation

Eric Holthaus   As social media fragments, marginalized voices gain more power

Christina Shih   Shared values move from nice-to-haves to essentials

Simon Galperin   Philanthropy stops investing in corporate media

Bill Grueskin   Local news will come to rely on AI

Rodney Gibbs   Recalibrating how we work apart

Karina Montoya   More reporters on the antitrust beat

Elite Truong   In platform collapse, an opportunity for community

Sam Gregory   Synthetic media forces us to understand how media gets made

AX Mina   Journalism in a time of permacrisis

Sarah Stonbely   Growth in public funding for news and information at the state and local levels

Daniel Trielli   Trust in news will continue to fall. Just look at Brazil.

Anita Varma   Journalism prioritizes the basic need for survival

Mario García   More newsrooms go mobile-first

Gabe Schneider   Well-funded journalism leaders stop making disparate pay

Sue Robinson   Engagement journalism will have to confront a tougher reality

Molly de Aguiar and Mandy Van Deven   Narrative change trend brings new money to journalism

Larry Ryckman   We’ll work together with our competitors

Joshua P. Darr   Local to live, wire to wither

Sue Schardt   Toward a new poetics of journalism

Mauricio Cabrera   It’s no longer about audiences, it’s about communities

Josh Schwartz   The AI spammers are coming

Victor Pickard   The year journalism and capitalism finally divorce

Taylor Lorenz   The “creator economy” will be astroturfed

Felicitas Carrique and Becca Aaronson   News product goes from trend to standard

Rachel Glickhouse   Humanizing newsrooms will be a badge of honor

Nicholas Jackson   There will be launches — and we’ll keep doing the work

Barbara Raab   More journalism funders will take more risks

Ryan Nave   Citizen journalism, but make it equitable

Pia Frey   Publishers start polling their users at scale

Ben Werdmuller   The internet is up for grabs again

Cory Bergman   The AI content flood

Anika Anand   Independent news businesses lead the way on healthy work cultures

Ryan Kellett   Airline-like loyalty programs try to tie down news readers

Upasna Gautam   Technology that performs at the speed of news

Ariel Zirulnick   Journalism doubles down on user needs

Zizi Papacharissi   Platforms are over

Alexandra Borchardt   The year of the climate journalism strategy

Parker Molloy   We’ll reach new heights of moral panic

Peter Sterne   AI enters the newsroom

Jim VandeHei   There is no “peak newsletter”

Mary Walter-Brown and Tristan Loper   Mission-driven metrics become our North Star

Cindy Royal   Yes, journalists should learn to code, but…