Synthetic media forces us to understand how media gets made

“We’ll see these expanding authenticity and provenance technology efforts intersect with the evolving TikTokification of media production, focused on remix, playful editing, and integrated AI effects.”

So you’ve been skeptical about deepfakes ever since you read a hyperbolic headline in 2018. And you’ve been right — sort of. Faceswap deepfakes haven’t rocked U.S. or European politics or permeated every social media ecosystem, and mis-contextualized shallowfake videos outnumber them thousands to one in misinformation and disinformation. But globally, false claims of deepfakery increasingly confound publics and journalists, and the underlying foundational problem of non-consensual sexual images targeting women and LGBTQI+ people festers without solutions.

2023 will be the year in which we take seriously the measures to prepare for, but not panic over, synthetic media and its big sibling, the broader phenomenon of “generative AI.” The rapacious pace and public visibility of developments in this space — including the accessibility of Stability AI, the picture-generating variety of DALL-E, the look to the future of text-to-video research like Imagen and Phenaki, as well as the recent popularization of consumer tools like Lensa — reflects an underlying swell of technological advances, as well as potential profits taking the driver seat over ethics. These tools are rife with potential for distributed creativity and journalistic storytelling. But making it easier to fake realistic scenes of real people doing things they never did, or sexualized images of women, or nonsensical floods of fake war crimes images are not to be laughed at.

What form is better preparation likely to take? Witness’s own global consultations in our Prepare, Don’t Panic initiative on synthetic media have raised a number of areas: equity in access to detection tools and capacities for journalists globally and in smaller organizations, efforts on the insidious power of deepfake claims around real footage, strong platform policies and legislative options. But here I’ll focus on authenticity and provenance infrastructure, which show the work of how media was made, where it came from and was edited, and how it was distributed.

Early efforts like Coalition for Content Provenance and Authenticity (C2PA) technical standards and Content Authenticity Initiative launched in 2022, and this space will be rife for innovation as long as we don’t default to just assuming it’s about tamper-proof immutability of origin images but understanding the nuance of how media is made. Authenticity and provenance efforts focus on layers of context about media integrity and origins available to everyone from a viewer who really wants to understand how a creative image was made, to a professional investigator or journalist. They are a proactive step to engage with a manipulated and synthetic media world. Witness focused within the C2PA coalition on the global, human rights ramifications of these types of standards, and how they can be done right and in a way that is user-centric, respects privacy, accounts for global journalistic contexts, and avoids legislative weaponization.

In 2023, we’ll see these expanding authenticity and provenance technology efforts intersect with the evolving TikTokification of media production, focused on remix, playful editing, and integrated AI effects. A labeling and disclosure mindset for creators and journalists alike will intermingle with the creative potential of showing how media is created and revealing the production process. We’ll start to extract ourselves from the current idea that disclosure and labeling are about singling out or discerning misinformation or malice. Want to see what this looks like in its baby steps? You can see the start of this process in your For You Page on TikTok, where you can see the audio a creator used or the effect they incorporated.

When it comes to generative AI systems, we’re likely to see efforts (and fight-back) to bake disclosure of how media is made into these models’ outputs, as well as the combinations of real and synthetic media that will become more commonplace. It’s not just soft norms pushing this way, like efforts on a Synthetic Media Code of Conduct, but also recent moves in Europe to mandate it within the draft EU AI Act.

These efforts will not sufficiently address non-consensual sexual images. These threats expand in scope with open image-generating systems which permit both real individuals and sexual imagery. The problem with these images is not one of “knowing it’s a deepfake”; even more acutely than in other scenarios, it’s the weaponization of lifelike images, irrespective of their perceived “reality.”

In 2023, as we start to separate the hype from the (un)reality with deepfakes, authenticity and provenance technologies will be one place we can look to help fortify the truth and pull-back the curtain on delightful creativity, by creating clear signals about how a piece of media has been created, generated, manipulated and edited.

Sam Gregory is director of programs, strategy, and innovation at Witness, the global human rights and civic journalism network.

So you’ve been skeptical about deepfakes ever since you read a hyperbolic headline in 2018. And you’ve been right — sort of. Faceswap deepfakes haven’t rocked U.S. or European politics or permeated every social media ecosystem, and mis-contextualized shallowfake videos outnumber them thousands to one in misinformation and disinformation. But globally, false claims of deepfakery increasingly confound publics and journalists, and the underlying foundational problem of non-consensual sexual images targeting women and LGBTQI+ people festers without solutions.

2023 will be the year in which we take seriously the measures to prepare for, but not panic over, synthetic media and its big sibling, the broader phenomenon of “generative AI.” The rapacious pace and public visibility of developments in this space — including the accessibility of Stability AI, the picture-generating variety of DALL-E, the look to the future of text-to-video research like Imagen and Phenaki, as well as the recent popularization of consumer tools like Lensa — reflects an underlying swell of technological advances, as well as potential profits taking the driver seat over ethics. These tools are rife with potential for distributed creativity and journalistic storytelling. But making it easier to fake realistic scenes of real people doing things they never did, or sexualized images of women, or nonsensical floods of fake war crimes images are not to be laughed at.

What form is better preparation likely to take? Witness’s own global consultations in our Prepare, Don’t Panic initiative on synthetic media have raised a number of areas: equity in access to detection tools and capacities for journalists globally and in smaller organizations, efforts on the insidious power of deepfake claims around real footage, strong platform policies and legislative options. But here I’ll focus on authenticity and provenance infrastructure, which show the work of how media was made, where it came from and was edited, and how it was distributed.

Early efforts like Coalition for Content Provenance and Authenticity (C2PA) technical standards and Content Authenticity Initiative launched in 2022, and this space will be rife for innovation as long as we don’t default to just assuming it’s about tamper-proof immutability of origin images but understanding the nuance of how media is made. Authenticity and provenance efforts focus on layers of context about media integrity and origins available to everyone from a viewer who really wants to understand how a creative image was made, to a professional investigator or journalist. They are a proactive step to engage with a manipulated and synthetic media world. Witness focused within the C2PA coalition on the global, human rights ramifications of these types of standards, and how they can be done right and in a way that is user-centric, respects privacy, accounts for global journalistic contexts, and avoids legislative weaponization.

In 2023, we’ll see these expanding authenticity and provenance technology efforts intersect with the evolving TikTokification of media production, focused on remix, playful editing, and integrated AI effects. A labeling and disclosure mindset for creators and journalists alike will intermingle with the creative potential of showing how media is created and revealing the production process. We’ll start to extract ourselves from the current idea that disclosure and labeling are about singling out or discerning misinformation or malice. Want to see what this looks like in its baby steps? You can see the start of this process in your For You Page on TikTok, where you can see the audio a creator used or the effect they incorporated.

When it comes to generative AI systems, we’re likely to see efforts (and fight-back) to bake disclosure of how media is made into these models’ outputs, as well as the combinations of real and synthetic media that will become more commonplace. It’s not just soft norms pushing this way, like efforts on a Synthetic Media Code of Conduct, but also recent moves in Europe to mandate it within the draft EU AI Act.

These efforts will not sufficiently address non-consensual sexual images. These threats expand in scope with open image-generating systems which permit both real individuals and sexual imagery. The problem with these images is not one of “knowing it’s a deepfake”; even more acutely than in other scenarios, it’s the weaponization of lifelike images, irrespective of their perceived “reality.”

In 2023, as we start to separate the hype from the (un)reality with deepfakes, authenticity and provenance technologies will be one place we can look to help fortify the truth and pull-back the curtain on delightful creativity, by creating clear signals about how a piece of media has been created, generated, manipulated and edited.

Sam Gregory is director of programs, strategy, and innovation at Witness, the global human rights and civic journalism network.

Brian Moritz   Rebuilding the news bundle

Janet Haven   ChatGPT and the future of trust 

Joanne McNeil   Facebook and the media kiss and make up

Christoph Mergerson   The rot at the core of the news business

Anita Varma   Journalism prioritizes the basic need for survival

Karina Montoya   More reporters on the antitrust beat

Jaden Amos   TikTok personality journalists continue to rise

Bill Grueskin   Local news will come to rely on AI

Errin Haines   Journalists on the campaign trail mend trust with the public

Amy Schmitz Weiss   Journalism education faces a crossroads

Joni Deutsch   Podcast collaboration — not competition — breeds excellence

Cassandra Etienne   Local news fellowships will help fight newsroom inequities

Mauricio Cabrera   It’s no longer about audiences, it’s about communities

Amethyst J. Davis   The slight of the great contraction

Don Day   The news about the news is bad. I’m optimistic.

Wilson Liévano   Diaspora journalism takes the next step

Anna Nirmala   News organizations get new structures

Molly de Aguiar and Mandy Van Deven   Narrative change trend brings new money to journalism

Kaitlyn Wells   We’ll prioritize media literacy for children

Nikki Usher   This is the year of the RSS reader. (Really!)

Tamar Charney   Flux is the new stability

A.J. Bauer   Covering the right wrong

Joe Amditis   AI throws a lifeline to local publishers

Francesco Zaffarano   There is no end of “social media”

Taylor Lorenz   The “creator economy” will be astroturfed

Dana Lacey   Tech will screw publishers over

Nicholas Jackson   There will be launches — and we’ll keep doing the work

Surya Mattu   Data journalists learn from photojournalists

Masuma Ahuja   Journalism starts working for and with its communities

Walter Frick   Journalists wake up to the power of prediction markets

Priyanjana Bengani   Partisan local news networks will collaborate

Cari Nazeer and Emily Goligoski   News organizations step up their support for caregivers

Daniel Trielli   Trust in news will continue to fall. Just look at Brazil.

Ben Werdmuller   The internet is up for grabs again

Anthony Nadler   Confronting media gerrymandering

Jakob Moll   Journalism startups will think beyond English

Kaitlin C. Miller   Harassment in journalism won’t get better, but we’ll talk about it more openly

Michael Schudson   Journalism gets more and more difficult

Nicholas Thompson   The year AI actually changes the media business

Sumi Aggarwal   Smart newsrooms will prioritize board development

Sue Robinson   Engagement journalism will have to confront a tougher reality

John Davidow   A year of intergenerational learning

Jessica Maddox   Journalists keep getting manipulated by internet culture

Khushbu Shah   Global reporting will suffer

Zizi Papacharissi   Platforms are over

Hillary Frey   Death to the labor-intensive memo for prospective hires

Esther Kezia Thorpe   Subscription pressures force product innovation

Tim Carmody   Newsletter writers need a new ethics

Kathy Lu   We need emotionally agile newsroom leaders

Sarah Stonbely   Growth in public funding for news and information at the state and local levels

An Xiao Mina   Journalism in a time of permacrisis

Dominic-Madori Davis   Everyone finally realizes the need for diverse voices in tech reporting

Sarah Marshall   A web channel strategy won’t be enough

Eric Thurm   Journalists think of themselves as workers

Cory Bergman   The AI content flood

Alex Perry   New paths to transparency without Twitter

Burt Herman   The year AI truly arrives — and with it the reckoning

Ariel Zirulnick   Journalism doubles down on user needs

Julia Angwin   Democracies will get serious about saving journalism

Andrew Donohue   We’ll find out whether journalism can, indeed, save democracy

Mariana Moura Santos   A woman who speaks is a woman who changes the world

Sarabeth Berman   Nonprofit local news shows that it can scale

Johannes Klingebiel   The innovation team, R.I.P.

Raney Aronson-Rath   Journalists will band together to fight intimidation

Jody Brannon   We’ll embrace policy remedies

Ryan Kellett   Airline-like loyalty programs try to tie down news readers

Laxmi Parthasarathy   Unlocking the silent demand for international journalism

Al Lucca   Digital news design gets interesting again

Leezel Tanglao   Community partnerships drive better reporting

Parker Molloy   We’ll reach new heights of moral panic

S. Mitra Kalita   “Everything sucks. Good luck to you.”

Dannagal G. Young   Stop rewarding elite performances of identity threat

Jim Friedlich   Local journalism steps up to the challenge of civic coverage

Susan Chira   Equipping local journalism

Gina Chua   The traditional story structure gets deconstructed

Ståle Grut   Your newsroom experiences a Midjourney-gate, too

Josh Schwartz   The AI spammers are coming

Mario García   More newsrooms go mobile-first

Danielle K. Brown and Kathleen Searles   DEI efforts must consider mental health and online abuse

Cindy Royal   Yes, journalists should learn to code, but…

Jarrad Henderson   Video editing will help people understand the media they consume

Emma Carew Grovum   The year to resist forgetting about diversity

Jonas Kaiser   Rejecting the “free speech” frame

Peter Sterne   AI enters the newsroom

J. Siguru Wahutu   American journalism reckons with its colonialist tendencies

Rodney Gibbs   Recalibrating how we work apart

Kerri Hoffman   Podcasting goes local

Rachel Glickhouse   Humanizing newsrooms will be a badge of honor

Jim VandeHei   There is no “peak newsletter”

Jacob L. Nelson   Despite it all, people will still want to be journalists

Christina Shih   Shared values move from nice-to-haves to essentials

Pia Frey   Publishers start polling their users at scale

Gordon Crovitz   The year advertisers stop funding misinformation

Michael W. Wagner   The backlash against pro-democracy reporting is coming

Delano Massey   The industry shakes its imposter syndrome

Brian Stelter   Finding new ways to reach news avoiders

Matt Rasnic   More newsroom workers turn to organized labor

Mael Vallejo   More threats to press freedom across the Americas

David Cohn   AI made this prediction

Barbara Raab   More journalism funders will take more risks

Kirstin McCudden   We’ll codify protection of journalism and newsgathering

Jenna Weiss-Berman   The economic downturn benefits the podcasting industry. (No, really!)

Alan Henry   A reckoning with why trust in news is so low

Eric Ulken   Generative AI brings wrongness at scale

Sam Gregory   Synthetic media forces us to understand how media gets made

Valérie Bélair-Gagnon   Well-being will become a core tenet of journalism

Elite Truong   In platform collapse, an opportunity for community

Bill Adair   The year of the fact-check (no, really!)

Tre'vell Anderson   Continued culpability in anti-trans campaigns

Alex Sujong Laughlin   Credit where it’s due

Emily Nonko   Incarcerated reporters get more bylines

Richard Tofel   The press might get better at vetting presidential candidates

Upasna Gautam   Technology that performs at the speed of news

Peter Bale   Rising costs force more digital innovation

Andrew Losowsky   Journalism realizes the replacement for Twitter is not a new Twitter

Mar Cabra   The inevitable mental health revolution

Mary Walter-Brown and Tristan Loper   Mission-driven metrics become our North Star

Eric Holthaus   As social media fragments, marginalized voices gain more power

Paul Cheung   More news organizations will realize they are in the business of impact, not eyeballs

Lisa Heyamoto   The independent news industry gets a roadmap to sustainability

Laura E. Davis   The year we embrace the robots — and ourselves

Moreno Cruz Osório   Brazilian journalism turns wounds into action

Ayala Panievsky   It’s time for PR for journalism

Sarah Alvarez   Dream bigger or lose out

Stefanie Murray   The year U.S. media stops screwing around and becomes pro-democracy

Sam Guzik   AI will start fact-checking. We may not like the results.

Ryan Nave   Citizen journalism, but make it equitable

Jesse Holcomb   Buffeted, whipped, bullied, pulled

Julia Beizer   News fatigue shows us a clear path forward

Janelle Salanga   Journalists work from a place of harm reduction

Alexandra Svokos   Working harder to reach audiences where they are

Joshua P. Darr   Local to live, wire to wither

Kavya Sukumar   Belling the cat: The rise of independent fact-checking at scale

Eric Nuzum   A focus on people instead of power

Anika Anand   Independent news businesses lead the way on healthy work cultures

Martina Efeyini   Talk to Gen Z. They’re the experts of Gen Z.

Basile Simon   Towards supporting criminal accountability

David Skok   Renewed interest in human-powered reporting

Felicitas Carrique and Becca Aaronson   News product goes from trend to standard

Megan Lucero and Shirish Kulkarni   The future of journalism is not you

Jennifer Choi and Jonathan Jackson   Funders finally bet on next-generation news entrepreneurs

Jennifer Brandel   AI couldn’t care less. Journalists will care more. 

Simon Galperin   Philanthropy stops investing in corporate media

Shanté Cosme   The answer to “quiet quitting” is radical empathy

Ryan Gantz   “I’m sorry, but I’m a large language model”

Nicholas Diakopoulos   Journalists productively harness generative AI tools

Alexandra Borchardt   The year of the climate journalism strategy

Sue Schardt   Toward a new poetics of journalism

Snigdha Sur   Newsrooms get nimble in a recession

Victor Pickard   The year journalism and capitalism finally divorce

Jessica Clark   Open discourse retrenches

Doris Truong   Workers demand to be paid what the job is worth

Larry Ryckman   We’ll work together with our competitors

Sue Cross   Thinking and acting collectively to save the news

Gabe Schneider   Well-funded journalism leaders stop making disparate pay

Juleyka Lantigua   Newsrooms recognize women of color as the canaries in the coal mine

Elizabeth Bramson-Boudreau   More of the same