Synthetic media forces us to understand how media gets made

“We’ll see these expanding authenticity and provenance technology efforts intersect with the evolving TikTokification of media production, focused on remix, playful editing, and integrated AI effects.”

So you’ve been skeptical about deepfakes ever since you read a hyperbolic headline in 2018. And you’ve been right — sort of. Faceswap deepfakes haven’t rocked U.S. or European politics or permeated every social media ecosystem, and mis-contextualized shallowfake videos outnumber them thousands to one in misinformation and disinformation. But globally, false claims of deepfakery increasingly confound publics and journalists, and the underlying foundational problem of non-consensual sexual images targeting women and LGBTQI+ people festers without solutions.

2023 will be the year in which we take seriously the measures to prepare for, but not panic over, synthetic media and its big sibling, the broader phenomenon of “generative AI.” The rapacious pace and public visibility of developments in this space — including the accessibility of Stability AI, the picture-generating variety of DALL-E, the look to the future of text-to-video research like Imagen and Phenaki, as well as the recent popularization of consumer tools like Lensa — reflects an underlying swell of technological advances, as well as potential profits taking the driver seat over ethics. These tools are rife with potential for distributed creativity and journalistic storytelling. But making it easier to fake realistic scenes of real people doing things they never did, or sexualized images of women, or nonsensical floods of fake war crimes images are not to be laughed at.

What form is better preparation likely to take? Witness’s own global consultations in our Prepare, Don’t Panic initiative on synthetic media have raised a number of areas: equity in access to detection tools and capacities for journalists globally and in smaller organizations, efforts on the insidious power of deepfake claims around real footage, strong platform policies and legislative options. But here I’ll focus on authenticity and provenance infrastructure, which show the work of how media was made, where it came from and was edited, and how it was distributed.

Early efforts like Coalition for Content Provenance and Authenticity (C2PA) technical standards and Content Authenticity Initiative launched in 2022, and this space will be rife for innovation as long as we don’t default to just assuming it’s about tamper-proof immutability of origin images but understanding the nuance of how media is made. Authenticity and provenance efforts focus on layers of context about media integrity and origins available to everyone from a viewer who really wants to understand how a creative image was made, to a professional investigator or journalist. They are a proactive step to engage with a manipulated and synthetic media world. Witness focused within the C2PA coalition on the global, human rights ramifications of these types of standards, and how they can be done right and in a way that is user-centric, respects privacy, accounts for global journalistic contexts, and avoids legislative weaponization.

In 2023, we’ll see these expanding authenticity and provenance technology efforts intersect with the evolving TikTokification of media production, focused on remix, playful editing, and integrated AI effects. A labeling and disclosure mindset for creators and journalists alike will intermingle with the creative potential of showing how media is created and revealing the production process. We’ll start to extract ourselves from the current idea that disclosure and labeling are about singling out or discerning misinformation or malice. Want to see what this looks like in its baby steps? You can see the start of this process in your For You Page on TikTok, where you can see the audio a creator used or the effect they incorporated.

When it comes to generative AI systems, we’re likely to see efforts (and fight-back) to bake disclosure of how media is made into these models’ outputs, as well as the combinations of real and synthetic media that will become more commonplace. It’s not just soft norms pushing this way, like efforts on a Synthetic Media Code of Conduct, but also recent moves in Europe to mandate it within the draft EU AI Act.

These efforts will not sufficiently address non-consensual sexual images. These threats expand in scope with open image-generating systems which permit both real individuals and sexual imagery. The problem with these images is not one of “knowing it’s a deepfake”; even more acutely than in other scenarios, it’s the weaponization of lifelike images, irrespective of their perceived “reality.”

In 2023, as we start to separate the hype from the (un)reality with deepfakes, authenticity and provenance technologies will be one place we can look to help fortify the truth and pull-back the curtain on delightful creativity, by creating clear signals about how a piece of media has been created, generated, manipulated and edited.

Sam Gregory is director of programs, strategy, and innovation at Witness, the global human rights and civic journalism network.

So you’ve been skeptical about deepfakes ever since you read a hyperbolic headline in 2018. And you’ve been right — sort of. Faceswap deepfakes haven’t rocked U.S. or European politics or permeated every social media ecosystem, and mis-contextualized shallowfake videos outnumber them thousands to one in misinformation and disinformation. But globally, false claims of deepfakery increasingly confound publics and journalists, and the underlying foundational problem of non-consensual sexual images targeting women and LGBTQI+ people festers without solutions.

2023 will be the year in which we take seriously the measures to prepare for, but not panic over, synthetic media and its big sibling, the broader phenomenon of “generative AI.” The rapacious pace and public visibility of developments in this space — including the accessibility of Stability AI, the picture-generating variety of DALL-E, the look to the future of text-to-video research like Imagen and Phenaki, as well as the recent popularization of consumer tools like Lensa — reflects an underlying swell of technological advances, as well as potential profits taking the driver seat over ethics. These tools are rife with potential for distributed creativity and journalistic storytelling. But making it easier to fake realistic scenes of real people doing things they never did, or sexualized images of women, or nonsensical floods of fake war crimes images are not to be laughed at.

What form is better preparation likely to take? Witness’s own global consultations in our Prepare, Don’t Panic initiative on synthetic media have raised a number of areas: equity in access to detection tools and capacities for journalists globally and in smaller organizations, efforts on the insidious power of deepfake claims around real footage, strong platform policies and legislative options. But here I’ll focus on authenticity and provenance infrastructure, which show the work of how media was made, where it came from and was edited, and how it was distributed.

Early efforts like Coalition for Content Provenance and Authenticity (C2PA) technical standards and Content Authenticity Initiative launched in 2022, and this space will be rife for innovation as long as we don’t default to just assuming it’s about tamper-proof immutability of origin images but understanding the nuance of how media is made. Authenticity and provenance efforts focus on layers of context about media integrity and origins available to everyone from a viewer who really wants to understand how a creative image was made, to a professional investigator or journalist. They are a proactive step to engage with a manipulated and synthetic media world. Witness focused within the C2PA coalition on the global, human rights ramifications of these types of standards, and how they can be done right and in a way that is user-centric, respects privacy, accounts for global journalistic contexts, and avoids legislative weaponization.

In 2023, we’ll see these expanding authenticity and provenance technology efforts intersect with the evolving TikTokification of media production, focused on remix, playful editing, and integrated AI effects. A labeling and disclosure mindset for creators and journalists alike will intermingle with the creative potential of showing how media is created and revealing the production process. We’ll start to extract ourselves from the current idea that disclosure and labeling are about singling out or discerning misinformation or malice. Want to see what this looks like in its baby steps? You can see the start of this process in your For You Page on TikTok, where you can see the audio a creator used or the effect they incorporated.

When it comes to generative AI systems, we’re likely to see efforts (and fight-back) to bake disclosure of how media is made into these models’ outputs, as well as the combinations of real and synthetic media that will become more commonplace. It’s not just soft norms pushing this way, like efforts on a Synthetic Media Code of Conduct, but also recent moves in Europe to mandate it within the draft EU AI Act.

These efforts will not sufficiently address non-consensual sexual images. These threats expand in scope with open image-generating systems which permit both real individuals and sexual imagery. The problem with these images is not one of “knowing it’s a deepfake”; even more acutely than in other scenarios, it’s the weaponization of lifelike images, irrespective of their perceived “reality.”

In 2023, as we start to separate the hype from the (un)reality with deepfakes, authenticity and provenance technologies will be one place we can look to help fortify the truth and pull-back the curtain on delightful creativity, by creating clear signals about how a piece of media has been created, generated, manipulated and edited.

Sam Gregory is director of programs, strategy, and innovation at Witness, the global human rights and civic journalism network.

Jim Friedlich   Local journalism steps up to the challenge of civic coverage

Al Lucca   Digital news design gets interesting again

Delano Massey   The industry shakes its imposter syndrome

Kathy Lu   We need emotionally agile newsroom leaders

Richard Tofel   The press might get better at vetting presidential candidates

Parker Molloy   We’ll reach new heights of moral panic

Janelle Salanga   Journalists work from a place of harm reduction

Raney Aronson-Rath   Journalists will band together to fight intimidation

Alexandra Svokos   Working harder to reach audiences where they are

Francesco Zaffarano   There is no end of “social media”

Karina Montoya   More reporters on the antitrust beat

Cory Bergman   The AI content flood

Masuma Ahuja   Journalism starts working for and with its communities

Sue Cross   Thinking and acting collectively to save the news

Joe Amditis   AI throws a lifeline to local publishers

Eric Nuzum   A focus on people instead of power

Snigdha Sur   Newsrooms get nimble in a recession

Bill Adair   The year of the fact-check (no, really!)

Tim Carmody   Newsletter writers need a new ethics

Tamar Charney   Flux is the new stability

Eric Thurm   Journalists think of themselves as workers

Ryan Gantz   “I’m sorry, but I’m a large language model”

Nicholas Jackson   There will be launches — and we’ll keep doing the work

Cindy Royal   Yes, journalists should learn to code, but…

Ryan Kellett   Airline-like loyalty programs try to tie down news readers

Errin Haines   Journalists on the campaign trail mend trust with the public

Megan Lucero and Shirish Kulkarni   The future of journalism is not you

Elite Truong   In platform collapse, an opportunity for community

Mary Walter-Brown and Tristan Loper   Mission-driven metrics become our North Star

Cari Nazeer and Emily Goligoski   News organizations step up their support for caregivers

Eric Ulken   Generative AI brings wrongness at scale

Rodney Gibbs   Recalibrating how we work apart

Sarah Stonbely   Growth in public funding for news and information at the state and local levels

Upasna Gautam   Technology that performs at the speed of news

Pia Frey   Publishers start polling their users at scale

Mael Vallejo   More threats to press freedom across the Americas

Sarabeth Berman   Nonprofit local news shows that it can scale

Mariana Moura Santos   A woman who speaks is a woman who changes the world

Jessica Clark   Open discourse retrenches

Tre'vell Anderson   Continued culpability in anti-trans campaigns

Emma Carew Grovum   The year to resist forgetting about diversity

Julia Beizer   News fatigue shows us a clear path forward

Andrew Donohue   We’ll find out whether journalism can, indeed, save democracy

Nikki Usher   This is the year of the RSS reader. (Really!)

Taylor Lorenz   The “creator economy” will be astroturfed

Mar Cabra   The inevitable mental health revolution

Lisa Heyamoto   The independent news industry gets a roadmap to sustainability

Kaitlyn Wells   We’ll prioritize media literacy for children

Christoph Mergerson   The rot at the core of the news business

Jessica Maddox   Journalists keep getting manipulated by internet culture

Josh Schwartz   The AI spammers are coming

Sam Gregory   Synthetic media forces us to understand how media gets made

Janet Haven   ChatGPT and the future of trust 

Doris Truong   Workers demand to be paid what the job is worth

Sue Schardt   Toward a new poetics of journalism

Laura E. Davis   The year we embrace the robots — and ourselves

Shanté Cosme   The answer to “quiet quitting” is radical empathy

Alex Sujong Laughlin   Credit where it’s due

Burt Herman   The year AI truly arrives — and with it the reckoning

Gina Chua   The traditional story structure gets deconstructed

Basile Simon   Towards supporting criminal accountability

Nicholas Diakopoulos   Journalists productively harness generative AI tools

Peter Sterne   AI enters the newsroom

Laxmi Parthasarathy   Unlocking the silent demand for international journalism

Elizabeth Bramson-Boudreau   More of the same

Ariel Zirulnick   Journalism doubles down on user needs

Dannagal G. Young   Stop rewarding elite performances of identity threat

Jennifer Choi and Jonathan Jackson   Funders finally bet on next-generation news entrepreneurs

Joshua P. Darr   Local to live, wire to wither

Juleyka Lantigua   Newsrooms recognize women of color as the canaries in the coal mine

Sarah Alvarez   Dream bigger or lose out

Hillary Frey   Death to the labor-intensive memo for prospective hires

Leezel Tanglao   Community partnerships drive better reporting

Wilson Liévano   Diaspora journalism takes the next step

Eric Holthaus   As social media fragments, marginalized voices gain more power

Stefanie Murray   The year U.S. media stops screwing around and becomes pro-democracy

Dana Lacey   Tech will screw publishers over

Jacob L. Nelson   Despite it all, people will still want to be journalists

Joni Deutsch   Podcast collaboration — not competition — breeds excellence

Alexandra Borchardt   The year of the climate journalism strategy

Julia Angwin   Democracies will get serious about saving journalism

Michael Schudson   Journalism gets more and more difficult

Joanne McNeil   Facebook and the media kiss and make up

Michael W. Wagner   The backlash against pro-democracy reporting is coming

Danielle K. Brown and Kathleen Searles   DEI efforts must consider mental health and online abuse

Emily Nonko   Incarcerated reporters get more bylines

John Davidow   A year of intergenerational learning

Victor Pickard   The year journalism and capitalism finally divorce

Mario García   More newsrooms go mobile-first

Jennifer Brandel   AI couldn’t care less. Journalists will care more. 

Brian Stelter   Finding new ways to reach news avoiders

Sam Guzik   AI will start fact-checking. We may not like the results.

Jaden Amos   TikTok personality journalists continue to rise

Ben Werdmuller   The internet is up for grabs again

Valérie Bélair-Gagnon   Well-being will become a core tenet of journalism

Barbara Raab   More journalism funders will take more risks

Dominic-Madori Davis   Everyone finally realizes the need for diverse voices in tech reporting

Ayala Panievsky   It’s time for PR for journalism

Amethyst J. Davis   The slight of the great contraction

Jakob Moll   Journalism startups will think beyond English

Ståle Grut   Your newsroom experiences a Midjourney-gate, too

Esther Kezia Thorpe   Subscription pressures force product innovation

Cassandra Etienne   Local news fellowships will help fight newsroom inequities

Anthony Nadler   Confronting media gerrymandering

Don Day   The news about the news is bad. I’m optimistic.

Gabe Schneider   Well-funded journalism leaders stop making disparate pay

Amy Schmitz Weiss   Journalism education faces a crossroads

Ryan Nave   Citizen journalism, but make it equitable

Anita Varma   Journalism prioritizes the basic need for survival

Molly de Aguiar and Mandy Van Deven   Narrative change trend brings new money to journalism

Sumi Aggarwal   Smart newsrooms will prioritize board development

Sarah Marshall   A web channel strategy won’t be enough

Nicholas Thompson   The year AI actually changes the media business

Martina Efeyini   Talk to Gen Z. They’re the experts of Gen Z.

Walter Frick   Journalists wake up to the power of prediction markets

Mauricio Cabrera   It’s no longer about audiences, it’s about communities

Jenna Weiss-Berman   The economic downturn benefits the podcasting industry. (No, really!)

Anika Anand   Independent news businesses lead the way on healthy work cultures

Larry Ryckman   We’ll work together with our competitors

Sue Robinson   Engagement journalism will have to confront a tougher reality

Kavya Sukumar   Belling the cat: The rise of independent fact-checking at scale

Jarrad Henderson   Video editing will help people understand the media they consume

J. Siguru Wahutu   American journalism reckons with its colonialist tendencies

S. Mitra Kalita   “Everything sucks. Good luck to you.”

David Skok   Renewed interest in human-powered reporting

Zizi Papacharissi   Platforms are over

Peter Bale   Rising costs force more digital innovation

A.J. Bauer   Covering the right wrong

Jesse Holcomb   Buffeted, whipped, bullied, pulled

Susan Chira   Equipping local journalism

Matt Rasnic   More newsroom workers turn to organized labor

Bill Grueskin   Local news will come to rely on AI

Johannes Klingebiel   The innovation team, R.I.P.

Alex Perry   New paths to transparency without Twitter

Priyanjana Bengani   Partisan local news networks will collaborate

Daniel Trielli   Trust in news will continue to fall. Just look at Brazil.

Rachel Glickhouse   Humanizing newsrooms will be a badge of honor

Brian Moritz   Rebuilding the news bundle

An Xiao Mina   Journalism in a time of permacrisis

Khushbu Shah   Global reporting will suffer

Paul Cheung   More news organizations will realize they are in the business of impact, not eyeballs

Kerri Hoffman   Podcasting goes local

Jody Brannon   We’ll embrace policy remedies

Andrew Losowsky   Journalism realizes the replacement for Twitter is not a new Twitter

Alan Henry   A reckoning with why trust in news is so low

Kaitlin C. Miller   Harassment in journalism won’t get better, but we’ll talk about it more openly

Jim VandeHei   There is no “peak newsletter”

Simon Galperin   Philanthropy stops investing in corporate media

Felicitas Carrique and Becca Aaronson   News product goes from trend to standard

Moreno Cruz Osório   Brazilian journalism turns wounds into action

Surya Mattu   Data journalists learn from photojournalists

Christina Shih   Shared values move from nice-to-haves to essentials

Gordon Crovitz   The year advertisers stop funding misinformation

Anna Nirmala   News organizations get new structures

Jonas Kaiser   Rejecting the “free speech” frame

Kirstin McCudden   We’ll codify protection of journalism and newsgathering

David Cohn   AI made this prediction