AI Won’t Kill Journalism — But It Might Starve It
Artificial intelligence won't murder journalism with a flash of ones and zeros — but it may quietly starve it to death.
The technology's impact on newsrooms, workflows, and truth-telling is deeply paradoxical. As MSNBC's former President Rashida Jones observed in a recent speaker event, AI holds enormous promise for streamlining newsroom tasks and creating efficiencies, but its application comes with heavy questions about displacement and authenticity. “You can now have a product, input it into a model, and the AI will create the entire campaign — ads, social media, the works," she said. “That used to take a whole team. Now it's two people and a program.”
That promise of speed and scale is seductive. But at what cost? Washington Post editor Marc Fisher, speaking on another occasion, offered a quiet rebuke to such over-optimization. His vision of journalism is intensely human — rooted in messiness, empathy, and trust. Fisher recalled knocking on a widow's door to inform her of her husband's death, and how, instead of slamming the door, she invited him in for hours. “People are desperate to be heard,” he said. “Even in the worst moments of their lives, they want to tell their story.” So, AI can write prose and draft reports; what it cannot do is sit with grief.
The industry's fundamental dilemma is not whether AI can be useful in journalism — it already is. Tools like automated transcription, basic copyediting, and even some forms of data-driven storytelling are streamlined through AI. The issue is structural. AI is accelerating the collapse of journalism's economic foundations while mining that very content to power its own future. It is, in effect, building castles with bricks stolen from crumbling homes.
The numbers from the Brookings Institution paint a devastating picture. While traffic to major news sites rose 43% over the past decade, revenue plummeted by 56% (Brookings, 2023). Their research documents that two and a half newspapers close every week across America. Thousands of journalists face layoffs annually, with over 2,500 newsroom positions eliminated in 2023 alone, according to their media employment tracker. Meanwhile, AI systems trained on this journalism — sometimes even paywalled material — repackage the work with zero attribution or compensation to publishers.
This extraction economy is nothing new. As Rashida Jones put it, “There are so many more places for people to get content now. It's all fragmented.” And in that fragmentation lies the crisis: as search engines and platforms offer instant answers with no attribution or compensation, journalism loses its audience, revenue, and, eventually, its purpose.
Yet AI fundamentally depends on journalism. Without fact-checked, reported content, the large language models underpinning tools like ChatGPT, Bing, or Gemini would rapidly degrade. The Brookings Institution's analysis of AI training data shows that news articles comprise nearly 48% of the most influential datasets that train these systems (Brookings Technology Policy Division, 2024). Remove that layer of human reporting, and AI risks hallucinating itself into oblivion.
So what's next? The future will hinge on regulation and reinvention. Governments must step in to ensure compensation, enforce intellectual property rights, and support bargaining codes similar to those in Australia and Canada. Newsrooms, meanwhile, must adapt pricing strategies and content licensing models that reflect the real value of their work throughout the AI value chain.