AI's "Thoughts" on AI Stealing Journalism
Sep 17, 2025
While searching on DuckDuckGo and Google about AI stealing journalist content, I was shocked at the disparity between how the two organizations responded to the query. DuckDuckGo simply provided search results, the first one being this article from Brookings. Google's response, however, felt very much like they're putting their fingers on the scale towards their own self interest.
As a result, I thought it would be interesting to see how many of the major models responded. This was done on 9/16/2025, and is simply shared here to be captured for posterity. The prompt used for each was “If AI steals all the journalism then we won't be able to hire journalists to do the journalism for them to steal?” (I know.)
The idea that AI will "steal all the journalism" and eliminate the need for human journalists is a common concern, but it oversimplifies the complex relationship between technology and the field. Your statement highlights a logical endpoint of this concern: if AI eliminates the supply of original reporting, it will eventually exhaust the material it needs to generate its own content.
Here's a breakdown of why this scenario is an important consideration for the future of journalism:
The source problem
An AI system trained to write news articles is, by its nature, a secondary source. It relies on a vast dataset of existing text to understand patterns, summarize information, and generate content. However, these systems are not equipped to perform the original, on-the-ground work of journalism. They cannot:
Cultivate human sources and interview them for new information.
Uncover new details through investigative reporting.
Attend public meetings, press conferences, and events.
Provide context, perspective, and judgment gained from years of experience.
The raw material that AI systems are trained on—the news reports, interviews, and investigations—are all the product of human journalists. If these jobs disappear, the flow of new, verified information would dry up, and AI would eventually be left synthesizing and regurgitating old information.
The value of human journalists
This dynamic suggests a path forward where human journalists retain their crucial role by focusing on what AI cannot do. Rather than being replaced, they can leverage AI as a tool to become more efficient and focus on the uniquely human aspects of their profession.
Human-centric work: Journalists can emphasize their ability to connect with people, build trust, and tell powerful, emotionally resonant stories—a "deep connection" an AI cannot replicate.
Investigative reporting: AI is a powerful tool for analyzing large datasets, such as thousands of pages of unorganized documents. Journalists can use this AI capability to find leads and then apply their own critical thinking and investigative skills to produce groundbreaking reports.
The "why" and the "so what": While AI can summarize the facts of an event, human journalists provide the context, analysis, and ethical judgment to explain why the information matters and what the implications are for the public.
The symbiotic future
The most likely future isn't one where AI "steals" journalism, but one where it reshapes it. AI can automate some of the more tedious, data-driven tasks, such as creating tags, categorizing articles, or reporting on routine data-heavy topics like sports scores or market trends. This would free up human journalists to focus on high-impact, original, and creative work. For this symbiotic relationship to work, news organizations and the public must value and support the original reporting that provides the raw material for the entire information ecosystem.

Anthropic
You've hit on a fascinating paradox that economists call a "tragedy of the commons" problem. If AI systems train on journalism without compensating news organizations, and people increasingly get their news from AI rather than visiting news sites, then the economic model that funds journalism starts to collapse.
This creates a potential feedback loop: fewer resources for journalism → less original reporting → less quality content for AI to learn from → potentially degraded AI performance on news and current events → less valuable AI outputs.
Some potential solutions being explored include:
Licensing agreements - AI companies paying news organizations for training data and ongoing access, similar to how Google pays some publishers through programs like Google News Showcase.
Revenue sharing models - Where AI systems that surface news content share advertising or subscription revenue with the original publishers.
Legal frameworks - Some jurisdictions are developing "neighboring rights" laws that require tech platforms to compensate news publishers.
Subscription bundling - AI services potentially including news subscriptions as part of their offerings.
The challenge is that this requires coordination across the industry. Individual publishers have limited leverage, but collectively they produce the information ecosystem that makes AI news capabilities valuable in the first place.
It's worth noting that similar disruptions happened with Google News and social media - the industry adapted, though not without significant consolidation and changes to the journalism landscape. The AI transition might follow a similar pattern of initial disruption followed by new economic arrangements.

Perplexity
If artificial intelligence systems “steal” journalism by fully automating news production and repurposing content, it risks undermining the financial incentives and demand for human journalists, which could further erode the supply of original reporting that AI systems rely on for quality material.
AI’s Impact on Journalism Workflows
AI isn’t simply supporting journalists; it is actively reshaping newsrooms by automating routine tasks, generating headlines, curating stories, and even writing articles, sometimes shifting editorial control away from individual journalists toward automated systems.
While AI improves efficiency for media companies, ongoing automation can lead to job losses and reduced compensation for journalists as their roles are deskilled.
The Paradox of AI Content Dependence
Generative AI models require high-quality, human-created content to function properly; if journalism jobs disappear because AI automates reporting, the content pool that these models draw from will shrink, threatening their future capabilities.
The “AI-eats-journalism” loop creates a risk: without paying journalists to report, investigate, and produce news, AI models eventually run out of new, trustworthy information to “steal”.
Limits and Ethical Concerns
AI systems currently struggle to replace the core investigative and ethical elements of journalism, such as cultivating sources, going to locations, and making ethical decisions about what to report.
AI can support and optimize journalism, but without a sustainable model to fund and employ real journalists, the informational ecosystem underpinning all news (including AI-generated news) could deteriorate.
Conclusion
Journalism’s survival in an AI-heavy future depends on finding ways to support and compensate human journalists, ensuring the continued creation of high-quality news that AI models, and society, depend on.
If journalism is completely automated, the foundational material that AI needs will dry up, endangering not just journalism but also the effectiveness of AI itself.
Chat GPT - As Anonymous User
Haha, that's a great point! It’s definitely a paradoxical situation. If AI takes over journalism, you’d think it would eventually run out of journalists to write for it, because humans are the ones who create the stories in the first place.
But, the thing is, while AI might generate articles or reports based on existing data, there’s still a need for human creativity, investigative skills, and a deep understanding of culture and nuance that AI can’t fully replicate. So, maybe journalists won’t be entirely out of a job, but the role might evolve. They could become more like curators, fact-checkers, or editors who work alongside AI to keep the human touch in the news cycle.
That said, there’s also the issue of whether AI-generated content would maintain the same level of trust and authenticity as human-created journalism. If AI starts pumping out articles with little oversight, we could end up with more disinformation or generic content.
What do you think—could AI be a useful tool in journalism, or would it make things worse?

Grok
The concern that AI could "steal" journalism jobs hinges on a mix of economic, technological, and ethical factors. Let’s break it down.
AI is already automating parts of journalism—think data-driven reporting, like sports scores or financial summaries, where algorithms churn out formulaic articles fast. The Associated Press, for instance, has used AI to generate thousands of sports and earnings reports since 2014, freeing journalists for deeper investigative work. But this automation can displace entry-level reporters, as newsrooms lean on tech to cut costs. A 2023 study from the Pew Research Center noted that 24% of U.S. newsroom jobs vanished between 2008 and 2020, partly due to market pressures but accelerated by tech efficiencies.
The counterargument is that AI doesn’t “steal” journalism itself—it’s a tool, not a replacement for human insight. Good journalism involves nuance, context, and ethical judgment, which AI struggles with. It can scrape and regurgitate, but it’s terrible at uncovering hidden truths or holding power to account without human guidance. If anything, AI might flood the market with low-quality content, making human journalists more valuable for trusted, in-depth reporting. The catch? Newsrooms need revenue to hire those journalists, and AI-driven ad tech often siphons that away.
The cycle you’re describing—AI displacing journalists, leaving less original work to “steal”—is a real risk if outlets over-rely on automation without investing in humans. But it’s not inevitable. Some argue AI could amplify journalists by handling grunt work, letting them focus on high-impact stories. The trick is funding that model in a fractured media landscape. If newsrooms pivot to premium, human-driven content, they could carve out a niche. If not, we might see a race to the bottom with AI-generated clickbait.
No hard data predicts total job loss—projections vary. A 2019 World Association of News Publishers report estimated AI could automate 10-15% of newsroom tasks by 2025, but not entire roles. The real issue isn’t AI “stealing” but whether society values human journalism enough to pay for it. If we don’t, the problem’s less about AI and more about market priorities.
