For Media and Publishing

How Media Organisations Can Use AI Without Losing Editorsial Judgement

AI tools like ChatGPT and Claude now generate content faster than reporters can write it. This pressure is real: optimised AI drafts often outperform human-written pieces in engagement metrics. The risk is that editors choose the AI version not because it is better journalism, but because the numbers say it will get more clicks. Your readers cannot tell which stories came from reporting and which came from a language model trained on internet text.

These are suggestions. Your situation will differ. Use what is useful.

Download printable PDF

Keep Reporting Separate From AI Drafting

The jobs AI does well (summarising press releases, structuring data, filling in public information) are exactly the jobs that used to train junior reporters. When you use Claude to draft a company earnings story from SEC filings, you save time but you lose the moment a reporter would have asked a follow-up question. Use AI for assembly of known facts, not for the work that develops journalistic instinct. Protect the time your team spends on reporting that AI cannot do: interviews, observation, investigation.

Recognise When AI Creates Editorsial Homogenisation

When your newsroom and five other outlets all use the same AI model for the same news, you all publish the same story at the same time with the same framing. This kills the competitive advantage of having better reporters. Perplexity and Google Gemini will tell your competitors the same thing they tell you. Editorss who rely on these tools for story ideas and angles are following the crowd, not leading it. Your readers chose your outlet for your voice and your judgment, not for someone else's facts read through a language model.

Maintain The Distinction Between Reporting And Generation

Readers trust journalism because they know a human made judgement calls about what matters and what is true. When you publish a story that came mostly from AI, you have an obligation to be clear about that. AP Automated Insights works well for earnings summaries and sports scores because readers expect those formats and understand their limitations. But readers who see a byline expect human reporting. The moment a reader discovers that a story they thought was reported was mostly generated, you have damaged trust. That damage spreads because your audience will now question which of your stories are real.

Build Editorsial Standards For When To Use AI

You need a rule book, not a principle. Tell your team exactly which tasks AI is allowed to do. Use Claude to draft the first section of a council meeting story from your notes, but not to interview sources. Use ChatGPT to generate alternative headlines and then have an editor choose. Use Google Gemini to find public data on a topic, but require a reporter to verify and contextualise it. These rules should be different for different story types and beats. The rule should be specific enough that a reporter can apply it without asking an editor every time.

Measure What Matters Beyond Engagement

Your analytics probably show that AI-optimised headlines and structures get more clicks. But engagement metrics miss what journalism is actually for. You need to also measure: how many sources a story used, whether it changed anything in the community, how many corrections you published for it, and whether readers still cite it months later as important. A story that got millions of clicks but turned out to be wrong is worse than a story that got fewer clicks and was right. When you measure only engagement, AI will make you worse at journalism because it optimises for what makes people click, not what makes them informed.

Key principles

  1. 1.AI is best at assembling known facts, not at the reporting that develops journalistic judgment and uncovers what others missed.
  2. 2.When multiple outlets use the same AI tool for the same news, editorial advantage disappears and readers get the same story everywhere.
  3. 3.Your readers trust your byline because they know a human made decisions about what is true and what matters, so you must protect that distinction.
  4. 4.Engagement metrics optimise for clicks, not for accuracy or impact, so measure reader trust and lasting influence alongside traffic.
  5. 5.Use AI for the work that saves time without damaging training, not for the work that used to teach your reporters how to think like journalists.

Key reminders

Related reads

The Book — Out Now

Cognitive Sovereignty: How To Think For Yourself When AI Thinks For You

Read the first chapter free.

No spam. Unsubscribe anytime.