While deepfakes and troubling misuse of Artificial Intelligence are making headlines, there are brilliant, smaller scale AI tools and use cases that have the potential to be utterly transformative at our collective fingertips. Here’s a summary of some of the important and easy-to-use examples.

If we reiterate one point this smartoctober, it’s that AI doesn’t and shouldn’t mean handing over creative control of your newsroom to a farm of robots. There has understandably been much trepidation about the promise of AI and what this might mean for publishers.

There is no doubt that AI can enhance the way news is reported and delivered, as well as improve workflow efficiencies for businesses across the board.

Logo Reuters

Reuters Institute for the study of journalism


It’s smartoctober! Every year we spend a month deep diving into a subject or skill set that we deem to be absolutely essential to newsrooms around the world. This year it’s time to consider AI.


Right now, there are tools and features available to editors and journalists which - rather than pitching, researching, writing and publishing Nobel-winning features from scratch - will help refine your output. They are best viewed as you’d consider automatic headline suggestions in your CMS: a useful, time saving device of considerable value for creating a headline, but of little use for anything else.

So, rather than think about what’s next for AI and where all this is going, this week we’re considering what’s available now.

Three easy ways to start incorporating AI into your newsroom, without disrupting your workflow

  1. Automated user needs analysis


We’re excited about this one. Currently in beta, the user needs analysis tool allows you to paste in the text of any article and then our algorithmic wizards in the code will analyse it to reveal which user need(s) it covers.


When you’re starting off with user needs, it can be difficult to ascertain which articles address which user needs. This takes the labour out of that process and allows you to see quickly where your articles sit in the matrix of four core ‘drivers’.

But it’s not just illustrative. It’s actionable too. While it reveals which driver or user needs are addressed, there are also tips on where articles fall short, meaning that you can ensure that what you think you’re producing is what’s actually being delivered.

Automated user needs analysis in journalism is pivotal for several reasons, Goran Milovanovic, senior data scientist at smartocto, explains. "Firstly, it significantly enhances efficiency and responsiveness by swiftly providing feedback on user needs in particular articles. This agility ensures that news organisations can adapt to rapidly changing audience preferences. Secondly, it enables targeted content delivery, enhancing user engagement and loyalty. Automated analysis also empowers newsrooms with data-driven insights, guiding editorial decisions to produce higher-quality content. Lastly, this approach fosters a continuous feedback loop, facilitating ongoing improvement in content relevance and accountability.”

Rutger Verhoeven, cofounder and CMO at smartocto, thinks this tool could be crucial for optimising your content strategy. "Because this technology gives you instant feedback about which journalistic angle your article was written from, you immediately learn where optimisations can be made. If you wanted to write a contextual Educate me piece, but the system returns that it has mainly become a factual representation, you can specifically search for places where the story can be sharper, better and more attractive to your audience.”

user needs tool

Copy the text from any article, paste into the first box and click ‘analyse’. That’s it. You’ll then see the balance of the article across the four drivers (know, understand, feel, do). If one reading here is higher, this indicates it’s well situated within a user need, but if there’s no clear ‘winner’, it may indicate that the motivation and angle of the piece is unclear. From here, you’ll be able to see where there are gaps in coverage - or angles that could be addressed in follow ups.


The article you’re analysing might return no tips, like in our example above. This doesn’t mean it’s broken, but rather that because at present the tool knows that if an article falls emphatically in one user need (usually over 70%) it recognises that follow ups from different user needs perspectives might be useful. Likewise, if the four axes show an even-ish split, the system concludes that the article was not focused enough in its planning and/ or execution meaning that the audience might feel confused after reading it.

  • Follow ups are vitally important to the health of your newsroom ecosystem. To read more on that, see how the article pasted in above could have benefitted from this in our analysis, recently published on the WAN-IFRA blog.

-> You can also test headlines with this tool. Try it now. Type in 'Man Bites Dog'.

The tool is currently in the development phase, and its accuracy may not be consistent. This separate AI prototype will be available until available until December 31th 2023. For further information please view more.

man bites dog userneeds.smartocto.com

Even with a short, three word headline, the tool does its thing. The result is 100% Emotional. The analysis is smart enough to know that this headline subverts the normal convention:

“The article 'man bites dog' is a classic example of an Emotional user need article. It aims to divert the audience and elicit emotions by presenting a surprising and unusual event.”

And, because it’s so completely in the ‘emotional’ field, the tip served here aims to prolong the life of the article by addressing a different user needs approach in follow up stories.

  • We’ve got a whole section about user needs for news on our website, which you can access here.

2. Headline testing

A simple example of what AI can do for you, now. A/B testing is something which newsrooms do as a matter of course - but there is room for refinement.

What AI can do

Through ChatGPT, and the inputs and prompts editors use, it can suggest alternative headlines

What it can’t…

Perform the whole process

How it works

Using a good prompt, ChatGPT can make suggestions for alternative headlines, which editors can select (or not) as an option for A/B testing. These alternatives are not automatically applied, and aren’t necessarily particularly good, but they can help redirect thinking and provide a fresh perspective to the process. The quality of the editorial input is absolutely vital, though.

A basic prompt might look like this:

  • "Create three alternative headlines for this article: [article headline]."

But that’s not really sufficient - and it certainly won’t provide a decent headline. If answers are only good enough as the questions asked, the same is true for output from prompts. This is better:

artificial intelligence chatgpt instruction
Suggestions Chatgpt

We shared the workflow for ChatGPT, but smartocto is currently training a model with loads of historical data and domain knowledge, which will be an alternative suggestion machine for its feature Tentacles. Do you want to join the beta testing team, click the button 'ask for a demo'. Check the sneak preview below.


-> We worked with Dutch publisher on this subject, and you can read the resultant client case here

-> See also how AI can help with topic and story discovery in this article from Nieman Lab

headline testing, AI, tentacles

3. Automated translation

There was once a sketch on a US late night comedy show in which the anchor dryly intoned that “in international news, no Americans were killed today”. It’s a wry, pithy comment on how we deal with information and news outside our own locales (often, we don’t), but of course there’s enormous potential here - even when it’s stuck behind a language barrier.

It’s now possible to take audio and video and, through tools like HeyGen, automatically translate content into other languages. This has of course been a possibility for a while, so here’s why this is new and exciting:

  • The tool matches individuals’ speech patterns and cadences, so the translation actually sounds like them
  • It also manages to match mouth movement to speech, reducing the clunky effect of overdub
  • Because the AI voice actually sounds like that of the original speaker, there’s much less disconnect, and a greater feeling of authority and authenticity - things that we know help build trust

-> See also this article from INMA about AI-aided video content for news

So, in summary…

"Right now in newsrooms we see a lot of ‘A’, but unfortunately very little ‘I’ in AI", says Rutger Verhoeven. "In the months and years to come this will change and, when it does, it will undoubtedly revolutionise the media industry. In the meantime look out for those use cases which will provide the examples and inspiration needed to make a successful transition to an AI-rich workflow.”