A publication of the Centre for Advancing Journalism, University of Melbourne

Society

For the latest news, let’s hear from our robot reporter on the spot

Are journalists even human? The answer might surprise you. Artificial intelligence is transforming news, with the technology stepping out of the shadows and into visible reporting and investigating roles. Petra Stock reports.

For the latest news, let’s hear from our robot reporter on the spot

Fast-evolving artificial intelligence tools have features useful to journalism, said Dr Lucinda McKnight, an expert in writing for the digital age. But the Deakin University researcher cautions the technology requires human oversight and comes with bias and inaccuracy risks. Image: Unsplash/Michael Winkler

Words by Petra Stock
 

When an earthquake hits California, robot reporter Quakebot reviews the US Geological Survey notice, drafts a news article and alerts a human editor at the Los Angeles Times who decides whether to publish the story. The whole process takes about three minutes.

At The Washington Post, robot Heliograf’s local reports on the 2020 US elections were read out by an artificial intelligence voice assistant and inserted directly into The Post’s political podcasts.

Meanwhile, a real estate reporting robot working for MittMedia has increased the Swedish publisher’s property coverage – from two articles a month to nearly 2000 – driving up page views and subscribers.

Artificial intelligence, or ‘AI’, is transforming journalism. Globally and in Australia, algorithms – either directly programmed or trained on data using machine learning – are performing a range of newsroom tasks. They are rapidly transcribing interviews, writing news reports, investigating, adding metadata to articles and photos, and curating news to keep readers engaged. The most recent shift has seen AI tools move from behind-the-scenes to more visible reporting and investigating roles.

While the evidence suggests many in the audience don’t mind if their reporters are human or robot, experts caution that as algorithms take on more of the journalistic grunt work, newsrooms must deepen their understanding of the technology, including its strategic opportunities and inherent risks.

“If the content is useful, trustworthy, entertaining, relevant, then [the audience] don’t care whether it came from an 18-year-old intern, a 60-year-old Oxford graduate, or a piece of software,” said Professor Charlie Beckett, director of journalism and society think tank Polis.

Nearly half – 40 per cent – of news leaders across 52 countries said robo-journalism, where AI automatically writes stories, will be an important industry trend in 2022, according to a Reuters Institute survey.

Interest in robo-reporting has been spurred by the rapid development of natural language models, a subset of AI capable of understanding text and writing fluently. AI writing tools have gained prominence since the 2020 release of GPT3 – short for Generative Pre-trained Transformer 3 – a technology developed by research lab OpenAI. It’s  now one among dozens of AI tools that can produce human quality text from a few prompts in a matter of seconds.

These tools have features useful to journalism, said Dr Lucinda McKnight, an expert in writing for the digital age. But the Deakin University researcher cautions the technology requires human oversight and comes with bias and inaccuracy risks.

“They have extraordinary capabilities that human writers don’t have,” she said.

“You can write in multiple languages at once, for example. You can have keywords and search-engine-optimised terms built into your writing, relying on AI data.”

The challenge for journalists and news organisations is to understand how these tools were trained, and to use them critically and carefully. For example, a model like GPT3 was trained by “gobbling up basically, the whole internet”, she said.

“So all of the human biases, and you know, assumptions and things that are made out there, get potentially taken up by this AI writer entity.”

McKnight said checking for bias and inaccuracy  were  just some of the reasons why human oversight was needed. She adds, AI writers “don’t have a moral compass… they don’t have creativity and imagination like humans do. They’re not going to have emotional intelligence, empathy, those human traits.”

As large language models improve, their potential for bias and toxicity increases, a 2022 AI status report by Stanford University found. The level of toxicity – rude, disrespectful or unreasonable content – depends heavily on the underlying data used to train AI models.

In an extreme and widely criticised example, a machine learning expert created ‘the worst AI ever’ – a racist, toxic bot – by training the system on a 4Chan board for so-called ‘politically incorrect’ discussions.

Dr Silvia Montaña-Niño researches the influence of data and algorithms on journalism at the Centre of Excellence for Automated Decision-Making and Society. She said AI was already being used for news production tasks ranging from automating sports, finance or weather reports, through to examining big data sets for investigations like the Pandora Papers.

Another example of AI being used in investigations was an algorithm developed by Peruvian news publisher Ojo Público. The tool called FUNES has been designed to “detect the risk of corruption in public contracts”, she said. FUNES sifts through thousands of public contract documents, election donations and suppliers, and flags possible signs of corruption. In this way the algorithm identifies potential story leads, which journalists then investigate further and report on.

Montaña-Niño said in Australia, while the use of AI in news distribution and to recommend stories was already common, newsrooms have been slower to adopt tools for news production like robo-writers. One likely reason, she said, was the smaller size of the media market and the abundance of human journalists making it less economical to replace their work with tailored algorithms.

The ABC has been one of the more innovative organisations, automating emergency alerts and using chatbots, she said.

Michael Collett works as a conversation designer at the ABC and was one of the creators of an ABC news chatbot on Facebook Messenger in 2016. More recently, Collett has been working on automating emergency information for smart speakers, so that users can ask Alexa if there are any emergencies in their local area.

Collett was careful to point out that the ABC’s automated emergency alerts are deliberately programmed. They don’t rely on the branch of AI called machine learning where computers are trained on large quantities of data.

He said, this means, “if there ever was an issue with our content in the ABC emergency voice app, we can explain exactly how it arrived at that, and we can make corrections”.

Collett said the design was informed by his journalism background, making him more aware of the risks and editorial responsibilities with automation.

With the rise of AI, Montaña-Niño anticipates that the biggest transformation for newsrooms and journalists is going to be understanding the technology and the data it draws from. For example, how have algorithms been designed or trained, who owns the technology and what is its design intent?

Beckett said AI’s “package of technologies” also pose interesting strategic questions for journalism and media organisations.

“If these technologies are doing some of the journalistic labour, much of it quite mundane, and boring and repetitive, and functional and formulaic, then part of the challenge is to make sure that that works properly, but also it challenges then to do something else.”

He said that ‘something else’ could involve freeing up time for journalists for “more creative, or more human, or more analytical” work, or using the technology’s capabilities to trawl through data as a tool for investigations.

But the nature of journalism means news organisations won’t necessarily have the time to reflect and take that holistic, structural view, Beckett said. Like previous technological developments – radio and television broadcasting, digital and social media – newsrooms are having to adapt and incorporate the technology while still delivering the daily news.

“It’s a bit like being on a motorbike and you’re having to keep going, but you’re rebuilding the engine at the same time,” he said.

“With each new technological change, you can’t stop, get off and build a new bike … because you’re The New York Times, or you are some local newspaper in Canberra.”

Image by Craiyon, an AI model that can draw images from text prompts

Image by Craiyon, an AI model that can draw images from text prompts

About The Citizen

THE CITIZEN is a publication of the Centre for Advancing Journalism. It has several aims. Foremost, it is a teaching tool that showcases the work of the students in the University of Melbourne’s Master of Journalism and Master of International Journalism programs, giving them real-world experience in working for publication and to deadline. Find out more →

  • Editor: Jo Chandler
  • Reporter: Qiyun (Gwen) Liu
  • Audio & Video editor: Louisa Lim
  • Data editor: Craig Butt
  • Editor-In-Chief: Andrew Dodd
  • Business editor: Lucy Smy
Winner — BEST PUBLICATION 2016 Ossie Awards