• Posted on 15 Aug 2025
  • 5 mins read

In our 2023 report, most newsrooms we spoke to expressed a mixture of enthusiasm over the potential for genAI to augment and automate some of the more mundane and time-consuming newsroom tasks and trepidation over the disruption it would create both inside and outside the newsroom. There had been some early experimentation, but newsrooms were proceeding cautiously, with news integrity and audience trust top of mind.  

Eighteen months on, there’s been some moderation of concern, but in comparison with overseas newsrooms, implementation in Australia is still constrained. This finding reflects global survey data, which show relatively low uptake amongst Australian journalists. A recent Thomson Reuters survey on newsroom uptake in developing economies found that eight in ten were using AI in their work, with 49% using it daily. By contrast, Medianet’s 2025 Australian Media Landscape Report found that 63% of Australian journalists had not used genAI in their work during the previous year.  

This is not to say that experimentation is not occurring. It is, but it has not yet translated into widespread application in newsroom workflows. Our interviews provided some insight into why this might be the case.  

One of the main constraints is a perceived lack of utility or value in AI, particularly with consumer-level products. The need to ensure the integrity of the news requires robust journalistic and editorial processes, including human oversight over all uses of AI. The burden that this places on newsroom resources limits the viability of many AI applications that may prove productive in other parts of the media industry. Indeed, there is broader experimentation occurring outside the newsroom in many of our participant organisations, for example in factual content and entertainment. The upshot is that newsrooms have focused in the first instance on low-risk applications of AI, such as back-end tasks like summarisation and transcription. While Australian newsrooms see the potential for using AI in some audience-facing areas, particularly in personalised delivery and reformatting, experimentation has demonstrated that for many uses, the technology is just not good enough.  

Larger, well-resourced organisations have been able to put cash into the development of custom tools that are more reliable in newsroom applications. But even for them, resources are constrained, and investment in AI means the cash must be taken from other parts of the business. As the ABC standards editor Matt Brown told us, “We don't have $100 million spare ... to run around just trying all this stuff out. It takes heaps of time to do it properly and to have some faith in the integrity of the process.” For smaller organisations, particularly in regional markets, resource limitations hit even harder. There’s clearly opportunity for efficiency gains in automating back-end processes, but there is a need for off-the-shelf products that are designed with newsroom needs in mind.  

Across the board, there is little desire to use AI to produce news content. While newsrooms are looking at ways to automate time-consuming or low-value tasks, none of them see AI as capable of performing the tasks of a journalist: hitting the phones or the street, talking to people, finding the story amongst the noise. Senior audience editor at Nine, Sophia Phan, said, “I feel like, in terms of it generating content, and especially content that we would use, we're so far away from that, just because we are the experts in that field.” There will certainly be disruption, and management will be looking at ways to cut costs, but all our participants recognised that their long-term business proposition is hitched to their ability to connect with their audiences, who expect news to authentic and accurate.  

Yet, looking at some of the AI applications being pursued overseas, particularly in content personalisation and repurposing, you could be forgiven for wondering whether Australian newsrooms are perhaps being too cautious. Chatbots like the Washington Post’s, which provide responses generated from their news archive, are likely a way off for Australian organisations. But there are other opportunities which have been slow to catch on here. Two examples are text summaries of online news articles and audio versions produced with synthetic voice. The latter have started to appear on some Australian news websites, but they have been used in prominent European and US publications for some time. These uses increase accessibility and reach, with the potential to expand audiences to languages other than English. While commercial radio in Australia is using synthetic voice for service information like weather updates, there is also potential to expand this to audio updates during breaking news events, where recording each segment in person may not provide the rapid coverage that’s needed.  

No doubt many of these applications will filter down into the Australian market over time, and, given the risks to news integrity – both internal and external – presented by genAI, there is good reason to proceed cautiously. The need to preserve news quality is paramount, not only for newsrooms, but for the integrity of the information ecosystem and the democratic processes that rely on it.  

 

References  

2025 Australian Media Landscape Report | Medianet https://engage.medianet.com.au/2025-media-landscape-report 

Thomson Reuters Foundation report | TRF Trust 
https://www.trust.org/resource/ai-revolution-journalists-global-south/ 

Share

Author

Michael Davis

Research Fellow, Faculty of Law

News

Monica Attard unpacks the latest BBC turmoil and what it signals for the ABC as public broadcasting becomes a proxy battlefield.

News

Derek Wilding digs into the proposed Australian content obligations for streaming services. 

News

Dr Alena Radina looks at new claims about Russia “grooming” AI models.