- Posted on 10 Oct 2025
- 4 minutes read
Five years since major digital platforms came together at the request of the federal government to develop the voluntary Australian Code of Practice on Disinformation and Misinformation, the code is undergoing its second review. The first review, conducted in 2022, resulted in the strengthening of several aspects of the code. But the discussion paper for the second review, which code administrator DIGI released last week, raises as its primary question whether the code should be scaled back by removing misinformation from its scope.
During the development of the code, ACMA called for the inclusion of both misinformation (innocently disseminated false or misleading content) and disinformation (false or misleading content spread with intent), as a code focused on disinformation would be too narrow to ‘adequately address the wide range of potential harms’. The platforms acquiesced, perhaps in the knowledge that a voluntary code would bring with it a range of reporting obligations but little in the way of hard consequences. But the failure of the Combatting Disinformation and Misinformation Bill last December, driven in large part by legitimate, if overblown, concerns about government intrusions on free speech, has seemingly opened a space for the scope of the code to be reconsidered.
The discussion paper provides arguments both for and against the removal of misinformation. Those against include the difficulty of determining intent and thus distinguishing misinformation from disinformation, the flexibility that a broad scope provides to respond to changes in the information environment, and increased transparency and accountability. Those in favour focus on the complexity and subjectivity of assessing online content, especially in politically contentious areas.
A further focus of the review is the role the code can play in facilitating an ecosystem approach to misinformation and disinformation. The paper observes that the code is ‘at present the sole regulatory tool’ to tackle mis- and disinformation in Australia, but one that is exclusively focused on platforms. It notes that the media, advertisers, online influencers and political actors also play important roles in the propagation of misinformation, but their conduct mostly lies outside the scope of the code. It is true that a broader view of the information ecosystem is needed, but removing misinformation from the code would undermine this broader approach, substantially narrowing the scope of transparency obligations and discouraging platforms from increasing their efforts to improve the online information ecosystem.
What is often missed in these discussions is that platforms already make impositions on individual expression by moderating and removing user content, including for breaching misinformation and related policies. Further, digital platforms are not a neutral marketplace of ideas. Misinformation sells, and as a result, platform algorithms rank it more highly and push it to users who would never have engaged with it otherwise. A properly calibrated systems approach should promote accountability for the operation of platform algorithms rather than waking the censorship demon by focusing on the removal of misinformation. The code, as the only sole regulatory tool available, at least encourages a degree of transparency and accountability for the actions that platforms take. Yes, an ecosystem approach is needed, but so are broader transparency and stronger accountability for digital platforms.
References:
https://digi.org.au/code-review-2025/
https://www.acma.gov.au/online-misinformation-and-news-quality-australia-position-paper-guide-code-development
https://www.tandfonline.com/doi/full/10.1080/1323238X.2025.2466862
