- Posted on 8 May 2025
- 5-minute read
If Donald Trump was the third force disrupting this election, the rise of influencers, podcasters and the proliferation of fake accounts and bots are surely the fourth.
It’s our first election since the release of generative AI, a new technology that has election officials in Canberra and around the world on edge. AI can generate endless fake articles and comments designed to trigger outrage or reinforce existing biases. It can analyse voter data to micro-target vulnerable communities and create deepfakes that even the most skilled digital forensic teams struggle to debunk.
Sounds terrifying and rightly so.
The existence of a pro-Russian influence operation,‘Pravda’ in this election is a concern. Pravda Australia presents itself as a news site, but it’s not. Throughout the campaign, the website pumped out over 150 stories a day, many from well-known Russian propaganda sites.
Its aim is not to connect with readers but to influence the chatbots. The assumption is that the more the chatbots read, the more informed their answers are. But this only holds if the information they are ingesting is free from disinformation and propaganda. This is how bad actors infect our information and media environments to sow division.
Disinformation designed to undermine confidence in our electoral processes was also swirling around. The Australian Electoral Commission (AEC) identified and debunked a total of seven prominent pieces including claims that voter preferences are controlled by the AEC (not the voter) and an old video of an AEC polling official rubbing out votes made with a pencil.
Deepfakes also made an, albeit underwhelming, entrance this election. One dodgy Peter Dutton deepfake circulated on Chinese social media.
Not great but a long way away from what’s unfolding across the Atlantic.
Yet complacency in this moment of intense disruption would be unwise. Disinformation flourishes in societies that are already, to a large extent, polarised and this is where the trendlines for Australia are not exactly comforting.
According to the latest McKinnon Poll 55% of Australians feel the country has become more divided.
The arrival of social media influencers in this campaign has been one of the more interesting developments. New voices, including Hannah Ferguson, 26-year-old Founder of Cheek Media and Abbie Chatfield with her “It’s a Lot” podcast, are setting their own mould for how a new generation engages with the political debate. They are opinionated and transparent about how they frame the issues, which for Hannah Ferguson is "incredibly left-wing, progressive and feminist".
Ferguson is part of a broader trend.
An ideology gap is opening up between young men and young women in countries across the world, including the UK, US, South Korea and Germany. Young women are becoming more liberal. Young men more conservative. According to Tapestry Research, they occupy different online worlds, breeding misunderstanding and resentment.
Our social media influencers have a less acknowledged co-publisher - the powerful and much less transparent social media algorithms that reward outrage and exacerbate echo chambers. The risk for Australia here is that disinformation tends to flourish in societies that are already polarised.
Compounding this opaque algorithmic digital environment is the existence of bots within it. Bots can be AI-generated and drive fake accounts that impersonate real people who like, share and comment on content. Fake accounts and bots have long lingered in the digital world, but again, the arrival of generative AI has offered industrial-scale capabilities.
According to disinformation detection company Cyabra, roughly one in five accounts recently analysed on X discussing the Australian election were fake and used AI-generated images. It’s likely that politicians, staffers and voters engaged with bots and fake accounts under the assumption that they are real people.
Overall, Australia is proving resilient to disinformation. But fueled by social media, we are becoming more polarised. And that’s a risky place to be.
The fourth force has arrived.
Author

Lisa Main
Director, Main Bureau – a communications intelligence agency.