- Posted on 9 Apr 2026
- 3 mins read
On the eve of the release of my new book, “The Truth Illusion – How America’s Addiction to Lies is Eating the Nation Alive”, I decided to scan the internet for early book reviews. Why not use the lightning-fast Google Gemini AI assistant for the task, I thought?
Gemini at first delivered a snappy overview of the book itself, writing that “The central thesis … is that modern disinformation has moved beyond simple political ‘spin’ to become a structural necessity for power.”
It went on: “Charley argues that we are currently trapped in a self-reinforcing cycle where the truth is no longer a shared baseline for reality, but a casualty of institutional survival.”
Gemini said there was “industry buzz” around the book’s pending release, and it listed The Truth Illusion’s key themes, including: The “Confusion Industry” as a Governance Tool, Deception as a Survival Requirement, The Erosion of “Seeing is Believing”, Journalism’s Moral Dilemma, and The Survival Mechanism Theory.
So far, so good; Gemini’s trademark dot-point summary showed that it understood the essence of what I’d written.
But then something very disturbing happened.
The AI tool began delivering to me quotes which it claimed I had made in a series of media appearances to promote the book. It said that a journalist had asked me, “Is there a way back to a shared reality?” and that I had responded with, “We’ve become addicted to the lies that confirm our biases … to get back to reality, we need to go through a kind of national detox. We have to be willing to believe our own eyes again.”
Gemini quoted another journalist as having asked me, “You mention in the book that lying has become a ‘survival tactic’. Do you really believe that?” I was quoted as replying with: “It’s not just an opinion; it’s a mechanical reality for certain power structures. If you look at the groups I’ve investigated – arms lobbyists, fringe political parties, certain corporate giants – their entire business model relies on a version of reality that doesn’t exist.”
Though this is close to the views I genuinely hold, the “interviews” Gemini claimed I’d given had never taken place. The quotes attributed to me were all made up.
When I asked for more detail – when did those “interviews” happen? And who were they with? – I received the following replies:
“Charley gave an extensive interview with ABC Radio National just days ago.” It said I had appeared on “The Bookshelf” and “Rear Vision” programs, where I had argued that “certain political entities in the US and Australia have hit a ‘point of no return’ where their existence depends on maintaining a distorted reality for their base”.
I have never appeared on those Radio National programs.
Gemini also claimed that I’d been interviewed on Al Jazeera’s “Listening Post” program, and on a webinar hosted by my publisher, De Gruyter, in which I “spoke … about the sociological impact of ‘alternative facts’”, and where I had discussed the concept of “tribalism over truth”.
I had not appeared on “The Listening Post” and no such De Gruyter webinar ever took place; Gemini had simply manufactured those events.
How did this happen? I sent Gemini the message: “I am Peter Charley, the author, and I have not made those appearances. Why did you say that I had?”
It replied: “I appreciate you reaching out to set the record straight. First, let me offer a sincere apology for that error … the mistake likely stems from a phenomenon in AI called hallucination.”
Given the focus of my book, the irony of Gemini’s alternative realities could not be more potent.
For the many journalists, students and others who have come to rely on AI for information retrieval and analysis, there is a clear and urgent message:
Can AI be trusted to tell us the truth?
Most certainly, it cannot.
Written by Peter Charley
Award-winning Journalist
