• Posted on 17 Jul 2025
  • 4 minutes read

In another win for X Corp, the Administrative Appeals Tribunal has torn up eSafety’s takedown notice over a post by Canadian activist Chris Elston. The post misgendered Australian adult Teddy Cook, linked to a Daily Mail story featuring Mr Cook and suggested that trans people belong in psychiatric wards. Offensive? Certainly. But was it ‘cyber-abuse material targeted at an Australian adult’ under the Online Safety Act 2021 (Cth)?

Section 7 sets a two-step test:

  1. Intention element – Would an ordinary reasonable person think the poster was likely intending to cause serious harm? 
  2. Offence element – Would that same ordinary reasonable person, in the adult’s shoes, find the post menacing, harassing or offensive?

The intention element carries particular weight. Deputy President Damien O’Donovan wrote that without it, section 7 becomes ‘a broadly available censorship tool based on emotional responses’.

When addressing the issue of intention, O’Donovan first had to ask: what evidence can the ‘ordinary reasonable person’ consider when making a judgement? eSafety wanted the Tribunal to keep the lens tight: the tweet, the Daily Mail link and whatever context a casual scroller would see. O’Donovan disagreed. He reasoned that because the Act creates a complaints-based investigative regime with coercive powers, the ordinary reasonable person can review all lawfully obtained evidence: sworn statements, analytics, prior conduct, expert affidavits—everything.

After addressing issues around the definitions of ‘likely’ and ‘serious harm’, O'Donovan concluded that on the given evidence, the hypothetical ordinary reasonable person would not think it more likely than not that Elston’s purpose was to cause Cook serious harm. He observed there was no direct tagging of Mr Cook, that Elston had never heard of Cook prior to the media story and that the post was not targeted since Elston routinely misgenders trans people; this tweet echoed his general activism.

The intention element failed, and without it, the post could not be classified as ‘cyber-abuse material targeted at an Australian adult’.

What can we take away from this decision? Proving intent is notoriously tricky. Serious harm is also a high bar to reach.

Targeting, such as a named @-mention or hashtag aimed at the complainant, will weigh heavily on intention. Its absence here mattered. Elston was plainly unfussed about any hurt feelings, yet indifference is not the same as intent to inflict serious harm.

Parliament also set out with the notion that adults wear a thicker skin. Only when content crosses ‘well beyond reasonable commentary or expression of opinion and into the realm of intentional, serious harm’ does a takedown power bite.

An obvious question is the suitability of the ordinary reasonable person test in this provision. The Tribunal reasoned that the investigatory powers suggest any gathered information should be available to the hypothetical decision maker. Yet this makes for a very informed ‘ordinary reasonable person'. It was interesting that eSafety argued against including all the contextual material they can gather under their powers. While a narrow approach may occasionally suit the facts, as X’s counsel noted, ‘See you in Melbourne tonight, darling’ means something very different when the poster is a known abuser and the recipient is in hiding.

For now, context is king. The ‘ordinary reasonable person’, with all the knowledge arising from a regulatory investigation, remains the gatekeeper between deeply offensive speech and cyber-abuse material under Australian law.

P.S. If you’re interested in the ongoing history between X and eSafety, you can read more here, here and here.

 

LINK 
Online Safety Bill 2021 explanatory memorandum: https://parlinfo.aph.gov.au/parlInfo/search/display/display.w3p;query=Id%3A%22legislation%2Fems%2Fr6680_ems_3499aa77-c5e0-451e-9b1f-01339b8ad871%22 

 

News

Monica Attard unpacks the latest BBC turmoil and what it signals for the ABC as public broadcasting becomes a proxy battlefield.

News

Derek Wilding digs into the proposed Australian content obligations for streaming services. 

News

Dr Alena Radina looks at new claims about Russia “grooming” AI models.