Enhancing online expectations
Late last month, Communications Minister Michelle Rowland announced the introduction of the Online Safety (Basic Online Safety Expectations) Amendment Determination 2024. This Determination amends the 2022 Determination (BOSE Determination) which outlines the expectations of online service providers under the Online Safety Act 2021 (OSA).
The BOSE Determination provides the ‘core expectations’ of online service providers, as well as ‘additional expectations’ and ‘examples of reasonable steps’ providers can undertake to meet the expectations. The amendments provide several new additional expectations and examples of reasonable steps.
The first additional expectation introduced is that the ‘provider of the service will take reasonable steps to ensure that the best interests of the child are a primary consideration in the design and operation of any service that is likely to be accessed by children.’ This has already received some criticism for attempting to baby-proof the internet, and the idea is likely to be debated further as it has been suggested as a guiding principle in the current review of the Online Safety Act.
The amendment also introduces additional expectations for user controls, requiring providers to offer features that allow users to manage interactions, content, and privacy settings. It also emphasises the need for proactive measures to detect and mitigate unlawful or harmful content, including those generated by AI systems or amplified by recommender algorithms. Another additional expectation will require providers to respond to a request from the eSafety Commissioner for a report on the number of active end users in Australia, with a breakdown of users into categories of adults and children – a seemingly difficult task to do accurately without age verification mechanisms.
Several new examples of reasonable steps that online service providers can take to meet expectations are also introduced. These include publishing regular transparency reports detailing measures being taken to protect Australian users, processes to detect hate speech breaching a service’s terms of use, assessing the impact of business decisions on end-user safety and having systems in place to detect unlawful or harmful content and action complaints.
With the recent eSafety versus X drama (read about it, here, here and here), the continued debate on how Australia handles misinformation and disinformation, and the current review of the OSA, the amendments come at a tumultuous time as Australia navigates its approach to online harm.
Kieran Lindsay, CMT Research Officer