Skip to main content

Site navigation

  • University of Technology Sydney home
  • Home

    Home
  • For students

  • For industry

  • Research

Explore

  • Courses
  • Events
  • News
  • Stories
  • People

For you

  • Libraryarrow_right_alt
  • Staffarrow_right_alt
  • Alumniarrow_right_alt
  • Current studentsarrow_right_alt
  • Study at UTS

    • arrow_right_alt Find a course
    • arrow_right_alt Course areas
    • arrow_right_alt Undergraduate students
    • arrow_right_alt Postgraduate students
    • arrow_right_alt Research Masters and PhD
    • arrow_right_alt Online study and short courses
  • Student information

    • arrow_right_alt Current students
    • arrow_right_alt New UTS students
    • arrow_right_alt Graduates (Alumni)
    • arrow_right_alt High school students
    • arrow_right_alt Indigenous students
    • arrow_right_alt International students
  • Admissions

    • arrow_right_alt How to apply
    • arrow_right_alt Entry pathways
    • arrow_right_alt Eligibility
arrow_right_altVisit our hub for students

For you

  • Libraryarrow_right_alt
  • Staffarrow_right_alt
  • Alumniarrow_right_alt
  • Current studentsarrow_right_alt

POPULAR LINKS

  • Apply for a coursearrow_right_alt
  • Current studentsarrow_right_alt
  • Scholarshipsarrow_right_alt
  • Featured industries

    • arrow_right_alt Agriculture and food
    • arrow_right_alt Defence and space
    • arrow_right_alt Energy and transport
    • arrow_right_alt Government and policy
    • arrow_right_alt Health and medical
    • arrow_right_alt Corporate training
  • Explore

    • arrow_right_alt Tech Central
    • arrow_right_alt Case studies
    • arrow_right_alt Research
arrow_right_altVisit our hub for industry

For you

  • Libraryarrow_right_alt
  • Staffarrow_right_alt
  • Alumniarrow_right_alt
  • Current studentsarrow_right_alt

POPULAR LINKS

  • Find a UTS expertarrow_right_alt
  • Partner with usarrow_right_alt
  • Explore

    • arrow_right_alt Explore our research
    • arrow_right_alt Research centres and institutes
    • arrow_right_alt Graduate research
    • arrow_right_alt Research partnerships
arrow_right_altVisit our hub for research

For you

  • Libraryarrow_right_alt
  • Staffarrow_right_alt
  • Alumniarrow_right_alt
  • Current studentsarrow_right_alt

POPULAR LINKS

  • Find a UTS expertarrow_right_alt
  • Research centres and institutesarrow_right_alt
  • University of Technology Sydney home
Explore the University of Technology Sydney
Category Filters:
University of Technology Sydney home University of Technology Sydney home
  1. home
  2. arrow_forward_ios ... Newsroom
  3. arrow_forward_ios ... 2024
  4. arrow_forward_ios 09
  5. arrow_forward_ios How AI 'guardrails' can help clean up a messy market

How AI 'guardrails' can help clean up a messy market

5 September 2024

A new safety standard and proposed guardrails for high-risk AI are a good start toward clearing up confusion around the latest technology writes Nicholas Davis.

An abstract representation of AI

Picture: Joshua Sortino / Unsplash / The Conversation

Australia’s federal government has launched a proposed set of mandatory guardrails for high-risk AI alongside a voluntary safety standard for organisations using AI.

Each of these documents offer ten mutually reinforcing guardrails that set clear expectations for organisations across the AI supply chain. They are relevant for all organisations using AI, including internal systems aimed at boosting employee efficiency and externally-facing systems such as chatbots.

Most of the guardrails relate to things like accountability, transparency, record-keeping and making sure humans are overseeing AI systems in a meaningful way. They are aligned with emerging international standards such as the ISO standard for AI management and the European Union’s AI Act.

The proposals for mandatory requirements for high-risk AI – which are open to public submissions for the next month – recognise that AI systems are special in ways that limit the ability of existing laws to effectively prevent or mitigate a wide range of harms to Australians. While defining precisely what constitutes a high-risk setting is a core part of the consultation, the proposed principle-based approach would likely capture any systems that have a legal effect. Examples might include AI recruitment systems, systems that may limit human rights (including some facial recognition systems), and any systems that can cause physical harm, such as autonomous vehicles.

Well-designed guardrails will improve technology and make us all better off. On this front, the government should accelerate law reform efforts to clarify existing rules and improve both transparency and accountability in the market. At the same time, we don’t need to – nor should we – wait for the government to act.

The AI market is a mess

As it stands, the market for AI products and services is a mess. The central problem is that people don’t know how AI systems work, when they’re using them, and whether the output helps or hurts them.

Take, for example, a company that recently asked my advice on a generative AI service projected to cost hundreds of thousands of dollars each year. It was worried about falling behind competitors and having difficulty choosing between vendors.

Yet, in the first 15 minutes of discussion, the company revealed it had no reliable information around the potential benefit for the business, and no knowledge of existing generative AI use by its teams.

It’s important we get this right. If you believe even a fraction of the hype, AI represents a huge opportunity for Australia. Estimates referenced by the federal government suggest the economic boost from AI and automation could be up to A$600 billion every year by 2030. This would lift our GDP to 25% above 2023 levels.

But all of this is at risk. The evidence is in the alarmingly high failure rates of AI projects (above 80% by some estimates), an array of reckless rollouts, low levels of citizen trust and the prospect of thousands of Robodebt-esque crises across both industry and government.

The information asymmetry problem

A lack of skills and experience among decision-makers is undoubtedly part of the problem. But the rapid pace of innovation in AI is supercharging another challenge: information asymmetry.

Information asymmetry is a simple, Nobel prize-winning economic concept with serious implications for everyone. And it’s a particularly pernicious challenge when it comes to AI.

When buyers and sellers have uneven knowledge about a product or service, it doesn’t just mean one party gains at the other’s expense. It can lead to poor-quality goods dominating the market, and even the market failing entirely.

AI creates information asymmetries in spades. AI models are technical and complex, they are often embedded and hidden inside other systems, and they are increasingly being used to make important choices.

Balancing out these asymmetries should deeply concern all of us. Boards, executives and shareholders want AI investments to pay off. Consumers want systems that work in their interests. And we all want to enjoy the benefits of economic expansion while avoiding the very real harms AI systems can inflict if they fail, or if they are used maliciously or deployed inappropriately.

In the short term, at least, companies selling AI gain a real benefit from restricting information so they can do deals with naïve counterparties. Solving this problem will require more than upskilling. It means using a range of tools and incentives to gather and share accurate, timely and important information about AI systems.

What businesses can do today

Now is the time to act. Businesses across Australia can pick up the Voluntary AI Safety Standard (or the International Standard Organisation’s version) and start gathering and documenting the information they need to make better decisions about AI today.

This will help in two ways. First, it will help businesses to take a structured approach to understanding and governing their own use of AI systems, to ask useful questions to (and demand answers from) their technology partners, and to signal to the market that their AI use is trustworthy.

Second, as more and more businesses adopt the standard, Australian and international vendors and deployers will feel market pressure to ensure their products and services are fit for purpose. In turn, it will become cheaper and easier for all of us to know whether the AI system we’re buying, relying on or being judged by actually serves our needs.

Clearing a path

Australian consumers and businesses both want AI to be safe and responsible. But we urgently need to close the huge gap that exists between aspiration and practice.

The National AI Centre’s Responsible AI index shows that while 78% of organisations believed they were developing and deploying AI systems responsibly, only 29% of organisations were applying actual practices towards this end.

Safe and responsible AI is where good governance meets good business practice and human-centred technology. In the bigger picture, it’s also about ensuring that innovation thrives in a well-functioning market. On both these fronts, standards can help us clear a path through the clutter.

Nicholas Davis, Industry Professor of Emerging Technology and Co-Director, Human Technology Institute, University of Technology Sydney

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Share
Share this on Facebook Share this on Twitter Share this on LinkedIn
Back to Technology and design

Related News

  • Hospital nurses checking on woman patient at intensive care unit.
    Workers are invisible bystanders in the adoption of AI
  • Mobile phone with the word encrypted. Adobe Stock image
    Technology, ethics and responsible AI development
  • Image of the Mona Lisa rendered with Lego bricks
    Digital ID is about to go mainstream. How can it work for everyone?

Acknowledgement of Country

UTS acknowledges the Gadigal People of the Eora Nation and the Boorooberongal People of the Dharug Nation upon whose ancestral lands our campuses now stand. We would also like to pay respect to the Elders both past and present, acknowledging them as the traditional custodians of knowledge for these lands. 

University of Technology Sydney

City Campus

15 Broadway, Ultimo, NSW 2007

Get in touch with UTS

Follow us

  • Instagram
  • LinkedIn
  • YouTube
  • Facebook

A member of

  • Australian Technology Network
Use arrow keys to navigate within each column of links. Press Tab to move between columns.

Study

  • Find a course
  • Undergraduate
  • Postgraduate
  • How to apply
  • Scholarships and prizes
  • International students
  • Campus maps
  • Accommodation

Engage

  • Find an expert
  • Industry
  • News
  • Events
  • Experience UTS
  • Research
  • Stories
  • Alumni

About

  • Who we are
  • Faculties
  • Learning and teaching
  • Sustainability
  • Initiatives
  • Equity, diversity and inclusion
  • Campus and locations
  • Awards and rankings
  • UTS governance

Staff and students

  • Current students
  • Help and support
  • Library
  • Policies
  • StaffConnect
  • Working at UTS
  • UTS Handbook
  • Contact us
  • Copyright © 2025
  • ABN: 77 257 686 961
  • CRICOS provider number: 00099F
  • TEQSA provider number: PRV12060
  • TEQSA category: Australian University
  • Privacy
  • Copyright
  • Disclaimer
  • Accessibility