- Posted on 13 Dec 2021
- 45-minute read
New technology can improve our lives – but there are also profound risks and threats.
This phenomenon is exemplified by the rise of facial recognition technology. For example, using this tech to unlock your smartphone is relatively low risk, but its use in policing could cause significant harm to marginalised groups.
Research has shown this technology tends to be far less accurate in identifying people with dark skin, women and people with a physical disability.
The risk of overuse is also significant, because this can result in our sliding into a society with the infrastructure for mass surveillance—a profound challenge to our right to privacy.
In this session, Aaina Agarwal, Dr Neils Wouters, Amanda Robinson and Duncan Anderson joined Edward Santow to discuss whether the potential benefits of facial recognition technology could outweigh the risks.
If you are interested in hearing about future events, please contact events.socialjustice@uts.edu.au.
The risks of having the infrastructure of a surveillance state in place are simply too great to justify the use or the potential use [of FRT] even in very limited circumstances Aaina Agarwal
Discussions around the ethics of facial recognition are very often and too much led by ethicists in their ivory tower. What we really need is that close connection with members of the public. Dr Neils Wouters
The growing concerns about implications of new and emerging technologies on societies are real, and we need to address those. Amanda Robinson
Building community confidence in the use of technology by police is certainly... an important part of the relationship police have with community. Duncan Anderson
The idea that new technology is double-edged bringing opportunities and risks is exemplified perhaps most by the rise of facial recognition technology. Ed Santow
Speakers
Aaina Agarwal is a business and human rights lawyer and media voice focused on the impact of disruptive technologies. She works as Counsel at BNH.AI, and is the Producer & Host of podcast Indivisible.
Dr Niels Wouters is a senior design researcher at Paper Giant. He is the co-creator of Biometric Mirror – an online tool that demonstrates facial recognition usage in psychometric analysis.
Amanda Robinson is Co-founder & Director of Humanitech at Australian Red Cross, a think + do tank, which seeks to ensure that technology serves humanity by putting people and society at the centre.
Duncan Anderson is the Executive Director, Strategic Priorities and Identity within the NSW Police Force. He co-chairs the NSW Identity Security Council which works to promote security, privacy and accessibility of identity products and services.
Edward Santow is Industry Professor – Responsible Technology at UTS. He was Australia’s Human Rights Commissioner from 2016–2021, where he led the most influential project worldwide on the human rights and social implications of AI.