Reflections on Human Rights and Technology
How much of our private data are we willing to give up for a little bit of life convenience?
That was a key take away question I am still trying to answer after attending the International Conference on Human Rights and Technology in July 2018. As an active UTS SOUL Award volunteer, I was awarded a scholarship to attend the conference organised by the Australian Human Rights Commission. Here are some of my reflections on what I heard.
The purpose of the conference was to gain greater insight into the role of technology in empowering our societies. It was attended by 400 people from diverse sections of the community, industry and government, and keynote speakers included Dr Alan Finkel, Australia’s Chief Scientist, and Dr Mary-Anne Williams, Director, Magic Lab at UTS. It was stimulating to listen to their wisdom and global views on the protection of human rights in the digital age.
Being a student of civil engineering, I used to believe the Internet of Things (IoT) and Artificial Intelligence (AI) would not have a huge impact on my lifestyle. But this has now changed. For example, for my studies , online educational platforms offer convenient access to information. I can do quick research on any topic and have the information with just a few clicks. I also realised that technology is playing an increasingly significant role in my field. For instance, sensors placed within concrete structures can collect strain-stress data which are subsequently analysed for understanding the life expectancy of the structure and the need of repairs, if necessary. The recent rise in AI based tools and software assists engineers and designers to create virtual models as well as understand the risks and costs of alternative structures. In brief, applications of AI are increasingly used for design optimisation, damage detection and solving complex civil engineering problems.
Panellists at the conference were keen to explore how societal behaviour influences technology and that AI is not neutral. An example used was the analytic software program COMPAS (Correctional Offender Management Profiling for Alternative Sanctions). COMPAS was designed to determine the ‘right’ candidate for parole. The software was fed with dynamic data - such as the age of the inmate when they committed the first crime and the reports of previous parole interviews – to make recommendations for parole decisions. However, subsequent analysis showed strong race and colour-biased recommendations. As a result, the parole officer-users decided not to solely depend on this AI based software for decisions. In this case, the software development had reflected existing social biases.
Another important concept covered was the power of responsible innovation. While technology gives us back more than it takes, it is up to the creators to put it to a responsible use. Let’s think about the application of drones in our society. They can be built and controlled to deliver blood or first aid materials quickly to an accident scene to save a life. Conversely, they can also be deployed as weapons to cause harm to people.
Government needs to play an important role here, with frequent checks and robust regulations to protect privacy and protect people, involve people in the decision-making processes that affect their rights, as well as ensuring fair and equal legal rights to individuals across the whole community. A recent example, Germany’s 2018 network enforcement law requires social media companies to maintain compliance of their content under local criminal laws. Firms incur heavy penalties in any case of non-compliance.
A powerful message from the conference was that technological innovation should be democratic and inclusive of the whole community, including people with disability and people who are marginalised. Research by RMIT University has highlighted that Australians with disability face more barriers in embracing digital technology compared to people without a disability. Multiple examples were given of developers all the around the world developing technologies that support the independence of people with disability. Examples include how an intelligent home assistant can assist people with disabilities in many day-to-day activities through speech and content recognition, and how professionals from the construction industry are improving the standards and design methodologies to accommodate wheelchair accessibility in residential houses by providing wide access paths from the driveway to all individual rooms. These developments are core to a resilient society.
Post-conference, I am now trying to shape my academic learning around sustainable futures while still searching for an answer to that question – how much private data are we willing to give up for a little bit of life convenience?