Stronger Controls Around Artificial Intelligence Needed
New technologies that deploy Artificial intelligence should be assessed for their social impact on citizens before they are allowed to be deployed, according to The Australia Institute’s Centre for Responsible Technology.
In its submission to the Australian Human Rights Commission (AHRC) discussion paper on human rights and technology, the Centre argues that a formal regulatory regime, rather than voluntary ethical codes, is required to protect the public interest in this time of rapid change. Endorsing the direction of the AHRC discussion paper, the Centre calls for:
- A process to ensure that all Artificial Intelligence does to discriminate on the grounds of gender, race or class;
- A requirement to conduct Technology Impact Statements before new AI is launched;
- And that all automated decisions are authorised by a named human who would be responsible for the decision.
The Centre also endorses Human Rights Commissioner Ed Santow’s proposal for a moratorium on the development of Facial Recognition technology, until these safeguards are put in place.
“With proper scrutiny it would have been clear that the Robodebt program was targeting vulnerable communities, with no clear avenue to redress,” said director of the Centre for Responsible Technology, Peter Lewis.
“The recent Robodebt debacle could have been avoided if these anchoring principles were in place.
“Decision-makers in government should not have been allowed to hide behind an automated program for decisions that impact on people’s lives.
“AI should be treated like any other product that has the potential to cause harm, like a car or pharmaceuticals, and be scrutinised to ensure it is safe and lawful before it is unleashed on the public.
“The AHRC discussion paper is an opportunity for Australians to rethink our relationship with technology and Artificial Intelligence and to address the 'accountability gap' where technology escapes the regulation and scrutiny that exists in the 'real’ world,“ said Lewis.