Welfare automation prioritises ‘efficiency over human rights’
17 June 2019 at 5:05 pm
The automation of welfare services in Australia risks pushing vulnerable families deeper into poverty, the United Nations has heard.
The Human Rights Law Centre (HRLC) made a submission to the UN’s independent expert on poverty, expressing fears that Australia’s automated approach to welfare was being used to “manage the poor” and entrench inequality.
Highlighted in the report were Centrelink’s automated debt recovery system “robo-debt” and the welfare program ParentsNext – both of which have come under fire for imposing excessive penalties.
HRLC lawyer Monique Hurley said computers making decisions about peoples’ livelihoods could be the difference between a child having food or going hungry.
“Single mothers with pre-school aged children have been left stranded and have had to turn to charities for food vouchers,” Hurley said.
“The robo-debt debacle has seen the government bully people into paying debts they do not owe, in an attempt to prioritise efficiency over human rights.”
Robo-debt uses data from government agencies to pick up discrepancies with what people have reported to Centrelink, but this has led thousands of welfare recipients to be sent automated recovery notices even when they don’t owe money.
HRLC said while previously only 20,000 notices were sent a year, this has risen to 20,000 a week.
The centre also said the highly automated Targeted Compliance Framework of ParentsNext had a “disproportionate impact” on Aboriginal and Torres Strait Islander parents.
“This is a government program that threatens to leave a struggling mother without money just because she hasn’t completed a task or reported it, and computers are taking human compassion out of the equation. A program that leaves even one child hungry or cold has no place in Australia,” Hurley said.
“The program is aggravating the inequality that Aboriginal and Torres Strait Islander parents already experienced by leaving them more exposed to the risk of financial sanctions.”
HRLC’s submission has called for a human rights-based approach to be taken with the development, adoption and evaluation of digital technologies in welfare systems.
This report said governments must work to remove discrimination and inequality from these automated systems, which requires affected groups to help in the design process.
Hurley said governments often overlooked the fact that relying on machines to make decisions could exacerbate existing inequality.
“The stakes are high when it comes to social security. We should not lose sight of the need for human empathy in decision making, and computers simply cannot do that,” she said.