New government tech systems, like those designed to detect crime or spot fraud, risk entrenching inequality without increased scrutiny warns a new report.
Looking at 61 examples of projects cancelled across Australia, Canada, New Zealand, the United States and Europe, the research urges policymakers to roll out more safeguards for automatic decision systems (ADS) – computer systems or algorithms designed to help or replace human decision making.
The report comes after high-profile UK controversies involving automated systems such as the Post Office horizon scandal and concerns raised about computer exam algorithms.
Automating Public Services: Learning from Cancelled Systems has been published by the Data Justice Lab and Carnegie UK supported by Western University’s Faculty of Information and Media studies
The new research finds that governments and agencies are trying to use technologies to cut costs. But the report highlights that these projects have provoked concerns about privacy, security, transparency, and accountability.
Automatic decision system initiatives have often been cancelled after community opposition, legal action or after investigative reporting.
About half the cancelled systems analysed in the research were associated with justice and policing, but the report also found healthcare and immigration ADS projects that were put on hold.
Lead researcher Dr Joanna Redden, Assistant Professor at Western University and Co-Director of the Data Justice Lab, said: “These findings demonstrate that there are competing understandings, values and politics about if, where and how automated decision-making systems should be used. Further, that the automation of public services is a public issue requiring more widespread debate about the kinds of datafied societies we want to live in.”
The report urges decision-makers to examine why ADS projects have failed in the past. In addition, it makes the case for new resources for regulators and a public registry of these sorts of initiatives. The report’s recommendations are intended to enhance the innovation capacity of the public sector by recognising the necessity of strong governance and institutional review.
Sarah Davidson, chief executive of Carnegie UK said: “Collective wellbeing requires that everyone have a voice in the decisions which affect them. This new hard-hitting study shows research identifies many, sometimes shocking, occasions where automated decision systems have been commissioned without sufficient democratic scrutiny.
“While we want to see innovation in public services, new initiatives must be developed in partnership with the people they’ll impact. We can’t see new digital initiatives work against the very people they’re supposed to be supporting or protecting.”