AI silos threaten justice

Ad hoc use of complex algorithms in the justice system needs urgent oversight says the Law Society of England and Wales in new report following a year-long investigation.

Shutterstock

The Law Society Technology and Law Policy Commission has published its report on algorithms in criminal justice alongside an interactive map that allows the public to see for the first time the beginnings of an overview of where algorithms are being used to assist decision-making across the justice system across England and Wales.

Silos

Law Society president Christina Blacklaws said, “Police, prisons and border forces are innovating in silos to help them manage and use the vast quantities of data they hold about people, places and events.”  Ms Blacklaws explained, “Complex algorithms are crunching data to help officials make judgement calls about all sorts of things – from where to send a bobby on the beat to who is at risk of being a victim or perpetrator of domestic violence; who to pick out of a crowd, let out on parole or which visa application to scrutinize” However, she said “While there are obvious efficiency wins, there is a worrying lack of oversight or framework to mitigate some hefty risks – of unlawful deployment, of discrimination or bias that may be unwittingly built in by an operator.” Me Blacklaws warned “These dangers are exacerbated by the absence of transparency, centralised coordination or systematic knowledge-sharing between public bodies. Although some forces are open about their use of algorithms, this is by no means uniform.”

Key recommendations

The report recommends on oversight a llegal framework for the use of complex algorithms in the justice system. The lawful basis for the use of any algorithmic systems must be clear and explicitly declared. On transparency, a national register of algorithmic systems used by public bodies. Equality can be addressed by the public sector equality duty being applied to the use of algorithms in the justice system. On human rights, public bodies must be able to explain what human rights are affected by any complex algorithm they use. Regarding human judgement, there must always be human management of complex algorithmic systems. To promote accountability, public bodies must be able to explain how specific algorithms reach specific decisions. Lastly, in respect to ownership public bodies should own software rather than renting it from tech companies and should manage all political design decisions. Ms Blacklaws concluded, “Within the right framework algorithmic systems – whether facial recognition technology, predictive policing or individual risk assessment tools – can deliver a range of benefits in the justice system, from efficiency and efficacy to accountability and consistency. We need to build a consensus rooted in the rule of law, which preserves human rights and equality, to deliver a trusted and reliable justice system now and for the future.” An interactive map of police forces using algorithms and for what can be found here. The Law Society Technology and Law Policy Commission report can be downloaded here.

Email your news and story ideas to: news@globallegalpost.com

Top