In January, it was reported that Chief Judge of Sabah and Sarawak Tan Sri Abang Iskandar Abang Hashim said that the courts in Sabah and Sarawak implemented Artificial Intelligence (AI) through a pilot project. And as the AI is currently being tested and used in courts, some are concerned that there was “no proper consultation on the technology’s use”, and that it is “not contemplated in the country’s penal code”.
“Our Criminal Procedure Code does not provide for use of AI in the courts … I think it’s unconstitutional,” said lawyer Hamid Ismail.
Hamid said that he was “uneasy” that the technology was being used before lawyers, judges, and the public fully understood it. The artificial intelligence tool, which was being tested as part of a nationwide pilot, was used when a man he defended was sentenced.
The piloted AI software was developed by Sarawak Information Systems—a state government firm. According to authorities, AI-based systems make sentencing “more consistent and can clear case backlogs quickly and cheaply”. It also “helps all parties in legal proceedings to avoid lengthy, expensive and stressful litigation”.
However, critics have warned AI risks bias against minorities and marginalised groups, saying the technology lacks a judge’s ability to weigh up individual circumstances or adapt to changing social mores. For example, there are reported cases of AI favoring men over women or penalising members of ethnic minorities.
This isn’t a new finding—as newly developed AI can often show bias pretty quickly. Like when a low-resolution picture of Barack Obama was uploaded into an AI face depixeliser, and the output is a white man.
“In sentencing, judges don’t just look at the facts of the case—they also consider mitigating factors, and use their discretion. But AI cannot use discretion,” Ismail noted.
To try and stop its AI software from bias sentencing, Sarawak Information Systems said it had removed the “race” variable from the algorithm. It also noted that the company had only used a dataset of five years from 2014-19 to train the algorithm, “which seems somewhat limited in comparison with the extensive databases used in global efforts”. However, there’s no information on whether it had since expanded its database.
An analysis by Khazanah Research Institute showed that court judges followed the AI sentencing recommendation in a third of the cases—all of which involved rape or drug possession under the terms of the two states’ pilot. For the other two thirds, some judges reduced the suggested sentences, while others toughened sentences on the basis that they would not serve as a “strong enough deterrent”.
“Many decisions might properly be handed over to the machines. (But) a judge should not outsource discretion to an opaque algorithm,” said Simon Chesterman, a senior director at AI Singapore.
[ SOURCE, IMAGE SOURCE ]
0 comments :
Post a Comment