Technology

Eradicate racism in the US judicial system want with the help of AI

From July 1 this year to the judicial system of the citySan Francisco (USA), while only in the form of an experiment, plans to introduce the practice of using artificial intelligence to help prosecutors rule out the likelihood of racial prejudice when being charged by a suspect, The Verge portal reports.

Artificial intelligence in the judicial system

As reported in the district attorney’s officeSan Francisco, the task of the system will not only be to analyze police reports and automatically delete data from them, which can be used to determine the race of the suspect (color of eyes, hair, skin), but also data about his environment (relatives, friends , neighbors, and so on), which also indirectly may indicate that the suspect belongs to a particular race. In addition, the AI ​​will remove the names and data of witnesses and police officers from the reports in order to exclude factors that may affect the impartiality of the prosecutors.

“If you look at the contingent of prisoners in the United States, it can be noted that there are more color men and women than white people,” says San Francisco District Attorney George Gascon.

Seeing the same suspect name, for example,Hernandez, investigators can immediately conclude that the person is of Latin American origin, and this in turn may affect the outcome and conclusions of the investigation.

It is noted that in 2017, at the request of the districtAttorney, in San Francisco, investigated statistics on criminal cases. It turned out that for the period from 2008 to 2014, 41% of the arrests were made by African Americans. However, they accounted for only 6% of the total population. Analysts then concluded that the decisions of the courts are "significant racial and ethnic differences."

Gascon explained that first the prosecutors will have toexamine the police reports processed by the system and, based on them, make a decision on the charges. Then they will be able to view the full report with all the names and data to determine if there are extenuating circumstances.

Currently, the judicial system inSan Francisco uses a much more limited manual filtering procedure to ensure that prosecutors do not see this information. But as a rule, only the first pages are removed from the report, which contain general information about the suspect. In the actual report, information about a person remains, therefore there is no practical sense from such a sample.

“We needed to connect machine learning technologies to this work,” commented Gascon.

According to him, this practice of usingartificial intelligence in the United States will be applied for the first time. Gascon also noted that he did not know a single law enforcement agency that had previously used AI for the same purposes. He also added that technology will be tested first in his office. Then, if it shows a good result, it will be provided free of charge to all prosecutors in the country.

The system was developed by programmers-analysts andengineers from the laboratory of computing policy at Stanford University. According to one of the authors of the development, Alex Choklas-Wood, it is a small web application that uses several machine learning algorithms that automatically edit police reports, marking certain words and replacing them with neutral ones: "place", "officer 1", "suspect " and so on.

You can discuss the news in our Telegram-chat.