Technology

Amazon's artificial intelligence blamed for bias in human race

Face recognition system is engaged inalmost every technology company. One of them is Amazon - its technology is called Rekognition and, according to experts from the Massachusetts Institute of Technology, it is much inferior to its counterparts. The fact is that during the assessment of its accuracy, it became clear that the system often cannot distinguish between the sex of black people. This is the second complaint about the abusive face recognition work of Amazon.

Scientists tested the work of artificial intelligenceRekognition throughout 2018. The number of errors in determining the sex of black people reached 31% and this, according to researchers, is a terrible result. For comparison, the recognition of faces of Microsoft makes such errors only in 1.5% of cases.

Amazon Company Claimsresearchers are unfounded. She assured that in the internal tests of the updated version of Rekognition, not a single error was noticed. The researchers noted that MIT employees didn’t name the level of accuracy at which Amazon’s artificial intelligence would be considered “correct.”

According to the CEO for deepMatt Wood’s Amazon Web Services training, they independently tested the system using a million images of people from the Megaface database. During their testing, artificial intelligence did not make a single mistake. At the same time, a company representative announced that they are ready to listen to the results of third-party tests in order to improve their product.

This is not the first time that Amazon’s AI has been at the center of a scandal. In the summer of 2018, she accepted 28 members of Congress for criminals, 38% of whom were black.

Others have dealt with similar scandals.the company. Fortunately, they are learning from situations and are constantly improving technology. In June, Microsoft expanded the amount of data used in face recognition — in particular, the system began to pay more attention to gender, age, and skin tone. This reduced the number of errors when sorting men and women up to 20 times.

If you have any thoughts about this news - feel free to write them in the comments! We also recommend that you join our chat in Telegram, where you will always find with whom to discuss science and technology.