Artificial intelligence Amazon accused of bias towards the race of man

The development of the system of recognition of persons engaged in almost every technology company. One of them is Amazon — its technology is called Rekognition and, according to experts from mit, it is inferior to their counterparts. The fact that the evaluation of its accuracy it was found that the system often cannot distinguish the gender of the black man. This is the second complaint to the offensive work of face recognition Amazon.

Система распознавания лиц

Scientists tested the artificial intelligence Rekognition throughout 2018. The number of errors in sex determination of black people has reached 31% and is, according to the researchers, a terrible result. For comparison, face recognition Microsoft admits this error is only 1.5% of cases.

Amazon believes that the claims of the researchers without reason. She assured that in the internal tests, updated versions of the Rekognition was seen no very much. The researchers noticed that the staff of MIT did not mention the accuracy level at which the work of artificial intelligence Amazon would be considered “correct”.

According to the CEO of deep training Amazon Web Services Matt wood, they independently tested the system using a million facial images from the database Megaface. In the course of their testing, artificial intelligence has not made a single mistake. The representative of the company announced that they are ready to listen to the results of third party tests in order to improve their product.

This is not the first case when artificial intelligence Amazon was at the center of the scandal. In the summer of 2018 it took 28 members of Congress for criminals, and 38% of them were black.

Such scandals have been dealt with and other companies. Fortunately, they learn from situations and constantly improve technology. In June, Microsoft has expanded the amount of data used in face detection in particular, the system began to pay more attention to gender, age and skin tone. This reduced the number of errors in the sorting of men and women up to 20 times.

If you have any thoughts about this news — feel free to write them in the comments! We also recommend you to join our chat in Telegram, where you will always find someone to talk about science and technology.

Leave a Reply

Your email address will not be published. Required fields are marked *