“Face API”: Faulty facial recognition system is supposed to learn skin tones.

Microsoft has announced some major improvements to its flawed facial recognition software, Face API. The software was massively criticized in a research paper at the beginning of 2018 because of its error rate of up to 20.8 percent. Experts accused the system of racist tendencies after it failed to correctly identify the gender of people with darker skin tones.

Industry-wide challenge

Face API has so far failed, particularly with women.

“With the new improvements, we are able to reduce error rates by 20 times for men and women with dark skin. For all women, the error rate can be reduced ninefold,” Microsoft said in a recent blog post. At the same time, the company points out that their faulty facial recognition software could indicate a much larger problem.

“The higher error rates among women with darker skin highlight an industry-wide challenge: artificial intelligence technologies are only as good as the data used to train them. “If a facial recognition system is to work well across all people, the training dataset must represent a variety of skin tones as well as factors such as hairstyle, jewelry and glasses,” Microsoft informs.

Logical bias

The Face API team has now made three important changes to the software. It expanded and revised training and benchmark datasets, launched new data collection experiments to further improve the training data by specifically focusing on skin tone, gender and age, and improved the classifier to achieve more precise results.

It is now a challenge to train AI systems so that they do not incorporate social prejudices into their own decision-making.

“These systems will necessarily reproduce social bias,” concludes Hanna Wallach from Microsoft Research Lab New York.

Source: Pte

Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )