This post is also available in: Deutsch
Facial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto.
Privacy and civil rights advocates have called on Amazon to stop marketing its Rekognition service because of worries about discrimination against minorities. Some Amazon investors have also asked the company to stop out of fear that it makes Amazon vulnerable to lawsuits.
The researchers said that in their tests, Amazon’s technology labeled darker-skinned women as men 31 percent of the time. Lighter-skinned women were misidentified 7 percent of the time. Darker-skinned men had a 1 percent error rate, while lighter-skinned men had none.
“labeled darker-skinned women as men 31 percent of the time”
There are a couple factors involved here, and neither has anything to do with “racism”.
The AI uses human inputs, at least at the start, so it’s subject to human perception. Across all races, men tend to be darker than their same-race women. The AI is picking up on this, misidentifying some dark-skinned women as men.
And, more darkly, the darker races of women may not be, how to put this nicely, as….feminine…as the lighter-skinned races of women. The AI is also picking this up; the less feminine dark-skinned women are being misidentified as men because they are mannish looking.
Reality has a racialist bias, but that doesn’t make it false. Shitlibs are just gonna have to deal with the fact that race is more real than their feelings.