Researchers say Amazon face-detection technology shows bias
28 Jan 2019

AP — Facial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto.

Privacy and civil rights advocates have called on Amazon to stop marketing its Rekognition service because of worries about discrimination against minorities. Some Amazon investors have also asked the company to stop out of fear that it makes Amazon vulnerable to lawsuits.

The researchers said that in their tests, Amazon’s technology labeled darker-skinned women as men 31 percent of the time. Lighter-skinned women were misidentified 7 percent of the time. Darker-skinned men had a 1 percent error rate, while lighter-skinned men had none.

Link to AP.

Count reading this article to your CPD minutes, by signing up to our CPD Wallet

FREE CPD Wallet
No Responses to “Researchers say Amazon face-detection technology shows bias”

You must be logged in to post a comment.

This site uses Akismet to reduce spam. Learn how your comment data is processed.