Amazon facial-identification software used by police falls short on tests for accuracy and bias,...
David Fahrenthold Retweeted
Amazon's facial-recognition software, marketed 2 law enforcement as a powerful crime-fighting tool, struggles 2 pass basic tests o/accuracy, raising concerns abt how biased results could tarnish AI's exploding use by police & surveillance. By @drewharwell
Technology
Amazon facial-identification software used by police falls short on tests for accuracy and bias, new research finds
By Drew Harwell January 25 at 11:01 AM
Facial-recognition software developed by Amazon and marketed to local and federal law enforcement as a powerful crime-fighting tool struggles to pass basic tests of accuracy, such as correctly identifying a persons gender, new research released Thursday says.
Researchers with M.I.T. Media Lab also said Amazons Rekognition system performed more accurately when assessing lighter-skinned faces, raising concerns about how biased results could tarnish the artificial-intelligence technologys use by police and in public venues, including airports and schools.
Amazons system performed flawlessly in predicting the gender of lighter-skinned men, the researchers said, but misidentified the gender of darker-skinned women in roughly 30 percent of their tests. Rival facial-recognition systems from Microsoft and other companies performed better but were also error-prone, they said.
The problem, AI researchers and engineers say, is that the vast sets of images the systems have been trained on skew heavily toward white men. The research shows, however, that some systems have rapidly grown more accurate over the past year after greater scrutiny and corporate investment into improving the results.
....
Drew Harwell is a national technology reporter for The Washington Post specializing in artificial intelligence. He previously covered national business and the Trump companies. Follow
https://twitter.com/drewharwell