AMAZON FACIAL RECOGNITION SOFTWARE USED BY LAW ENFORCEMENT HAS RACIAL BIAS, PER REPORT
Facial technology software from Amazon used by some law enforcement agencies has shown inaccuracies, particularly when it comes to women of color. Sometimes the technology mistakes dark-skinned women as men. The system worked fine when recognizing men. It’s identifying women that became a glaring problem for the software. An MIT report says Amazon’s Recognition misidentified…