Amazon’s facial recognition loopholes ban police from using it
Amazon has said it will not be providing its facial recognition technology to police for one year. However, the company has refused to say if the move applies to federal law enforcement agencies. Amazon has come under the most scrutiny after its Rekognition face-scanning technology showed bias against people of color.
The announcement came two days after IBM said it was leaving the facial recognition market altogether.
In a blog, Amazon said, “We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested.”
Amazon did not give any concrete explanation for the decision beyond calling for federal regulation of the tech. Despite all the issues the company says it will continue to provide the facial recognition software to rights organizations that lookout keep a check for missing and exploited children and fight against human trafficking.
Many companies have been providing facial recognition technology to the US police, but Amazon is by far the largest company providing the service to the police. Amazon has also provided its facial recognition technology- Rekognition to federal agencies, like Immigration and the Customs Enforcement.
Amazon’s cloud chief Andy Jassy last year had said that the company is open to offer its Rekognition to any government department who needs it.
Studies by Joy Buolamwini, a researcher at the MIT Media Lab, and Timnit Gebru, a member at Microsoft Research show the flaws of modern facial recognition technology over racial bias. Buolamwini and Gebry have co-authored a 2018 paper that found errors in facial recognition systems from major tech companies – IBM and Microsoft, for identifying darker-skinned individuals was higher than when identifying white-skinned individuals.
Amazon has faced constant criticism over the years for selling access to Rekognition to police department, activists, and civil rights organizations. In another study in 2019, Buolamwini and co-author Deborah Raji analyzed Amazon’s Rekognition software and found that the system had major issues identifying the gender of dark-skin individuals, and made mistakes like identifying darker-skinned women for men. However, as per the research, the system worked with a zero error rate when analyzing images of lighter-skinned people.
Despite all the issues Amazon has tried to defend itself and undermined the finding. Later to answer Amazon Buolamwini posted a lengthy and detailed response to Medium, in which she says, “Amazon’s approach thus far has been one of denial, deflection, and delay. We cannot rely on Amazon to police itself or provide unregulated and unproven technology to government agencies or the police.”
After many activists and researchers like Buolamwini highlighted the pitfalls of police using facial recognition technology like Rekognition government departments begun discontinuing contracts with Amazon. Google and IBM have decided to exit the face recognition marketplace.