Amazon defended the law enforcement use cases for its facial recognition technology Friday in an effort to throw water on fears raised by civil rights activists.
Matt Wood, a leader on the Amazon Web Services machine learning team, published a blog post in response to criticism from the American Civil Liberties Union and other advocacy groups, which have been demanding the company stop selling its Rekognition software to police. In the post, Wood cautions that we “should not throw away the oven because the temperature could be set wrong and burn the pizza.”
Wood expressed skepticism about an experiment the ACLU conducted using Rekognition to compare headshots of the members of Congress with a database of 25,000 mugshots. The test set Rekognition to make matches with an 80 percent confidence rating, according to Amazon. The ACLU says Rekognition incorrectly matched 28 members of Congress to people pictured in arrest photos.
“The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus,” the ACLU said in a blog post.
In Wood’s response, he says Amazon recreated the ACLU experiment comparing photos from members of Congress to a database of 850,000 faces with a 99 percent confidence threshold. Amazon says it saw a 0 percent misidentification rate, “despite the fact that we are comparing against a larger corpus of faces.”
“The default confidence threshold for Rekognition is 80%, which is good for a broad set of general use cases (such as identifying objects, or celebrities on social media), but it’s not the right one for public safety use cases,” Wood wrote. “The 80% confidence threshold used by the ACLU is far too low to ensure the accurate identification of individuals; we would expect to see false positives at this level of confidence.”
Amazon recommends a 99 percent confidence rating for use cases like law enforcement, where accurate facial recognition is critical. An Amazon spokesperson told GeekWire that “Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgment (and not to make fully autonomous decisions), where it can help find lost children, restrict human trafficking, or prevent crimes.”
The ACLU began sounding the alarm about police use of Rekognition in May, claiming that the technology can amplify racial biases. In June, leaders of various civil and immigrant rights groups delivered 150,000 signatures to Amazon’s Seattle headquarters from people demanding Amazon to stop selling Rekognition to police.