Suspend use of biased facial recognition technology
10/18/2019, 6 a.m.
It’s no secret that Amazon has been promoting DIY (Do It Yourself) surveillance products to consumers, such as its very own smart doorbell, Ring. But what Amazon shoppers and most everyday Americans are just starting to find out is that the real target customers for these surveillance tools are police departments and other law enforcement agencies — something that should have every person of color worried.
The little we know about Amazon’s surveillance technology is not good. Amazon’s Rekognition, the cloud-based software platform that was launched in 2016 and which has been sold and used by a number of U.S. government agencies, including U.S. Immigration and Customs Enforcement, or ICE, has been demonstrated to have racial and gender bias. Hacks into the system also have raised serious privacy concerns.
States’ attorneys general, as the lead law enforcement officials within the 50 states, must suspend any partnerships with Amazon until we have more information, particularly with regard to how these technologies could impact communities of color.
Amazon has become a lightning rod for criticism. Dozens of social advocacy groups, including Human Rights Watch, Color of Change and Data for Black Lives, have sounded the alarm over the consequences of allowing Amazon’s surveillance tools to wrongfully target groups that advocate for justice for people of color and others. Last year, Amazon Rekognition falsely matched 28 members of Congress with criminal mugshots, disproportionally representing people of color.
Without identifying a specific company, NAACP President and CEO Derrick Johnson recently called facial recognition technology “a scary proposition.” Mr. Johnson was responding to questions from veteran journalist Dr. Barbara Reynolds during Richard Prince’s recent Journalism Roundtable. Dr. Reynolds expressed concern over technology “so flawed” that it doesn’t even properly represent the faces of black people. Mr. Johnson noted that the NAACP has had ongoing discussions on the topic with the committees of Homeland Security in both the U.S. House of Representatives and the U.S. Senate.
“We’re in the middle of conversations with cyber experts so that we can have a very clear policy approach dealing with not only facial recognition, but all of the technology and how it can be used in our community against us,” he said.
Meanwhile, researchers from Google, Facebook and Microsoft have all urged Amazon to stop selling Rekognition software to law enforcement, citing study after study that show the company’s surveillance tools simply cannot be trusted.
And what has Amazon done to address these serious allegations about racial profiling from its surveillance tools? Quietly tell law enforcement officials not to use the words “surveillance” when talking about Amazon products in public.
It is overwhelmingly clear that Amazon’s facial recognition technology is not only deeply flawed, but has the grave potential to magnify our worst racial biases if we continue to allow it to dictate policing.
Amazon’s Neighbors application similarly has put black people and other people of color at unfair risk of being targeted by law enforcement officials. Earlier this year, one review found that neighborhood watch groups using Ring footage disproportionately accused people of color of suspicious activity under the guise of law and order. Moreover, these videos frequently are accompanied by racist and verbally abusive language, demonstrating the threats these technologies pose.
Giving incredibly invasive tools like Neighbors or Rekognition a greater role in our justice systems poses a threat to anyone who wants to walk the streets without the fear of being tracked and falsely targeted.
We must protect our communities of color before Amazon’s dangerous surveillance technologies become fully entrenched in our criminal justice system.
The writer is an award-winning journalist who is president and CEO of Trice Edney Communications in Washington.