Artificial intelligence systems boast a wealth of encouraging applications in public safety and security, but one automated tool is being denounced as a threat to civil liberties and a conduit for excessive police surveillance.
A coalition of 34 organizations led by the American Civil Liberties Union sent a letter this week to Amazon CEO Jeff Bezos demanding the company stop offering its facial recognition tool, “Rekognition,” to law enforcement and government agencies. Rekognition uses deep learning to cross-reference images or video against a user-provided database (such as a federal identification or mugshot archive), and is available through Amazon Web Services .
Since its commercial release in 2016, the Orlando Police Department, the Washington County Sheriff’s Office in Oregon, and other law enforcement entities have commissioned Rekognition as a way to identify and track criminal suspects, sometimes in real-time. It’s also been used to locate missing children in crowded spaces and to identify terrorist suspects overseas, and Amazon has stressed its potential for crime prevention applications as the technology evolves.
As the tech giant leverages case studies to attract new law enforcement customers, privacy advocates are beginning to push back, citing concerns that the technology is “primed for abuse” and “readily available to violate rights and target communities of color,” according to the letter.
Though the technology in and of itself is not necessarily discriminatory, the data used can be. Because minorities are investigated and incarcerated at disproportionate rates, facial recognition is more likely to be used on those populations. But facial recognition tools generally perform worse on faces with darker skin and ethnic features. A study from the MIT Media Lab showed that darker-skinned women are up to 35 percent more likely to be misidentified by facial recognition systems. A 2012 study investigating three commercial algorithms found that all three consistently performed 5-to-10 percent worse on African-Americans than on Caucasians.
If these algorithms are inaccurate, innocent people could potentially be investigated and arrested for crimes they did not commit. And inaccuracy isn’t the only cause for concern, privacy proponents argue. While police departments routinely use facial recognition to confirm the identities of detained suspects, Amazon’s interest in the market suggests an expansion of use that some fear could result in mass surveillance. Currently, more than 130 million American residents are saved in facial recognition databases accessible to police and other criminal investigators, according to the The New York Times.
Integrating Rekognition with officer-worn body cameras, traffic cameras, or other real-time public sensors might open the door for cities and government agencies to develop automated systems capable of identifying and tracking any individual, even people not suspected of specific crimes. The lack of legislative regulation surrounding this technology further muddies the waters.
In December 2017, the Bureau of Justice Assistance published a “Face Recognition Policy Development Template” for use in criminal intelligence and investigative activities, aimed at helping law enforcement establish policies that comply with existing laws and reduce privacy risks. However, this document is not law, and its generally up to individual police departments how they choose to use facial recognition technology. The Seattle Police Department, for example, has privacy-conscious policies barring the use of such tools via body cameras.
“Something causes more harm than good if it erodes public trust and confidence,” said Sean Whitcomb, a sergeant and spokesman for the Seattle Police Department (SPD), in a trajectory feature on predictive policing published in late 2017.
Photo Credit: Amazon