Amazon on Wednesday banned police use of its face-recognition technology for a year, making it the latest tech giant to step back from law-enforcement use of systems that have faced criticism for incorrectly identifying people with darker skin.
The Seattle-based company did not say why it took action now. Ongoing protests following the death of George Floyd have focused attention on racial injustice in the U.S. and how police use technology to track people. Floyd died May 25 after a white Minneapolis police officer pressed his knee into the handcuffed black man’s neck for several minutes even after Floyd stopped moving and pleading for air.
Law enforcement agencies use facial recognition to identify suspects, but critics say it can be misused. A number of U.S. cities have banned its use by police and other government agencies, led by San Francisco last year.
On Tuesday, IBM said it would get out of the facial recognition business, noting concerns about how the technology can be used for mass surveillance and racial profiling.
Civil rights groups and Amazon’s own employees have pushed the company to stop selling its technology, called Rekognition, to government agencies, saying that it could be used to invade people’s privacy and target minorities.
In a blog post Wednesday, Amazon said that it hoped Congress would put in place stronger regulations for facial recognition.
“Amazon’s decision is an important symbolic step, but this doesn’t really change the face recognition landscape in the United States since it’s not a major player,” said Clare Garvie, a researcher at Georgetown University’s Center on Privacy and Technology. Her public records research found only two U.S. agencies using or testing Rekognition. The Washington County Sheriff’s Office in Oregon has been the most public about using it. The Orlando police department tested it, but chose not to implement it, she said.
Studies led by MIT researcher Joy Buolamwini found racial and gender disparities in facial recognition software. Those findings spurred Microsoft and IBM to improve their systems, but irked Amazon, which last year publicly attacked her research methods. A group of artificial intelligence scholars, including a winner of computer science’s top prize, last year launched a spirited defence of her work and called on Amazon to stop selling its facial recognition software to police.
A study last year by a U.S. agency affirmed the concerns about the technology’s flaws. The National Institute of Standards and Technology tested leading facial recognition systems — though not from Amazon, which didn’t submit its algorithms — and found that they often performed unevenly based on a person’s race, gender or age.
Buolamwini on Wednesday called Amazon’s announcement a “welcomed though unexpected announcement.”
“Microsoft also needs to take a stand,” she wrote in an emailed statement. “More importantly our lawmakers need to step up” to rein in harmful deployments of the technologies.
Microsoft has been vocal about the need to regulate facial recognition to prevent human rights abuses but hasn’t said it wouldn’t sell it to law enforcement. The company didn’t respond to a request for comment Wednesday.
Amazon began attracting attention from the American Civil Liberties Union and privacy advocates after it introduced Rekognition in 2016 and began pitching it to law enforcement. But experts like Garvie say many U.S. agencies rely on facial recognition technology built by companies that are not as well known, such as Tokyo-based NEC, Chicago-based Motorola Solutions or the European companies Idemia, Gemalto and Cognitec.
Amazon isn’t abandoning facial recognition altogether. The company said organizations, such as those that use Rekognition to help find children who are missing or sexually exploited, will still have access to the technology.
This week’s announcements by Amazon and IBM follow a push by Democratic lawmakers to pass a sweeping police reform package in Congress that could include restrictions on the use of facial recognition, especially in police body cameras. Though not commonly used in the U.S., the possibility of cameras that could monitor crowds and identify people in real time have attracted bipartisan concern.
The tech industry has fought against outright bans of facial recognition, but some companies have called for federal laws that could set guidelines for responsible use of the technology.
“It is becoming clear that the absence of consistent national rules will delay getting this valuable technology into the hands of law enforcement, slowing down investigations and making communities less safe,” said Daniel Castro, vice-president of the industry-backed Information Technology and Innovation Foundation, which has advocated for facial recognition providers.
Angel Diaz, an attorney at New York University’s Brennan Center for Justice, said he welcomed Amazon’s moratorium but said it “should have come sooner given numerous studies showing that the technology is racially biased.”
“We agree that Congress needs to act, but local communities should also be empowered to voice their concerns and decide if and how they want this technology deployed at all,” he said.
____
O’Brien reported from Providence, Rhode Island.