Advertisement

Facial recognition tools misidentify people of colour more often: U.S. study

Click to play video: 'Vancouver International to start using facial recognition'
Vancouver International to start using facial recognition
WATCH: (From Nov. 4, 2019) Vancouver International to start using facial recognition – Nov 4, 2019

Many facial recognition systems misidentify people of colour more often than white people, according to a U.S. government study released on Thursday that is likely to heighten increasing skepticism of technology widely used by law enforcement agencies.

The study from the National Institute of Standards and Technology found that, when conducting a particular type of database searching known as “one-to-one” matching, many facial recognition algorithms falsely identified African-American and Asian faces 10 to 100 times more than Caucasian faces.

The study also found that African-American females are more likely to be misidentified in “one-to-many” matching, which can be used for identification of a person of interest in a criminal investigation.

Facial-recognition databases are used by police to help identify possible criminal suspects. They typically work by conducting searches of vast troves of known images, such as mug shots, and algorithmically comparing them with other images, such as those taken form a store’s surveillance cameras, that capture an unidentified person believed to be committing a crime.

Story continues below advertisement
Click to play video: 'New documents: facial recognition confirmed Ont. developer was wanted crime boss, but CBSA couldn’t prove it'
New documents: facial recognition confirmed Ont. developer was wanted crime boss, but CBSA couldn’t prove it

The NIST study was based on a review of 189 software algorithms from 99 developers — a majority of the facial recognition technology industry — and found a wide range in accuracy across developers.

Breaking news from Canada and around the world sent to your email, as it happens.

The American Civil Liberties Union, a prominent civil rights organization, on Thursday said the survey illustrates why law enforcement agencies like the FBI should not use facial recognition tools.

“One false match can lead to missed flights, lengthy interrogations, watchlist placements, tense police encounters, false arrests or worse,” ACLU policy analyst Jay Stanley said in a statement.

NIST, a nonregulatory agency that is part of the U.S. Department of Commerce, did not test tools used by powerful technology companies like Facebook Inc, Amazon.com Inc, Apple Inc, and Alphabet Inc because they did not submit their algorithms for review.

Story continues below advertisement

Facial recognition technology has come under increased scrutiny in recent years amid fears that it may lack accuracy, lead to false positives and perpetuate racial bias.

Click to play video: 'Facial recognition software privacy concerns'
Facial recognition software privacy concerns

Sponsored content

AdChoices