Four out of 11 major Canadian law enforcement agencies say they use or are exploring using facial recognition software, the controversial technology that has been found to be flawed and has prompted privacy advocates to call for it to be banned.
And Ontario’s information and privacy commissioner is expressing concern over the prospect of police using a new type of facial recognition software that collects billions of images of people from social media and other sites.
Global News asked the law enforcement agencies whether they have tested or used facial recognition technology following a recent New York Times report about a facial recognition tool by a company called Clearview AI that’s being used by hundreds of law enforcement agencies in North America, including in Canada. No specific Canadian law enforcement agency was identified in the report.
Of the four agencies that said they used or were exploring using facial recognition technology, the Toronto police, Calgary police and Edmonton police said they did not use Clearview AI, while the Ontario Provincial Police said it would not identify the company it has used.
The app by Clearview AI cross-references uploaded images of people against three billion photos it says it has in a database containing images scraped from social media sites, including Facebook and millions of other websites. Privacy rules in most democracies generally prohibit governments from collecting this type of personal information, particularly when it comes to law enforcement agencies amassing material that is unrelated to any active investigation.
The Clearview AI database is more expansive than any other known collection created by the U.S. government or law enforcement agencies — and has alarmed many advocates who consider it to be a massive invasion of privacy.
A recent study from the National Institute of Standards and Technology in the U.S. found that many facial recognition systems misidentify people of colour more often than white people.
Brian Beamish, Ontario’s information and privacy commissioner, said in an email that privacy laws allow police to collect personal information only “for legitimate, limited and specific law enforcement purposes.”
“We would be very concerned if police services in Ontario were using or planning to use a service that automatically collects images of people’s faces from across the internet,” he wrote.
Clearview AI’s website includes a testimonial from an unnamed person the company claims is a “Detective Constable in the Sex Crimes Unit” from “Canadian Law Enforcement.” The testimonial praised the technology as “hands-down the best thing that has happened to victim identification in the last 10 years.”
Clearview’s claims about the success of its software were called into question by a Buzzfeed News story on Thursday. The New York Police Department told the news outlet that Clearview AI did not lead to the arrest of a terrorism suspect, as the company claimed in a mass email to U.S. law enforcement agencies last year.
The company has not responded to requests for comment for this story.
The Calgary and Toronto police services said they use facial recognition technology, and Edmonton police say they are exploring using it, but they all said they do not use Clearview AI. The software they use cross-references visuals of suspects against internal mugshot databases.
A Toronto Star report from last year found that the Toronto Police Service has been using facial recognition software since 2018 at a cost of $451,718 plus annual fees.
Beamish, Ontario’s information and privacy commissioner, told Global News that his office is aware that the Toronto Police Service is using facial recognition technology to match images of suspects from crime scenes against the TPS’s mugshot database.
“This is not real-time use of facial recognition technology and is not the type of technology used by Clearview AI,” he continued. “Based on the information made available to us, we did not identify any compliance issues with the TPS program. If the TPS planned to change or expand the scope of this program, then we would expect them to consult with our office.”
The Ontario Provincial Police said it does use facial recognition “for various types of investigations,” but declined to specify further and confirm whether it’s through Clearview AI.
Beamish said that his office was not consulted by the OPP about their use of facial recognition technology, but that Ontario’s privacy laws do not require organizations to contact his office when implementing new programs.
“We do strongly encourage any organization, including law enforcement agencies, to contact the IPC if they are considering using any new technologies that could pose a risk to Ontarians’ privacy,” Beamish said.
The RCMP would not say whether it uses facial recognition technology.
“The RCMP does not comment on specific investigative tools or techniques,” a spokesperson said. “However, we continue to monitor new and evolving technology.”
In 2016, the RCMP told Motherboard that it was seeking a system that would allow it to “implement facial recognition as an option” as part of its fingerprint identification system, but that, at the time, there were “no immediate plans to use facial recognition features.”
In response to Global News this week, an RCMP spokesperson said: “While recognizing this as a potential future requirement, there are currently no plans to integrate facial recognition into the Automated Fingerprint Identification System.”
The Canada Border Services Agency said it has not used any products from Clearview AI and that it “does not use facial authentication technology as a method to identify travellers at its land ports of entry.” Rather, it said it uses “facial matching” technology through its electronic kiosks located at select airports.
“The use of facial matching technology consists of a one-to-one comparison between the photo of the traveller taken at the kiosk to the photograph stored on the chip in the travellers’ ePassport,” the spokesperson said.
Vito Pilieci, a spokesperson for the privacy commissioner of Canada, which is responsible for enforcing the Privacy Act that applies to federal government institutions, said the commissioner is aware of the media report about Clearview AI.
“We have not received any Privacy Impact Assessments (PIAs) that mention the use of Clearview AI,” Pilieci wrote. “That said, we will be following up with the RCMP.”
He said the commission has had some discussions about the use of facial recognition technologies.
“During our previous consultations with the RCMP on its use of video surveillance in specific initiatives (body worn cameras, video surveillance of Parliament Hill), the RCMP committed to consulting with us in advance before implementing facial recognition technology to analyze those images,” he said.
“We are not in a position to provide further comment on whether or not the RCMP is using Clearview technology, but in general, we would advise that any use of invasive technologies such as facial recognition would need to be fully justified by a pressing and legitimate necessity, and any such program would need to be designed to ensure that it would be proportionate.”
A Calgary police spokesperson said the service uses facial recognition software not from Clearview, but through NEC.
In 2014, the police service became the first in Canada to use the software, which, like that used by the Toronto police, compares photos of suspects or security footage against the police’s mugshot database. It entered into a contract with NEC Corporation of America to use its NeoFace Reveal facial recognition software.
The Edmonton Police Service said it has not used Clearview AI, but that it is currently involved in a “project assessing and engaging in a facial recognition solution, although it hasn’t been implemented yet.” The spokesperson said the police service is also working with NEC.
“The intention will be to use the technology in response to existing criminal investigations, using a database of pictures previously obtained for a lawful purpose (past mug shots). The EPS is also assessing all the privacy impacts and implications of this technology,” the spokesperson wrote.
The Montreal police would not say whether it uses facial recognition.
“We cannot confirm or deny that the SPVM is using any facial recognition technology,” a spokesperson said.
“It’s important to underline the fact that the SPVM always respects the laws in force while conducting its operations and investigations. If a police department uses such a technology, it does so to fulfill its mission, which is to protect the citizens and goods, to prevent and repress crime and to arrest perpetrators.”
Last August, a Montreal city councillor called on the city to follow San Francisco in banning the use of facial recognition software by government and police. The Montreal police at the time would also not say whether it uses the technology.
The Vancouver police, Halifax police and Winnipeg police told Global News they do not use facial recognition technology. A Regina police spokesperson said, “I am not aware of any testing of this technology, or its use by our police service.”
Privacy expert Ann Cavoukian, executive director of the Global Privacy and Security by Design Centre, said the New York Times findings regarding Clearview were “gut-wrenching” and show that facial recognition technology use by government and law enforcement should be banned altogether, as has been done in places like San Francisco.
“There’s so little transparency, no accountability, no idea of the accuracy rates, and complete loss of privacy,” Cavoukian said in an interview.
“It really hinders certain groups of people much more so than others. Where’s the fairness, where’s the equity, the ethical use of this data?” she said. “We cannot allow this to continue.”
Cavoukian said she hopes her calls for a federal ban on the use of facial recognition by government and law enforcement are echoed by others in Canada.
In addition to San Francisco, other U.S. cities, including Oakland, Calif., and Somerville, Mass., have passed laws to ban the use of facial recognition technology by government agencies.
The European Commission is exploring banning the use of facial recognition in public areas for up to five years.