The chief of London police is clearing up the force’s use of controversial facial recognition technology Clearview AI.
The Clearview AI technology first raised privacy concerns when it was revealed the software scraped more than three billion photos from social media websites such as Facebook and Twitter to provide a database for law enforcement agencies.
Last month, London police told Global News the force had not used or tested Clearview AI before backtracking on that statement in early March.
Chief Steve Williams said the London Police Service had never subscribed to Clearview AI or made any purchases as a corporation.
“Initial checks revealed that we were not using Clearview. That was wrong,” Williams said, adding that after police had a published a statement denying the force’s use of the software, a followup investigation revealed otherwise.
“We did that deeper dive and determined that, yes, in fact, some of our officers did access the Clearview website and accessed some of the information on it.”
Williams said the new findings have prompted a review into all past use of the software by London police.
“All the cases to date… indicate that this software was marketed to police and services and officers through various conferences, courses and other means,” Williams said.
“These officers accessed the software as a result of those efforts and went onto the website and tested it and played around with it, for a lack of a better term, using their own images. That was really just exploratory in nature.”
Williams added that as of Monday, police have only found one instance when the software was accessed for the purpose of an investigation.
“That investigation went nowhere… but we’re still looking into it,” he said.
Williams said the police force is developing a policy on any new technology being used by officers in future investigations. Williams said the upcoming policy will be robust and consist of privacy impact assessments for new technologies.
Privacy expert and former Ontario privacy commissioner Ann Cavoukian said it is “appalling” that police forces have used Clearview AI. She also raised concerns over the lack of a policy to accommodate the new technology used by London police officers.
“You would expect them to go through the proper protocol in accessing this information,” Cavoukian said.
She added that facial recognition technology is riddled with problems and has the potential to falsely identify lawful citizens as suspects in an investigation. Cavoukian also said the essence of the software is one that undermines a core principle of Canadian society.
“Privacy forms the foundation of our freedom. You can’t have free democratic societies without a solid foundation of privacy, and your facial image is your most sensitive information,” Cavoukian said.
“Your facial image is the most sensitive biometric out there. The most sensitive personal information is your face; it’s your identity. Once you’re a victim of identity theft, your life is just a nightmare until you can clear that.”
Both the federal and Ontario privacy commissioners have told Global News that privacy regulators across “all provinces and territories” are now working together to develop a framework on biometric technology use for organizations, including law enforcement.
The federal privacy commissioners also told Global News the RCMP’s use of Clearview AI was being investigated under the Privacy Act, a law that applies to the information the government can store or collect on an individual.
— With files from Global News’ Matthew Trevithick and David Lao