Advertisement

London police Clearview AI review reveals 7 officers accessed the facial recognition technology

The front of London Police Headquarters, Sept. 6, 2017. Matthew Trevithick/980 CFPL

London police have shared the findings from a review of the organization’s use of the controversial facial recognition technology Clearview AI.

At the London Police Services Board (LPSB) meeting on Thursday, London police Chief Stephen Williams revealed that seven officers accessed the software, with one of those officers using it in an investigation.

“Some of the members were made aware of the Clearview technology at a training seminar in November 2019, and it all surfaced at other training courses and other seminars,” Williams said.

“It was marketed to our officers as a trial, and they were provided free online access with a login code to try the software out.”

He told the board that several officers logged on to the software on multiple occasions to test out the application using photos of themselves or other public figures.

Story continues below advertisement

The Clearview AI technology first raised privacy concerns when it was revealed the software scraped more than three billion photos from social media websites such as Facebook and Twitter to provide a database for law enforcement agencies.

Breaking news from Canada and around the world sent to your email, as it happens.

In February, London police told Global News the force had not used or tested Clearview AI before backtracking on that statement in early March.

In March, Williams told Global News that although London police never subscribed or made any purchases from the corporation, a further investigation revealed that some officers accessed the information on the Clearview website.

During Thursday’s meeting, Williams said one officer used the technology as part of an investigation to try to identify a man who was allegedly spying on a child in a department store change room.

He said after attempts to identify the suspect from security footage were unsuccessful, the investigation has not progressed.

“I do maintain the officer was well-intentioned in that case, but I completely appreciate there are privacy implications, and we needed to hit pause,” Williams told board members.

London police have now developed a service procedure to guide officers before using or implementing new technology, which Williams said will be used for all assessing all new technology.

Story continues below advertisement

“The reality is this technology is not coming, it’s already here,” he said.

“We also recognize that although we have extraordinary powers as police officers, we need to keep these powers in check.”

Williams did say that AI will interact with policing in the future, but there are now measures in place to help police do their due diligence beforehand.

Reacting to these findings, Susan Toth, vice-chair of the LPSB, said she did not believe officers who used the technology did so maliciously, however, she was concerned with the way the company presented the product to officers.

“I think it was quite problematic that it was rolled out without any systemic approach to actual leadership and governance boards.”

Sponsored content

AdChoices