Advertisement

How a B.C. filmmaker’s honeymoon photos helped train controversial facial recognition tech

Click to play video: 'B.C. filmmaker highlights dangers of facial recognition technology'
B.C. filmmaker highlights dangers of facial recognition technology
A new documentary by a B.C. filmmaker shows just how easy it is for your private pictures can become part of a very public database when you post them online. John Hua reports. – Jun 11, 2021

When Brett Gaylor posted his honeymoon photos on a popular picture sharing website 15 years ago, he never imagined they’d be used to train facial recognition software illegally used in Canada by the RCMP.

They were. And when the shocked documentary filmmaker based in Victoria, B.C., found out, he set about making a film about how it happened and what it means for Canadians’ privacy rights.

Click to play video: 'Privacy investigation finds U.S. tech firm violated Canadian rules with facial-recognition tool'
Privacy investigation finds U.S. tech firm violated Canadian rules with facial-recognition tool

“I wanted to sort of trace the supply chain of facial recognition. Because facial recognition is a type of artificial intelligence, so it needs millions of images to be able to learn to recognize faces.”

Story continues below advertisement

“So they used these photos they found online as a way to make these artificial intelligence systems better.”

Through his research, Gaylor learned the photos which he’d uploaded to the website Flickr, had been downloaded and incorporated into massive datasets, including a popular one called MegaFace, created by the University of Washington.

Click to play video: 'Privacy watchdogs call on provincial governments to beef up their laws'
Privacy watchdogs call on provincial governments to beef up their laws

There, along with the photos of millions of other people, they were used to train artificial intelligence on facial recognition for companies including U.S.-based Clearview AI, along with the Chinese government, he said.

For news impacting Canada and around the world, sign up for breaking news alerts delivered directly to you when they happen.

Get breaking National news

For news impacting Canada and around the world, sign up for breaking news alerts delivered directly to you when they happen.
By providing your email address, you have read and agree to Global News' Terms and Conditions and Privacy Policy.

“So in some small way this photo of me on my honeymoon became involved in this system that was used all over the world, ostensibly to be able to recognize criminals, but in ways that really violate a lot of our civil liberties,” he said.

Story continues below advertisement

On Wednesday, Canada’s privacy commissioner Daniel Therrien said the RCMP had broken the law by using information from Clearview AI without ensuring compliance with the Privacy Act.

“The use of [facial recognition technology] by the RCMP to search through massive repositories of Canadians who are innocent of any suspicion of crime presents a serious violation of privacy,” Therrien said in his report.

“A government institution cannot collect personal information from a third party agent if that third party agent collected the information unlawfully.”

The company would stop offering its facial-recognition services in Canada in response to the privacy investigation, he announced last year.

Story continues below advertisement

Gaylor has now produced an interactive short film called Discriminator about facial recognition and what he learned as he unraveled the thread of how his photos had contributed to the development of the technology.

The film allows viewers to see how the technology works by activating the webcams on their devices, to help the audience see AI and facial recognition as something tangible, rather than theoretical.

The film also raised questions about privacy and the need to regulate these new technologies as they develop, Gaylor said.

“You can’t give consent to something you don’t know is going to happen. It’s impossible. Instead what we need to do is create legal frameworks that will make certain uses off-limits,” Gaylor said.

With files from Kylie Stanton and John Hua

Sponsored content

AdChoices