When Brett Gaylor posted his honeymoon photos on a popular picture sharing website 15 years ago, he never imagined they’d be used to train facial recognition software illegally used in Canada by the RCMP.
They were. And when the shocked documentary filmmaker based in Victoria, B.C., found out, he set about making a film about how it happened and what it means for Canadians’ privacy rights.
“I wanted to sort of trace the supply chain of facial recognition. Because facial recognition is a type of artificial intelligence, so it needs millions of images to be able to learn to recognize faces.”
“So they used these photos they found online as a way to make these artificial intelligence systems better.”
Through his research, Gaylor learned the photos which he’d uploaded to the website Flickr, had been downloaded and incorporated into massive datasets, including a popular one called MegaFace, created by the University of Washington.
There, along with the photos of millions of other people, they were used to train artificial intelligence on facial recognition for companies including U.S.-based Clearview AI, along with the Chinese government, he said.
“So in some small way this photo of me on my honeymoon became involved in this system that was used all over the world, ostensibly to be able to recognize criminals, but in ways that really violate a lot of our civil liberties,” he said.
On Wednesday, Canada’s privacy commissioner Daniel Therrien said the RCMP had broken the law by using information from Clearview AI without ensuring compliance with the Privacy Act.
“A government institution cannot collect personal information from a third party agent if that third party agent collected the information unlawfully.”
The company would stop offering its facial-recognition services in Canada in response to the privacy investigation, he announced last year.
Gaylor has now produced an interactive short film called Discriminator about facial recognition and what he learned as he unraveled the thread of how his photos had contributed to the development of the technology.
The film allows viewers to see how the technology works by activating the webcams on their devices, to help the audience see AI and facial recognition as something tangible, rather than theoretical.
The film also raised questions about privacy and the need to regulate these new technologies as they develop, Gaylor said.
“You can’t give consent to something you don’t know is going to happen. It’s impossible. Instead what we need to do is create legal frameworks that will make certain uses off-limits,” Gaylor said.
With files from Kylie Stanton and John Hua