The use of facial recognition technology as a security tool on Parliament Hill would pose substantial legal, privacy and human rights risks _ and might even be unlawful, says a study prepared for the parliamentary security unit.
It warns the technology could be used to surveil, track, identify or misidentify a person, and might lead to decisions that result in them being stopped, questioned, detained or arbitrarily prevented from entering the parliamentary precinct.
The independent report was completed in April by the Leadership Lab at Toronto Metropolitan University at the request of the Parliamentary Protective Service, which funded the research.
Information was gathered through interviews with protective service members as well as lawyers, scholars and people with expertise in facial recognition.
The findings come amid heightened concern about the safety of politicians and those who participate in the public arena following a spate of verbal abuse and threats directed at members of Parliament and journalists, particularly women and people of colour.
A man unleashed a profane verbal assault on Deputy Prime Minister Chrystia Freeland in Alberta last Friday, drawing widespread condemnation.
Public Safety Minister Marco Mendicino stressed the importance Monday of working closely with the RCMP, other police forces and the sergeant-at-arms of the House of Commons “to ensure that all ministers and all MPs and their staff have the protection if they need it.”
“We’ll keep all options on the table.”
The threats and intimidation are increasing, disproportionately affecting women, racialized Canadians and Indigenous people, “and that represents a threat not only to them, and their teams, their families, but a threat to our democracy,” Mendicino added.
“So it is important that we have a good robust debate _ that is one of the trademarks of a healthy democracy.”
Women, Gender Equality and Youth Minister Marci Ien, a former journalist, said intimidation was the main thing that worried her family when she decided to enter politics.
“As a Black journalist, the level of threats that I got — on my life and the life of my children — to run for office was not a small decision to make. This is real, this is real. What happened to the deputy prime minister was reprehensible, but not surprising.”
In response to questions, the Parliamentary Protective Service said it does not use _ nor does it intend to introduce _ facial recognition technology, but added that it needs to learn more about “emerging and ever-evolving threats and technologies” to ensure physical security within the parliamentary precinct.
The technology allows an image of a person’s face to be matched against a database of photos with the aim of identifying the individual.
The report says it could be used restrictively, for instance to compare a scan of an MP’s face with a banked image of their likeness before allowing them onto Parliament Hill. At the other end of the spectrum, the technology could be used to compare an image of a member of the public strolling on the Hill grounds against a large database of photos to try to identify them.
“The technology can be used to uniquely identify individuals who visit Parliament or categorize them based on their identity and, after identifying people, can be used to track their location patterns, political leanings, personal preferences, and activities,” the report says.
Dozens of security cameras currently record activity on Parliament Hill, with policies governing how long the images are kept. Signs posted on the Hill advise visitors of the cameras.
The report says going further and using facial recognition tools in the parliamentary context “raises numerous risks” regarding free expression and freedom of assembly and association.
“Some of Canada’s most vulnerable populations visit Parliament to participate in rallies, protests, and to make their voices heard on essential political issues, which are activities that the (protective service) plays a key role in facilitating and protecting.”
Use of the technology could “give rise to chilling effects” that are likely to dissuade many groups from organizing and visiting Parliament on critical issues _ particularly for communities such as Black and Indigenous people who have been historically subject to increased state surveillance, the report adds.
“There are currently no clear legal limits nor required safeguards regarding the collection and processing of biometric information such as facial images through automated means _ a major gap in Canada’s privacy and human rights legal framework.”
In addition, the report warns, if facial recognition technology were used for the protection of MPs, senators and Parliament Hill generally, there could be “scope or function creep,” expanding the tool’s use to scenarios that pose privacy risks.