Advertisement

CBSA plans to use facial recognition app to track deportations: documents

Click to play video: 'Air Canada launches facial recognition pilot in hopes of shortening airport wait times'
Air Canada launches facial recognition pilot in hopes of shortening airport wait times
WATCH: Air Canada launches facial recognition pilot in hopes of shortening airport wait times – Feb 22, 2023

The Canada Border Services Agency plans to implement an app that uses facial recognition technology to keep track of people who have been ordered to be deported from the country.

The mobile reporting app would use biometrics to confirm a person’s identity and record their location data when they use the app to check in. Documents obtained through access-to-information indicate that the CBSA has proposed such an app as far back as 2021.

A spokesperson confirmed that an app called ReportIn will be launched this fall.

Experts are flagging numerous concerns, questioning the validity of user consent and potential secrecy around how the technology makes its decisions.

Each year, about 2,000 people who have been ordered to leave the country fail to show up, meaning the CBSA “must spend considerable resources investigating, locating and in some cases detaining these clients,” says a 2021 document.

Story continues below advertisement

The agency pitched a smartphone app as an “ideal solution.”

Click to play video: 'Canada’s growing backlog of persons ordered deported for crime or security concerns'
Canada’s growing backlog of persons ordered deported for crime or security concerns

Getting regular updates through the app on a person’s “residential address, employment, family status, among other things, will allow the CBSA to have relevant information that can be used to contact and monitor the client for any early indicators of non-compliance,” it said.

“Additionally, given the automation, it is more likely that the client will feel engaged and will recognize the level of visibility the CBSA has on their case.”

Plus, the document noted: “If a client fails to appear for removal, the information gathered through the app will provide good investigative leads for locating the client.”

An algorithmic impact assessment for the project — not yet posted on the federal government’s website — said biometric voice technology the CBSA tried using was being phased out due to “failing technology,” and it developed the ReportIn app to replace it.

Story continues below advertisement

It said a person’s “facial biometrics and location, provided by sensors and/or the GPS in the mobile device/smartphone” are recorded through the ReportIn app and then sent to the CBSA’s back-end system.

Breaking news from Canada and around the world sent to your email, as it happens.
For news impacting Canada and around the world, sign up for breaking news alerts delivered directly to you when they happen.

Get breaking National news

For news impacting Canada and around the world, sign up for breaking news alerts delivered directly to you when they happen.
By providing your email address, you have read and agree to Global News' Terms and Conditions and Privacy Policy.

Once people submit photos, a “facial comparison algorithm” will generate a similarity score to a reference photo.

If the system doesn’t confirm a facial match, it triggers a process for officers to investigate the case.

“The individuals’ location is also collected every time they report and if the individual fails to comply with their conditions,” it said. The document noted individuals will not be “constantly tracked.”

The app uses technology from Amazon Web Services. That’s a choice that grabbed the attention of Brenda McPhail, the director of executive education in McMaster University’s public policy in digital society program.

Click to play video: 'Trudeau refuses to say if border agency should stop cancelling arrest warrants'
Trudeau refuses to say if border agency should stop cancelling arrest warrants

She said while many facial recognition companies submit their algorithms for testing to the U.S. National Institute of Standards and Technology, Amazon has never voluntarily done so.

Story continues below advertisement

An Amazon Web Services spokesperson said its Amazon Rekognition technology is “tested extensively — including by third parties like Credo AI, a company that specializes in Responsible AI, and iBeta Quality Assurance.”

The spokesperson added that Amazon Rekognition is a “large-scale cloud-based system and therefore not downloadable as described in the NIST participation guidance.”

“That is why our Rekognition Face Liveness was instead submitted for testing against industry standards to iBeta Lab,” which is accredited by the institute as an independent test lab, the spokesperson said.

The CBSA document says the algorithm used will be a trade secret. In a situation that could have life-changing consequences, McPhail asked whether it’s “appropriate to use a tool that is protected by trade secrets or proprietary secrets and that denies people the right to understand how decisions about them are truly being made.”

Kristen Thomasen, an associate professor and chair in law, robotics and society at the University of Windsor, said the reference to trade secrets is a signal there could be legal impediments blocking information about the system.

There’s been concern for years about people who are subject to errors in systems being legally prohibited from getting more information because of intellectual property protections, she explained.

Click to play video: 'Peel, York police using facial recognition technology'
Peel, York police using facial recognition technology

CBSA spokesperson Maria Ladouceur said the agency “developed this smartphone app to allow foreign nationals and permanent residents subject to immigration enforcement conditions to report without coming in-person to a CBSA office.”

Story continues below advertisement

She said the agency “worked in close consultation” with the Office of the Privacy Commissioner on the app. “Enrolment in ReportIn will be voluntary, and users will need to consent to both using the app, and the use of their likeness to verify their identity.”

Petra Molnar, the associate director of York University’s refugee law lab, said there is a power imbalance between the agency implementing the app and the people on the receiving end.

“Can a person really, truly consent in this situation where there is a vast power differential?”

If an individual doesn’t consent to participate, they can report in-person as an alternative, Ladouceur said.

Click to play video: 'Canadian Tire’s use of facial recognition technology subject of privacy commissioner’s report'
Canadian Tire’s use of facial recognition technology subject of privacy commissioner’s report

Thomasen also cautioned there is a risk of errors with facial recognition technology, and that risk is higher for racialized individuals and people with darker skin.

Story continues below advertisement

Molnar said it’s “very troubling that there is basically no discussion of … human rights impacts in the documents.”

The CBSA spokesperson said Credo AI reviewed the software for bias against demographic groups, and found a 99.9 per cent facial match rate across six different demographic groups, adding the app “will be continuously tested after launch to assess accuracy and performance.”

The final decision will be made by a human, with officers overseeing all submissions, but the experts noted humans tend to trust judgements made by technology.

Thomasen said there is a “fairly widely recognized … psychological tendency for people to defer to the expertise of the computer system,” where computer systems are perceived to be less biased or more accurate.

Sponsored content

AdChoices