Advertisement

British case challenges facial recognition in the United Kingdom, home of the security camera

Click to play video: 'UK lawsuit questions use of facial recognition by police'
UK lawsuit questions use of facial recognition by police
WATCH: A British man has launched a lawsuit against UK police to stop them from combining security video and facial recognition technology to build an extensive database to identify and catch criminals. Redmond Shannon reports – Aug 6, 2019

A British man’s court case against his local police force is highlighting the concerns raised by numerous groups in the U.K. over the use of automated facial recognition (AFR) technology by police.

Cardiff man Ed Bridges says his image was captured twice by AFR cameras used by South Wales Police: once while out shopping, and again while taking part in a peaceful protest.

The case, which was heard by a judge in May 2019, is expected to receive a ruling in the fall — one that could have major consequences for how AFR is used in England and Wales.

The United Kingdom has long been a pioneer in the use of regular security cameras, partly because of the IRA bombing campaigns between the 1970s and 1990s.

It was once estimated that the average Londoner is captured on CCTV around 300 times a day.

Story continues below advertisement

Bridge’s concerns with AFR focus on it representing what he says is an unlawful violation of his privacy, and on the lack of legislation governing how police are allowed to deploy the technology.

Police forces use AFR by scanning people’s faces and matching them against a database of images of criminal suspects, or the more vague term used by South Wales Police, “persons of interest”.

What constitutes a person of interest is currently up to each police force that uses the designation.

The human rights campaign group Liberty is supporting Bridge’s court case.

Hannah Couchman, the group’s policy and campaigns officer, says it wants the police use of AFR banned.

“There is no law around the use of facial recognition technology. So that means that the police can essentially do exactly as they like,” said Couchman.

“Under data protection principles, [Ed Bridge’s] data is being used in a certain way that there is no legal basis for. You can’t find a piece of statute that talks about the police’s ability to do this.”
Story continues below advertisement

WATCH: (April 26, 2018) Chinese cities use water spray, facial recognition technology to stop jaywalkers

Click to play video: 'Chinese cities use water spray, facial recognition technology to stop jaywalkers'
Chinese cities use water spray, facial recognition technology to stop jaywalkers

Police insist that images are not kept for longer than 24 hours and that the use of AFR does lead to arrests of suspects.

An often-used argument for the use of surveillance technology is that if you’ve done nothing wrong, you shouldn’t worry about your picture being taken.

But Couchman says an AFR scan is distinctly different from simply taking someone’s picture.

“Facial recognition is very different from standard photographs or standard CCTV, precisely because it takes biometric data,” said Couchman.

“It makes a map of your face and converts it to a numerical code that’s unique to you, and as I say it can be used to track and monitor you.”

Story continues below advertisement
Welshman Ed Bridges took legal action against the use of facial recognition technology by South Wales Police. Credit: Liberty Human Rights

Calls for moratorium

Liberty is not alone in its concern with how AFR is used.

Former cabinet minister and MP for the ruling Conservative Party, David Davis, said cameras have the potential to erode civil liberties.

“You must think through what this leads to,” said the former Brexit Secretary.

“You’re going to use this facial recognition for what purpose? To arrest people, to follow them, to keep data on them, to intrude on their privacy.”

The U.K.’s House of Commons Science and Technology Committee has called for “a moratorium on the use of auto facial recognition technologies” until a proper regulatory framework is put in place.

Story continues below advertisement

In June 2019, the U.K.’s trade body for lawyers, The Law Society, published a report saying the ways AFR was being used by police “lack a clear and explicit lawful basis”.

London’s Policing Ethics Panel has also voiced concerns, saying police “should proceed with caution and ensure that robust internal governance arrangements are in place”

In a May 2019 report, the panel said it identified two main areas of concern: “injustices associated with mis-identification, and potential incursions on civil liberty.”

WATCH: (Nov. 2, 2017) Giant London billboard targets ads by tracking you

Click to play video: 'Giant London billboard targets ads by tracking you'
Giant London billboard targets ads by tracking you

The issue of misidentification and the poor accuracy of AFR systems is another concern that human rights activists have raised.

Research by the University of Essex this year — obtained by UK broadcaster Sky News — found that only 19 per cent of matches made by the AFR system used by London’s Metropolitan Police were correct.

Story continues below advertisement
London’s Metropolitan Police trial the use of facial recognition technology in January 2019. Credit: Liberty Human Rights

Unlike South Wales Police, whose watch list contains “persons of interest”, the Metropolitan Police says its trail compared captured images with “a list of offenders wanted by the police and courts for various offences.”

Some systems can be much more accurate than 19 per cent, but there have also been concerns that AFR is designed to best identify white male faces.

“The fact [is] that this technology is most likely to misidentify you if you are a person of colour or a woman,” said Couchman.

“So that means that you’re more likely to be subject to what we call a false stop, where the police aren’t actually looking for you at all.”

In July, the London-based civil liberties group Big Brother Watch launched its own court proceedings against the Metropolitan Police’s use of AFR, calling the technology “Orwellian”.

Story continues below advertisement

The Metropolitan Police force carried out 10 trials of AFR, ending in July 2019. It says it is now considering whether or not it will continue to use the technology in the future.

Sponsored content

AdChoices