Councillors grill police over use of live facial recognition in Camden

The Metropolitan Police have defended the use of live facial recognition software

Police have claimed that live facial recognition (LFR) is highly accurate in catching criminals – as councillors in Camden pressed officers about the technology’s flaws.

On Monday, Metropolitan Police representatives told the Town Hall’s culture and environment scrutiny committee that the business community “welcomes” LFR on the borough’s streets.

The system, they said, is 89 per cent accurate, with independent scientific reviews finding an “exceptionally low” level of incorrect matches – otherwise known as false positives.

The Met’s defence included a diversity study into the uses of LFR, which also found “no statistically significant” difference in performance across different racial or gender groups.
But Cllr Awale Olad, scrutiny committee chair, asked why the technology had been attacked for “targeting people with darker skin, especially women”.

The Met representatives said their system only looks for faces, rather than genders, using a biometric template.

They argued that those cases of misidentification were in commercial settings, such as shopkeepers using facial recognition to collect customer data.

“Our system doesn’t do this,” they said. “When it sees a face, it biometrically templates that face and compares that template against a watchlist of individuals.”

Facial recognition systems use different sensitivity levels or ‘thresholds’ to determine the balance between false positives and false negatives.

The Met representatives admitted that “one in 6,000 people” will be wrongly identified, but they argued this performance is “better than independent science suggested”.

Cllr Olad countered that “around 60 civil liberties group” in the country object to the police using the technology because of its flaws and the risks posed to minority groups.

“That says a lot,” he argued.

The Met’s spokesperson said its own testing had produced results that confirmed the accuracy of its LFR systems.

“What we were required to do under our public sector equalities duty is know how our algorithm performs.

“That’s why we went out and did the testing, at which point we had around 15,000 people on our watchlist in London.

“The scientists actually had to load this system with 178,000 people in order to generate enough false positives.”

Cllr Rishi Madlani (Labour) said he had “mixed” feelings about the technology’s use in the borough.

“That being said, the public perception of this is in a really bad place, probably compounded by the confusion around AI.

“How do we help you, if we’re confident that the data is now accurate, bring the public with us on this?” he asked.

The police argued that education was key, given that the technology was complex.

The use of LFR has steadily increased in recent months, with streams of reports from police claiming the technology has led to arrests.

But there are cases of people being wrongly identified by facial recognition systems, while questions loom over incidents of racial discrimination in some LFR checks.

The privacy and civil liberties group Big Brother Watch on Monday denounced the “mass surveillance tech” as ineffective in fighting crime.

After a person was stabbed during a live facial recognition deployment, the group criticised police for standing by with cameras as the attack took place “just metres away”, while the suspect fled the scene.

Cllr Olad raised the point that some boroughs, along with the European Parliament, have already called for a ban on LFR.

The Met invited the committee to see how the technology was being used in practice, something other councils had found “a really useful exercise”.

Leave a Comment





This site uses Akismet to reduce spam. Learn how your comment data is processed.