Four out of five people identified by the Metropolitan Police's facial recognition technology as possible suspects are innocent, according to an independent report.

Researchers found that the controversial system is 81% inaccurate – meaning that, in the vast majority of cases, it flagged up faces to police when they were not on a wanted list.

The force maintains its technology only makes a mistake in one in 1,000 cases – but it uses a different measurement to arrive at this conclusion.

The report, exclusively revealed by Sky News and The Guardian, raises "significant concerns" about Scotland Yard's use of the technology, and calls for the facial recognition programme to be halted.

Citing a range of technical, operational and legal issues, the report concludes that it is "highly possible" the Met's usage of the system would be found unlawful if challenged in court.

Advertisement

Image: An independent report calls for Scotland Yard to halt its use of facial recognition technology

The Met has been monitoring crowds with live facial recognition (LFR) since August 2016, when it used the technology at Notting Hill Carnival.

Since then, it has conducted 10 trials at locations including Leicester Square, Westfield Stratford, and Whitehall during the 2017 Remembrance Sunday commemorations.

More from London

The first independent evaluation of the scheme was commissioned by Scotland Yard and conducted by academics from the University of Essex.

Professor Pete Fussey and Dr Daragh Murray evaluated the technology's accuracy at six of the 10 police trials. They found that, of 42 matches, only eight were verified as correct – an error rate of 81%. Four of the 42 were people who were never found because they were absorbed into the crowd, so a match could not be verified.

The Met prefers to measure accuracy by comparing successful and unsuccessful matches with the total number of faces processed by the facial recognition system. According to this metric, the error rate was just 0.1%.

Policing minister delivers incorrect figures to parliament
Policing minister tells incorrect figures to parliament

Duncan Ball, the Met's deputy assistant commissioner, said: "We are extremely disappointed with the negative and unbalanced tone of this report… We have a legal basis for this pilot period and have taken legal advice throughout.

"We believe the public would absolutely expect us to try innovative methods of crime fighting in order to make London safer."

Professor Fussey and Dr Murray claimed the Met's use of facial recognition during these trials lacked "an explicit legal basis" and failed to take into account how this technology infringed fundamental human rights.

"Our report conducted a detailed, academic, legal analysis of the documentation the Met Police used as a basis for the face recognition trials," Professor Fussey told Sky News. "There are some shortcomings and if [the Met] was taken to court there is a good chance that would be successfully challenged."

The co-authors also found "significant" operational problems – with obtaining the consent of those affected a particular issue.

When live facial recognition is used in public places, everyone who comes within range of the cameras is considered to be under overt surveillance.

The Met did make an effort to notify passers-by about their trials by putting out signs and tweeting about the event.

But the researchers observed "significant shortcomings" in this process – and said this created difficulty in gaining meaningful consent.

A recent BBC documentary captured an incident where a man was fined after refusing to take part in a facial recognition trial.

Professor Fussey and Dr Murray wrote: "Treating LFR camera avoidance as suspicious behaviour undermines the premise of informed consent.

"The arrest of LFR camera-avoiding individuals for more minor offences than those used to justify the test deployments raise clear issues regarding the extension of police powers and of 'surveillance creep'."

Their report also criticised the Met's use of "watch lists" – the registers of "wanted" people that facial recognition is supposed to help locate.

According to the report, the data used to create watch lists was not current, so people were stopped even though their case had already been addressed. In other cases, there were no clear reasons why people were put on watch lists, leaving "significant ambiguity" about the intended purpose of facial recognition.

Read More – Source [contf] [contfnew]

Sky News

[contfnewc] [contfnewc]