Privacy campaigners criticise Met Police launch of facial recognition technology

Big Brother Watch labelled the Met’s announcement a “stain” on the government’s human rights record.

Facial Recognition Technology
Facial Recognition Technology

The decision to allow the Metropolitan Police to begin operational use of live facial recognition has been called a “stain” on the Government’s human rights record by privacy campaigners.

On Friday, the Met announced it will begin deploying the technology – following a number of trials – to help fight against serious and violent crime, and also to help find missing children and vulnerable people.

However, Silkie Carlo, executive director of UK civil liberties group Big Brother Watch, called for the decision to be reconsidered.

“This decision represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the UK,” she said.

“It flies in the face of the independent review showing the Met’s use of facial recognition was likely unlawful, risked harming public rights and was 81% inaccurate. This is a breath-taking assault on our rights and we will challenge it, including by urgently considering next steps in our ongoing legal claim against the Met and the Home Secretary.

“This move instantly stains the new Government’s human rights record and we urge an immediate reconsideration.”

Police in London will start using the facial recognition cameras “within a month”, The Met said.

Suspects wanted by police or the courts will be on “watchlists”, and if they are spotted by the cameras they will be approached by officers.

Trials of the cameras saw them used in locations including the Westfield shopping centre in Stratford and London’s West End.

Scotland Yard say that the public will be aware of the surveillance, with officers handing out leaflets to the public and the cameras in open locations.

Met Police assistant commissioner Nick Ephgrave said: “Every day our police officers are briefed about suspects they should look out for; live facial recognition improves the effectiveness of this tactic.

“Similarly if it can help locate missing children or vulnerable adults swiftly, and keep them from harm and exploitation, then we have a duty to deploy the technology to do this.”

A camera used during trials at Scotland Yard for the new facial recognition system (Stefan Rousseau/PA)
A camera used during trials at Scotland Yard for the new facial recognition system (Stefan Rousseau/PA)

In response, The Information Commissioner’s office said the tech has “potentially significant privacy implications”.

Last year, it urged the Government to create a legal code of practice to ensure the technology was deployed safely.

It said in a statement: “We reiterate our call for Government to introduce a statutory and binding code of practice for LFR as a matter of priority.

“The code will ensure consistency in how police forces use this technology and to improve clarity and foreseeability in its use for the public and police officers alike. We believe it’s important for government to work with regulators, law enforcement, technology providers and communities to support the code.

“Facial recognition remains a high priority for the ICO and the public. We have several ongoing investigations. We will be publishing more about its use by the private sector later this year.”

But Mr Ephgrave said the Met is “in the business of policing by consent” and thinks the force is effectively balancing the right to privacy with crime prevention.

He said: “Everything we do in policing is a balance between common law powers to investigate and prevent crime, and Article eight rights to privacy.

“It’s not just in respect of live facial recognition, it’s in respect of covert operations, stop and search, there’s any number of examples where we have to balance individuals right to privacy against our duty to prevent and deter crime.”

The force claims that the technology has a very low failure rate, with the system only creating a false alert one in every 1,000 times.

However, using a different metric, last year, research from the University of Essex said the tech only had eight correct matches out of 42, across six trials they evaluated.

Campaign group Liberty called the decision a “dangerous, oppressive and completely unjustified move”.

The group’s advocacy director Clare Collier said: “Facial recognition technology gives the State unprecedented power to track and monitor any one of us, destroying our privacy and our free expression.

“Rolling out an oppressive mass surveillance tool that has been rejected by democracies and embraced by oppressive regimes is a dangerous and sinister step, pushing us towards a surveillance state in which our freedom to live our lives free from State interference no longer exists.”

Sorry, we are not accepting comments on this article.

Top Stories

More from the Express & Star

UK & International News