Tue, 16 Apr 2024

 

Crime: Met Police to deploy facial recognition cameras
 
By:
Sat, 25 Jan 2020   ||   Nigeria,
 

The Metropolitan Police has announced it will use live facial recognition cameras operationally for the first time on London streets.

The cameras will be in use for five to six hours at a time, with bespoke lists of suspects wanted for serious and violent crimes drawn up each time.

The cameras identified 70% of suspects but an independent review found much lower accuracy, Police source said.

Privacy campaigners said it was a "serious threat to civil liberties".

Following earlier pilots in London and deployments by South Wales Police, the cameras are due to be put into action within a month.

Police say they will warn local communities and consult with them in advance.

Cameras will be clearly signposted, covering a "small, targeted area", and police officers will hand out leaflets about the facial recognition scanning, the Met said.

Assistant Commissioner Nick Ephgrave said the Met has "a duty" to use new technologies to keep people safe, adding that research showed the public supported the move.

"We all want to live and work in a city which is safe: the public rightly expect us to use widely available technology to stop criminals," he said.

"Equally I have to be sure that we have the right safeguards and transparency in place to ensure that we protect people's privacy and human rights. I believe our careful and considered deployment of live facial recognition strikes that balance."

Giving update on how it works, Mr Ephgrave said the system could also be used to find missing children or vulnerable adults.

Trials of the cameras have already taken place on 10 occasions in locations such as Stratford's Westfield shopping centre and the West End of London.

The Met said it tested the system during these trials using police staff whose images were stored in the database. The results suggested that 70% of wanted suspects would be identified walking past the cameras, while only one in 1,000 people generated a false alert.

But an independent review of six of these deployments, using different methodology, found that only eight out of 42 matches were "verifiably correct".

Campaigners have warned that accuracy may be worse for black and minority ethnic people, because the software is trained on predominantly white faces.

The Met said that the technology was "tried-and-tested" in the private sector, but previous uses of facial recognition have been controversial.

Big Brother Watch, a privacy campaign group, said the decision represented "an enormous expansion of the surveillance state and a serious threat to civil liberties in the UK".

Silkie Carlo, the group's director, said: "It flies in the face of the independent review showing the Met's use of facial recognition was likely unlawful, risked harming public rights and was 81% inaccurate."

Last year, the Met admitted it supplied images for a database carrying out facial recognition scans on a privately owned estate in King's Cross, after initially denying involvement.

The Information Commissioner launched an investigation into the use of facial recognition by the estate's developer, Argent, saying that the technology is a "potential threat to privacy that should concern us all". The investigation continues.

The ICO, which is the UK's data protection watchdog, said a broader inquiry into how police use live facial recognition technology found there was public support for its use, although it needed to be "appropriately governed, targeted and intelligence-led".

The Met Police had given assurances that it is taking steps to reduce intrusion, but the government should introduce a legally binding code of practice, an ICO spokeswoman said.

"This is an important new technology with potentially significant privacy implications for UK citizens," she said.

 

Tag(s):
 
 
Back to News