Facial recognition cameras see one arrest after scanning 13,000 people

Minority reported: Met Police’s facial recognition cameras scanned 13,000 people leading to ONE arrest – with AI surveillance picking out WRONG person seven out of eight times

  • Facial recognition devices identified the wrong person seven out of eight times
  • Cameras failed to spot any suspects out of 4,600 faces in London last February
  • Campaigners say cameras are ‘dangerously inaccurate and waste public money’

Facial recognition cameras scanned 13,000 people’s faces but only helped make one arrest, new figures show, as it’s revealed the AI surveillance kit picked out the wrong person seven out of eight times during one deployment. 

Met Police used facial recognition technology three times in London last year – but campaigners say it is ‘dangerously inaccurate and a waste of public money’.

During a search for a suspect in Oxford Circus, cameras scanned 8,600 faces against a list of 7,200 suspects.

Eight people were singled out by the technology, but seven of them were totally innocent, while a 35-year-old woman was arrested in connection with a serious assault on an emergency worker.

Last February the cameras scanned 4,600 faces against a list of nearly 6,000 people who were wanted in Stratford, East London. Not a single suspect was identified. 

Facial recognition cameras have been branded ‘dangerously inaccurate,’ after a deployment at Oxford Circus last February scanned more than 8000 faces, wrongly identified seven ‘suspects,’ and resulted in just one arrest

The technology was used again in Oxford Circus in February, but the operation was stopped due to a technical fault after scanning an unknown amount of faces.

Facial recognition software works by measuring the structure of a face, including the distance between features such as the eyes, nose, mouth and jaw. 

Police are alerted if a match is found, and officers then decide whether to speak to the potential suspect.

Silkie Carlo, director of Big Brother Watch, said: ‘The police’s own data shows facial recognition surveillance is dangerously inaccurate and a waste of public money, let alone a serious assault on our civil liberties. 

‘The Government should ban police and retailers alike using this Orwellian tech.’

Scotland Yard said officers only deploy the technology to specific policing operations. 

Met Police has not outlined how many facial recognition cameras it has, or the cost of them.

It is also unclear if the technology has led to any convictions, Scotland Yard only pointed to the 35-year-old woman arrested in Oxford Circus 12 months ago.

A police operation at Stratford last February failed to identify a single suspect – despite scanning around 4,600 faces 

Specialist cameras identify potential suspects by measuring the structure of a face, including the distance between features such as the eyes, nose, mouth and jaw.

When operational it measures the faces of passers-by and compares it to a database of suspects used by Met Police.

It then alerts officers in the area to any potential matches.

Ultimately, it’s up to officers to decide whether or not to approach a potential suspect.

A Met Police spokesman told MailOnline today: ‘The Met will use any technology lawfully available to tackle crime.

‘We know from the trials that live facial recognition technology could assist our officers to locate criminals who are wanted for serious and violent offences, such as knife and gun crime, and the sexual exploitation of children.

‘The Met will publicise details of deployments online in advance, both on our corporate website, and typically through local borough communications channels.’

In October it was revealed there were around 1,000 AI scanners monitoring social distancing between pedestrians and cyclists – having originally been installed to monitor traffic in the Capital.

Last month it was revealed London has the highest number of cameras outside of China, with an eye-watering 627,707 CCTV monitors set up across the UK’s capital. 

Met Police Commissioner Dame Cressida Dick has previously defended the tech.

Speaking last February, she said: ‘It is for critics to justify to the victims of those crimes why police should not use tech lawfully and proportionally to catch criminals who caused the victims real harm.

Met Police only deploy the technology to specific policing operations, but there are major privacy concerns amid campaigners, with London currently home to the highest number of cameras outside of China , with an eye-watering 627,707 CCTV monitors set up across the UK’s capital

‘It is not for me and the police to decide where the boundary lies between security and privacy, though I do think it is right for us to contribute to the debate.’ 

Susan Hall, who leads the Conservatives on the Greater London Assembly, told The Times: ‘Facial recognition is a promising tool which can get dangerous individuals off our streets. 

‘However, by only uploading a fraction of its wanted list, the Met is severely limiting the number of criminals this technology can identify.’ 

Facial recognition technology has been installed in London by private companies.

In 2019 the developer behind a 67-acre site in King’s Cross admitted it had installed the technology, which can track tens of thousands of people every day. 

Canary Wharf planned to install facial recognition across its 97-acre estate, MailOnline has approached it to see whether those plans went ahead. 

Argent, the property developer for the King’s Cross estate, said: ‘These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public.’ 

FACIAL RECOGNITION CAMERAS INSTALLED BY DEVELOPERS

Developers behind the 67-acre King’s Cross site admitted to installing facial recognition technology, which can track tens of thousands of people every day. 

Speaking in 2019, Argent, the property developer for the King’s Cross estate, said: ‘These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public.’ 

A spokeswoman has declined to say what those systems were, how long the facial recognition had been in operation or what the legal basis was for its use, as is required under European data protection law. 

It is also not known exactly who will have access to the cameras and what is stored on the system, including whether police will be able to make use of the information. 

Canary Wharf was said to be considering the plans at the time – MailOnline has approached the banking site to see if the plans went ahead.

Sources close to Canary Wharf, told the Financial Times that if the technology was to be adopted it would not operate continuously on pedestrians and office workers, but be limited to specific purposes or threats. 

According to the publication, Canary Wharf currently operates at least 1,750 CCTV cameras, as well as an automatic licence plate recognition system to track vehicles.

 

Source: Read Full Article