Facial recognition wrongly identifies public as potential criminals 96% of time in London

Yet another report che il riconoscimento facciale a fini di polizia non funziona…

Link articolo originale

Archivio di tutti i clip:
clips.quintarelli.it
(Notebook di Evernote).

Facial recognition wrongly identifies public as potential criminals 96% of time, figures reveal

14-year-old black schoolboy among those wrongly fingerprinted after being misidentified

Click to followThe Independent

Facial recognition technology has misidentified members of the public as potential criminals in 96 per cent of scans so far in London, new figures reveal.

The Metropolitan Police said the controversial software could help it hunt down wanted offenders and reduce violence, but critics have accused it of wasting public money and violating human rights.

The trials have so far cost more than £222,000 in London and are subject to a legal challenge and a separate probe by the Information Commissioner. 

Eight trials carried in London between 2016 and 2018 resulted in a 96 per cent rate of “false positives” – where software wrongly alerts police that a person passing through the scanning area matches a photo on the database.

Two deployments outside the Westfield in shopping centre in Stratford last year saw a 100 per cent failure rate and monitors said a 14-year-old black schoolboy was fingerprinted after being misidentified.

Created with Sketch.

UK news in pictures
Show all 51

Created with Sketch.

Created with Sketch.

Police allegedly stopped people for covering their faces or wearing hoods, and one man was fined for a public order offence after refusing to be scanned in Romford. 

Scotland Yard called the trials “overt” but The Independent found shoppers unaware facial recognition was being used, and campaigners accused police of rolling out the technology “by stealth”.

The figures did not cover two of the Metropolitan Police’s 2019 facial recognition trials, which will be included in its own review.

The force said eight arrests resulted from the most recent deployment, as a direct result of a flagging system for wanted violent criminals.

“All alerts against the watch list are deleted after 30 days and faces in the video stream that do not generate an alert are deleted immediately,” a spokesperson added.

Big Brother Watch, which obtained the data through a freedom of information request, called for police to drop the technology.

Director Silkie Carlo said: “This is a turning point for civil liberties in the UK. If police push ahead with facial recognition surveillance, members of the public could be tracked across Britain’s colossal CCTV networks. 

“For a nation that opposed ID cards and rejected the national DNA database, the notion of live facial recognition turning citizens into walking ID cards is chilling.”

Facial recognition trial in London’s West End

Ms Carlo warned that British police were setting “a dangerous example to countries around the world”, adding: “It would be disastrous for policing and the future of civil liberties and we urge police to drop it for good”.

Big Brother Watch has raised £10,000 for a legal challenge that argues facial recognition “breaches fundamental human rights protecting privacy and freedom of expression”.

The Metropolitan Police said it was “aware that the accuracy of the live facial recognition technology has been subject to some debate and challenge”. 

“When using the technology, further checks and balances are always carried out before police action is taken,” a spokesperson added. “The final decision to engage with an individual flagged by the technology is always made by a human. 

“We understand that the public will rightly expect the use of this technology to be rigorously scrutinised and used lawfully. The technology itself is developing all the time and work is being done to ensure the technology is as accurate as possible.”

The Metropolitan Police said a “full independent evaluation” of the trials was ongoing and the conclusions would be made public in due course.

A parliamentary debate was told that South Wales Police and Leicestershire Police have also used live facial recognition, while other forces use software to compare new images of suspects to their databases.

Darren Jones, the Labour MP for Bristol North West said the technology “can be used without individuals really knowing it is happening” by police, councils and private firms.

“We have known for many years that the way the police have been processing the facial images of innocent citizens is unlawful,” he told Westminster Hall on Wednesday.

“The system is not fit for purpose … that in 2019 going into 2020, we do not know what we are doing to fix it or how it will be fixed, it is wholly unsatisfactory.”

An unmarked police van carrying facial recognition cameras and software on deployment in London’s West End on 17 December 2018 (Lizzie Dearden)

Scottish National Party MP Stuart McDonald said the “hugely problematic” technology must only be used proportionately and in a targeted way.

“There are huge concerns about the impact of such technology on privacy and freedoms such as the freedom of assembly,” he added. 

“Studies have shown that such technology can disproportionally misidentify women and black and minority ethnic people, and as a consequence people from those groups are more likely to be wrongly stopped and questioned.”

Louise Haigh, Labour’s shadow policing minister, said that police may use facial recognition with good intentions but “mess it up”.

“I do not get a strong impression that individual police forces are learning from each other,” she added. “In the case of the Met … it has been trialled for three years in a row. When does a trial become a permanent fixture?”

Nick Hurd, the policing minister, said the government supported pilots and believes facial recognition holds “real opportunities”.

He said “extremely legitimate privacy concerns” were being examined by parliamentary committees and watchdogs, while the Home Office was “working with police to improve their procedures”.

“We are not a surveillance state and have no intention of becoming one,” Mr Hurd added. “That means that we must use new technologies in ways that are sensitive to their impact on privacy, and ensure that their use is proportionate.”

If you like this post, please consider sharing it.

Leave a Comment

Your email address will not be published. Required fields are marked *