Met Police to deploy facial recognition cameras

Non lasciamo che i dati interferiscano con le decisioni.
Tecnologia salvifica.
Un po’ come gli ultrasuoni per le zanzare.

Link articolo originale

Archivio di tutti i clip:
clips.quintarelli.it
(Notebook di Evernote).

Met Police to deploy facial recognition cameras

Image copyright
PA Media

Image caption

Police have already run trials of facial recognition cameras in London

The Metropolitan Police has announced it will use live facial recognition cameras operationally for the first time on London streets.The cameras will be in use for five to six hours at a time, with bespoke lists of suspects wanted for serious and violent crimes drawn up each time.Police say the cameras identified 70% of suspects but an independent review found much lower accuracy.Privacy campaigners said it was a “serious threat to civil liberties”.Following earlier pilots in London and deployments by South Wales Police, the cameras are due to be put into action within a month.Police say they will warn local communities and consult with them in advance.Cameras will be clearly signposted, covering a “small, targeted area”, and police officers will hand out leaflets about the facial recognition scanning, the Met said.Assistant Commissioner Nick Ephgrave said the Met has “a duty” to use new technologies to keep people safe, adding that research showed the public supported the move.”We all want to live and work in a city which is safe: the public rightly expect us to use widely available technology to stop criminals,” he said.”Equally I have to be sure that we have the right safeguards and transparency in place to ensure that we protect people’s privacy and human rights. I believe our careful and considered deployment of live facial recognition strikes that balance.”Accuracy concernsMr Ephgrave said the system could also be used to find missing children or vulnerable adults.Trials of the cameras have already taken place on 10 occasions in locations such as Stratford’s Westfield shopping centre and the West End of London.The Met said it tested the system during these trials using police staff whose images were stored in the database. The results suggested that 70% of wanted suspects would be identified walking past the cameras, while only one in 1,000 people generated a false alert.But an independent review of six of these deployments, using different methodology, found that only eight out of 42 matches were “verifiably correct”.Campaigners have warned that accuracy may be worse for black and minority ethnic people, because the software is trained on predominantly white faces.

‘The Met is convinced it has public support’

Over the past four years, as the Met has trialled facial recognition, opposition to its use has intensified, led in the UK by campaign groups Liberty and Big Brother Watch. They exploited a nervousness on the part of senior police officers to speak out in favour of the technology at a time when the government was preoccupied with other matters. But testing and work on the system has continued and now, after the election of a Boris Johnson-led administration whose party promised in its manifesto to “empower the police to safely use new technologies”, Scotland Yard has made its move. The force also believes a recent High Court judgment, which said South Wales Police did not breach the rights of a man whose face had been scanned by a camera, gives it some legal cover.The case is heading for the Court of Appeal. But the Met is pressing on, convinced that the public at large will support its efforts to use facial recognition to track down serious offenders – even if civil liberties campaigners do not.

The Met said that the technology was “tried-and-tested” in the private sector, but previous uses of facial recognition have been controversial.Big Brother Watch, a privacy campaign group, said the decision represented “an enormous expansion of the surveillance state and a serious threat to civil liberties in the UK”.Silkie Carlo, the group’s director, said: “It flies in the face of the independent review showing the Met’s use of facial recognition was likely unlawful, risked harming public rights and was 81% inaccurate.”Last year, the Met admitted it supplied images for a database carrying out facial recognition scans on a privately owned estate in King’s Cross, after initially denying involvement.

The Information Commissioner launched an investigation into the use of facial recognition by the estate’s developer, Argent, saying that the technology is a “potential threat to privacy that should concern us all”. The investigation continues.The ICO, which is the UK’s data protection watchdog, said a broader inquiry into how police use live facial recognition technology found there was public support for its use, although it needed to be “appropriately governed, targeted and intelligence-led”.The Met Police had given assurances that it is taking steps to reduce intrusion, but the government should introduce a legally binding code of practice, an ICO spokeswoman said.”This is an important new technology with potentially significant privacy implications for UK citizens,” she said.In South Wales, one man is challenging police in the Court of Appeal over their use of the technology, which has been trialled at public events since 2017.Ed Bridges is appealing against a ruling that South Wales Police did not breach his human rights when it scanned his face on the street in Cardiff city centre and later at a peaceful protest at an arms fair in the city.

If you like this post, please consider sharing it.

Leave a Comment

Your email address will not be published. Required fields are marked *