irresponsabili.

Link articolo originale

Archivio di tutti i clip:
clips.quintarelli.it
(Notebook di Evernote).

UK launched passport photo checker it knew would fail with dark skin

Technology

9 October 2019

By Adam Vaughan

Face recognition often struggles to recognise people with certain skin tones Iza habur/Getty Images
The UK government went ahead with a face-detection system for its passport photo checking service, despite knowing the technology failed to work well for people in some ethnic minorities.
Face recognition technology has a record of failing to recognise people with certain skin tones. For example, Google had to apologise in 2015 when its photos app labelled a black couple as gorillas.
Now, documents released by the Home Office this week show it was aware of problems with its website’s passport photo checking service, but decided to use it regardless.

“User research was carried out with a wide range of ethnic groups and did identify that people with very light or very dark skin found it difficult to provide an acceptable passport photograph,” the department wrote in a document released in response to a freedom of information (FOI) request. “However; the overall performance was judged sufficient to deploy.”
Since the service went live in June 2016, some users have had problems with the photo checking service. Joshua Bada, a black sports coach, was told by the system recently that his photo didn’t meet requirements after it mistook his lips for an open mouth.

Read more: The backlash against face recognition has begun – but who will win?

Cat Hallam, a black technology officer at Keele University, UK, found the service wrongly suggested her eyes were closed and her mouth was open. “What is very disheartening about all of this is they were aware of it,” Hallam told New Scientist.
“A person’s race should not be a barrier to using technology for essential public services,” says a spokesperson for the UK’s Equality and Human Rights Commission. “We are disappointed that the government is proceeding with the implementation of this technology despite evidence that it is more difficult for some people to use it based on the colour of their skin.”
Face detection software is normally trained on thousands of images. One way bias can enter the system is if the training data isn’t large enough or diverse enough to represent the group it will be used on.
Sam Smith of campaign group MedConfidential, which submitted the FOI request, says: “Clearly, they deployed what they had anyway – they’ve taken the standard as ‘if no one else’s image analysis works for black people, then ours doesn’t have to either’.”
The Home Office says users can override the photo checker and proceed with their passport application, but observers say that misses the point.
“Even with the user being able to override the selection, it is still creating a – largely racialised – disparity in experience between users,” says Os Keyes at the University of Washington, Seattle.

Read more: UK’s controversial use of face recognition to be challenged in court

Users may be reluctant to use the override function given that the website warns people they may have a problem with their application if the photo doesn’t meet the rules, says Hallam.
The government said that to mitigate the issue it would: “continue to conduct user research and usability testing with appropriate participants to ensure that users from different ethnicities can follow the photo guidance and provide a photo that passes the photo checks.”
A Home Office spokesperson told New Scientist: “We are determined to make the experience of uploading a digital photograph as simple as possible, and will continue working to improve this process for all of our customers.”
The government promise of “we’ll fix it later” is “a depressingly common response to people pointing out biases in technology”, says Keyes.
Samir Jeraj at the Race Equality Foundation says: “It’s outrageous. It clearly shows it wasn’t a priority for them that it would work for people with black skin.” Jeraj called on the government to be clearer and more robust about what improvements it will make, and by when. In the meantime, he adds it would not cost the passport office anything to put a note on its website acknowledging the issue.
Noel Sharkey, a computer scientist at the University of Sheffield, UK, says the case shows the need for new regulation. “It is well known that current face recognition systems are highly inaccurate for people with darker shades of skin. That is bad enough, but then using a technology that the passport office knows to be biased is unacceptable and normalises injustice. We urgently need new regulation to stop any inequalities.”

More on these topics:
face recognition