New York bans use of facial recognition in schools statewide | VentureBeat

il riconoscimento facciale dovrebbe essere una tecnologia di uso individuale. –

Link articolo originale

Archivio di tutti i clip:
clips.quintarelli.it
(Notebook di Evernote).

New York bans use of facial recognition in schools statewide

People walk past a poster simulating facial recognition software at the Security China 2018 exhibition on public safety and security in Beijing, China October 24, 2018. Image Credit: Reuters / Thomas Peter

VB Transform

Watch every session from the AI event of the year

On-Demand
Watch Now

The New York legislature today passed a moratorium on the use of facial recognition and other forms of biometric identification in schools until 2022. The bill, which has yet to be signed by Governor Andrew Cuomo, comes in response to the planned launch of facial recognition by the Lockport City School District and appears to be the first in the nation to explicitly regulate or ban use of the technology in schools.
In January, Lockport became one of the only U.S. school districts to adopt facial recognition in all of its K-12 buildings, which serve about 5,000 students. Proponents argued the $1.4 million system made by Canada-based SN Technologies’ Aegis kept students safe by enforcing watchlists and sending alerts when it detected someone dangerous (or otherwise unwanted). It could also detect 10 types of guns and alert select district personnel and law enforcement. But critics said it might be used to surveil students and build a database of sensitive information the school district might struggle to keep secure.
While the Lockport schools’ privacy policy stated that the watchlist wouldn’t include students and the database would only cover non-students deemed a threat, including sex offenders or those banned by court order, district superintendent Michelle Bradley ultimately oversaw which individuals were added to the system. It was reported earlier this month that school board president John Linderman couldn’t guarantee student photos would never be included for disciplinary reasons.
“Once we allow [facial recognition tech in schools], it’s going to open a floodgate for millions and millions of dollars of state dollars being spent on technology that is really questionable in terms of its reliability, in terms of its accuracy, and in terms of its value, versus the risks that come with using this technology, in terms of privacy, in terms of false positives, and so many other things,” New York Assemblymember Monica Wallace, who proposed the bill passed today, said earlier this year. Senator Brian Kavanagh (D-NY) was a cosponsor.
The bill will also require the New York State Education Department to study the issue of biometric identification in schools and craft regulations.
The New York Civil Liberties Union, which filed a lawsuit against Lockport Schools a month ago on behalf of parents opposing the system, issued a statement on Wednesday praising the moratorium. “This is especially important as schools across the state begin to acknowledge the experiences of Black and Brown students being policed in schools and funneled into the school-to-prison pipeline,” said education policy center deputy director Stefanie Coyle. “Facial recognition is notoriously inaccurate, especially when it comes to identifying women and people of color. For children, whose appearances change rapidly as they grow, biometric technologies’ accuracy is even more questionable. False positives, where the wrong student is identified, can result in traumatic interactions with law enforcement, loss of class time, disciplinary action, and potentially a criminal record.”
Bradley said the district didn’t believe there was “any valid basis” on which it should be prevented from deploying and using the Aegis system. “Contrary to the constant misrepresentations by opponents of the Aegis, Aegis does not in any way record or retain biometric information relating to students or any other individuals on district grounds,” he said. “The legislative effort would result in over $1 million of taxpayer money being committed to an approved system that cannot be used to protect the district community from sex offenders and others who present a threat.”
Beyond Lockport Schools, a number of efforts to use facial recognition systems within schools have met with resistance from parents, students, alumni, community members, and lawmakers alike. At the college level, a media firestorm erupted after a University of Colorado professor was revealed to have secretly photographed thousands of students, employees, and visitors on public sidewalks for a military anti-terrorism project. University of California San Diego researchers admitted to studying footage of students’ facial expressions to predict engagement levels. And last year, the University of California Los Angeles proposed using facial recognition software for security surveillance as part of a larger campus security policy.
Fight for the Future, the Boston-based nonprofit promoting causes related to copyright legislation, online privacy, and internet censorship, in January announced it would team up with advocacy group Students for Sensible Drug Policy in an effort to ban facial recognition on university campuses in the U.S. To kickstart the grassroots movement, the organizations launched a website and an organizing toolkit for student groups.
Many experts consider facial recognition to be problematic. The Association for Computing Machinery (ACM) and American Civil Liberties Union (ACLU) continue to call for moratoriums on all forms of the technology. San Francisco and Oakland, along with Boston and five other Massachusetts communities, have banned police use of facial recognition technology. After the first wave of recent Black Lives Matter protests in the U.S., companies including Amazon, IBM, and Microsoft halted or ended the sale of facial recognition products. Benchmarks of major vendors’ systems by the Gender Shades project and the National Institute of Standards and Technology (NIST) have found that facial recognition technology exhibits racial and gender bias and performs poorly on people who don’t conform to a single gender identity. And facial recognition programs can be wildly inaccurate, misclassifying people upwards of 96% of the time.

If you like this post, please consider sharing it.

Leave a Comment

Your email address will not be published. Required fields are marked *