Some students in upstate New York returned from the holidays to schools equipped with controversial facial recognition software that district officials claim will enhance security. Parents and privacy advocates worry about the myriad potential abuses.
“It’s like Facebook. It doesn’t matter what the managers of the system promise, what matters is the technical capacity of the system because of its retroactive ability to go back and do whatever it wants,” said Lockport parent and Democracy Center founder Jim Shultz. “It’s like a time machine.” Shultz’s daughter is a junior at Lockport High School, and he said the students have no idea of the seriousness of what’s going on.
Lockport City School District officials contend that the security measures resemble those used in airports or casinos or commonly found on phones and some social media platforms. In a message to parents on the district’s website, Superintendent Michelle Bradley said the primary purpose of the system is to “detect and classify threats.” Recognizable threats include 10 types of guns and individual faces flagged as potential security risks such as sex offenders, suspended staff members, and anyone with a court order to stay off school property.
Critics quickly pointed out that the system only works to the extent that “suspicious individuals” are in its database. Weapons need to be fully visible to trigger a response, limiting the system’s potential to alert authorities of an imminent threat.
Only a handful of schools nationwide employ biometric technology, but the large-scale rollout to every school in the Lockport district gained national attention due to the pitched battle over its implementation.
Controversy erupted in early 2018 after the district purchased a $1.4 million software package, but its cost eventually ballooned to $2.7 million, which included the specialized cameras required to run the sophisticated system. The district used its portion of $2 billion in statewide taxpayer technology bond funds for the purchase.
School officials originally planned to tag select students as potential violent threats, claiming they could mitigate or even prevent a tragedy like what unfolded at Marjory Stoneman Douglas High School in Parkland, Fla., where a shooter killed 17 people on Feb. 14, 2018. But New York’s Department of Education intervened, first temporarily freezing implementation and ultimately ordering the district to exclude students from the database. The Lockport Board of Education voted last Wednesday to follow the state’s guidance, but Board President John Linderman made no guarantees it would remain that way.
“I think you always go back and you review it,” Linderman told the Lockport Union-Sun & Journal. “Can I say we’ll never add student photos to the system. I can’t say that. At this time, there is no intention to do that. We’re going to respect the request of the state.”
Concerned parents said they don’t know enough about the software’s capacity to track student and staff movements and monitor daily habits, like when they come and go, whom they greet, even what they wear. All that information could be exploited for illicit purposes.
“How we deploy artificial intelligence in our schools is something that we need to do very carefully, very thoughtfully, with transparency and with a real understanding that once you open up that Pandora’s box, everything’s on the loose,” Shultz said. “You just can’t put it back.”
For now, the system remains live and tracking. A possible legislative remedy sits in the New York State Senate Education Committee. If passed, it would enact a moratorium on facial recognition technology in schools until July 2022 to study the issue more thoroughly.