The Delicate Ethics of Using Facial Recognition in Schools

On a steamy evening in May, 9,000 people filled Stingaree Stadium at Texas City High School for graduation night. A rainstorm delayed ceremonies by a half-hour, but the school district’s facial recognition system didn’t miss a beat. Cameras positioned along the fenceline allowed algorithms to check every face that walked in the gate.

As the stadium filled with families, security staff in the press box received a notification that the system had spotted someone on their watchlist. It was a boy who had been expelled from the district and sent to a county disciplinary school, whose pupils are barred by district rules from visiting other campuses.

Less than 30 seconds after the boy sat down, a sheriff’s deputy asked for his name. When he replied, he was escorted from the stadium, and missed his sister’s graduation. “Mama was upset, but that’s the rules,” says Mike Matranga, executive director of security at Texas City Independent School District, on the shore of Galveston Bay south of Houston.Matranga proudly relates the incident to show how facial recognition can make schools safer. It also shows how the nation’s schoolchildren have been thrust into a debate over the value—and the risks—of AI-enhanced surveillance.
WIRED identified eight public school systems, from rural areas to giant urban districts, that have moved to install facial recognition systems in the past year. There likely are many more. The technology watched over thousands of students returning to school in recent weeks, continually checking faces against watchlists compiled by school officials and law enforcement.

Administrators say facial recognition systems are important tools to respond to or even prevent major incidents such as shootings. But the systems are also being used to enforce school rules or simply as a convenient way to monitor students.

This spring, staff at Putnam City Schools in Oklahoma needed to check whether a student reported as having run away from home was at school. Rather than ask teachers, Cory Boggs, who directs IT for the district, tapped facial recognition cameras to quickly spot the student. “It’s a very, very efficient way of monitoring a group of people,” he says. Putnam City and Texas City both bought surveillance software called Better Tomorrow from AnyVision, an Israeli startup that media reports in its home country say supplies Israeli army checkpoints in the West Bank.Not everyone likes the idea of facial recognition in schools. Last year, parents in Lockport, New York, protested plans by school officials to install a $1.4 million facial recognition system, saying it was inappropriate to use such potentially intrusive technology on children. “The moment they turn those cameras on, every student, including my daughter, is being surveilled by a system that can track their whereabouts and their associations,” says Jim Shultz, the parent of a Lockport junior. The district says it doesn’t intend to watch students; rather, officials say they want to keep out unwelcome visitors, including suspended students and local sex offenders.
The parent protests, reported first by the Lockport Journal, caught the attention of the New York Civil Liberties Union, which raised concerns about the accuracy of facial recognition algorithms on darker skin tones . The NYCLU noted that the district planned to include suspended students, who are disproportionately black, on its watchlist. Similar worries have helped motivate cities including San Francisco and Oakland to ban their public agencies from using facial recognition. In June, the state Education Department ordered Lockport to halt testing of the system.