Facial Recognition in Our Schools: Finding the Right Balance Between Privacy and Safety

Gun violence in our schools is shattering records as the number of school shootings in the U.S. hit a record high. According to The Washington Post, there were 46 shootings at K-12 schools in 2022, more than 2021’s 42 school shootings. Thirty four students and adults were killed in these shootings. But beyond those who were sadly victims, 43,450 children witnessed or experienced a shooting. But, it’s not just gun violence.

But school violence isn’t just gun violence though that’s what captures the media attention. Violence also includes bullying, fighting, gang violence and sexual violence. According to the U.S. Department of Education, 77% of public schools recorded that one or more incidents of crime had taken place, amounting to 1.4 million incidents – this translates to a rate of 29 incidents per 1,000 students enrolled in the 2019-2020 school year.

Given these alarming statistics, it’s certainly time to revisit your school’s safety measures and the technologies used to safeguard students, faculty and staff.

Early Warning Technology

A growing number of schools are turning to facial recognition as an early warning technology with programs focused on watchlist alerting: automatic alerting security personnel whenever a person on a watchlist enters school grounds.

Facial recognition systems are effectively automating the manual process of having to memorize the faces of potential security threats. With real-time facial recognition, school resource officers or other school faculty can simply respond to system alerts (when a person-of-interest is recognized) to resolve the situation.

School resource officers can’t cover all school entrances and it’s practically impossible to reliably memorize hundreds or thousands of faces. While school districts may have hundreds of cameras throughout their districts, if someone is not watching those cameras at a given point in time — those cameras are almost worthless.

Facial recognition solutions – like those from Oosto, automate the detection and alerting process – effectively automating an old and ineffective method of identifying bad actors in real-time.

Addressing Concerns

In many communities, there has been predictable backlash about the use of facial recognition on a number of different ethical and privacy fronts. So, let me address those concerns, one at a time.

Concern 1: I don’t want my school turned into a prison. I don’t want my child surveilled 24×7.

The Facts: Most schools are already using video surveillance in some form. According to the National Center for Education Statistics, almost 80% of elementary schools have security cameras as part of their security system. This number rises to about 94% for high schools. So students are already being surveilled. Layering in facial recognition is all about providing real-time alerts to security staff so they can respond to threats quickly. Facial recognition systems are primarily being used to identify bad actors such as violent ex-students, registered sex offenders, non-custodial parents, or anyone who may have made threats against the students or staff.

“We have almost 700 cameras throughout the district, but if someone is not watching that camera at that time – the camera is almost worthless.” – Kip Robins, Director Technology, Santa FE ISD (2019)

NOTE: Oosto has embedded additional features to protect privacy. Upon the detection of a face, we have enabled the option to blur the faces of bystanders during video playback that were not enrolled to the watchlist. Oosto also empowers our customers to choose whether to record the faces of individuals not enrolled in the watchlist. With this feature enabled, the faces of non-enrolled individuals will not be displayed in the gallery and will not be saved on school servers. Now, our education clients can avoid unnecessary privacy risks and only record and process detections of individuals on the school’s watchlist.

Concern 2: I’m concerned that the school is retaining the facial biometrics of my child. What if that data is breached and falls into the wrong hands?

The Facts: The biometric information captured for facial recognition is fleeting. When our technology detects a face (or faces) within a video frame, it instantly converts those faces to numerical vectors for comparison to the vectors of watchlisted people. If there is no match, none of those vectors are retained by the school district. Facial recognition companies generally do not store pictures of innocent students, staff or faculty within their systems.

Importantly, none of this biometric information is transmitted to the facial recognition vendor (e.g., Oosto) over the Internet. It stays local and is managed at the district level. If there is a data breach, the data would be worthless to the hackers since these numerical vectors are encrypted and effectively made unreadable.

Concern 3: I’m concerned about the demographic bias that is baked into the facial recognition algorithms.

The Facts: While there is evidence that some older versions of facial recognition technology have struggled to perform consistently across various demographic factors, the oft-repeated claim that it is inherently less accurate in matching photos of minority subjects simply does not reflect the current state of the algorithms. According to the National Institute of Standards and Technology (NIST), which tests over 650 algorithms for accuracy, there are now over 100 algorithms that can match a photo out of a lineup of over 12 million photos, over 99% of the time, regardless of race or gender. Oosto’s accuracy in real-world implementations is also north of 99%.

Concern 4: I’m concerned about who gets to decide which people are placed on a school’s watchlist

The Facts: Many schools have lists of known security threats. This often includes ex-students, parents, and faculty prone to violence, as well as known sex offenders who live in the immediate vicinity. The school district determines who gets added to the watchlist and who gets removed from the watchlist. This is completely within the school district’s jurisdiction and no watchlist decisions are made by the facial recognition vendor. We encourage school districts to be transparent about the rationale and criteria being used to add someone to their watchlist. Leading facial recognition companies like Oosto do not provide pre-populated watchlists to schools – they rely on the districts to create their own custom watchlists based on their specific security threats.

The best way to address these concerns is to educate students, parents, staff, faculty, PTAs and your local community. When the community understands how the technology works, how the misconceptions of yesteryear no longer apply (e.g., demographic bias), and how biographic data is protected, they’re generally supportive.

It’s equally important to explain the drawbacks of leveraging video surveillance without facial recognition – this means relying on staff to memorize faces, watch video footage around the clock, and spread their limited resources thinly. These limitations unfortunately mean that a bad actor could slip through your defenses.

Given the shocking rise in school violence, this needs to be a calculated gamble.

Avatar photo

About the Author

Powered by Vision AI, Oosto provides actionable intelligence to keep your customers, employees, and visitors safe.

Oosto Delivers on Promise of Cloud Security and Safety with Oosto Protect
SCHOOL SAFETY EMERGENCY SUMMIT
Oosto Delivers on Promise of Cloud Security and Safety with Oosto Protect