Schools, and many other businesses, are concerned with the possibility of violence in their buildings, justifiably in many areas. So, administrators are looking for new technologies that would allow them to spot trouble before it happens. Technologies that include AI-enhanced video and audio surveillance.

A new report from the non-profit investigative journalism organization ProPublica1 says that there are many companies are only too happy to sell them systems which, they claim, can detect an incident in the making.

By deploying surveillance technology in public spaces like hallways and cafeterias, device makers and school officials hope to anticipate and prevent everything from mass shootings to underage smoking. Sound Intelligence also markets add-on packages to recognize the sounds of gunshots, car alarms and broken glass, while Hauppauge, New York-based Soter Technologies develops sensors that determine if students are vaping in the school bathroom. The Lockport school district in upstate New York is planning a facial-recognition system to identify intruders on campus.

The various systems rely on algorithms to sort out the various sounds and alert administrators when something matches particular patterns stored in the database. However, at least one system analyzed by ProPublica produced many false positives, while also recording and storing the audio collected by its microphones.

Yet ProPublica’s analysis, as well as the experiences of some U.S. schools and hospitals that have used Sound Intelligence’s aggression detector, suggest that it can be less than reliable. At the heart of the device is what the company calls a machine learning algorithm. Our research found that it tends to equate aggression with rough, strained noises in a relatively high pitch, like D’Anna’s [a student who worked with the reporters] coughing. A 1994 YouTube clip of abrasive-sounding comedian Gilbert Gottfried (“Is it hot in here or am I crazy?”) set off the detector, which analyzes sound but doesn’t take words or meaning into account. Although a Louroe spokesman said the detector doesn’t intrude on student privacy because it only captures sound patterns deemed aggressive, its microphones allow administrators to record, replay and store those snippets of conversation indefinitely.

As you might expect, surveillance technologies like this can often side effects, especially when used with young people in schools.

Dr. Nancy Rappaport, an associate professor of psychiatry at Harvard Medical School who studies school safety, said the audio surveillance could have the unintended consequence of increasing student distrust and alienation. She added that schools are opting for inexpensive technological fixes over solutions that get to the root of the problem, such as more counseling for troubled kids. One Louroe microphone with aggression software sells for about $1,000.

Covering a school with microphones to spy on the kids is far cheaper than hiring qualified counselors who will actually interact with them.

Anyway, then there is the problem inherent with any kind of artificial intelligence: the underlying code that analyzes the data collected is written by human beings, and far too often incorporates their biases.

Researchers have also found that implementing algorithms in the real world can go astray because of incomplete or biased training data or incorrect framing of the problem. For example, an algorithm used to predict criminal recidivism made errors that disproportionately punished black defendants.2

There is much more detail in the the full report, including an explanation of how they tested the devices they purchased, along with audio and video of the students they enlisted to help. The whole thing is worth your time, especially if you teach in a school district that might be considering surveillance systems of any kind.

But I also wonder what students might think about this issue. About the idea of school administrators collecting and storing the sounds of their daily life. This report might make a wonderful jumping off point for discussion and further investigation in their community.


The image is one of the Sound Intelligence microphones purchased by ProPublica. It is intended to be installed on the ceiling, and the fact that it resembles a common smoke detector is probably not a coincidence.

1. For their wide-ranging investigations and reporting, ProPublica is well worth your financial support.

2. For much more about the problems with allowing algorithms to make decisions concerning people, I highly recommend the book “Weapons of Math Destruction” by Cathy O’Neil