wasting bandwidth since 1999

Tag: safety (Page 1 of 2)

School Is Hearing You

Schools, and many other businesses, are concerned with the possibility of violence in their buildings, justifiably in many areas. So, administrators are looking for new technologies that would allow them to spot trouble before it happens. Technologies that include AI-enhanced video and audio surveillance.

A new report from the non-profit investigative journalism organization ProPublica1 says that there are many companies are only too happy to sell them systems which, they claim, can detect an incident in the making.

By deploying surveillance technology in public spaces like hallways and cafeterias, device makers and school officials hope to anticipate and prevent everything from mass shootings to underage smoking. Sound Intelligence also markets add-on packages to recognize the sounds of gunshots, car alarms and broken glass, while Hauppauge, New York-based Soter Technologies develops sensors that determine if students are vaping in the school bathroom. The Lockport school district in upstate New York is planning a facial-recognition system to identify intruders on campus.

The various systems rely on algorithms to sort out the various sounds and alert administrators when something matches particular patterns stored in the database. However, at least one system analyzed by ProPublica produced many false positives, while also recording and storing the audio collected by its microphones.

Yet ProPublica’s analysis, as well as the experiences of some U.S. schools and hospitals that have used Sound Intelligence’s aggression detector, suggest that it can be less than reliable. At the heart of the device is what the company calls a machine learning algorithm. Our research found that it tends to equate aggression with rough, strained noises in a relatively high pitch, like D’Anna’s [a student who worked with the reporters] coughing. A 1994 YouTube clip of abrasive-sounding comedian Gilbert Gottfried (“Is it hot in here or am I crazy?”) set off the detector, which analyzes sound but doesn’t take words or meaning into account. Although a Louroe spokesman said the detector doesn’t intrude on student privacy because it only captures sound patterns deemed aggressive, its microphones allow administrators to record, replay and store those snippets of conversation indefinitely.

As you might expect, surveillance technologies like this can often side effects, especially when used with young people in schools.

Dr. Nancy Rappaport, an associate professor of psychiatry at Harvard Medical School who studies school safety, said the audio surveillance could have the unintended consequence of increasing student distrust and alienation. She added that schools are opting for inexpensive technological fixes over solutions that get to the root of the problem, such as more counseling for troubled kids. One Louroe microphone with aggression software sells for about $1,000.

Covering a school with microphones to spy on the kids is far cheaper than hiring qualified counselors who will actually interact with them.

Anyway, then there is the problem inherent with any kind of artificial intelligence: the underlying code that analyzes the data collected is written by human beings, and far too often incorporates their biases.

Researchers have also found that implementing algorithms in the real world can go astray because of incomplete or biased training data or incorrect framing of the problem. For example, an algorithm used to predict criminal recidivism made errors that disproportionately punished black defendants.2

There is much more detail in the the full report, including an explanation of how they tested the devices they purchased, along with audio and video of the students they enlisted to help. The whole thing is worth your time, especially if you teach in a school district that might be considering surveillance systems of any kind.

But I also wonder what students might think about this issue. About the idea of school administrators collecting and storing the sounds of their daily life. This report might make a wonderful jumping off point for discussion and further investigation in their community.


The image is one of the Sound Intelligence microphones purchased by ProPublica. It is intended to be installed on the ceiling, and the fact that it resembles a common smoke detector is probably not a coincidence.

1. For their wide-ranging investigations and reporting, ProPublica is well worth your financial support.

2. For much more about the problems with allowing algorithms to make decisions concerning people, I highly recommend the book “Weapons of Math Destruction” by Cathy O’Neil

Alternatives to Fear

Following up on that last post, in addition to the study summary, the Consumer Reports article includes, at the end, after all the scary stuff, nine ways to protect yourself online.

Most of them make a lot of sense, and not just on Facebook.

Think before you type. Even if you delete an account (which takes Facebook about a month), some info can remain in Facebook’s computers for up to 90 days.

Regularly check your exposure. Each month, check out how your page looks to others. Review individual privacy settings if necessary.

Protect basic information. Set the audience for profile items, such as your town or employer. And remember: Sharing info with “friends of friends” could expose it to tens of thousands.

Know what you can’t protect. Your name and profile picture are public. To protect your identity, don’t use a photo, or use one that doesn’t show your face.

So, why aren’t we teaching that stuff in school? Helping kids understand how to build a responsible and safe online presence.

As to the uproar over “cyberbullying” on Facebook elsewhere in the article, isn’t one child bullying another a concern regardless of where it takes place?  Bullying occurs on playgrounds, in locker rooms, and in malls. We don’t ban playgrounds, close locker rooms, and impose age limits on malls.

The problem is with the people involved, not the location, and that is how the problem should be addressed. This is less about Facebook and more about the need for adults to pay closer attention and communicate with the kids in their lives.

The Magic Switch Doesn’t Work

With the standard cautions about accepting any one report/study/poll as conclusive proof of anything, this is still something we as educators need to pay attention to.

A study from the Internet Safety Technical Task Force, a project created by the Attorneys General from 49 states, found that the solicitation of children on the web has been greatly exaggerated.

While even one child being solicited is bad, the researchers found that the few who were, often had other issues affecting their online conduct.

The task force, led by the Berkman Center for Internet and Society at Harvard University, looked at scientific data on online sexual predators and found that children and teenagers were unlikely to be propositioned by adults online. In the cases that do exist, the report said, teenagers are typically willing participants and are already at risk because of poor home environments, substance abuse or other problems.

The report also says that online bullying is a far bigger problem and that social networks are not the “horribly bad neighborhoods” they have been portrayed in the media.

As you might expect, the task force was charged to determine “technologies that might play a role in enhancing safety for children online”.

Also to be expected, they found that the technologies now in place “do not appear to offer substantial help in protecting minors from sexual solicitation.”

Ok, whether or not you accept the findings of this particular study and other recent research that arrives at similar conclusions, I would hope that one thing is becoming clear.

Technology will not be the solution to protect kids on the web. There is no magic switch to flip that will make it happen.

Instead it’s going to take a lot of hard work on the part of the adults in the lives of children to help them cope with their real problems and understand how to live and work safely online.

Hopscotch is Next

A few years back many schools decided to throw dodgeball off the playground due to concerns about student safety. Some have done the same with tag and other games.

Now, in England, “[t]he sack race and three-legged race have been banned from a school sports day because the children might fall over and hurt themselves”.

Correct me if I’m wrong, but isn’t the risk of falling over and getting hurt part of being a kid and growing up?

Safety is the Smallest Part of the Equation

I’ve ranted before about the law passed by our state legislature requiring all teachers to include lessons about internet safety in their curriculums.

But, while the web is a fast changing place, things move pretty slowly here in the real world: the bill was enacted two years ago.

Virginia public schools will soon launch Internet safety lessons across all grade levels, responding to a state mandate that is the first of its kind in the nation. Even though today’s students have known no life without the Internet, only a couple of states have laws that recommend schools teach online safety.

In Virginia, local school systems have been rewriting policies, running pilot programs and putting final touches on lesson plans to be offered from kindergarten through 12th grade starting in September.

At least the state got it right in one big way. The message must come from the teachers and be integrated with their other instruction, instead of something like a single inoculation-type assembly.

Before they work with the kids, however, the teachers themselves need to understand internet safety and we still have far too many adults who actually believe the email from the IRS about their lost tax refund.

It would be best if we could do that instruction face-to-face, including meaningful discussions about how to best present this to the kids.

But since our overly-large school district has more people involved in instruction than many systems have kids, we fall back on an automated approach. Our teachers will be learning about internet safety by watching a series of online videos.

As you might expect, the material presented in those videos is rather negative. Not at the Dateline/Fox Alert level, but still pretty bad.

However, instead of teaching “internet safety” by warning kids about talking to strangers in chat rooms or posting nasty stuff on MySpace, I wish we could take a more positive approach to integrating this into the classroom.

Internet safety should be part of information literacy, the process of helping students understand that there is good and bad material on the web and how to tell the difference.

We need to teach them how to be constructive web publishers as part of their work in learning science, social studies and the rest of the subjects we expect them to know, not just how to be “safe”.

Because this really isn’t about “safety” anymore. Knowing how to responsibly and effectively add content to the web is fast becoming a life skill.

Many students will be doing just that as part of jobs in their future life, in addition to the recreational publishing activities they’re involved with now (whether we like it or not).

So the bottom line to teaching internet safety is that scaring kids (and adults, for that matter) into responsible use of the web only covers part of what they’ll need going forward.

A very small part.

« Older posts

© 2024 Assorted Stuff

Theme by Anders NorenUp ↑