wasting bandwidth since 1999

Category: data issues Page 1 of 4

data

Talking to Alexa: Skill of the Future?

When some shiny new technology starts getting lots of hype, you can bet that articles on using it in school will quickly follow. That’s certainly true of those rapidly multiplying voice-activated artificially-intelligent devices.

The title of a recent article in Education Week asks if interacting with these devices is a “skill of the future”, or just a bad idea.

Mining Data (Non-Intrusively)

Related to the previous post, one way I follow news around the right to be forgotten, and the larger topic of data privacy, is using Google news alerts. Not a great research system, instead a little like mining for gold: most days they deliver a whole lot of dirt, mostly from obscure sites.

Then there are the little nuggets that occasionally show up.

Like this press release from a UK start up that promises to “give people back ownership of their personal data online”.

Ok, tell me more.

Do You Really Have a Right to be “Forgotten”?

8d08930r

One “tech” story that seemed to be missing from most of those decade-in-review pieces is that of the “right to be forgotten”. Although this issue is really more about privacy and use of personal data, it’s more relevant to tech than stuff like Amazon extorting local communities in their “search” for a new headquarters.

The concept first jumped on the radar for most people in 2014 when the European Court of Justice ruled that EU citizens had a right to ask search engines to remove “inadequate, irrelevant or no longer relevant” information from their results.1 I first began following this issue around that time and, as you might expect, nothing about this decision turned out to be that simple.

School Is Hearing You

Schools, and many other businesses, are concerned with the possibility of violence in their buildings, justifiably in many areas. So, administrators are looking for new technologies that would allow them to spot trouble before it happens. Technologies that include AI-enhanced video and audio surveillance.

A new report from the non-profit investigative journalism organization ProPublica1 says that there are many companies are only too happy to sell them systems which, they claim, can detect an incident in the making.

By deploying surveillance technology in public spaces like hallways and cafeterias, device makers and school officials hope to anticipate and prevent everything from mass shootings to underage smoking. Sound Intelligence also markets add-on packages to recognize the sounds of gunshots, car alarms and broken glass, while Hauppauge, New York-based Soter Technologies develops sensors that determine if students are vaping in the school bathroom. The Lockport school district in upstate New York is planning a facial-recognition system to identify intruders on campus.

The various systems rely on algorithms to sort out the various sounds and alert administrators when something matches particular patterns stored in the database. However, at least one system analyzed by ProPublica produced many false positives, while also recording and storing the audio collected by its microphones.

Yet ProPublica’s analysis, as well as the experiences of some U.S. schools and hospitals that have used Sound Intelligence’s aggression detector, suggest that it can be less than reliable. At the heart of the device is what the company calls a machine learning algorithm. Our research found that it tends to equate aggression with rough, strained noises in a relatively high pitch, like D’Anna’s [a student who worked with the reporters] coughing. A 1994 YouTube clip of abrasive-sounding comedian Gilbert Gottfried (“Is it hot in here or am I crazy?”) set off the detector, which analyzes sound but doesn’t take words or meaning into account. Although a Louroe spokesman said the detector doesn’t intrude on student privacy because it only captures sound patterns deemed aggressive, its microphones allow administrators to record, replay and store those snippets of conversation indefinitely.

As you might expect, surveillance technologies like this can often side effects, especially when used with young people in schools.

Dr. Nancy Rappaport, an associate professor of psychiatry at Harvard Medical School who studies school safety, said the audio surveillance could have the unintended consequence of increasing student distrust and alienation. She added that schools are opting for inexpensive technological fixes over solutions that get to the root of the problem, such as more counseling for troubled kids. One Louroe microphone with aggression software sells for about $1,000.

Covering a school with microphones to spy on the kids is far cheaper than hiring qualified counselors who will actually interact with them.

Anyway, then there is the problem inherent with any kind of artificial intelligence: the underlying code that analyzes the data collected is written by human beings, and far too often incorporates their biases.

Researchers have also found that implementing algorithms in the real world can go astray because of incomplete or biased training data or incorrect framing of the problem. For example, an algorithm used to predict criminal recidivism made errors that disproportionately punished black defendants.2

There is much more detail in the the full report, including an explanation of how they tested the devices they purchased, along with audio and video of the students they enlisted to help. The whole thing is worth your time, especially if you teach in a school district that might be considering surveillance systems of any kind.

But I also wonder what students might think about this issue. About the idea of school administrators collecting and storing the sounds of their daily life. This report might make a wonderful jumping off point for discussion and further investigation in their community.


The image is one of the Sound Intelligence microphones purchased by ProPublica. It is intended to be installed on the ceiling, and the fact that it resembles a common smoke detector is probably not a coincidence.

1. For their wide-ranging investigations and reporting, ProPublica is well worth your financial support.

2. For much more about the problems with allowing algorithms to make decisions concerning people, I highly recommend the book “Weapons of Math Destruction” by Cathy O’Neil

No, Google Is Not Free

One of the 800-pound gorillas at ISTE, of course, is Google. They are “gold” sponsors (meaning they kicked in more money than the silver and bronze level companies) and have a huge presence on the vendor floor, both in their own booths and in the booths of dozens (hundreds?) of other companies that connect to them in some way. Plus many, many sessions and posters deal with their various education-related products.

And one term commonly associated with all of this Googley goodness is free. Educators love free, and they don’t pay to use GSuite, Classroom, Expeditions, Maps, Earth, ChromeOS, Photos, storage, and, of course, Search.

Except these services really are not free.

Instead of sending Google money, education users and their students, like the rest of us, are contributing labor and data to the company. In Google’s own words:

The Google Privacy Policy describes fully how Google services generally use information, including for G Suite for Education users. To summarize, we use the information we collect from all of our services to provide, maintain, protect and improve them, to develop new ones, and to protect Google and our users. We also use this information to offer users tailored content, such as more relevant search results. We may combine personal information from one service with information, including personal information, from other Google services. [emphasis mine]

Another way to look at our relationship with Google comes from an essay in Slate about another free product that’s been in the news lately, Facebook.

There are at least two alternative ways of viewing our relationship to Facebook… The first is to view ourselves as customers of Facebook, paying with our time, attention, and data instead of with money. This implies greater responsibility on both sides.

The second is to view ourselves as part of Facebook’s labor force. Just as bees labor unwittingly on beekeepers’ behalf, our posts and status updates continually enrich Facebook.

Swap Google for Facebook in those statements. We help their marketing department by putting their name and products in front of students, often for many hours a day. We also provide the labor to help develop and test products that will make them a lot of money.

And do not assume students are protected by working in a “closed” Google Education environment. Unless your network never connects to the outside world, there are many ways for Google (and others) to connect your “anonymous” students to advertisers, now and in the future.

Anyway, even with all that, I’m not trying to convince you to quit using Google’s products, either personally or in the classroom. I use some of them myself (although not as much as I used to). I even present conference sessions and workshops encouraging teachers to use Google Earth and Google’s other geo-related resources for their instruction.

However, everyone needs to understand that the cost of “free” admission to Google (or any other services that don’t charge at the door) is your data. Data that is stored, analyzed, connected with other data, and occasionally sold, stolen, or otherwise distributed to third parties. With your permission, since you agreed to the terms of service you probably didn’t read when registering for that first Gmail address.

So, by all means, continue using Google and other free services. But, in the wise words of Sgt. Phil Esterhaus, let’s be careful out there.


If you’ve never seen the classic cop show Hill Street Blues, you’ve missed some good TV. At least for the first three or so seasons. I think it’s available on Hulu and maybe other streaming services.

Page 1 of 4

Powered by WordPress & Theme by Anders Norén