wasting bandwidth since 1999

Tag: alexa (Page 1 of 2)

Alexa at ISTE

picture of Alexa device

The ISTE mega conference ended it’s run for this year yesterday and, although I wasn’t able to go this year, I know from experience one of the first questions attendees will get when they get back to work: “What’s new?” Or “what was hot?”

One attendee that actually gets paid to hype the new and hot at ISTE is EdSurge, a cheerleader for the edtech industry. They seem to think that Alexa, Amazon’s voice-activated AI system, is about to take off.

Four-and-a-half years after Amazon first released Alexa, its voice-activated virtual assistant, the technology is finding its footing in education.

Across the massive, brightly colored expo hall at ISTE 2019, the annual education technology conference where companies display and demo their latest gadgets, upgrades and software, several vendors showcased the new skills they have developed for Alexa-enabled devices.

I’m not sure three examples, one of which sounds more like a promotion for the ACT college testing program, could be considered any kind of trend towards creating a “footing” in education.

They also spoke with an educator who is very enthusiastic about the potential for Alexa as an instructional tool, despite the fact that Amazon itself has warned against using their devices in the classroom. She’s using something called Amazon Blueprints, “a website that allows users to build custom apps on the voice-enabled devices”, to create applications for her students to use.

Of course, the EdSurge writer doesn’t address the potential data privacy issues involved with having a device that records everything it hears in a classroom, other than to link to two other articles.

I also wonder whether having students interact with those “custom apps” is a good use of their time. At this point, Alexa’s “thinking” is a rather simplistic, yes-or-no, this-or-that kind of interaction, and the classroom applications I’ve read about (here and elsewhere) certainly reflect that.

As I’ve written about before, a better used for this kind of AI in the classroom would be to enable kids to investigate how Alexa works and programming it themselves.

Update: If you’re interested in digging deeper into the privacy and data issues surrounding Alexa and other AI devices in the classroom, the article “Voice Assistants in the Classroom: Useful Tool or Privacy Problem?” by Susan Bearden from last November has a lot of good information.


Hey, Alexa. Push the clock ahead a few hours so I can get outta here. :-)

More About Alexa and Its AI Siblings

Following up on my previous rant about Alexa in the classroom, two good, related articles from Wired on the subject of artificial intelligence that are worth your time to read.

In one the writer highlights sections of reports to regulators from both Alphabet (Google’s parent) and Microsoft that warn of possible “risk factors” in future products.

From Alphabet:

New products and services, including those that incorporate or utilize artificial intelligence and machine learning, can raise new or exacerbate existing ethical, technological, legal, and other challenges, which may negatively affect our brands and demand for our products and services and adversely affect our revenues and operating results.

Microsoft was more specific:

AI algorithms may be flawed. Datasets may be insufficient or contain biased information. Inappropriate or controversial data practices by Microsoft or others could impair the acceptance of AI solutions. These deficiencies could undermine the decisions, predictions, or analysis AI applications produce, subjecting us to competitive harm, legal liability, and brand or reputational harm.

On the other hand, Amazon, in a report to stockholders, is more worried about governments regulating their products than they are about Alexa activating Skynet at sometime in the future.

The other post is a long excerpt from a book being published this month called “Talk to Me: How Voice Computing Will Transform the Way We Live, Work, and Think”.

It covers some pieces of recent history in the development of artificially intelligent products and the difficulty of programming a machine to understand the many ways that humans communicate.

I’m undecided about reading the whole book, but this part of it is worth 15 minutes.


The image is the user interface of HAL, the malfunctioning artificial intelligence from the 1968 film “2001: A Space Odyssey”. It also links to an interesting New York Times story of how the sound of HAL was created.

Hey, Alexa! What Are You Doing In The Classroom?

It’s very hard to escape all the hype around those voice-activated, quasi-AI powered personalities: Amazon’s Alexa, Apple’s Siri, Google’s Assistant.1

And, of course, some people bring up the idea of using them in the classroom.

A couple of weeks ago, I sat in on an ISTE webinar2 by a professor of education who was going to explain how we could use Alexa, what he classified as an “internet of things technology”, for teaching and learning.

Notice his thesis was centered on how to use Alexa with students. Not why.

Ok, I can certainly see how there might be a case for hands-free, artificially intelligent devices with certain students, those with visual and motor impairments. Maybe even to support students with reading disabilities.

But are these tools that can really help most students learn?

Currently Alexa and her competitors can only answer specific questions, when they aren’t monitoring the room for requests to place an Amazon order. Sometimes getting to those answers takes several attempts (as unintentionally demonstrated in some of the examples) as the human tailors the question format to fit the algorithms inside the box.

(I wonder how students with far less patience than the presenter would react to Alexa’s confusion.)

He also demonstrated some “add-ons” that would allow a teacher to “program” Alexa with what, to my ear, amounted to little more than audible flashcards and quiz-type activities.

So far, pretty basic stuff. But, when it comes to this supposedly new category of edtech, I have more than a few questions that go beyond how well the algorithm can retrieve facts and quiz kids.

Do we really want to be training our kids how to speak with Alexa (or Siri, or Google)? If we’re going to spend time helping them learn how to frame good questions, wouldn’t it be better if they were working on topics that matter? Topics that might have multiple or open-ended answers?

Instead of two-way artificial conversations with Alexa, how about if the kids learn the art of participating in a meaningful discussion with their peers? Or with other people outside of the classroom?

But if you really want to bring an AI voice into the classroom, why not use it as a great starting point for students to investigate that so-called “intelligence” behind the box?

Let’s do some research into how Siri works? Why does Google Assistant respond in the way it does? Who created the algorithms that sit in the background and why?

What might Amazon be doing with all the verbal data that Alexa is collecting? What could the company (and others?) learn from just listening to us?

The professor didn’t include any of that in his presentation, or anything related to the legal and ethical issues of putting an always-listening, network-connected device from a third party in a setting with children.

Some people in the chat room brought up COPPA, FERPA, and other privacy issues, but the speaker only addressed questions regarding this complex topic in the final few minutes of the session. As you might expect, he didn’t have any actual answers to these concerns.

Anyway, the bottom line to all this is that we need to consider suggestions of using Alexa, or any other always-listening device, in the classroom with a great deal of skepticism. The same goes for any other artificially intelligent device, software, or web service used by students.

At this point, there are far too many unanswered questions, including what’s in the algorithms and how the data collected is being used.


I have one of those HomePods by Apple in my house. I agree with the Wirecutter review: it’s a great speaker, especially for music, but Siri is definitely behind Alexa and Google Assistant in its (her?) artificial intelligence. On the other hand, I have more trust in Apple to keep my secrets. :-)

1. I excluded Samsung’s Bixby from that list because I’ve know absolutely no one who has actually used it, despite being release two years ago.

2. You can see the webinar here but you’ll need to have a paid ISTE membership. His slide deck is available to everyone, however.

The Surveillance Classroom

During the 2016 holiday season, Amazon’s Alexa devices were huge sellers. Google was second in the category with Home. Apple just started shipping their Siri-enabled Homepod and they will probably sell a bunch of them.

So now tens of millions of homes have always-listening internet-connected microphones listening to every sound, and more are coming. This despite the many cautions from privacy experts about allowing large corporations to have access to a new continuous stream of auditory data. 

But who cares if the artificially intelligent software powering these devices is buggy? Does it matter that Amazon, Google, and Apple are vague about how they are using that information and who has access to it? Let’s bring these boxes into the classroom!

Michael Horn, co-author of Disrupting Class, the hot education-change book from a decade ago, says Alexa and her friends is “the next technology that could disrupt the classroom”.

It’s not entirely clear why Horn believes a “voice-activated” classroom would improve student learning. Other than that the superintendent he has interviewed is concerned that kids “will come in and will be used to voice-activated environments and technology-based learning programs”.

That’s nothing new. For a few decades (at least) we have been throwing technology into the classroom based on the premise that kids have the stuff at home. That approach hasn’t been especially successful, and Alexa is not likely to change that.

But these days, a major reason for using many, if not most, new classroom technologies is collecting and analyzing data.

These devices could also send teachers real-time data to help them know where and how they should intervene with individual students. Eastwood imagines that over time these technologies would also know the different students based on their reading levels, numeracy, background knowledge, and other areas, such that it could provide access to the appropriate OER content to support that specific child in continuing her learning.

Maybe I’m wrong but I think it’s better to have a teacher or other adult listening to kids.

Anyway, Horn presents a lot of questions about the use of Alexa and her peers in the classroom but his last one is probably the most salient: “What is the best use of big data and artificial intelligence in education?” Before ending, he also very briefly touches on the security of that data – “And there are bound to be privacy concerns.”. As I said, briefly.

But the bottom line to all this is whether we want Amazon, Google, or Apple surveillance devices collecting data on everything that happens in the classroom. Horn seems to think the technology could be disruptive. It sounds creepy and rather invasive to me.


The image is from an article about a contest Amazon is running for developers, with cash prizes for the best Alexa apps that are “educational, fun, engaging or all of the above for kids under the age of 13”.

« Older posts

© 2021 Assorted Stuff

Theme by Anders NorenUp ↑