It’s very hard to escape all the hype around those voice-activated, quasi-AI powered personalities: Amazon’s Alexa, Apple’s Siri, Google’s Assistant.1
And, of course, some people bring up the idea of using them in the classroom.
A couple of weeks ago, I sat in on an ISTE webinar2 by a professor of education who was going to explain how we could use Alexa, what he classified as an “internet of things technology”, for teaching and learning.
Notice his thesis was centered on how to use Alexa with students. Not why.
Ok, I can certainly see how there might be a case for hands-free, artificially intelligent devices with certain students, those with visual and motor impairments. Maybe even to support students with reading disabilities.
But are these tools that can really help most students learn?
Currently Alexa and her competitors can only answer specific questions, when they aren’t monitoring the room for requests to place an Amazon order. Sometimes getting to those answers takes several attempts (as unintentionally demonstrated in some of the examples) as the human tailors the question format to fit the algorithms inside the box.
(I wonder how students with far less patience than the presenter would react to Alexa’s confusion.)
He also demonstrated some “add-ons” that would allow a teacher to “program” Alexa with what, to my ear, amounted to little more than audible flashcards and quiz-type activities.
So far, pretty basic stuff. But, when it comes to this supposedly new category of edtech, I have more than a few questions that go beyond how well the algorithm can retrieve facts and quiz kids.
Do we really want to be training our kids how to speak with Alexa (or Siri, or Google)? If we’re going to spend time helping them learn how to frame good questions, wouldn’t it be better if they were working on topics that matter? Topics that might have multiple or open-ended answers?
Instead of two-way artificial conversations with Alexa, how about if the kids learn the art of participating in a meaningful discussion with their peers? Or with other people outside of the classroom?
But if you really want to bring an AI voice into the classroom, why not use it as a great starting point for students to investigate that so-called “intelligence” behind the box?
Let’s do some research into how Siri works? Why does Google Assistant respond in the way it does? Who created the algorithms that sit in the background and why?
What might Amazon be doing with all the verbal data that Alexa is collecting? What could the company (and others?) learn from just listening to us?
The professor didn’t include any of that in his presentation, or anything related to the legal and ethical issues of putting an always-listening, network-connected device from a third party in a setting with children.
Some people in the chat room brought up COPPA, FERPA, and other privacy issues, but the speaker only addressed questions regarding this complex topic in the final few minutes of the session. As you might expect, he didn’t have any actual answers to these concerns.
Anyway, the bottom line to all this is that we need to consider suggestions of using Alexa, or any other always-listening device, in the classroom with a great deal of skepticism. The same goes for any other artificially intelligent device, software, or web service used by students.
At this point, there are far too many unanswered questions, including what’s in the algorithms and how the data collected is being used.
I have one of those HomePods by Apple in my house. I agree with the Wirecutter review: it’s a great speaker, especially for music, but Siri is definitely behind Alexa and Google Assistant in its (her?) artificial intelligence. On the other hand, I have more trust in Apple to keep my secrets. :-)
1. I excluded Samsung’s Bixby from that list because I’ve know absolutely no one who has actually used it, despite being release two years ago.