What is “Ed” Tech?

Roomba hack: Spirograph!

For one teacher, edtech included a Roomba.

It turns out the disc-shaped vacuum cleaner, which uses sensors to autonomously zip around homes is also a great tool to teach students about robotics and empathy.

Yung’s students learned all about how the Roomba moves, behaves and how it works. Then they set off to dream up and draw their own robots that could help people in the real world like a robot that gives you a blanket when you sleep.

Ok, I can see building a lesson around understanding how a robotic vacuum works.

But does that make the device educational technology?

Not necessarily.

Let’s face it, “edtech” is a very broad term and a wide variety of hardware, software and services have been tossed into that basket. So maybe we need to be a little more specific.

I don’t expect this to catch on, but I see at least two subcategories: teaching technology and learning technology.

That robot is learning technology only if kids are the ones using it. They should be playing with it, experimenting,  programming it. Maybe even taking the device apart and changing it’s function, like the picture above.

Technology under the control and direction of adults is teaching technology. Stuff like Google Classroom, FlipGrid, interactive whiteboards, most of the hot new stuff in your Twitter feed.

And that’s not a bad thing. Only that we need to make a distinction between technology that is used by teachers as part of their instruction and tech that is used by students as part of their learning.

They are not necessarily the same. Certainly not of equal value.

Just something to think about.


Image: Roomba hack: spirograph! by squidish on Flickr and used under a Creative Commons license.

The Surveillance Classroom

During the 2016 holiday season, Amazon’s Alexa devices were huge sellers. Google was second in the category with Home. Apple just started shipping their Siri-enabled Homepod and they will probably sell a bunch of them.

So now tens of millions of homes have always-listening internet-connected microphones listening to every sound, and more are coming. This despite the many cautions from privacy experts about allowing large corporations to have access to a new continuous stream of auditory data. 

But who cares if the artificially intelligent software powering these devices is buggy? Does it matter that Amazon, Google, and Apple are vague about how they are using that information and who has access to it? Let’s bring these boxes into the classroom!

Michael Horn, co-author of Disrupting Class, the hot education-change book from a decade ago, says Alexa and her friends is “the next technology that could disrupt the classroom”.

It’s not entirely clear why Horn believes a “voice-activated” classroom would improve student learning. Other than that the superintendent he has interviewed is concerned that kids “will come in and will be used to voice-activated environments and technology-based learning programs”.

That’s nothing new. For a few decades (at least) we have been throwing technology into the classroom based on the premise that kids have the stuff at home. That approach hasn’t been especially successful, and Alexa is not likely to change that.

But these days, a major reason for using many, if not most, new classroom technologies is collecting and analyzing data.

These devices could also send teachers real-time data to help them know where and how they should intervene with individual students. Eastwood imagines that over time these technologies would also know the different students based on their reading levels, numeracy, background knowledge, and other areas, such that it could provide access to the appropriate OER content to support that specific child in continuing her learning.

Maybe I’m wrong but I think it’s better to have a teacher or other adult listening to kids.

Anyway, Horn presents a lot of questions about the use of Alexa and her peers in the classroom but his last one is probably the most salient: “What is the best use of big data and artificial intelligence in education?” Before ending, he also very briefly touches on the security of that data – “And there are bound to be privacy concerns.”. As I said, briefly.

But the bottom line to all this is whether we want Amazon, Google, or Apple surveillance devices collecting data on everything that happens in the classroom. Horn seems to think the technology could be disruptive. It sounds creepy and rather invasive to me.


The image is from an article about a contest Amazon is running for developers, with cash prizes for the best Alexa apps that are “educational, fun, engaging or all of the above for kids under the age of 13”.

Escaping Bloom’s Basement

Bloom's Taxonomy

A software developer whose company produces free writing tools makes a rather interesting observation: Edtech Is Trapped in Ben Bloom’s Basement.

The current wave of education technology has been fraught with pedagogically unsound replications of the worst aspects of teaching and learning. Rather than build new opportunities for students to move beyond the most basic building blocks of knowledge, much of Silicon Valley has been content to recreate education’s problematic status quo inside the four corners of a Chromebook, and then have the gall to call that innovation.

Bloom, of course, refers to the Taxonomy of Educational Objectives represented in the pyramid that should be familiar to every educator. But it’s also true that most edtech is stuck on the lower rungs of the scales specifically designed to assess the quality of technology use (lookin’ at you SAMR).

Anyway, how do we get the use of instructional technology out of Bloom’s basement?

Climbing up Ben Bloom’s learning hierarchy won’t be easy, but it is necessary if we want to build education technology capable of helping learners move beyond basic remembering and understanding. There are two ways to do this: better tech or less tech.

Better tech entails leveraging cutting edge research in areas like machine learning to provide students with targeted feedback that scaffolds their learning experiences as they move up the pyramid. Less tech entails building technology that knows how to get out of the way and allow for more meaningful interactions to take place in the classroom. Today’s education technologists are exploring both approaches.

At this point he heads off into a promotion of products from his company, software he puts under the “better tech” category. And this is where he loses me.

Because I would argue that “less tech”, using the basic tools in creative ways, is the better path. Especially since that “better tech” he praises sounds a lot like programmed learning systems that are more about automating that “problematic status quo” he criticizes at the beginning of the post.

Even better than “less tech” would be technology that is controlled by students and used by them to explore, create and communicate. That, however, would require changes to the education system that go way beyond selecting software and devices.


The image is from the Vanderbilt University Center for Teaching and is used under a Creative Commons license.

This is, of course, a “modern” revised version of the concept. The original pyramid, one of “three hierarchical models used to classify educational learning objectives into levels of complexity and specificity” developed by a team of educators chaired by Benjamin Bloom, had synthesis and evaluation as the top two segments.

Tech Addiction Does Not Have a Tech Solution

The New York Times says the tech backlash is here.

Once uncritically hailed for their innovation and economic success, Silicon Valley companies are under fire from all sides, facing calls to take more responsibility for their role in everything from election meddling and hate speech to physical health and internet addiction.

The backlash against big tech has been growing for months. Facebook and Twitter are under scrutiny for their roles in enabling Russian meddling in the 2016 presidential election and for facilitating abusive behavior. Google was hit with a record antitrust fine in Europe for improperly exploiting its market power.

Evidently, the breaking point came this week when two of Apple’s large, institutional investors began pressuring the company to study and find solutions to the addictive nature of their technology, especially among children. Their statement expresses a belief that “long-term health of its youngest customers and the health of society, our economy and the company itself are inextricably linked”. 

Ok, I understand the addictive properties of gadgets like smartphones and social media sites like Facebook and SnapChat. But is this another case of blaming technology for human problems? Of demanding technological solutions instead of the difficult job of working collectively to change the culture?

I strongly disagree with media that post clickbait headlines like “Your smartphone is making you stupid, antisocial and unhealthy.” and follow them with unsupported statements like this:

They have impaired our ability to remember. They make it more difficult to daydream and think creatively. They make us more vulnerable to anxiety. They make parents ignore their children. And they are addictive, if not in the contested clinical sense then for all intents and purposes.

No. They – those evil smartphones – just sit there doing nothing until someone picks it up. They do not impede daydreaming and creativity and, in my opinion, can actually improve the ability to recall information. On their own smartphones don’t make people more anxious. And they certainly do not “make” parents ignore their children.

Yes, companies like Apple1 should provide tools to help mitigate the “addictive nature” of their products. And Facebook should use some of that highly touted “artificial intelligence” to do a better job of screening out anti-social messaging. All of them certainly need to do a better job of educating parents and teachers on how children interact with their products.

But at this point in history, it’s not possible to compel people to use those tools and that knowledge. I wonder if it’s even possible to educate people how to be more socially responsible on social media when some of the worst examples come from people our political, business, and entertainment “leaders”.

So, the tech “backlash” is here and this debate will continue. With too much of the blame likely directed at the technology and the companies that create it. And not nearly enough of the responsibility accepted by those of us who use it.


1 And Google, which provides the operating system for far more smartphones than Apple and is often ignored in these debates. Of course, much of Android is copied from iOS so there’s that. :-)

By All Means, Question The Screens

As promised, Jay Mathews is back with a followup to his promotion for a new anti-edtech book, written by two high school social studies teachers here in Fairfax County. If possible, his installment this week features even more cliches and overly broad generalizations.

Mathews begins by citing the long discredited myth of the “digital native”, and follows that by completely misrepresenting (and likely misunderstanding) the work of danah boyd. All in one paragraph.

The rest of the column is a messy collection of anecdotes and unsupported claims from the book.

Citing much research, they concluded, “the new digital world is a toxic environment for the developing minds of young people. Rather than making digital natives superlearners, it has stunted their mental growth.”

I would love to see the academic studies they found using phrases like “toxic environment” and “superlearners”.

But what about the broad range of research that arrives at very different conclusions? While the negative side too often gets the headlines, studies of how technology impacts learning is hardly conclusive. And this blanket statement alone makes me think the authors are not going for any kind of balance in this book.

Then there’s this reasoning:

Social studies teachers, they reported, are being encouraged to move “to DBQs, or document-based questions, which are simply research papers where the teacher has done all the research for the students.” Clement and Miles stick with real research papers, after students learn about different types of evidence and plan investigative strategies. Yet their students often become frustrated when devices don’t lead them to a useful source right away.

Completely ignore the issue of whether writing “real” research papers is even a valid assignment anymore.

Back before the evil internet, very few teachers just dumped their kids in the library and ignored their frustration with finding appropriate material. The process of searching for, validating, and using evidence was a key part of the learning. It still is. If anything, these skills are even more important for students now. And banning the use of “screens” for research borders on educational malpractice.

The only idea by Mathews and the authors in this mess that makes any sense is that parents (and students and teachers) should ask questions about the use of technology in schools. But “May I opt my child out of screen-based instructional activities?” is not one of them.

Instead, go deeper and challenge educators to respond to queries like “How can the use of technology change and improve the learning experience for my child?” or “How will your instructional practice change to help my child make best use of the technology available?”.

Bottom line, I certainly support questioning the use of “screens” in the classroom. However, recommending, as the authors (and probably Mathews) do, that “teachers reject most of ed-tech” is completely unrealistic and extremely short-sighted.

It’s a matter of how, not whether.