What is “Ed” Tech?

Roomba hack: Spirograph!

For one teacher, edtech included a Roomba.

It turns out the disc-shaped vacuum cleaner, which uses sensors to autonomously zip around homes is also a great tool to teach students about robotics and empathy.

Yung’s students learned all about how the Roomba moves, behaves and how it works. Then they set off to dream up and draw their own robots that could help people in the real world like a robot that gives you a blanket when you sleep.

Ok, I can see building a lesson around understanding how a robotic vacuum works.

But does that make the device educational technology?

Not necessarily.

Let’s face it, “edtech” is a very broad term and a wide variety of hardware, software and services have been tossed into that basket. So maybe we need to be a little more specific.

I don’t expect this to catch on, but I see at least two subcategories: teaching technology and learning technology.

That robot is learning technology only if kids are the ones using it. They should be playing with it, experimenting,  programming it. Maybe even taking the device apart and changing it’s function, like the picture above.

Technology under the control and direction of adults is teaching technology. Stuff like Google Classroom, FlipGrid, interactive whiteboards, most of the hot new stuff in your Twitter feed.

And that’s not a bad thing. Only that we need to make a distinction between technology that is used by teachers as part of their instruction and tech that is used by students as part of their learning.

They are not necessarily the same. Certainly not of equal value.

Just something to think about.


Image: Roomba hack: spirograph! by squidish on Flickr and used under a Creative Commons license.

Hey, Alexa. Explain Your Algorithms.

AI Cover

Lately I seem to reading a lot about artificial intelligence. Between all the self-driving car projects and many, many predictions about robots coming for our jobs (and our children), the topic is rather hard to avoid. The topic is interesting but also somewhat scary since we’re talking about creating machines that attempt to replicate, and even improve upon, the human decision making process.

One of the better assessments of why we need to be cautious about allowing artificially-intelligent systems to take over from human judgement comes from MIT’s Technology Review, whose Senior Editor for AI says “no one really knows how the most advanced algorithms do what they do”.

If a person makes a decision, it’s possible (theoretically) to learn how they arrived at that choice by simply asking them. Of course it’s not as easy with children, but most adults are able to offer some kind of logical process explaining their actions. Even if that process is flawed and they arrive at the wrong conclusion.

It’s not so easy with machines.

There’s already an argument that being able to interrogate an AI system about how it reached its conclusions is a fundamental legal right. Starting in the summer of 2018, the European Union may require that companies be able to give users an explanation for decisions that automated systems reach. This might be impossible, even for systems that seem relatively simple on the surface, such as the apps and websites that use deep learning to serve ads or recommend songs. The computers that run those services have programmed themselves, and they have done it in ways we cannot understand. Even the engineers who build these apps cannot fully explain their behavior.

We’ve never before built machines that operate in ways their creators don’t understand. How well can we expect to communicate—and get along with—intelligent machines that could be unpredictable and inscrutable?

However, far below the level of television-ready robots and whatever Elon Musk is up to now, AI is a topic that educators should probably be watching.

More and more edtech companies are developing (and your school administrators are buying) “personalized” learning systems that include complex algorithms. These applications may fall short of being intelligent but will still collect huge amounts of data from students and then make “decisions” about the course of their educational life. 

It’s unlikely the salespeople can offer any clear explanation of how the systems work. Even the engineers who wrote the software may not have a good understanding of the whole package. Or know if there are errors in the code that could result in incorrect results.

And it’s not like you can ask the computer to explain itself.


The image is the logo from Steven Spielberg’s 2001 film AI: Artificial Intelligence, a somewhat mediocre example of the genre. And a movie would have been far different if Stanley Kubrick had lived to direct it.

Don’t Call This Personal Learning

Once upon a time, like around three years ago, one of the hottest concepts in education reform was a collection of “micro schools”, called AltSchool. The brain child of a former Google executive, AltSchool mixed concepts from Montessori and other progressive educators with data-driven technology to produce a startup that attracted hundreds of millions in venture capital. 

Today, not so much. AltSchool is now “pivoting” from operating a few expensive boutique private schools to marketing software for “personalizing” learning in regular public schools.

So, what magic is AltSchool selling in their algorithms?

The software allows educators to build curriculum, tailor assignments, set goals, offer feedback, and track a host of metrics from learners. Students using the platform can create customized “playlist” of assignments and monitor their progress with the hope that their learning will become self-directed.

In other words, this one-time darling of Silicon Valley is marketing a more advanced, maybe even more intelligent, version of the teaching machines we were promised in the 50’s. And the programmed learning software schools overpaid for in the 90’s.

Call this personalized learning if you like. Maybe individualized instruction. But this approach, where every part of the learning process is mapped out by adults, is in no way personal.

I’m repeating myself but it’s worth restating…

Personalized learning is something done TO students. Personal learning is something students do for themselves.

One can be automated, the other is uniquely individual.

Why Are You Going to That?

Educon

Last week I ran into a former colleague in the supermarket and during the brief impromptu catchup, I mentioned that I would be spending the weekend in Philadelphia at EduCon. After reminding me that I was no longer working, she asked “Why are you going to an education conference?”.

I suppose it’s a valid question. I didn’t really have much of an answer at that point. That kind of encounter isn’t really designed for long-winded explanations. But blog posts are.

Ok, it’s quite true that I’m no longer employed by a school district, or being paid by any other organization. But that doesn’t mean I’m no longer an educator. At least I still think of myself in that way and I’m having a great time finding other ways to help people learn outside of the formal system. So, I was at EduCon to continue growing as an educator.

I was also in Philly to continue my personal learning. We talk a lot about “lifelong learning”, a concept we constantly try to sell to our students. Spending several days interacting with other educators at Science Leadership Academy is me putting that concept into action. Plus the city itself is a wonderful place to explore and learn from.

Finally, I return every year on a usually cold and windy January weekend for the community. EduCon is a unique event that attracts a relatively small, dynamic, diverse group of educators deeply interested in improving both their practice and American education in general. It’s refreshing to reconnect with that community for a few days of face-to-face conversations.

All of which means I already have the 2019 dates (January 25-27) locked on my calendar. Maybe you want to plan to join me?


Picture is of one packed EduCon session being streamed to the world.

There’s Nothing “Super” About It

Cathy Davidson has a new book in which she proposes “how to revolutionize the university to prepare students for a world in flux”. Nothing like setting high goals for yourself, is there?

In an excerpt from that book, she tackles the slightly less daunting issue of whether technology in the classroom benefits or hurts students.

She makes many great points in a very short space, but anyone who makes decisions about using technology in a K12 classroom should be required to demonstrate an understanding of this paragraph.

Here’s the connection between educational technophobia or technophilia: Both presume that technology in and of itself has superpowers that can either tank or replace human learning. Technology can automate many things. What it cannot automate is how humans learn something new and challenging. Neither can it, in its own right, rob us of our ability to learn when we want and need to learn. It can distract us, entice us away from the business of learning–but so can just about anything else when we’re bored.

Exactly. Technology is not a superhero. Or a super-villain. Good outcomes or bad (or something in-between) depends on how you use it.

Instead of either banning devices or automating information retrieval–whether from a screen or a lecturer droning on from the podium–the best pedagogical research we have reinforces the idea that learning in the classroom is most effective when it proceeds pretty much the way it does when we try to master something new outside of school: learning incrementally, being challenged, trying again. I even studied for my driver’s test that way–and certainly that’s what I do if I’m aspiring to something really difficult.

Incremental and challenging certainly doesn’t describe the test-driven process that “learning” in most American schools. With or without technology.