Hey, Alexa. Explain Your Algorithms.

AI Cover

Lately I seem to reading a lot about artificial intelligence. Between all the self-driving car projects and many, many predictions about robots coming for our jobs (and our children), the topic is rather hard to avoid. The topic is interesting but also somewhat scary since we’re talking about creating machines that attempt to replicate, and even improve upon, the human decision making process.

One of the better assessments of why we need to be cautious about allowing artificially-intelligent systems to take over from human judgement comes from MIT’s Technology Review, whose Senior Editor for AI says “no one really knows how the most advanced algorithms do what they do”.

If a person makes a decision, it’s possible (theoretically) to learn how they arrived at that choice by simply asking them. Of course it’s not as easy with children, but most adults are able to offer some kind of logical process explaining their actions. Even if that process is flawed and they arrive at the wrong conclusion.

It’s not so easy with machines.

There’s already an argument that being able to interrogate an AI system about how it reached its conclusions is a fundamental legal right. Starting in the summer of 2018, the European Union may require that companies be able to give users an explanation for decisions that automated systems reach. This might be impossible, even for systems that seem relatively simple on the surface, such as the apps and websites that use deep learning to serve ads or recommend songs. The computers that run those services have programmed themselves, and they have done it in ways we cannot understand. Even the engineers who build these apps cannot fully explain their behavior.

We’ve never before built machines that operate in ways their creators don’t understand. How well can we expect to communicate—and get along with—intelligent machines that could be unpredictable and inscrutable?

However, far below the level of television-ready robots and whatever Elon Musk is up to now, AI is a topic that educators should probably be watching.

More and more edtech companies are developing (and your school administrators are buying) “personalized” learning systems that include complex algorithms. These applications may fall short of being intelligent but will still collect huge amounts of data from students and then make “decisions” about the course of their educational life. 

It’s unlikely the salespeople can offer any clear explanation of how the systems work. Even the engineers who wrote the software may not have a good understanding of the whole package. Or know if there are errors in the code that could result in incorrect results.

And it’s not like you can ask the computer to explain itself.

The image is the logo from Steven Spielberg’s 2001 film AI: Artificial Intelligence, a somewhat mediocre example of the genre. And a movie would have been far different if Stanley Kubrick had lived to direct it.

Don’t Call This Personal Learning

Once upon a time, like around three years ago, one of the hottest concepts in education reform was a collection of “micro schools”, called AltSchool. The brain child of a former Google executive, AltSchool mixed concepts from Montessori and other progressive educators with data-driven technology to produce a startup that attracted hundreds of millions in venture capital. 

Today, not so much. AltSchool is now “pivoting” from operating a few expensive boutique private schools to marketing software for “personalizing” learning in regular public schools.

So, what magic is AltSchool selling in their algorithms?

The software allows educators to build curriculum, tailor assignments, set goals, offer feedback, and track a host of metrics from learners. Students using the platform can create customized “playlist” of assignments and monitor their progress with the hope that their learning will become self-directed.

In other words, this one-time darling of Silicon Valley is marketing a more advanced, maybe even more intelligent, version of the teaching machines we were promised in the 50’s. And the programmed learning software schools overpaid for in the 90’s.

Call this personalized learning if you like. Maybe individualized instruction. But this approach, where every part of the learning process is mapped out by adults, is in no way personal.

I’m repeating myself but it’s worth restating…

Personalized learning is something done TO students. Personal learning is something students do for themselves.

One can be automated, the other is uniquely individual.

Why Are You Going to That?


Last week I ran into a former colleague in the supermarket and during the brief impromptu catchup, I mentioned that I would be spending the weekend in Philadelphia at EduCon. After reminding me that I was no longer working, she asked “Why are you going to an education conference?”.

I suppose it’s a valid question. I didn’t really have much of an answer at that point. That kind of encounter isn’t really designed for long-winded explanations. But blog posts are.

Ok, it’s quite true that I’m no longer employed by a school district, or being paid by any other organization. But that doesn’t mean I’m no longer an educator. At least I still think of myself in that way and I’m having a great time finding other ways to help people learn outside of the formal system. So, I was at EduCon to continue growing as an educator.

I was also in Philly to continue my personal learning. We talk a lot about “lifelong learning”, a concept we constantly try to sell to our students. Spending several days interacting with other educators at Science Leadership Academy is me putting that concept into action. Plus the city itself is a wonderful place to explore and learn from.

Finally, I return every year on a usually cold and windy January weekend for the community. EduCon is a unique event that attracts a relatively small, dynamic, diverse group of educators deeply interested in improving both their practice and American education in general. It’s refreshing to reconnect with that community for a few days of face-to-face conversations.

All of which means I already have the 2019 dates (January 25-27) locked on my calendar. Maybe you want to plan to join me?

Picture is of one packed EduCon session being streamed to the world.

There’s Nothing “Super” About It

Cathy Davidson has a new book in which she proposes “how to revolutionize the university to prepare students for a world in flux”. Nothing like setting high goals for yourself, is there?

In an excerpt from that book, she tackles the slightly less daunting issue of whether technology in the classroom benefits or hurts students.

She makes many great points in a very short space, but anyone who makes decisions about using technology in a K12 classroom should be required to demonstrate an understanding of this paragraph.

Here’s the connection between educational technophobia or technophilia: Both presume that technology in and of itself has superpowers that can either tank or replace human learning. Technology can automate many things. What it cannot automate is how humans learn something new and challenging. Neither can it, in its own right, rob us of our ability to learn when we want and need to learn. It can distract us, entice us away from the business of learning–but so can just about anything else when we’re bored.

Exactly. Technology is not a superhero. Or a super-villain. Good outcomes or bad (or something in-between) depends on how you use it.

Instead of either banning devices or automating information retrieval–whether from a screen or a lecturer droning on from the podium–the best pedagogical research we have reinforces the idea that learning in the classroom is most effective when it proceeds pretty much the way it does when we try to master something new outside of school: learning incrementally, being challenged, trying again. I even studied for my driver’s test that way–and certainly that’s what I do if I’m aspiring to something really difficult.

Incremental and challenging certainly doesn’t describe the test-driven process that “learning” in most American schools. With or without technology.

EdTech Déjà Vu

If last week was a “normal” end of June, I would have spent five or six days attending the ISTE Conference, this year in San Antonio. This year I had to skip the event and join the #notatiste crowd.

But I wonder just how much I missed by not being there in person.

I certainly regret not having the rare opportunity to see friends and colleagues face-to-face. Beyond that, it wasn’t hard to keep up with the big ideas being tossed around in the convention center, thanks to the active stream of tweets, posts, podcasts, and video.

1Anyway, one concept that seemed to be all over the place was “personalization”. Presenters discussed how to personalize instruction. Vendors offered thousands of products to help the process. Visionaries talked of how artificial intelligence (AI) would personalize education.

But to me all of that seemed very, very familiar. Haven’t we tried this before?

Twenty years ago, when I was transitioning out of the classroom and into edtech training, I worked with several elementary schools whose principals had bought into a system called SuccessMaker. It was an expensive “programmed learning” system contained on dozens of CDs that was supposed to improve student learning in reading and math.

The software presented the students with activities wrapped with animated characters and, based on the child’s response, moved them through the lessons. The developer recommended that students should spend 20 minutes a day on their system. I remember clearly a trainer from the company promising teachers they would see tremendous improvement in test scores very quickly. And that students would be highly motivated to learn because they like using technology.

So, how is that system different from those that were being promoted at ISTE? I’m not sure much has really changed in those twenty years.

Those new “personalized” learning systems on the ISTE vendor floor likely use much more sophisticated algorithms to tailor lessons for students. They certainly collect far more data, sending it to the cloud to be processed along with information on tens of thousand students. As opposed to relying on the basic information provided by the teacher and storing individual student records on a single, non-networked machine.

Plus the marketing hasn’t changed much. Developers still promise miracle jumps in test scores. They still emphasize high student engagement because “technology”.

And, as with those systems from two decades ago, none of the learning is really personal.