Hey, Alexa. Explain Your Algorithms.

AI Cover

Lately I seem to reading a lot about artificial intelligence. Between all the self-driving car projects and many, many predictions about robots coming for our jobs (and our children), the topic is rather hard to avoid. The topic is interesting but also somewhat scary since we’re talking about creating machines that attempt to replicate, and even improve upon, the human decision making process.

One of the better assessments of why we need to be cautious about allowing artificially-intelligent systems to take over from human judgement comes from MIT’s Technology Review, whose Senior Editor for AI says “no one really knows how the most advanced algorithms do what they do”.

If a person makes a decision, it’s possible (theoretically) to learn how they arrived at that choice by simply asking them. Of course it’s not as easy with children, but most adults are able to offer some kind of logical process explaining their actions. Even if that process is flawed and they arrive at the wrong conclusion.

It’s not so easy with machines.

There’s already an argument that being able to interrogate an AI system about how it reached its conclusions is a fundamental legal right. Starting in the summer of 2018, the European Union may require that companies be able to give users an explanation for decisions that automated systems reach. This might be impossible, even for systems that seem relatively simple on the surface, such as the apps and websites that use deep learning to serve ads or recommend songs. The computers that run those services have programmed themselves, and they have done it in ways we cannot understand. Even the engineers who build these apps cannot fully explain their behavior.

We’ve never before built machines that operate in ways their creators don’t understand. How well can we expect to communicate—and get along with—intelligent machines that could be unpredictable and inscrutable?

However, far below the level of television-ready robots and whatever Elon Musk is up to now, AI is a topic that educators should probably be watching.

More and more edtech companies are developing (and your school administrators are buying) “personalized” learning systems that include complex algorithms. These applications may fall short of being intelligent but will still collect huge amounts of data from students and then make “decisions” about the course of their educational life. 

It’s unlikely the salespeople can offer any clear explanation of how the systems work. Even the engineers who wrote the software may not have a good understanding of the whole package. Or know if there are errors in the code that could result in incorrect results.

And it’s not like you can ask the computer to explain itself.


The image is the logo from Steven Spielberg’s 2001 film AI: Artificial Intelligence, a somewhat mediocre example of the genre. And a movie would have been far different if Stanley Kubrick had lived to direct it.

Don’t Call This Personal Learning

Once upon a time, like around three years ago, one of the hottest concepts in education reform was a collection of “micro schools”, called AltSchool. The brain child of a former Google executive, AltSchool mixed concepts from Montessori and other progressive educators with data-driven technology to produce a startup that attracted hundreds of millions in venture capital. 

Today, not so much. AltSchool is now “pivoting” from operating a few expensive boutique private schools to marketing software for “personalizing” learning in regular public schools.

So, what magic is AltSchool selling in their algorithms?

The software allows educators to build curriculum, tailor assignments, set goals, offer feedback, and track a host of metrics from learners. Students using the platform can create customized “playlist” of assignments and monitor their progress with the hope that their learning will become self-directed.

In other words, this one-time darling of Silicon Valley is marketing a more advanced, maybe even more intelligent, version of the teaching machines we were promised in the 50’s. And the programmed learning software schools overpaid for in the 90’s.

Call this personalized learning if you like. Maybe individualized instruction. But this approach, where every part of the learning process is mapped out by adults, is in no way personal.

I’m repeating myself but it’s worth restating…

Personalized learning is something done TO students. Personal learning is something students do for themselves.

One can be automated, the other is uniquely individual.

Personalizing Students

The US Secretary of Education believes “personalized” learning is the future of schools.

At least she does based on the observational “snapshots” she’s collected in the past couple of years.

What I have observed and also read from others who are more deeply immersed in this, is that students that are in that setting are able to really pursue their learning and take charge and control of their own learning and to proceed at a rate that works for them.

I am optimistic that the places where this customized, personalized approach has been tested and shown to be successful for students, that there is going to be a broader embrace of it.

Those snapshots are more than enough evidence to create policy, right?

The vision of “personalized” learning on which DeVos is heaping praise comes largely from high-profile experiments funded by Silicon Valley billionaires, like the Summit program, largely backed by Mark Zuckerberg’s foundation, and Alt Schools, created by a former Google executive. All of these programs depend heavily on software to customize the educational program for each student, collecting a lot of data on each child along the way.

That educational program, however, is far from personal.

In most of these high tech schools, the curriculum and how it is presented is still determined by adults. Students may get to choose from a short menu of activities at each stage of the lessons, they have little to no choice in the topics they will study. Their data is used to “improve” the algorithms but their thoughts, ideas, and opinions are largely ignored.

Just like most “normal” schools.

But in the past few months, these techie education “experts” have been finding that personalizing the learning process is not as simple as they thought. Alt School has closed many of their boutique schools and some of their parents and educators are having second thoughts. Last year, sales of the Summit system to public districts was much slower than the company forecast.

Of course, “personalized” learning is the hot buzz term for hundreds of edtech companies at the moment and that’s not likely to change anytime soon. But, as responsible educators, we need to question the meaning of the word and how software vendors are applying it. Not to mention how their systems and algorithms are using data collected from students.

Because “personalized” (or “individualized”) is not the same as personal learning. The kind where students work with teachers and their peers to explore their interests and skills, as well as understanding the basic knowledge they will need as adults.

And are a fundamental part of planning their own learning process, not simply responding to software and curriculum designed by hired experts.

EdTech Déjà Vu

If last week was a “normal” end of June, I would have spent five or six days attending the ISTE Conference, this year in San Antonio. This year I had to skip the event and join the #notatiste crowd.

But I wonder just how much I missed by not being there in person.

I certainly regret not having the rare opportunity to see friends and colleagues face-to-face. Beyond that, it wasn’t hard to keep up with the big ideas being tossed around in the convention center, thanks to the active stream of tweets, posts, podcasts, and video.

1Anyway, one concept that seemed to be all over the place was “personalization”. Presenters discussed how to personalize instruction. Vendors offered thousands of products to help the process. Visionaries talked of how artificial intelligence (AI) would personalize education.

But to me all of that seemed very, very familiar. Haven’t we tried this before?

Twenty years ago, when I was transitioning out of the classroom and into edtech training, I worked with several elementary schools whose principals had bought into a system called SuccessMaker. It was an expensive “programmed learning” system contained on dozens of CDs that was supposed to improve student learning in reading and math.

The software presented the students with activities wrapped with animated characters and, based on the child’s response, moved them through the lessons. The developer recommended that students should spend 20 minutes a day on their system. I remember clearly a trainer from the company promising teachers they would see tremendous improvement in test scores very quickly. And that students would be highly motivated to learn because they like using technology.

So, how is that system different from those that were being promoted at ISTE? I’m not sure much has really changed in those twenty years.

Those new “personalized” learning systems on the ISTE vendor floor likely use much more sophisticated algorithms to tailor lessons for students. They certainly collect far more data, sending it to the cloud to be processed along with information on tens of thousand students. As opposed to relying on the basic information provided by the teacher and storing individual student records on a single, non-networked machine.

Plus the marketing hasn’t changed much. Developers still promise miracle jumps in test scores. They still emphasize high student engagement because “technology”.

And, as with those systems from two decades ago, none of the learning is really personal.

Spotlight Mismatch

Description of a Spotlight report on Personalized Learning from an EdWeek newsletter:

See how schools are using algorithm-driven playlists to customize lessons for students, consider red flags to look for when purchasing products, and learn how personalization can make learning more social.

I’m almost curious enough to give them my personal information, just so I can understand how “algorithm-driven playlists” and customized lessons can make learning “more social”. Seems like a big mismatch to me.

The next item in the newsletter describes their Spotlight report on Maker Education:

Learn how schools are embracing student-driven learning, ensuring equity in maker education, and providing students with opportunities to develop real-world skills.

Is it possible to have “algorithm-driven playlists” and “student-driven learning” in the same classroom? Or do these reports describe two completely different groups of students? And if that’s the case, how do we decide which students get “personalized” and who gets the “opportunities to develop real-world skills”?

Lots of questions. Not many good answers.