wasting bandwidth since 1999

Tag: algorithms

Hey, Alexa. Explain Your Algorithms.

AI Cover

Lately I seem to reading a lot about artificial intelligence. Between all the self-driving car projects and many, many predictions about robots coming for our jobs (and our children), the topic is rather hard to avoid. The topic is interesting but also somewhat scary since we’re talking about creating machines that attempt to replicate, and even improve upon, the human decision making process.

One of the better assessments of why we need to be cautious about allowing artificially-intelligent systems to take over from human judgement comes from MIT’s Technology Review, whose Senior Editor for AI says “no one really knows how the most advanced algorithms do what they do”.

If a person makes a decision, it’s possible (theoretically) to learn how they arrived at that choice by simply asking them. Of course it’s not as easy with children, but most adults are able to offer some kind of logical process explaining their actions. Even if that process is flawed and they arrive at the wrong conclusion.

It’s not so easy with machines.

There’s already an argument that being able to interrogate an AI system about how it reached its conclusions is a fundamental legal right. Starting in the summer of 2018, the European Union may require that companies be able to give users an explanation for decisions that automated systems reach. This might be impossible, even for systems that seem relatively simple on the surface, such as the apps and websites that use deep learning to serve ads or recommend songs. The computers that run those services have programmed themselves, and they have done it in ways we cannot understand. Even the engineers who build these apps cannot fully explain their behavior.

We’ve never before built machines that operate in ways their creators don’t understand. How well can we expect to communicate—and get along with—intelligent machines that could be unpredictable and inscrutable?

However, far below the level of television-ready robots and whatever Elon Musk is up to now, AI is a topic that educators should probably be watching.

More and more edtech companies are developing (and your school administrators are buying) “personalized” learning systems that include complex algorithms. These applications may fall short of being intelligent but will still collect huge amounts of data from students and then make “decisions” about the course of their educational life. 

It’s unlikely the salespeople can offer any clear explanation of how the systems work. Even the engineers who wrote the software may not have a good understanding of the whole package. Or know if there are errors in the code that could result in incorrect results.

And it’s not like you can ask the computer to explain itself.


The image is the logo from Steven Spielberg’s 2001 film AI: Artificial Intelligence, a somewhat mediocre example of the genre. And a movie would have been far different if Stanley Kubrick had lived to direct it.

Personalizing Students

image of student working with teaching machine

The US Secretary of Education believes “personalized” learning is the future of schools.

At least she does based on the observational “snapshots” she’s collected in the past couple of years.

What I have observed and also read from others who are more deeply immersed in this, is that students that are in that setting are able to really pursue their learning and take charge and control of their own learning and to proceed at a rate that works for them.

I am optimistic that the places where this customized, personalized approach has been tested and shown to be successful for students, that there is going to be a broader embrace of it.

Those snapshots are more than enough evidence to create policy, right?

The vision of “personalized” learning on which DeVos is heaping praise comes largely from high-profile experiments funded by Silicon Valley billionaires, like the Summit program, largely backed by Mark Zuckerberg’s foundation, and Alt Schools, created by a former Google executive. All of these programs depend heavily on software to customize the educational program for each student, collecting a lot of data on each child along the way.

That educational program, however, is far from personal.

In most of these high tech schools, the curriculum and how it is presented is still determined by adults. Students may get to choose from a short menu of activities at each stage of the lessons, they have little to no choice in the topics they will study. Their data is used to “improve” the algorithms but their thoughts, ideas, and opinions are largely ignored.

Just like most “normal” schools.

But in the past few months, these techie education “experts” have been finding that personalizing the learning process is not as simple as they thought. Alt School has closed many of their boutique schools and some of their parents and educators are having second thoughts. Last year, sales of the Summit system to public districts was much slower than the company forecast.

Of course, “personalized” learning is the hot buzz term for hundreds of edtech companies at the moment and that’s not likely to change anytime soon. But, as responsible educators, we need to question the meaning of the word and how software vendors are applying it. Not to mention how their systems and algorithms are using data collected from students.

Because “personalized” (or “individualized”) is not the same as personal learning. The kind where students work with teachers and their peers to explore their interests and skills, as well as understanding the basic knowledge they will need as adults.

And are a fundamental part of planning their own learning process, not simply responding to software and curriculum designed by hired experts.

Beware the Algorithm

Cathy O’Neil is the author of a great book with the wonderfully provocative title, Weapons of Math Destruction. But it’s not hyperbole. In it she clearly and compellingly explains how algorithms, fueled by big data, are being used everyday to make decisions about important points in our lives. This includes whether or not we qualify for a loan, get a job, or go to prison.

Or create the “value add” rating a teacher receives, the one that mixes student test scores with data other points to determine if they keep their job.

The companies who sell those evaluation systems want us to believe that their mathematics is completely objective and a neutral judge of people. O’Neil, however, presents a variety of examples, starting with the teacher evaluation programs sold by Pearson and others, to explain why that’s just not true. She explains how the data used to build the algorithms usually comes with biases based on how and why it was collected. And additional bias is built into the systems by the people who ultimately decide what the resulting numbers mean.

While I highly recommend reading the book, you can get the TL;DR version of O’Neil’s warning in her TED talk from last spring, recently posted to their site and embedded below.

Teaching by Algorithm

A BBC video starts by asking “Could computer algorithms upgrade education?”. It just gets worse from there.

It’s a profile of the Alt Schools, a small chain of private schools based in San Francisco, funded by tech billionaires like Mark Zuckerberg. They also ask if this is the school of the future… and I certainly hope not.

I love the part where the CEO is giving the camera a tour of the company offices and notes that “almost everyone down there on the floor is a programmer”, and then, over there in the back, you have the educators. Plus the marketing and design people.

It’s pretty clear from that tour and this whole profile that the philosophy behind Alt School is very much driven by coding and data. They are using all this data (collected from largely rich, white kids based on the school in the video) to train their algorithms, with the goal to automate the teaching process. Something that makes the video’s note about the diminishing influence of teachers leading to a decline in good people entering the profession even more likely.

Or am I being paranoid?

Certainly teaching in a school where everything is recorded and deposited into a computer is pretty creepy. But is “hyper-personalized” instruction, driven by massive amounts of data and delivered by screen, really the future of learning? Or is it just the future for kids whose districts have the money to buy into this kind of marketing?

Watch the video. The New Yorker and Wired Magazine offer more details in their stories about this concept.

3-2-1 For 10-2-16

Three readings worth your time this week.

Growing up I was a big fan of Issac Asimov. Although he is primarily known for science fiction, Asimov also wrote books, short stories, and essays on almost any topic you can name. Like this piece from Technology Review, unpublished until 2014, in which he explores the sources of creativity. Although written in 1959, it’s still very relevant and a good example of Asimov’s thought processes. (about 7 minutes)

This week I saw a lot of chatter around the “fact” that NASA had updated the signs of the zodiac and inserted a new astrological sign. It all sounded like just another of the many absurdities that swim around the web and Phil Plait, who writes as the Bad Astronomer, explains just how stupid the whole deal is. Starting with the real fact that NASA had nothing to do with this, not to mention that astrology isn’t “worth wrapping a fish in”. (about 4 minutes)

Speaking of space, Elon Musk, the billionaire CEO of Tesla Motors, this week outlined his vision for not only traveling to Mars but establishing large working colonies on that planet. His current plans call for launching the first mission to Mars in 2024, less than eight years from now. It’s all very ambitious, and very much lacking in details. But Musk’s plans are certainly worth watching. (about 18 minutes)

Two audio tracks for your commute.

Did you ever wonder how Facebook determines which news stories and ads will be placed in your timeline? On a segment of the Note To Self podcast the host and a reporter for Pro Publica discuss those invisible algorithms and the impact they might have your perception of the world. They also introduce a Pro Publica project that asks users to contribute their data in an attempt to learn more about the Facebook “black box”. (18:00)

At last Monday’s presidential debate, both candidates tossed around a variety of economic terms, most of which you may have heard before. But what do they mean? The Planet Money team does an entertaining job of explaining the Terms of the Debate. (20:25)

One video to watch when you have a few minutes.

Have you ever seen an assembly line for airplanes? Boing builds 42 of their workhorse 737 model every month at their massive facility in Renton, Washington. This short video is an interesting look at how the assembly of each aircraft is completed in just nine days. (2:28)

© 2023 Assorted Stuff

Theme by Anders NorenUp ↑