Inappropriate Optimism

Approaching the end of another calendar, the inevitable (and lazy) flood of year-end recaps and forecasts for some undefined future is beginning to trickle in.

In that latter category, one writer is very optimistic about the “next wave” of educational technology, ending his column that tries to make that case with this:

At this point, America’s education system finally has all the key building blocks in place: The infrastructure is solid, almost every student has a device and wireless internet access, schools and educators (at all levels) are now much more comfortable working with technology and data, and thousands of entrepreneurs are working—not just with early adopters, but increasingly with early mainstream schools and educators—to bring edtech and personalized learning to the masses.

This is why I’m optimistic about the next decade of educational technology and innovation. I can’t wait to see how the next chapter unfolds!

Ok. Except that he has all kinds of bad assumptions jammed into just that one paragraph

Start with the statement that the “infrastructure is solid” in schools. It’s true that the vast majority of US classrooms are connected to the internet. But the number with adequate bandwidth is much, much lower, especially in high poverty rural and urban areas.

Even worse is his claim that “almost every student has a device”. I suppose if you average out everything, it might be close to 1:1. But even if you can claim a 1:1 ratio in your school/district, that doesn’t mean every student has the same quality of device1. Or can accomplish the same quality of work with the equipment and software available. That’s true even in the very rich overly-large school district that used to employ me.

Finally, there’s the line about educators being “much more comfortable” using technology and data. I’m pretty sure most teachers are “comfortable” with the tools they use. The digital grade book, attendance systems, Word. Most are not at all comfortable with tools for meaningful learning, especially when it’s students using that technology in creative ways.

However, all of that really doesn’t matter. When it comes to being optimistic about educational technology, this particular column is not at all about student learning or even teacher productivity.

The writer is a “general partner” at a venture capital firm, one that specializes in “disruptive education” startups. His optimism is all squeezing as much profit as possible from the education technology companies in which they’ve invested. Profit which will ultimately come from schools and districts at the expense of other priorities.

After all, there’s a bear market in all that “personalization” and data collection.


1. A Chromebook is NOT a computer. Don’t tell me otherwise because I’ve used both and Chromebooks do not compute. But that’s a rant for another day.

Don’t Blame the Lecture

A few days ago, the New York Times published the latest high profile story advocating for a ban of laptops from classrooms, mostly at the college level. They all point to a “growing body of evidence” claiming to show that students learn less and get poorer grades when they use devices during lectures.

I was going to chime in with my thoughts on the matter, including more than a few questions about the methodology and assumptions behind these studies. But marketing guru Seth Godin, who occasionally chimes in on education issues (and often makes a lot of sense), has already written a high profile response that has popped into my Twitter feed many times in the past few days.

While it’s not a great challenge to this simplistic nonsense, at least Godin is exactly right that the professor who authored the Times op-ed has missed the real issue.

The times they are a'changing

Why give lectures at all?

Why offer a handmade, real-time oration for a small audience of students—students who are expected to slow down their clock speed, listen attentively and take good notes at the very same rate as all the other students?

I know why we used to do it. We used to do it because a lecture is a thoughtful exposition, a reasoned, researched argument that delivers a lot of information in a fairly condensed period of time. And before technology, the best way to deliver that exposition was to do it live.

But now?

Godin’s recommendation to replace the live lecture – basically going to the “flipped” classroom approach and have students watch a recording of the presentation outside of class – is a crappy alternative.

But he does ask the right question: Why lectures? Why do we continue with the “watch presentation-take notes-answer test questions” approach to learning? Especially since it is becoming clear that this is not an effective learning process.

Which leads to the other half of this question: if we’re not going to lecture at students, what do with do with all that “precious classroom time”?

And it is precious. It’s a curated group of thirty or a hundred students, coordinated in real-time and in real-space, inhabiting a very expensive room, simultaneously.

The K-12 experience is thirteen years built on compliance and obedience, a systemic effort to train kids to become cogs in the industrial machine. And it has worked. One component of this regime is the top-down nature of the classroom. We don’t want to train kids to ask difficult questions, so we lecture at them.

Although teachers in K12 don’t perform as many lectures as college instructors1, most classrooms are still structured around direct instruction. With material structured by the adults and presented to students, who are then expected to extract the required information, and recall it on some kind of test at some later time.

In the end, however, my biggest objection to all these “laptops are making kids stupid” stories is that the researchers – and the writers reporting on their work – always start by blindly blaming the technology and the students.

And assuming our current educational structure is above reproach and needs no alteration.


1 However, the lecture format is still a fundamental part of many high school courses, especially Advanced Placement, which is essentially a college course adapted for slightly younger students.

Picture from Flickr and used under Creative Commons license from brett jordan.

By All Means, Question The Screens

As promised, Jay Mathews is back with a followup to his promotion for a new anti-edtech book, written by two high school social studies teachers here in Fairfax County. If possible, his installment this week features even more cliches and overly broad generalizations.

Mathews begins by citing the long discredited myth of the “digital native”, and follows that by completely misrepresenting (and likely misunderstanding) the work of danah boyd. All in one paragraph.

The rest of the column is a messy collection of anecdotes and unsupported claims from the book.

Citing much research, they concluded, “the new digital world is a toxic environment for the developing minds of young people. Rather than making digital natives superlearners, it has stunted their mental growth.”

I would love to see the academic studies they found using phrases like “toxic environment” and “superlearners”.

But what about the broad range of research that arrives at very different conclusions? While the negative side too often gets the headlines, studies of how technology impacts learning is hardly conclusive. And this blanket statement alone makes me think the authors are not going for any kind of balance in this book.

Then there’s this reasoning:

Social studies teachers, they reported, are being encouraged to move “to DBQs, or document-based questions, which are simply research papers where the teacher has done all the research for the students.” Clement and Miles stick with real research papers, after students learn about different types of evidence and plan investigative strategies. Yet their students often become frustrated when devices don’t lead them to a useful source right away.

Completely ignore the issue of whether writing “real” research papers is even a valid assignment anymore.

Back before the evil internet, very few teachers just dumped their kids in the library and ignored their frustration with finding appropriate material. The process of searching for, validating, and using evidence was a key part of the learning. It still is. If anything, these skills are even more important for students now. And banning the use of “screens” for research borders on educational malpractice.

The only idea by Mathews and the authors in this mess that makes any sense is that parents (and students and teachers) should ask questions about the use of technology in schools. But “May I opt my child out of screen-based instructional activities?” is not one of them.

Instead, go deeper and challenge educators to respond to queries like “How can the use of technology change and improve the learning experience for my child?” or “How will your instructional practice change to help my child make best use of the technology available?”.

Bottom line, I certainly support questioning the use of “screens” in the classroom. However, recommending, as the authors (and probably Mathews) do, that “teachers reject most of ed-tech” is completely unrealistic and extremely short-sighted.

It’s a matter of how, not whether.

Blame the Screens, Not Us

In his regular weekly column for the Washington Post1, Jay Mathews wants us to know about two local teachers who have written a book containing “discoveries that threaten the foundations of the high-tech classroom”.

Wow! But a statement like that is what you might expect from something with the title “Screen Schooled: Two Veteran Teachers Expose How Technology Overuse Is Making Our Kids Dumber.”

I haven’t read more than the excerpts provided by Mathews and the Amazon sampler for the book, but I have a few observations anyway.

Let’s start with the authors’ “three core principles for good teaching”:

(1) deliver instruction in the simplest possible manner; (2) focus instruction on what students are able to do; and (3) foster face-to-face human interaction and opportunities for community building.

That opening phrase “deliver instruction” is certainly at the core of the common view of classroom pedagogy. Someone designated as “teacher” delivers a package of carefully curated information to a group known as “students”. Unstated but assumed, of course, is that students will display the amount of information they have retained at some point.

In the third principle, the idea of “community building” is wonderful. That should be one of the primary goals of classrooms and schools. However, real communities are built by the members, not framed by someone else. Leaders come from within the community, not assigned to that role.

Then there are the opening lines to the first chapter of the book itself.

Something is not right with today’s kids. You know it, and I know it.

That is followed with the far too common indictment of “screen time” and the “misuse” of social media and technology in general, complete with the fictitious example of “typical” teenager Brett as he gets up and goes to school. Just that part of the book includes an incredible number of cliches disparaging both students and their teachers. I’m completely torn as to whether I want to read more.

However, in the course of the article, the authors’ and Mathews do land on a few truths.

They are certainly correct that “these tools in and of themselves do not make for better teaching”. And I do agree with this observation:

Students need no help from schools developing their tablet, smartphone, or Twitter skills. They are doing this on their own.

But not the conclusion that follows.

What they need help with is critical thinking, problem solving, and community building.

Most kids do very well with developing those skills. Just not for the material you are trying to get them to understand.

So, did you consider that maybe the problem isn’t with your students and their use of technology but instead with this structure we’ve designed for them call “school”?

Is it possible the curriculum we expect them to learn is a major part of the problem? Large parts of that material is irrelevant and does little to foster those problem solving and community building skills mentioned several times. Not to mention they way it is “delivered”.

Plus the kids are very well aware of why the teachers want them to absorb the information in the first place: it’s on the SOLs (Virginia’s standards of learning), it will probably show up on a test sometime in the future, and they must pass the test to “succeed” (and keep the schools/district numbers high).

In the end, I do not disagree with these teacher that there is something wrong with how we use technology in school.

The problem, however, is that, for the most part, we are trying to replicate the standard school experience through screens. We want to maintain the same curriculum, pedagogy, and academic framework with some computing devices slapped on the side.

Instead of taking full advantage of the available power from devices and networks to reimagine the entire learning experience.

By the way, Mathews closes the column with this:

Next week, I will get into what they say can be done to turn back the acidic distractions of the tech revolution in our schools, and save just the stuff that works.

You have been warned.


1. The title in the paper, “Teachers demonstrate the power of fewer screens and more human interaction” is completely wrong; the online title “Hitting the return key on education” makes no sense.

There’s Nothing “Super” About It

Cathy Davidson has a new book in which she proposes “how to revolutionize the university to prepare students for a world in flux”. Nothing like setting high goals for yourself, is there?

In an excerpt from that book, she tackles the slightly less daunting issue of whether technology in the classroom benefits or hurts students.

She makes many great points in a very short space, but anyone who makes decisions about using technology in a K12 classroom should be required to demonstrate an understanding of this paragraph.

Here’s the connection between educational technophobia or technophilia: Both presume that technology in and of itself has superpowers that can either tank or replace human learning. Technology can automate many things. What it cannot automate is how humans learn something new and challenging. Neither can it, in its own right, rob us of our ability to learn when we want and need to learn. It can distract us, entice us away from the business of learning–but so can just about anything else when we’re bored.

Exactly. Technology is not a superhero. Or a super-villain. Good outcomes or bad (or something in-between) depends on how you use it.

Instead of either banning devices or automating information retrieval–whether from a screen or a lecturer droning on from the podium–the best pedagogical research we have reinforces the idea that learning in the classroom is most effective when it proceeds pretty much the way it does when we try to master something new outside of school: learning incrementally, being challenged, trying again. I even studied for my driver’s test that way–and certainly that’s what I do if I’m aspiring to something really difficult.

Incremental and challenging certainly doesn’t describe the test-driven process that “learning” in most American schools. With or without technology.