Again, Tech is Not the Problem

A college professor writing in the New Yorker makes the case for banning laptops in the classroom.

Or at least he tries – and largely fails.

…the temptation for distraction was high. I know that I have a hard time staying on task when the option to check out at any momentary lull is available; I assumed that this must be true for my students, as well.

I wonder if he asked his students about the situation in addition to assuming their experience was just like his. And why is their temptation for distraction so high?

He goes on to cite a study which concluded that “disconnected students performed better on a post-lecture quiz”, and in the next paragraph acknowledges that the assessment method used by the researchers, a pop quiz, “are not the best measure of learning, which is an iterative and reflective process.”

Then, after discussing research that tried to incorporate more precision in to the investigative process, he at least approaches a part of the problem that does not assign sole blame to the technology.

These examples can be seen as the progeny of an ill-conceived union of twenty-first-century tools (computers, tablets, smartphones) with nineteenth-century modalities (lectures).

But that recognition doesn’t last long.

Common to all of these contexts is the human-machine interaction. Our “digital assistants” are platforms for play and socializing; it makes sense, then, that we would approach those devices as game and chat machines, rather than as learning portals.

It really doesn’t make sense. You’re the teacher. If you want your students to approach their devices as learning portals, then structure your instructional practice to fit that idea. Don’t assume they graduated high school with that understanding.

Anyway, he ends the piece with this grudging conclusion.

We’re not all that far along in understanding how learning, teaching, and technology interact in the classroom. Institutions should certainly enable faculty to experiment with new technology, but should also approach all potential classroom intruders with a healthy dose of skepticism, and resist the impulse to always implement the new, trendy thing out of our fear of being left behind.

In other words, we need more research about how we can keep our “nineteenth-century modality” for delivering information to students, followed closely by our time-honored assessment system of course, and “resist the impulse” to allow “new, trendy” things like laptops and wifi to be used.

Again, did any of these professors bother to talk to their students about how they learn best? Did any of them consider that maybe their approach to teaching was the part of the problem that needed fixing?

This essay reflects the university-level experience through the lens of a small group of professors. However, we have many K12 teachers who express similar feelings (and fears) about “twenty-first century tools” intruding on their traditional instructional methods.

Blame The Technology, As Always

Speaking of doing things the way we always have,1report from a teachers union in Northern Ireland calls for “urgent action over the impact of modern technology on children’s ability to learn at school”.

Nothing new here. You’ve probably read many stories like this, ones where educators, parents, politicians, and others express concerns over changes they see in kids, brought about by (more blamed on) technology. They’ve been told for years/decades/centuries.

And this from an elementary teacher quoted in this BBC story strikes me as the fundamental error in that call for “urgent action”.

There’s a complete lack of motivation among many of my pupils – these gadgets are really destroying their ability to learn.

So, the technology is at fault.

Ok, I have to ask: is it possible the lack of motivation in your students has less to do with the “gadgets” and more about what and how you’re teaching? Could it be you’re blaming the technology when you should be considering other factors?

It’s not just in the UK. Many education “experts” here in the US also assume that the rest of the world can fundamentally shift around us, with kids having access to powerful communication tools and networks (and, yes, complex games), but the curriculum and instructional practice of school can stay exactly the same.


  1. A possibly tenuous connection to the previous rant.

Technology and the Law

You really don’t need watch carefully to recognize that technology is advancing at a rate that is far outpacing the ability of governments and the legal system to keep up.

However, while a few stories of Congress critters struggling to cope with social media and judges being stumped by digital recording systems might be funny in the moment, the legal stagnation when it comes to rapid digital changes has serious implications for the future of American society.

As The Guardian, the UK news organization that first published Edward Snowden’s revelations of how the NSA is monitoring everyone’s communications, accurately notes, “Technology law will soon be reshaped by people who don’t use email”.

There’s been much discussion – and derision – of the US supreme court’s recent forays into cellphones and the internet, but as more and more of these cases bubble up to the high chamber, including surveillance reform, we won’t be laughing for long: the future of technology and privacy law will undoubtedly be written over the next few years by nine individuals who haven’t “really ‘gotten to’ email” and find Facebook and Twitter “a challenge”.

And we certainly can’t count on Congress to address the issues.

This lack of basic understanding is alarming, because the supreme court is really the only branch of power poised to confront one of the great challenges of our time: catching up our laws to the pace of innovation, defending our privacy against the sprint of surveillance. The NSA is “training more cyberwarriors” as fast as it can, but our elected representatives move at a snail’s pace when it comes to the internet. The US Congress has proven itself unable to pass even the most uncontroversial proposals, let alone comprehensive NSA reforms: the legislative branch can’t even get its act together long enough to pass an update our primary email privacy law, which was written in 1986 – before the World Wide Web had been invented.

As the most recent edition of the Decode DC podcast illustrates, our legislators can’t even manage the flood of email and other messages they receive.

But while our “leaders” are bogged down with re-fighting political battles of the past, legislatures in other countries are looking forward.

By contrast, consider Finland. There, lawmakers are experimenting with a bold new way of reforming a law: crowdsourcing — meaning turning the legislative process over to the people.

Or consider Brazil, where there is now an experimental computer lab smack in the middle of the Parliament’s committee rooms. There, official staff hackers throw together apps and games and data visualizations to help Brazilians — and the members of Parliament — understand the legislative process.

So, any hope our politicians can get their act together and bring our laws into the 21st century? Probably not, especially when a third of their adult constituents (and too many of them) don’t even accept basic scientific concepts and believe in ghosts, UFOs and astrology.

Formatting Your Digital Legacy

From the BBC:

A dozen previously unknown works created by Andy Warhol have been recovered from 30-year-old Amiga disks.1

I find this story interesting for a couple of reasons.

First, that Warhol, one of the most iconic and controversial artists of my early years of media awareness, would experiment with the then new process of computer art. I know he was probably paid a boatload of money to help promote Commodore’s new machine and already had the kind of personality open to this kind of new, but still it’s fun to consider an acclaimed cultural figure working in 32-bit color.

However, beyond the artistic aspect of the discovery is this: “A painstaking three-year project was required to recover the images which were saved in an obscure data format.” [emphasis mine]

Many of us consider work in digital form to be more durable than that stored on old fashioned, very fragile paper. Able to be infinitely duplicated so that copies will remain somewhere for centuries. Out there, in that cloud thing. Or the Google.

But what happens when the first drafts of the ground-breaking, historically significant, Pulitzer-prize winning novel written by one of our students is stored in a format no longer supported? Scholars are still finding the early paper-based revisions by important writers of several hundred years ago, in a language not very different from that used today. Warhol’s Amiga-based masterpieces were made less than 30 years ago in a now obscure format.

Not too long ago someone brought me a 3-1/2” floppy disk2 with some far less significant but personally important files written in ClarisWorks asking if I could open them. Given enough time, far less than three years, and a compelling need, I could probably make it happen, but that’s just one of many examples of once-popular file formats, now abandoned in the relatively short timeline of digital media.

Anyway, something to consider when you and your students decide where and how to store the important work you’re doing, the stuff that will be unearthed by archaeologists in the future and used to write the history of the early 21st century.

Now, I need to start looking for a way to preserve all of these crappy, but culturally significant, rants for posterity.


  1. For those too young to remember the Amiga, it was considered cutting edge personal computing in the ’80s

  2. Look it up, kiddies!

Is The Next Big Thing Already in Your Classroom?

I recently attended a presentation by a teacher who explained how she is using Google Glass in her classroom, and one or her ideas was to let a student wear the device as a way of getting a better idea of their perspective of her.

It’s a great concept but I wondered if we really need a $1500 device to do that. Most teachers already have the tools necessary to get a class-eye view of their work sitting in the pockets and backpacks of their kids. But there’s a larger question that needs to be addressed when discussing Glass being the next big thing in education.

Do we really need to look for the next big thing?

Instead, shouldn’t we try to make better use of the last big thing we bought, made a big deal of for a while, and then put in the closet when the next big thing was announced?

Think back a couple of years when interactive whiteboards (IWB) were all the educational rage. Our schools couldn’t install them fast enough. Classes were taught on how to make great use of them, educational theorists were convinced they would revolutionize instruction, and researchers produced conclusive data on just how motivational/engaging/effective they were.

Now, when I visit schools, I make a point to gather a little data of my own on how those IWBs function in classrooms. Many, if not most, are used as little more than projection surfaces. The software, which supposed to enhance the interactivity of the boards, is used as a slightly fancier version of the standard slide show software that was all the rage back in it’s day. Very few kids use them in any way.

In the meantime, we have lots of portable computers that are put into fixed labs and spend a frightening amount of time as a replacement for paper/pencil multiple choice tests.

There are stacks of clicker systems mostly in closets except when pulled out to use for a few minutes as a “fun” way to practice for standardized tests.

Along side them in the storage room you’ll often find a bunch of wireless slates, which were supposed to add more interactivity to classrooms.

Every one of our classrooms has high speed access to the world wide web. A resource with nearly unlimited potential to connect our students to each other and to the world, enabling them to publish work and ideas to a much wider audience. In most, of course, the web is little more than a digital encyclopedia, the direct replacement for Word/Excel/PowerPoint (new tool, same assignments), and, of course, one more vehicle for delivering tests.

I suppose it’s possible that Google Glass is the next big thing for education. But while we’re waiting to determine that, not to mention waiting for the next, next big thing, we need to make better use of the previous editions of the next big thing we already have.

Selling EdTech

Larry Cuban, one of the best critics of the way we use technology for K12 instruction, has a great post about how companies market technology to schools, an $18 billion industry and growing, and why their products are usually out of touch with teacher and student needs.

The largest part of the problem is the big gap between the people who write the purchase orders, “school district IT professionals, district office administrators, and superintendents”, and the students and teachers who actually use the products.

That is where the money is. School officers are the ones who recommend to boards of education what to buy and how to deploy devices and software. From start-ups to established companies, high-tech representatives rarely involve teachers or students in their pitches to district officers or school boards. So the paradox is that the end-users (teachers and students) have little to do with purchasing decisions.

Cuban also notes that the companies rarely observe actual classrooms to see how their products might be used. Instead they depend on surveys, focus groups, and occasionally academic research (but only when it fits their approach).

So what can be done?

Cuban offers two great suggestions. First, companies need to talk to teacher and students and spend some money on learning about what happens in real classrooms. And second, dial back the “over-the-top claims” promising to provide solutions appropriate to every school everywhere.

But that doesn’t address the other part of this situation, the people buying this stuff. The folks on our end making the purchasing decisions who don’t teach (and may never have taught), rarely if ever work with kids1, and, in the case of the IT department, are more concerned with password management and how the technology works on “their” network than whether it is instructionally sound or even useful.


  1. Other than their own children, who they oven extrapolate into the experiences of every student in the district.

Blame the Internet!

I don’t understand some of the writers employed by the Washington Post. Maybe they’ve been living inside the bubble of the infamous “beltway” too long. Or possibly they’re trying to write satire and the point never gets across.

Take, for example, a column from today’s paper that starts with the line “If I could, I would repeal the Internet.”. The writer’s primary thesis, as best I can determine, seems to be that the “terrifying danger” posed by the threat of cyberwar far outweighs the “relatively modest” benefits of the web.

He then goes on to lay out a variety of doomsday possibilities (disruption of the power grid, decimation of the financial system, Chinese hackers, etc.) to be brought about by the Internet, evidently drawn from a report, a book, and conversations with cybersecurity experts (all of whom profit from worst case scenarios).

And then he ends the column with this conclusion.

All this qualifies our view of the Internet. Granted, it’s relentless. New uses spread rapidly. Already, 56 percent of U.S. adults own smartphones and 34 percent have tablets, says the Pew Internet & American Life Project. But the Internet’s social impact is shallow. Imagine life without it. Would the loss of e-mail, Facebook or Wikipedia inflict fundamental change? Now imagine life without some earlier breakthroughs: electricity, cars, antibiotics. Life would be radically different. The Internet’s virtues are overstated, its vices understated. It’s a mixed blessing — and the mix may be moving against us. [my emphasis]

Another shortsighted pundit placing total blame for a problem (or potential problem in this case) on the technology involved rather than on the other, more human factors of how it’s used. And ignoring the fact that digital networks are a relatively recent invention (especially the part where everyone can have access) and we are only at the beginning of their evolution and application.

A similar reasoned, logical argument* could also have been made for those earlier, life-changing breakthroughs he lists, plus ships, trains, chemistry, the telephone, television and many more. Especially early in their lifetimes when society was working its way through the disruptions they caused.

You can debate the benefits of having a ubiquitous, always on communications network available in every home and classroom (which it’s not, yet). Certainly we need to address many problems in the way the technology is used, with some people doing very silly and even stupid things with the power they have.

However, only someone who has not been paying attention over the past fifteen years or so could declare that the internet has not enabled many fundamental, positive changes for society.

I wonder if the editor responsible for this columnist thought he was kidding.

Update (later today): David Weinberger suggests we repeal the First Amendment and oxygen using the same reasoning as the Post writer, then provides a MadLibs version to do the same for anything you don’t like. You too can be a Washington columnist.


* That was an attempt at satire, in case it wasn’t obvious. :-)

Our Screwed Up Approach to Instructional Technology

Schools buying iPads is not really news. Unless it’s one of the largest districts in the country, Los Angeles County, spending a boatload of cash, $30 million, on them.

Now I love my iPad, and believe it has great potential as a personal learning device. However, this particular story has many, many elements that illustrate just how screwed up our approach to instructional technology really is.

The Board of Education voted 6 to 0 on Tuesday to approve the contract after hearing senior staff laud Apple’s product as both the best in quality and the least expensive option that met the district’s specifications.

How many teachers and students were involved in setting those specifications? The article doesn’t say but I’m going to go out on a limb and bet that few if any classroom teachers (and probably no kids) had any say in the matter.

More likely, the "specifications" came from the IT department and is based only on the opinions of technicians. Or possibly from a superintendent, based on anecdotes of his children who can do amazing stuff on iPads that he/she extrapolated to every student in the county.

Again based only on speculation (and too many years of experience), I’m betting that $30 million includes no money for professional development, beyond maybe a short here’s-what-button-to-push orientation. Nothing to help teachers understand how to make the best use of these new tools in their instruction. And certainly no consideration of changing curriculum and classroom practice to fit the capabilities of these relatively new devices.

Then there’s the matter of how we pay for all those devices. In this case the money will come from a bond issue, not from continuing funds, which means the iPads purchased this summer will probably be long gone while taxpayers are still paying off the borrowed money.

Rather than building instructional technology into regular budgets, schools and districts seem to constantly fall into this kind of big burst, headline-making, "special occasion" spending. Why do they do it that way? Simple. Administrators, along with many teachers, parents, and other voting members of the community continue to view computers as a nice-to-have extra, something to play with after we finish all that regular school stuff.

But the problem is not just with the people who supported this vote. Those who spoke against the decision also reveal some pretty stupid approaches to making instructional technology decisions.

Hines [senior director of state government affairs for Microsoft] also noted that more businesses still use Microsoft platforms, and that students should be exposed to machines they will encounter in the workplace.

We don’t help kids at all by teaching them specific software, except for the few in specific vocational certification programs. Instead, how about helping kids understand how to use and be productive with any technology they might encounter? The flexibility to adapt to whatever new tools enter that workplace is a far more valuable skill than learning PowerPoint inside and out.

Finally, we arrive at the bottom line to all this.

Officials said they opposed a delay in part because new state and national tests will be taken on computers, and they don’t want Los Angeles students to lack the necessary experience with them.

As we’ve seen close up here in the overly-large school district (and the rest of Virginia), officials like administering standardized tests digitally because the results (aka "data") are available faster and are easier to manipulate. And learning how to generate good data is fast becoming more important than any other skill students might acquire during their time in K12.

Maybe even worth $30 million.

Blame the Technology. Or the Students.

From the New York Times

There is a widespread belief among teachers that students’ constant use of digital technology is hampering their attention spans and ability to persevere in the face of challenging tasks, according to two surveys of teachers being released on Thursday.

An English teacher quoted in the story complained “I’m an entertainer. I have to do a song and dance to capture their attention.” and later asked “What’s going to happen when they don’t have constant entertainment?”.

However, is technology the problem? Or what it’s “doing to” kids?

Although I can sympathize to some degree, the English teacher’s statement and the opinions of a majority in the survey are a little disturbing. The whole foundation on which these studies are based* assumes that whatever is being done in the classroom is right and the kids are “wrong” in some way, due, of course, to their “constant use of digital technology”.

I wonder if anyone – researchers or subjects – seriously questioned whether what the students were asked to learn, the assignments they were given, the instructional methods might, just might, be a major factor in their “shorter attention spans”.

Is technology to blame?

Or is a large part of the problem that our education system is largely unwilling to take a reflective look at itself, to reevaluate what today’s students need to know and how to best help them learn it?


*Admittedly I haven’t read either report so it’s possible I’m completely wrong. Wouldn’t be the first time.

What Does “Tech Savvy Student” Mean?

Ok, how many of you “older” folks out there are tired of the whole “digital natives” vs. “digital immigrants” concept? As someone who didn’t grow up using computers but who is now very comfortable with networks, social media, mobile devices and the rest, I know I am.

It’s hard, however, to convince some of my colleagues that kids are not born with some magic innate technological talent. I try to tell them that, more than anything, their students simply have more time to spend playing with various electronic devices and absorbing all the little tricks that seem like genius to those who don’t. Invoke the 10,000 hour rule.

However, knowing how to trick out a smart phone or understanding the complexities of Facebook doesn’t mean those students also have any clue how to use all that power to advance their learning. Or even the basics of common programs you use like PowerPoint.

A little research to back up this idea comes from a new study conducted by the Economic & Social Research Council (ESRC). The subjects were college students in the UK so maybe the findings don’t directly apply to the kids in US classrooms, but I doubt some of what they found is far off from those in our overly-large school district and other similar parts of the country.

  • 97.8 percent of students owned a mobile phone;
  • Just over three quarters — 77.4 percent — owned a laptop and 38.1 percent owned a desktop computer.
  • 70.1 percent felt their access to computers was sufficient to meet their computing needs.
  • The mobile phone was chosen by 83.2 percent as the device students would miss the most if it was taken away.
  • A small minority of students don’t use email or have access to mobile phones.

Students 20 years old or younger reported being more engaged in instant messaging, texting, social networks and downloading video media than students who were aged 25 years or more. Only 4.3 percent of those 20 or younger never used social networking sites, and for those 35 or older this rose to 78.5 percent.

In other words, they’re coming to your classroom understanding IM, texting, social networks, video downloads and carrying some powerful tools that involve reading, writing, collaboration.

So, what are we doing to leverage those communications skills and the devices in their pockets to improve their learning?

Sorry, I forgot it’s May. Testing season. No time to worry about all that learning stuff.

The Legacy of Steve Jobs

This is exactly it.

One of the best things about Steve Jobs’ return to Apple — which will live on and benefit society long after Jobs’ death — is how he basically spent the last 14 years teaching thousands of Apple employees to have incredibly high standards and to build amazing products.

Perhaps more importantly, through Apple’s products, Steve also taught hundreds of millions of consumers to expect and demand amazing things.

For now, many of those Apple colleagues — especially the ones who worked most closely with Steve — still work there. But over time, more will leave to start their own companies or launch new projects. And some of those companies will make some really cool things completely outside the consumer electronics industry, reflecting both the work of their founders and also a little bit of Steve Jobs.

When do we apply Jobs kind of thinking, which has nothing remotely to do with scores on standardized tests, to public education? When do millions of consumers (aka students) begin demanding amazing things?

Soon, I hope.

And, by the way, that thermostat is very cool!

Not What It Says on the Door

According to the title of the office in which I work here in the overly-large school district, we do instructional technology integration.  Which is good, something I really want to do.

Except that much of what we actually do has nothing with either “instructional technology” or instruction.

The major project that involves the most people and time in our office is an “assessment resource tool”, basically a large database of questions that teachers use to create tests for their students.

But none of that is “instructional” technology.

We also have a parallel database that is being loaded with “curriculum and resources for planning and delivering instruction”, allowing teachers to search for and download activities and lessons create under the auspices of our curriculum specialists. It’s the kind of stuff that used to be packaged in large binders (or in more recent years on CD) and shipped to schools.

And that’s still not instructional technology.

We are also involved with the multi-year rollout of a new student information system, another big database which will include an online gradebook, attendance records, and other information teachers need.

Not instructional technology.

Online standardized tests? Data analysis? IWBs? Student response systems (aka clickers)?

No. Nope. Normally not. Hell no.

So what is my definition of instructional technology?

Simple. It’s anything that students use to develop and enhance their learning.  Or, in the words of our school board, students should “use technology to access, communicate, and apply knowledge and to foster creativity”.

Whatever the specific language, the key point here is that to be instructional, all this technology we’ve spent tens of millions on over the past decade – the hardware, software, networks – should be in the hands of, and to some degree controlled by, the kids to build their knowledge and experience.

Unfortunately, that’s not what’s happening around here, certainly not this time of year when every resource (including technology) for almost every student in almost every school is completely focused on getting the right bubbles in the right places.

And it’s not just the six to eight weeks around the SOLs (gotta practice, remember). All of that stuff above, especially that “assessment resource tool”, is sucking up all of the available computing devices in many schools during most of the year.

Which means students have fewer and fewer tools for all that creating, communicating, and collaborating we say we want the kids to be doing.

However, I’m not saying those resources that are the primary focus of our office aren’t important.* Teachers and schools certainly need good administrative tools to better manage the increasingly complicated learning process.

Just don’t call any of it “instructional” unless it’s in the hands of kids.


*I would argue that clickers and IWBs, however you classify them, are a waste of money… but that’s another post.

It’s Got Star Power!

Reading or watching a story in the popular media about how technology is being used in schools usually makes me cringe.  Take for example a recent article in the New York Times about schools embracing iPads.

The two digital pages include several extremely superficial examples of classroom use that include over-the-top quotes like “I think this could very well be the biggest thing to hit school technology since the overhead projector.” from a principal.

And this observation, “It has brought individual technology into the classroom without changing the classroom atmosphere”, which is rather scary since a truly successful 1-1 program should change the classroom in some very significant ways.

Plus the incredible instance of the school that “converted an empty classroom into a lab with 36 iPads — named the iMaginarium — that has become the centerpiece of the school because, as the principal put it, “of all the devices out there, the iPad has the most star power with kids.”

Don’t get me wrong, I think the iPad and other touch tablets have a lot of potential as learning tools and, if we are ever going to break out of the teacher-centered, lecture/demo, traditional classroom, students will need to have some kind of easy to use, always connected, personal communication device.

However, until that potential is better realized, I wish reporters at the Times and elsewhere would pay closer attention to people like Larry Cuban who very correctly observes “There is very little evidence that kids learn more, faster or better by using these machines.”