Is The Next Big Thing Already in Your Classroom?

I recently attended a presentation by a teacher who explained how she is using Google Glass in her classroom, and one or her ideas was to let a student wear the device as a way of getting a better idea of their perspective of her.

It’s a great concept but I wondered if we really need a $1500 device to do that. Most teachers already have the tools necessary to get a class-eye view of their work sitting in the pockets and backpacks of their kids. But there’s a larger question that needs to be addressed when discussing Glass being the next big thing in education.

Do we really need to look for the next big thing?

Instead, shouldn’t we try to make better use of the last big thing we bought, made a big deal of for a while, and then put in the closet when the next big thing was announced?

Think back a couple of years when interactive whiteboards (IWB) were all the educational rage. Our schools couldn’t install them fast enough. Classes were taught on how to make great use of them, educational theorists were convinced they would revolutionize instruction, and researchers produced conclusive data on just how motivational/engaging/effective they were.

Now, when I visit schools, I make a point to gather a little data of my own on how those IWBs function in classrooms. Many, if not most, are used as little more than projection surfaces. The software, which supposed to enhance the interactivity of the boards, is used as a slightly fancier version of the standard slide show software that was all the rage back in it’s day. Very few kids use them in any way.

In the meantime, we have lots of portable computers that are put into fixed labs and spend a frightening amount of time as a replacement for paper/pencil multiple choice tests.

There are stacks of clicker systems mostly in closets except when pulled out to use for a few minutes as a “fun” way to practice for standardized tests.

Along side them in the storage room you’ll often find a bunch of wireless slates, which were supposed to add more interactivity to classrooms.

Every one of our classrooms has high speed access to the world wide web. A resource with nearly unlimited potential to connect our students to each other and to the world, enabling them to publish work and ideas to a much wider audience. In most, of course, the web is little more than a digital encyclopedia, the direct replacement for Word/Excel/PowerPoint (new tool, same assignments), and, of course, one more vehicle for delivering tests.

I suppose it’s possible that Google Glass is the next big thing for education. But while we’re waiting to determine that, not to mention waiting for the next, next big thing, we need to make better use of the previous editions of the next big thing we already have.

Selling EdTech

Larry Cuban, one of the best critics of the way we use technology for K12 instruction, has a great post about how companies market technology to schools, an $18 billion industry and growing, and why their products are usually out of touch with teacher and student needs.

The largest part of the problem is the big gap between the people who write the purchase orders, “school district IT professionals, district office administrators, and superintendents”, and the students and teachers who actually use the products.

That is where the money is. School officers are the ones who recommend to boards of education what to buy and how to deploy devices and software. From start-ups to established companies, high-tech representatives rarely involve teachers or students in their pitches to district officers or school boards. So the paradox is that the end-users (teachers and students) have little to do with purchasing decisions.

Cuban also notes that the companies rarely observe actual classrooms to see how their products might be used. Instead they depend on surveys, focus groups, and occasionally academic research (but only when it fits their approach).

So what can be done?

Cuban offers two great suggestions. First, companies need to talk to teacher and students and spend some money on learning about what happens in real classrooms. And second, dial back the “over-the-top claims” promising to provide solutions appropriate to every school everywhere.

But that doesn’t address the other part of this situation, the people buying this stuff. The folks on our end making the purchasing decisions who don’t teach (and may never have taught), rarely if ever work with kids1, and, in the case of the IT department, are more concerned with password management and how the technology works on “their” network than whether it is instructionally sound or even useful.

  1. Other than their own children, who they oven extrapolate into the experiences of every student in the district.

Blame the Internet!

I don’t understand some of the writers employed by the Washington Post. Maybe they’ve been living inside the bubble of the infamous “beltway” too long. Or possibly they’re trying to write satire and the point never gets across.

Take, for example, a column from today’s paper that starts with the line “If I could, I would repeal the Internet.”. The writer’s primary thesis, as best I can determine, seems to be that the “terrifying danger” posed by the threat of cyberwar far outweighs the “relatively modest” benefits of the web.

He then goes on to lay out a variety of doomsday possibilities (disruption of the power grid, decimation of the financial system, Chinese hackers, etc.) to be brought about by the Internet, evidently drawn from a report, a book, and conversations with cybersecurity experts (all of whom profit from worst case scenarios).

And then he ends the column with this conclusion.

All this qualifies our view of the Internet. Granted, it’s relentless. New uses spread rapidly. Already, 56 percent of U.S. adults own smartphones and 34 percent have tablets, says the Pew Internet & American Life Project. But the Internet’s social impact is shallow. Imagine life without it. Would the loss of e-mail, Facebook or Wikipedia inflict fundamental change? Now imagine life without some earlier breakthroughs: electricity, cars, antibiotics. Life would be radically different. The Internet’s virtues are overstated, its vices understated. It’s a mixed blessing — and the mix may be moving against us. [my emphasis]

Another shortsighted pundit placing total blame for a problem (or potential problem in this case) on the technology involved rather than on the other, more human factors of how it’s used. And ignoring the fact that digital networks are a relatively recent invention (especially the part where everyone can have access) and we are only at the beginning of their evolution and application.

A similar reasoned, logical argument* could also have been made for those earlier, life-changing breakthroughs he lists, plus ships, trains, chemistry, the telephone, television and many more. Especially early in their lifetimes when society was working its way through the disruptions they caused.

You can debate the benefits of having a ubiquitous, always on communications network available in every home and classroom (which it’s not, yet). Certainly we need to address many problems in the way the technology is used, with some people doing very silly and even stupid things with the power they have.

However, only someone who has not been paying attention over the past fifteen years or so could declare that the internet has not enabled many fundamental, positive changes for society.

I wonder if the editor responsible for this columnist thought he was kidding.

Update (later today): David Weinberger suggests we repeal the First Amendment and oxygen using the same reasoning as the Post writer, then provides a MadLibs version to do the same for anything you don’t like. You too can be a Washington columnist.

* That was an attempt at satire, in case it wasn’t obvious. :-)

Our Screwed Up Approach to Instructional Technology

Schools buying iPads is not really news. Unless it’s one of the largest districts in the country, Los Angeles County, spending a boatload of cash, $30 million, on them.

Now I love my iPad, and believe it has great potential as a personal learning device. However, this particular story has many, many elements that illustrate just how screwed up our approach to instructional technology really is.

The Board of Education voted 6 to 0 on Tuesday to approve the contract after hearing senior staff laud Apple’s product as both the best in quality and the least expensive option that met the district’s specifications.

How many teachers and students were involved in setting those specifications? The article doesn’t say but I’m going to go out on a limb and bet that few if any classroom teachers (and probably no kids) had any say in the matter.

More likely, the "specifications" came from the IT department and is based only on the opinions of technicians. Or possibly from a superintendent, based on anecdotes of his children who can do amazing stuff on iPads that he/she extrapolated to every student in the county.

Again based only on speculation (and too many years of experience), I’m betting that $30 million includes no money for professional development, beyond maybe a short here’s-what-button-to-push orientation. Nothing to help teachers understand how to make the best use of these new tools in their instruction. And certainly no consideration of changing curriculum and classroom practice to fit the capabilities of these relatively new devices.

Then there’s the matter of how we pay for all those devices. In this case the money will come from a bond issue, not from continuing funds, which means the iPads purchased this summer will probably be long gone while taxpayers are still paying off the borrowed money.

Rather than building instructional technology into regular budgets, schools and districts seem to constantly fall into this kind of big burst, headline-making, "special occasion" spending. Why do they do it that way? Simple. Administrators, along with many teachers, parents, and other voting members of the community continue to view computers as a nice-to-have extra, something to play with after we finish all that regular school stuff.

But the problem is not just with the people who supported this vote. Those who spoke against the decision also reveal some pretty stupid approaches to making instructional technology decisions.

Hines [senior director of state government affairs for Microsoft] also noted that more businesses still use Microsoft platforms, and that students should be exposed to machines they will encounter in the workplace.

We don’t help kids at all by teaching them specific software, except for the few in specific vocational certification programs. Instead, how about helping kids understand how to use and be productive with any technology they might encounter? The flexibility to adapt to whatever new tools enter that workplace is a far more valuable skill than learning PowerPoint inside and out.

Finally, we arrive at the bottom line to all this.

Officials said they opposed a delay in part because new state and national tests will be taken on computers, and they don’t want Los Angeles students to lack the necessary experience with them.

As we’ve seen close up here in the overly-large school district (and the rest of Virginia), officials like administering standardized tests digitally because the results (aka "data") are available faster and are easier to manipulate. And learning how to generate good data is fast becoming more important than any other skill students might acquire during their time in K12.

Maybe even worth $30 million.

Blame the Technology. Or the Students.

From the New York Times

There is a widespread belief among teachers that students’ constant use of digital technology is hampering their attention spans and ability to persevere in the face of challenging tasks, according to two surveys of teachers being released on Thursday.

An English teacher quoted in the story complained “I’m an entertainer. I have to do a song and dance to capture their attention.” and later asked “What’s going to happen when they don’t have constant entertainment?”.

However, is technology the problem? Or what it’s “doing to” kids?

Although I can sympathize to some degree, the English teacher’s statement and the opinions of a majority in the survey are a little disturbing. The whole foundation on which these studies are based* assumes that whatever is being done in the classroom is right and the kids are “wrong” in some way, due, of course, to their “constant use of digital technology”.

I wonder if anyone – researchers or subjects – seriously questioned whether what the students were asked to learn, the assignments they were given, the instructional methods might, just might, be a major factor in their “shorter attention spans”.

Is technology to blame?

Or is a large part of the problem that our education system is largely unwilling to take a reflective look at itself, to reevaluate what today’s students need to know and how to best help them learn it?

*Admittedly I haven’t read either report so it’s possible I’m completely wrong. Wouldn’t be the first time.

What Does “Tech Savvy Student” Mean?

Ok, how many of you “older” folks out there are tired of the whole “digital natives” vs. “digital immigrants” concept? As someone who didn’t grow up using computers but who is now very comfortable with networks, social media, mobile devices and the rest, I know I am.

It’s hard, however, to convince some of my colleagues that kids are not born with some magic innate technological talent. I try to tell them that, more than anything, their students simply have more time to spend playing with various electronic devices and absorbing all the little tricks that seem like genius to those who don’t. Invoke the 10,000 hour rule.

However, knowing how to trick out a smart phone or understanding the complexities of Facebook doesn’t mean those students also have any clue how to use all that power to advance their learning. Or even the basics of common programs you use like PowerPoint.

A little research to back up this idea comes from a new study conducted by the Economic & Social Research Council (ESRC). The subjects were college students in the UK so maybe the findings don’t directly apply to the kids in US classrooms, but I doubt some of what they found is far off from those in our overly-large school district and other similar parts of the country.

  • 97.8 percent of students owned a mobile phone;
  • Just over three quarters — 77.4 percent — owned a laptop and 38.1 percent owned a desktop computer.
  • 70.1 percent felt their access to computers was sufficient to meet their computing needs.
  • The mobile phone was chosen by 83.2 percent as the device students would miss the most if it was taken away.
  • A small minority of students don’t use email or have access to mobile phones.

Students 20 years old or younger reported being more engaged in instant messaging, texting, social networks and downloading video media than students who were aged 25 years or more. Only 4.3 percent of those 20 or younger never used social networking sites, and for those 35 or older this rose to 78.5 percent.

In other words, they’re coming to your classroom understanding IM, texting, social networks, video downloads and carrying some powerful tools that involve reading, writing, collaboration.

So, what are we doing to leverage those communications skills and the devices in their pockets to improve their learning?

Sorry, I forgot it’s May. Testing season. No time to worry about all that learning stuff.

The Legacy of Steve Jobs

This is exactly it.

One of the best things about Steve Jobs’ return to Apple — which will live on and benefit society long after Jobs’ death — is how he basically spent the last 14 years teaching thousands of Apple employees to have incredibly high standards and to build amazing products.

Perhaps more importantly, through Apple’s products, Steve also taught hundreds of millions of consumers to expect and demand amazing things.

For now, many of those Apple colleagues — especially the ones who worked most closely with Steve — still work there. But over time, more will leave to start their own companies or launch new projects. And some of those companies will make some really cool things completely outside the consumer electronics industry, reflecting both the work of their founders and also a little bit of Steve Jobs.

When do we apply Jobs kind of thinking, which has nothing remotely to do with scores on standardized tests, to public education? When do millions of consumers (aka students) begin demanding amazing things?

Soon, I hope.

And, by the way, that thermostat is very cool!

Not What It Says on the Door

According to the title of the office in which I work here in the overly-large school district, we do instructional technology integration.  Which is good, something I really want to do.

Except that much of what we actually do has nothing with either “instructional technology” or instruction.

The major project that involves the most people and time in our office is an “assessment resource tool”, basically a large database of questions that teachers use to create tests for their students.

But none of that is “instructional” technology.

We also have a parallel database that is being loaded with “curriculum and resources for planning and delivering instruction”, allowing teachers to search for and download activities and lessons create under the auspices of our curriculum specialists. It’s the kind of stuff that used to be packaged in large binders (or in more recent years on CD) and shipped to schools.

And that’s still not instructional technology.

We are also involved with the multi-year rollout of a new student information system, another big database which will include an online gradebook, attendance records, and other information teachers need.

Not instructional technology.

Online standardized tests? Data analysis? IWBs? Student response systems (aka clickers)?

No. Nope. Normally not. Hell no.

So what is my definition of instructional technology?

Simple. It’s anything that students use to develop and enhance their learning.  Or, in the words of our school board, students should “use technology to access, communicate, and apply knowledge and to foster creativity”.

Whatever the specific language, the key point here is that to be instructional, all this technology we’ve spent tens of millions on over the past decade – the hardware, software, networks – should be in the hands of, and to some degree controlled by, the kids to build their knowledge and experience.

Unfortunately, that’s not what’s happening around here, certainly not this time of year when every resource (including technology) for almost every student in almost every school is completely focused on getting the right bubbles in the right places.

And it’s not just the six to eight weeks around the SOLs (gotta practice, remember). All of that stuff above, especially that “assessment resource tool”, is sucking up all of the available computing devices in many schools during most of the year.

Which means students have fewer and fewer tools for all that creating, communicating, and collaborating we say we want the kids to be doing.

However, I’m not saying those resources that are the primary focus of our office aren’t important.* Teachers and schools certainly need good administrative tools to better manage the increasingly complicated learning process.

Just don’t call any of it “instructional” unless it’s in the hands of kids.

*I would argue that clickers and IWBs, however you classify them, are a waste of money… but that’s another post.

It’s Got Star Power!

Reading or watching a story in the popular media about how technology is being used in schools usually makes me cringe.  Take for example a recent article in the New York Times about schools embracing iPads.

The two digital pages include several extremely superficial examples of classroom use that include over-the-top quotes like “I think this could very well be the biggest thing to hit school technology since the overhead projector.” from a principal.

And this observation, “It has brought individual technology into the classroom without changing the classroom atmosphere”, which is rather scary since a truly successful 1-1 program should change the classroom in some very significant ways.

Plus the incredible instance of the school that “converted an empty classroom into a lab with 36 iPads — named the iMaginarium — that has become the centerpiece of the school because, as the principal put it, “of all the devices out there, the iPad has the most star power with kids.”

Don’t get me wrong, I think the iPad and other touch tablets have a lot of potential as learning tools and, if we are ever going to break out of the teacher-centered, lecture/demo, traditional classroom, students will need to have some kind of easy to use, always connected, personal communication device.

However, until that potential is better realized, I wish reporters at the Times and elsewhere would pay closer attention to people like Larry Cuban who very correctly observes “There is very little evidence that kids learn more, faster or better by using these machines.”

One More Thing…

Following up on the previous post about leadership, during the interview with Steve Jobs he discussed Apple’s approach to business.

And he made this observation about the difference between producing computing devices for consumers, in which Apple has been very successful over the past decade, and the business market.

What I love about the consumer market that I always hated about the enterprise market is that we come up with a product, we try to tell everybody about it, and every person decides for themselves.  They vote yes or no.  And if enough of them say yes, we get to come to work tomorrow.  That’s how it works.  It’s really simple.

As for the enterprise market, it’s not so simple.  The people who use the products don’t decide for themselves.  And the people who make those decisions sometimes are confused.

Confused indeed!

Here in the overly-large school district, we are regularly reminded that we are not really a school system.

We work for an “enterprise” (and that we are all “clients”).

And that last part of Jobs’ remarks may offer a clue as to why the use of instructional technology is not what it should be here in the overly-large enterprise.

Picture from Wikipedia and I’m only guessing that it’s legal to link to it and not get sued by Paramount. :-)

Why Are We Buying This Stuff?

In parts one and two of a multipart post, Larry Cuban looks at why school districts buy new technologies when there is little or no evidence they do anything to improve student learning, especially when most are having major budget problems.

From part one, he notes that consumer spending on electronics in the US is up despite the continuing recession.

At the same time schools are purchasing more technology products while also laying off teachers, increasing class size, and cutting program.

Economists can probably tell you why families are devoting scarce resources to new and better technology devices but why are schools doing the same thing?

The reasons public officials most often give for these purchases, past and present, is that the electronic devices will transform classroom practices, student learning, and prepare students for jobs in a competitive global economy. So, school boards need to back up these reasons with solid evidence for spending public dollars on new (and replacement) technologies that promise significant changes in teaching, learning, and administrative practice.

Where is that “solid evidence”?

The evidence for these electronic devices doing what is expected both in the U.S. and abroad is—as I read the research–at best, spotty—at worst, weak. Few careful and impartial observers of U.S., Europe, and Asia where governments have committed themselves to infusing technology into schools can say with confidence that the use of new technologies has led to increases in student academic achievement (as measured on either U.S. or international tests), altered substantially how teachers teach, or prepared students for to compete in an ever-changing labor market.

In part two, Cuban offers two reasons for this blind devotion to tech “solutions” that solve nothing: political and psychological.

This political explanation helps to make sense of why policymakers effortlessly skip over the lack of evidence to support major high tech expenditures. They figure that media photos of students happily clicking away on laptops–visible symbols–will trump the few research studies or critics who question purchases.

Turning from a political to a psychological explanation, districts buy technology because they suffer from “inattentional blindness”: They are too focused on a specific problem and lose sight of the big picture.

Or they suffer from some kind of blindness caused by salespeople promising tech-based “solutions” to whatever problem their schools might be facing without seeing if it fits in that big picture.

Of course, if the stuff looks good when photographed next to the superintendent, mayor, governor, and/or congressional candidate, so much the better.

Cuban, as always, makes some excellent points about our educational obsession with gimmicks.  Take the time to read both posts.

Everything You Need to Know About the Internet (abridged)

A British “professor of the public understanding of technology” writing in The Guardian offers nine things everyone needs to know about the internet.

For anyone who’s been connected and paying attention, which I suspect includes many people who read blogs like this one, this is a review of what you already know.

However, it’s still a great read, detailing some important concepts about the web that many people still don’t get.

Three of them stand out for me as an educator.


One of the things that most baffles (and troubles) people about the net is its capacity for disruption. One moment you’ve got a stable, profitable business – say, as the CEO of a music label; the next minute your industry is struggling for survival, and you’re paying a king’s ransom to intellectual property lawyers in a losing struggle to stem the tide. Or you’re a newspaper group, wondering how a solid revenue stream from classified ads could suddenly have vaporised; or a university librarian wondering why students use only Google nowadays. How can this stuff happen? And how does it happen so fast?

Or you’re an educational institution and the traditional structure where all information flows through one teacher makes no sense in this age of instant connection to many teachers.


For baby-boomers, a computer was a standalone PC running Microsoft software. Eventually, these devices were networked, first locally (via office networks) and then globally (via the internet). But as broadband connections to the net became commonplace, something strange happened: if you had a fast enough connection to the network, you became less concerned about the precise location of either your stored data or the processor that was performing computational tasks for you.

We in the education business also need to recognize that the network is also the classroom. And vice versa.


Since our current intellectual property regime was conceived in an era when copying was difficult and imperfect, it’s not surprising that it seems increasingly out of sync with the networked world. To make matters worse (or better, depending on your point of view), digital technology has provided internet users with software tools which make it trivially easy to copy, edit, remix and publish anything that is available in digital form – which means nearly everything, nowadays. As a result, millions of people have become “publishers” in the sense that their creations are globally published on platforms such as Blogger, Flickr and YouTube. So everywhere one looks, one finds things that infringe copyright in one way or another.

Which has incredible implications for teaching and learning.  We need to spend more time teaching kids how to be responsible producers as well as smart consumers.

In the end the writer notes that it “would be ridiculous to pretend that these nine ideas encapsulate everything that there is to be known about the net”.

Still his excellent points are well worth the read.

Do We Really Need “Educational” Technology?

It’s spring so, as with most years, we’ve been getting a lot of questions about TSIP.

For those of you from outside Virginia, TSIP is the Technology Standards for Instructional Personnel, a legislative requirement for all teachers enacted about ten years ago.

Everyone with a teaching license must complete the TSIP requirements, either during their first year in the state or in order to renew their license.

Unfortunately, the requirements haven’t changed in a decade and generally conform to what a college advisor called the “inoculation theory of professional development”: you need it, you get it, you never have to bother with it again.

One shot and you’re done.

My thinking on the whole TSIP concept today just happens to coincides with a call from ISTE and other organizations for people to blog and/or tweet about the lack of any ed tech funding in the proposed federal budget.

However, I’m not entirely sure that’s necessarily a bad idea.

I’m willing to bet that most of the half billion dollars allocated this year had very little impact on instruction anyway.

Most likely it was spent at the state or district level to buy expensive packages from educational conglomerates like Pearson (along with plenty of consultants, of course), promising “solutions” to whatever problem is at the top of your list.

But more to the point, I wonder if there’s really a need for “educational” technology anymore?

Does the artificial classification of hardware, software, web applications and the rest as “instructional” (with the inevitable conclusion that rest of the stuff is not) just get in the way of the basic idea that almost any technology could be used for learning?

And does the process also gives some in our profession the cover necessary to ignore anything considered “non-instructional”?

You know, all that tech the kids play with when they’re not with us or when we’re not looking.

We say we want students to be able to communicate and collaborate, to develop critical thinking and problem solving skills, and to become creative and innovative in their work.

Do we really need special “edtech” to make that happen?

Or just a better understanding of how people in the real world are using all kinds of technology to improve their personal skills in all those areas and how to help our students learn to do the same.

Maybe, just like our tech standards that linger from the previous century, the whole concept of “educational technology” is outdated and obsolete.