Recently I’ve been thinking a lot about the use of personal portable communications devices* in a classroom setting, only a small part of which is driven from carrying around an iPad for the last six weeks or so.
Based mainly on my experience, I think Apple’s tablet will be a very compelling device for learning, once they push out a few major upgrades. More about that in another post.
Anyway, here in the overly-large school district we are also running experiments with the instructional use of the iPod Touch, as well as seriously discussing how and why students might use their own personal computing devices in the classroom.
Frequently, however, I get an impression that many of the people involved don’t understand the nature of these tools and how they are designed to be used.
They want very much to map them onto our classic computer lab style of technology use.
In a computer lab, even one consisting of laptops stored in a big metal box that is rolled from room to room, all the units are the same. Or essentiallyÂ identicalÂ when booted.
All have the same software, identical desktops, files all in the same places, mapped to the same servers, and sometimes are even connected to a master unit that can take control of the whole lab.
And all the kids using them are expected to be doing exactly the same activities on each computer.
On the other hand, the powerful devices coming to school in the pockets of students (and many adults) are designed to be personal, withÂ everything customized by the user to make the unit function best for them.
The “lab” created when each student boots their personal “computer” would result in almost none of the workstations looking – or working – alike.
So, as one teacher recentlyÂ asked me, how are we supposed to get anything done if every computer in the room is different?ï»¿
It’s a valid question, and I think the answer lies in a fundamental mistake we’ve made over the decades in the way we’ve taught kids and adults to use computers.
We taught Microsoft Word Fundamentals instead of learning the writing process regardless of the tool being used.
Our training focused on the mechanics of PowerPoint instead of on understanding the best ways to communicate ideas.
While this approach is possible, and relatively easy to implement, when using a standardized lab, it falls apart completely in a BYOC setting.
So we “get something done” by separating learning to use the technology from using that technology as a tool for learning more useful skills – like writing and communicating ideas.
After setting minimum requirements for the “computers”, we put the responsibility on the students themselves for knowing how to use the different applications.
Then we give them meaningful assignments and evaluate their work based on factors other than how well they use fonts and the number of slides in their shows.
I know, I know… far too simplistic.
However, if a 1-1 ratio of kids to computers in our schools, at least high schools, is really what we want (and I hear many around here say it is), then we will need to make two major changes to our education process.
First, we must allow and encourage students to freely use their own computing devices in schools (and provide them for those families who can’t afford it).
And then completely revise both the curriculum and our approach to teaching to fit the new circumstances.
Not simple at all, but do we have any other options?
*I know that’s a very clunky name but “smart phones” doesn’t nearly cover the capabilities of these devices and “computer” ties them too much to a stereotype.