A Resolution For The New School Year

It’s a new school year so why not adopt a good resolution to get it started?

My nomination would be for everyone drop the antiquated digital native myth.

In the past few weeks I’ve seen the phrase used in several different back to school edtech articles as a lazy shorthand to excuse their lack of understanding about how kids relate to technology. Too many educators and parents also seem to have accepted the idea as scientific fact. 

It’s just plain false.

A paper recently published in a European teacher preparation journal is just the latest research to make the case that growing up with PCs, texting, and PlayStations does not endow kids with “native” technology skills.

In particular this and earlier studies knock down the concept of “multitasking”, that these “digital natives” are able to accomplish two or more tasks at the same time. Plenty of research shows the human brain just doesn’t work that way.

For years I tried to get the educators I worked with to understand that students have two big advantages over adults when it comes to using new devices and software: they have more time to spend on them than you do, and they are highly motivated to learn.

Teachers can’t do anything about the time issue. It’s just the nature of being a grownup with grownup responsibilities. They can do something about the motivation part.

However, teachers also need to realize that their students are not highly proficient in all aspects of using technology.

They are wizzes with social interaction apps. They know all the popular Instagram filters and the best emojis to express their feelings. They are quick to find the YouTube channels from which they can learn new skills, and maybe even make some videos of their own.

Kids are lacking when it comes to using their devices and the web for learning more broadly about the world they will enter as adults. They have difficulty filtering through the internet stream to find valid information. They need help understanding how to best present themselves online.

Which is where you come in.

You don’t need to match the abilities of your students when it comes to social media and the rest. You do need to understand how students can apply the technology available (including their personal devices) to collaborate and communicate online.

Just stop calling them “digital natives”. And don’t call yourself a “digital immigrant” either.

Digitally Faulty

From the Washington Post’s Grade Point blog we have a list of “five critical skills every new college graduate should have”. It begins with

Every graduate needs to be “digitally aware.” Students entering college and the workforce now often are referred to as “digital natives” because they were raised on technology from a very young age.

And stop right there.

Anyone still using the “digital native-digital immigrant” trope is, at the very least, being intellectually lazy. As with many other concepts about people, especially kids, that phrase is a binary, either-or shortcut that excuses the writer from the responsibility of explaining the complexity of the subject, and their readers from having to understand it.

Being “raised on technology from a very young age” does not convey the expertise implied by calling them “natives”. For those kids who have easy access to digital devices and networks growing up (which excludes large numbers of children, even in the US), most acquire a comfort level with the tools that connect them to their friends and personal interests. They are not computing geniuses – or “hackers” when a negative slant is needed.

For most students, their “native” digital skills don’t automatically translate into using the technology tools for learning skills needed to live in the broader world. They still need parents and teachers to guide them in those areas.

Continuing in the same brief section, the writer also leans on another, more recent, flawed assumption about the needs of graduates, from both college and high school.

It’s no longer good enough to know how to use a computer. Understanding the programming language behind the apps on your iPhone, or the basics of Artificial Intelligence are all now seen as basic foundational skills by many employers. Learning to program is much like learning a second language was in the 20th century: You might not become proficient enough to move overseas, but you could get by if you traveled to a particular country.

I’d love to see some statistics about the “many employers” who see programming as a “basic foundational skill”. Plenty of politicians, business-types, and other education experts, tell us that kids need to learn to code. The president is asking for $4 billion to provide computer programming classes for all students in K12, without a clear definition of why it’s that important.

And equating learning to program with learning to speak a second language is yet another lazy, not to mention very wrong, shortcut. Beyond both being classes offered in many high schools, the two require different skill sets and processes in the brain. But it’s probably not as bad as equating coding with being able to read and write in your native language.

Ok, so the writer goes on to present his four other “critical” skills for graduates, but that first one is bad enough. I really don’t want to waste time on figuring out how one becomes a “learning animal”, or explain why lacking the ability to “navigate through life without a syllabus” is a failure of their schools, not the graduate.

Still Fighting the Native Myth

Northwestern University is concerned about how their freshmen present themselves on the web and offer a course in the Communications Studies Department called “Managing Your Online Reputation”.

While the course developed by Ms. Hargittai and Mr. King uses cautionary tales, it also seeks to train students to build robust, productive online identities through which they can engage topics of interest, command audiences, and advance their careers. The course draws on social-science research about reputation and crisis management. The professors believe it to be one of a kind.

Is this class really necessary? These days most kids are “digital natives”, right? They start using communications devices as babies and quickly learn to work those social media sites that totally baffle their parents. Don’t they?

But Ms. Hargittai and Mr. King, among others, say that the familiar narrative about tech-smart young people is false. Their course grew out of years of research conducted by Ms. Hargittai on the online skills of millennials. The findings paint a picture not of an army of app-building, HTML-typing twenty-somethings, but of a stratified landscape in which some, mostly privileged, young people use their skills constructively, while others lack even basic Internet knowledge.

I’ve ranted on this topic a few times over the years and the bottom line is that the “digital native” myth is one of the most detrimental edtech clichés of the past decade. Far too often used as an excuse to avoid the kind of instruction, by schools and parents, being offered in this course.

So, why is it “one of a kind”? More importantly, why doesn’t this kind of instruction start in high school?

The Myth of Digital Natives

A recent edition of the PBS Idea Channel 1 does a nice job of exploring the question Do “Digital Natives” Exist?.

I’ve never liked the digital native/digital immigrant concept.2 As the presenter points out, it presumes that people are born with some innate knowledge of how to work external devices, which makes no more sense than a newborn understanding how to speak a language.

Everyone learns as they grow and develop through life, and kids3 just have more time, opportunity, and motivation to figure out stuff like Facebook and iPads than most adults do.

However, for too many educators, the whole native/immigrant myth actually gets in the way of them learning to make better use of technology in their professional practice (and sometimes offers an easy excuse to skip it altogether).

Anyway, watch the video (the topic only covers the first 5-1/2 minutes of the program) and see what you think.

Better yet, show it to someone who thinks they’re an immigrant.

What Does “Tech Savvy Student” Mean?

Ok, how many of you “older” folks out there are tired of the whole “digital natives” vs. “digital immigrants” concept? As someone who didn’t grow up using computers but who is now very comfortable with networks, social media, mobile devices and the rest, I know I am.

It’s hard, however, to convince some of my colleagues that kids are not born with some magic innate technological talent. I try to tell them that, more than anything, their students simply have more time to spend playing with various electronic devices and absorbing all the little tricks that seem like genius to those who don’t. Invoke the 10,000 hour rule.

However, knowing how to trick out a smart phone or understanding the complexities of Facebook doesn’t mean those students also have any clue how to use all that power to advance their learning. Or even the basics of common programs you use like PowerPoint.

A little research to back up this idea comes from a new study conducted by the Economic & Social Research Council (ESRC). The subjects were college students in the UK so maybe the findings don’t directly apply to the kids in US classrooms, but I doubt some of what they found is far off from those in our overly-large school district and other similar parts of the country.

  • 97.8 percent of students owned a mobile phone;
  • Just over three quarters – 77.4 percent – owned a laptop and 38.1 percent owned a desktop computer.
  • 70.1 percent felt their access to computers was sufficient to meet their computing needs.
  • The mobile phone was chosen by 83.2 percent as the device students would miss the most if it was taken away.
  • A small minority of students don’t use email or have access to mobile phones.

Students 20 years old or younger reported being more engaged in instant messaging, texting, social networks and downloading video media than students who were aged 25 years or more. Only 4.3 percent of those 20 or younger never used social networking sites, and for those 35 or older this rose to 78.5 percent.

In other words, they’re coming to your classroom understanding IM, texting, social networks, video downloads and carrying some powerful tools that involve reading, writing, collaboration.

So, what are we doing to leverage those communications skills and the devices in their pockets to improve their learning?

Sorry, I forgot it’s May. Testing season. No time to worry about all that learning stuff.