Digitally Faulty

From the Washington Post’s Grade Point blog we have a list of “five critical skills every new college graduate should have”. It begins with

Every graduate needs to be “digitally aware.” Students entering college and the workforce now often are referred to as “digital natives” because they were raised on technology from a very young age.

And stop right there.

Anyone still using the “digital native-digital immigrant” trope is, at the very least, being intellectually lazy. As with many other concepts about people, especially kids, that phrase is a binary, either-or shortcut that excuses the writer from the responsibility of explaining the complexity of the subject, and their readers from having to understand it.

Being “raised on technology from a very young age” does not convey the expertise implied by calling them “natives”. For those kids who have easy access to digital devices and networks growing up (which excludes large numbers of children, even in the US), most acquire a comfort level with the tools that connect them to their friends and personal interests. They are not computing geniuses – or “hackers” when a negative slant is needed.

For most students, their “native” digital skills don’t automatically translate into using the technology tools for learning skills needed to live in the broader world. They still need parents and teachers to guide them in those areas.

Continuing in the same brief section, the writer also leans on another, more recent, flawed assumption about the needs of graduates, from both college and high school.

It’s no longer good enough to know how to use a computer. Understanding the programming language behind the apps on your iPhone, or the basics of Artificial Intelligence are all now seen as basic foundational skills by many employers. Learning to program is much like learning a second language was in the 20th century: You might not become proficient enough to move overseas, but you could get by if you traveled to a particular country.

I’d love to see some statistics about the “many employers” who see programming as a “basic foundational skill”. Plenty of politicians, business-types, and other education experts, tell us that kids need to learn to code. The president is asking for $4 billion to provide computer programming classes for all students in K12, without a clear definition of why it’s that important.

And equating learning to program with learning to speak a second language is yet another lazy, not to mention very wrong, shortcut. Beyond both being classes offered in many high schools, the two require different skill sets and processes in the brain. But it’s probably not as bad as equating coding with being able to read and write in your native language.

Ok, so the writer goes on to present his four other “critical” skills for graduates, but that first one is bad enough. I really don’t want to waste time on figuring out how one becomes a “learning animal”, or explain why lacking the ability to “navigate through life without a syllabus” is a failure of their schools, not the graduate.