The Appearance of Digital Literacy

The title of a recent Wired article claims that Digital Literacy is the “key to the future” – even if we have no idea what that phrase means.

Here in the overly-large school district we talk a lot about “digital” literacy (with it’s interchangeable companion “digital learning”), although few of those using the phrase can offer a coherent definition for it, and often two people will have very different interpretations.

This particular story is based on discussions among “representatives of the tech industry… and academia” that took place at GitHub, one of the geekier places in Silicon Valley and the web. As you might expect, learning to code is a central tenet in this community, but even that idea is vaguely defined.

But “learning to code” is an exceedingly broad concept, and one which without more specifics risks oversimplifying conversations about what digital literacy really means. And how digital literacy is defined is important. This isn’t just about filling Silicon Valley jobs. It’s about educators, policy makers, and parents understanding how to give the rising generations of digital natives the tools they need to define the future of technology for themselves.

Let’s ignore the lame and outdated “digital natives” reference, and assume that we really do want our students, during their time in K12, to develop programming skills to help them define their “future of technology”. Where does that fit in our current concept of “school”? Or, to channel the thoughts of many students, will this be on the test?

Coding is one of those skills that are also puréed into STEM/STEAM/Maker, more ill-defined instructional concepts that in our schools are almost always welded on as before/after school, lunchtime, or pull-out enrichment activities, but rarely included as part of the “regular” curriculum. They are treated as events, rather than as an environment.

If STEAM is so important – and more than one school reformer has declared it to be vital to our national economic future – why isn’t it part of the core curriculum? Instead of a nice extra activity, great for photo ops, offered to a small segment of students, the ones we know will have no trouble passing the spring standardized tests?

As with the tendency to dump computers and other “high-tech” devices (tablets, “smart” boards, etc.) into classrooms with little or no change in instructional practice1, adding STEM and/or coding activities also provides schools – and district administrators, school board members, and other politicians – with “the appearance of teaching digital literacy without providing the actual substance”.

And without having to decide what in the hell “digital literacy” really means.

Really Bad Vision

This is probably one of the most depressing ideas I've seen in a while. The Gates Foundation wants to spend up to $6 million to develop “literacy courseware”.

More specifically, it plans to use that small piece of Bill's pocket change “to entice publishers, developers, and entrepreneurs to propose the most innovative digital solutions for engaging, personalized software that helps students with reading and writing”.

Notice what's missing from that enticement list? No mention of educators.

The request for proposal says this is part of the Foundation's “vision” for education, something they call “personalized learning”.

My vision of their vision looks more like this:


Expanding the Concept of Literacy

In the title of a piece on the Time web site, the writer declares ‘Digital Literacy’ Will Never Replace the Traditional Kind’, which is a classic strawman since you would be hard pressed to find even the most ardent supporter of instructional technology claiming that it should.

However, I’m not even sure there is such a thing as “digital literacy”.  The “traditional kind” of literacy is defined at it’s most basic level as the ability to read and write. In other words, the ability to communicate with other people.

Increasingly students (and adults for that matter) need to know how to communicate using a variety of both analog and digital tools (including audio and video) to be considered literate, which expands the standard definition rather than replacing it.

Among all the straw, the writer actually tries to get at this point, although in a very condescending manner.

There is no doubt that the students of today, and the workers of tomorrow, will need to innovate, collaborate and evaluate, to name three of the “21st century skills” so dear to digital literacy enthusiasts.

Please name a few of those “digital literacy enthusiasts”. And why the skills usually lumped under the “21st century” label are necessarily digital. Or in any way different from those required by successful adults in the 18th, 19th, or 20th century, distinctly non-digital periods of history.

Anyway, she continues…

But such skills can’t be separated from the knowledge that gives rise to them. To innovate, you have to know what came before. To collaborate, you have to contribute knowledge to the joint venture. And to evaluate, you have to compare new information against knowledge you’ve already mastered. Nor is there any reason that these skills must be learned or practiced in the context of technology. Critical thinking is crucial, but English students engage in it whenever they parse a line of poetry or analyze the motives of an unreliable narrator. Collaboration is key, but it can be effectively fostered in the glee club or on the athletic field. Whatever is specific to the technological tools we use right now – and these tools are bound to change in any case – is designed to be easy to learn and simple to use.

Very true. None of those activities require computers, networks, and communications tools.

Unless, of course, you want to involve students with people and information outside of the relatively limited walls of their school building. And expand their literacy skills beyond basic reading and writing.

We Need More Tech Skeptics

I’ve never liked the whole “digital native/digital immigrant” meme, and an administrator at the University of Kansas seems to agree we need to look at how people understand “technology” in new ways.

She says that many of those digital natives we call students, in both K12 schools and colleges, are actually technologically illiterate, at least under what she says should be an updated definition of “tech literacy”.

The assumption that today’s student are computer-literate because they are “digital natives” is a pernicious one, Zvacek said. “Our students are task-specific tech savvy: they know how to do many things,” she said. “What we need is for them to be tech-skeptical.”

Zvacek was careful to make clear that by tech-skeptical, she did not mean tech-negative. The skepticism she advocates is not a knee-jerk aversion to new technology tools, but rather the critical capacity to glean the implications, and limitations, of technologies as they emerge and become woven into the students’ lives. In a campus environment, that means knowing why not to trust Google to turn up the best sources for a research paper in its top returns, or appreciating the implications of surrendering personal data — including the propensities of one’s bladder — to third parties on the Web.

I think I like the idea of teaching tech “skepticism” instead of “literacy”, for adults as well as kids.

It ties right into helping people develop their crap detector, a concept Neil Postman wrote about in the 70’s and that Howard Rheingold is discussing now.

Thanks to Shaun Johnson for the link.