Dumb Headline

In their new education blog, Grade Point, the Washington Post reports on a study showing My smartphone is making me dumb. Actually, that headline is probably making their readers dumber.

Researchers gave college students their first smartphone and asked them “whether they thought the devices would help them learn”. Of course a large majority said yes.

But a year later, when they were asked the same questions in the past tense, the results were entirely different — the college students felt the phones had distracted them and hadn’t been helpful, after all.

So, of course, we blame the technology, instead of any number of other factors (start with this being their first smartphone) that don’t necessarily translate into provocative headlines.

Finally, tacked onto the end of the post, the writer did manage arrive at the far more accurate conclusion of research like this.

Just providing access to mobile technology wasn’t enough, they concluded; educators would need to offer more structure or guidance if they wanted phones to enhance students’ academic experience.

Teachers must learn how incorporate mobile devices into their practice before students can understand how to use them for their learning.

Not exactly link bait.

Observing the Future

Betteridge’s law of headlines states that “Any headline that ends in a question mark can be answered by the word no.”.

Take for example, a recent article on the Ars Technica website: “Is your smartphone making you dumb?”.

And despite the provocative question, the authors of the study being referenced don’t actually arrive at that conclusion.

“the results are purely correlational,” emphasize Golonka and Wilson. There’s no way to tell whether an over-reliance on smartphones decreases analytical thinking or whether lower analytical thinking ability results in a heavier reliance on smartphones, they explain.

Of course, this is one instance in a long line of “research” and “analysis” provocatively asking if Google, the internet, social networking, or technology in general is impacting human intellectual development in some way. For good or bad. Maybe both.

Did societal observers have the same questions in the aftermath the printing press, telegraph, telephone, radio or any other major change in the way people communicated information? I suspect they did. Did humans get dumber? Smarter? Weirder?

I’m pretty sure the honest answer to the question of what the use of smartphones/instant search/social networking/<insert your tech fear here is doing to our brains is “we don’t know”. All of these digital tools some say we are addicted to (another of those headline concerns) are very, very recent developments in human history. It takes more than a decade or two to sort through all the data.

Which is all the more reason to do our own research. Be introspective about ourselves and observant of others. Pay attention and we’ll watch the future of the human species develop.

I’m pretty optimistic about it.

The Leadership Squash

Thanks to @science_goddess for pointing me to this NPR piece on The Myth of the Superstar Superintendent in which they report on a study showing no correlation between student achievement and who is leading their school district.

However, I think their conclusion is far too simple. It’s foolish to say that the leaders of a school system don’t matter. As in any other field, it all depends on their leadership style.

“A good superintendent empowers leading visionary principals and teacher leaders at the school,” she [education writer and author Dana Goldstein] says. But what actually happens too often is that superintendents “squash interesting ideas, so you’d have principals afraid to try something new, afraid to try something innovative.”

Unfortunately, with the many layers of super-level leadership we have here in the overly-large school district, there’s a lot of that squashing going on.

What Does Your “Research” Really Say?

An essay by an English teacher posted in the wonderful Post blog The Answer Sheet 1 offers Seven things teachers are sick of hearing from school reformers.

It’s all good, worth your time to read and pass along, and she probably could have added eight or ten more. But this is one that really stands out.

4. Don’t tell us “The research says…” unless you’re willing to talk about what it really says.

It’s not that we don’t care about research, but that most often when research is mentioned in a school context, it is used to end legitimate conversation rather than to begin it, as a cudgel to silence us rather than an opening to engage us constructively. Very often when confronted with a “research says” claim that I find dubious or irrelevant, I ask for a citation and get a blank or vaguely menacing stare, or some invented claim about the demands of the Common Core, or a single name, “Marzano,” as though he completed all instructional research.

Research on children and learning is difficult to do right and the best you can say about almost studies in this area is that they are incomplete. However, at the very least those education “experts” pontificating on research should be required to read past the executive summary.

Oh, and I’m one more teacher who’s tired of “Marzano” being cited as the solution to everything.


  1. And there isn’t much wonderful in the Washington Post these days, on paper or the web.

Everybody’s Wild About Data

That’s especially true in the education business, which if you look closely, probably produces the most unreliable data you could possibly get.

But that doesn’t stop politicians, media, and “experts” from latching onto polls, studies, and research and using them to sell their pet reform plans. Often without question and based only on a read of the executive summary.

For the rest of us who want to know a little more about the data before accepting the headlines, The Atlantic offers some advice on How to Read Education Data Without Jumping to Conclusions.

As readers and writers look for solutions to educational woes, here are some questions that can help lead to more informed decisions.

1. Does the study prove the right point? It’s remarkable how often far-reaching education policy is shaped by studies that don’t really prove the benefit of the policy being implemented. 

2. Could the finding be a fluke? Small studies are notoriously fluky, and should be read skeptically.

3. Does the study have enough scale and power? …the million-dollar question is whether the study was capable of detecting a difference in the first place.

4. Is it causation, or just correlation? Correlation … does not indicate causation. In fact, it often does not.

The fact that too many people don’t know the difference between those two concepts in number 4 is a direct indictment of the K12 math curriculum. Doesn’t say much for those statistics courses that many educators are required to take during their advanced degree programs.

Anyway, these are all good recommendations. I would only suggest adding one more: Who paid for this particular research?

Just because a particular organization (like the Gates Foundation) funds a study that ends up supporting their existing point of view (as has happened more than once), doesn’t mean the research is flawed.

Only that it should require even closer scrutiny before using it to make educational policy and spending millions of dollars to implement it.