Don’t Blame the Lecture

A few days ago, the New York Times published the latest high profile story advocating for a ban of laptops from classrooms, mostly at the college level. They all point to a “growing body of evidence” claiming to show that students learn less and get poorer grades when they use devices during lectures.

I was going to chime in with my thoughts on the matter, including more than a few questions about the methodology and assumptions behind these studies. But marketing guru Seth Godin, who occasionally chimes in on education issues (and often makes a lot of sense), has already written a high profile response that has popped into my Twitter feed many times in the past few days.

While it’s not a great challenge to this simplistic nonsense, at least Godin is exactly right that the professor who authored the Times op-ed has missed the real issue.

The times they are a'changing

Why give lectures at all?

Why offer a handmade, real-time oration for a small audience of students—students who are expected to slow down their clock speed, listen attentively and take good notes at the very same rate as all the other students?

I know why we used to do it. We used to do it because a lecture is a thoughtful exposition, a reasoned, researched argument that delivers a lot of information in a fairly condensed period of time. And before technology, the best way to deliver that exposition was to do it live.

But now?

Godin’s recommendation to replace the live lecture – basically going to the “flipped” classroom approach and have students watch a recording of the presentation outside of class – is a crappy alternative.

But he does ask the right question: Why lectures? Why do we continue with the “watch presentation-take notes-answer test questions” approach to learning? Especially since it is becoming clear that this is not an effective learning process.

Which leads to the other half of this question: if we’re not going to lecture at students, what do with do with all that “precious classroom time”?

And it is precious. It’s a curated group of thirty or a hundred students, coordinated in real-time and in real-space, inhabiting a very expensive room, simultaneously.

The K-12 experience is thirteen years built on compliance and obedience, a systemic effort to train kids to become cogs in the industrial machine. And it has worked. One component of this regime is the top-down nature of the classroom. We don’t want to train kids to ask difficult questions, so we lecture at them.

Although teachers in K12 don’t perform as many lectures as college instructors1, most classrooms are still structured around direct instruction. With material structured by the adults and presented to students, who are then expected to extract the required information, and recall it on some kind of test at some later time.

In the end, however, my biggest objection to all these “laptops are making kids stupid” stories is that the researchers – and the writers reporting on their work – always start by blindly blaming the technology and the students.

And assuming our current educational structure is above reproach and needs no alteration.


1 However, the lecture format is still a fundamental part of many high school courses, especially Advanced Placement, which is essentially a college course adapted for slightly younger students.

Picture from Flickr and used under Creative Commons license from brett jordan.

No, We Aren’t Losing a Generation to Smartphones

Betteridge’s law of headlines says “Any headline that ends in a question mark can be answered by the word no.”

Which certainly applies to the very provocative title of a recent article in The Atlantic: Have Smartphones Destroyed a Generation?

The author tries very hard to convince the reader that teenagers (the group she calls iGen) who “spend more time than average on screen activities are more likely to be unhappy”, more prone to depression, and at greater risk for suicide.

There’s not a single exception. All screen activities are linked to less happiness, and all nonscreen activities are linked to more happiness.

But at the generational level, when teens spend more time on smartphones and less time on in-person social interactions, loneliness is more common.

So is depression. Once again, the effect of screen activities is unmistakable: The more time teens spend looking at screens, the more likely they are to report symptoms of depression.

After reading the article, I’m not at all convinced of definitive claims like this.Evil iphone pack

Certainly smartphones have “radically changed every aspect of teenagers’ lives”, at least the lives of those kids who are financially able to own them. On the other hand, I could also say the same about myself and many other adults.

However, is the technology entirely to blame for any psychological problem she sees in the teenagers she has been studying? Where are the adults, especially parents, in all this? And the fundamental question that always needs to be asked in any human research, is behavior A the direct cause of behavior B?

I don’t think the author is entirely sure either.

Of course, these analyses don’t unequivocally prove that screen time causes unhappiness; it’s possible that unhappy teens spend more time online. But recent research suggests that screen time, in particular social-media use, does indeed cause unhappiness.

What’s the connection between smartphones and the apparent psychological distress this generation is experiencing? For all their power to link kids day and night, social media also exacerbate the age-old teen concern about being left out.

Ok, I’m not an expert in this area. I haven’t read the research and don’t have any countervailing evidence to present.

But I always have many doubts when someone definitively attributes great harm, or benefits, to a particular technology. 

In this case, I’m extremely hesitant to accept the author’s conjecture that we are losing an entire generation (which I understood to encompass thirty years) to a technology that has only really been widespread in society for less than a decade.

What Doesn’t Work In Education Reporting

School House1

Rather than asking what works in education, NPR asked “education researcher John Hattie” about ideas that don’t work. His answers are not based on his own work but on a review of “more than 1,000 ‘meta-analyses’”, whatever that is.

Anyway, I wish the writer had started by asking Mr. Hattie what he means by “works”. Of course, I know the answer. Works means improving scores on standardized tests, even though his number two non-working solution is standardized testing.

High-performing schools, and countries, don’t necessarily give more standardized tests than low performers. They often give fewer.

The alternative: testing that emphasizes giving teachers immediate, actionable feedback to improve teaching.

Having said that, his final item, in which he says the US spends too much money on public education, is based on the fact that other countries spend much less while their students score higher than American kids on an international standardized test (the PISA).

He also says that smaller class sizes don’t work because countries like Japan and Korea have relatively large classes and are “high performing”, again on those standardized tests.

It’s too bad NPR just transcribed the executive summary of Hattie’s paper, instead doing some research of their own and asking some informed questions.

The worst part about stories like this is the failure to recognize that there are major differences between American society and our approach to education with those of other countries. Start with the fact that, instead of national education goals, we have 51 educational policies, plus around 31,000 local school boards.

Then review the level of public support for public education in the US compared to Finland, Japan, and the rest. How many of their government leaders are working hard to demonize teachers and privatize schools?

Finally, look at the societal support systems for children in each country, especially rankings of child poverty rates (anywhere from 20% to 33% of all children, depending on the definition of “poverty”). I’ll bet “making America great again” has nothing to do with improving those dismal statistics.

And anyone who says poverty has nothing to do with learning, has never tried to teach math to a class of hungry middle school students.

New Generation, Much of the Same

Researchers over analyzed and stereotyped baby boomers, Gen X, and Millennials (aka Gen Y), so now it’s time to do the same to Gen Z. And Adobe is right on the job with a new report on Gen Z in the Classroom.

For the study, they interviewed around a thousand students ages 11-17, plus 400 of their teachers. And what did they discover…

Gen Z students are most likely to describe themselves as “creative” and “smart.”

Gen Z students have mixed emotions when it comes to their future after they finish school – their top emotions are “excited” and also “nervous.”

Both students and teachers feel that Gen Z is only somewhat prepared for their futures after school.

Many students feel uncertain about what they want to do, worried about finding a job and concerned that school has not properly prepared them for the “real world.”

All of which could have been said about any group of teen agers in the US for decades. At 16, didn’t most of us think we were smarter than our parents? Were excited and nervous about the future? And were very uncertain about where we would be in ten years?

Being a technology company, a large part of Adobe’s focus in the survey was about the Gen Z group’s relationship with technology. But even then, most of the results are hardly surprising or particularly unique.

Both students and teachers agree that growing up in the age of technology is the defining characteristic of Gen Z – and technology provides more digital tools and outlets for creativity.

Computers & technology classes are the “sweet spot” – not only a favorite class, but also a top class to prepare students for the future and a top class for creativity.

Most say that increased access to digital tools and technology will make Gen Z more creative and better prepared for the future workforce. Still, some students and teachers think Gen Z’s reliance on technology is holding them back from thinking “outside the box.”

I always wonder when people use that phrase “outside the box”. Who gets to define “the box” and what’s inside or outside? In the case of kids, it’s the adults, of course.

Anyway, my favorite “findings” from the executive summary are in section Insight 3.

Both students and teachers alike agree that Gen Z learns best through doing/hands-on experience (e.g., lab work, creating content).

Both audiences wish that there was more of a focus on creativity in the classroom.

Teachers say that having more opportunities for this type of hands-on learning is the number one way they can better prepare Gen Z students for the workforce. Most feel that the technology is already in place, but the curriculum needs to catch up.

I’m not sure we needed more research to arrive at those conclusions. And I don’t believe they are unique to one generation. Millennials, Gen Xers, even us old Baby Boomers, all learned better through experiences rather than lectures, and most of us would have been better served if we could have had more of it during our time in school.

In the end, some variation of this report could have been written about any group of students from the past sixty years. The question is, why has American education not changed to better meet their needs in that time?

More Let’s Blame the Technology

Time Magazine, which I was surprised to discover still sells around 3 million paper copies each week in the US, recently featured a promotional article for a new book (“Glow Kids”) in which the writer is sharply critical of technology used for instruction: Screens In Schools Are a $60 Billion Hoax.

Ok, I’m not immune to that kind of click bait. Let’s jump in and see what he has to say.

He starts by stating that any acceptance of tech in the classroom as “a necessary and beneficial evolution” is not only a lie, but worse.

Tech in the classroom not only leads to worse educational outcomes for kids, which I will explain shortly, it can also clinically hurt them. I’ve worked with over a thousand teens in the past 15 years and have observed that students who have been raised on a high-tech diet not only appear to struggle more with attention and focus, but also seem to suffer from an adolescent malaise that appears to be a direct byproduct of their digital immersion. Indeed, over two hundred peer-reviewed studies point to screen time correlating to increased ADHD, screen addiction, increased aggression, depression, anxiety and even psychosis.

Attention and focus to what? What is “adolescent malaise”? Is “screen addiction” a real psychological condition?

cracked screen

Ok, what evidence do you have to suggest that the correlation supposedly conclusively established in those studies is causation for any of those diverse afflictions? And justification for the hyperbolic headline?

Well, first he spends a few paragraphs on the fact that selling technology into schools has become big business, attracting evil people like Rupert Murdoch. And on the billion dollar Pearson/iPad project mess in Los Angeles, which was the result of poor planning and possible corruption, not the technology itself.

Yes, we know Finland’s school system is magical without lots of screens. And that there are some academics who “once believed” and now say that there is no “technological fix” for the problems of American education.

But where is the evidence that the technology itself is harmful to children? Or a “hoax”? Maybe you have to buy the book to learn the answers but this particular article does little to substantiate the provocative claims in the first few paragraphs.

He does cite some research: the high-profile paper by the Organisation for Economic Co-operation and Development (OECD) from last year that relied entirely on data from their testing program. A 2012 British meta study of research that largely predates the iPhone. Another analysis of 50 studies that concluded “reading for pleasure among young people has decreased in recent decades”. Plus Jane Healy, who has made a career of blaming screens for all kinds of societal issues.

Bottom line for this writer, the existence of computers is to blame for any and all problems observed in classrooms.

But I have a question: Is it possible that those student struggles with attention in school and lack of progress in “learning” come not from the screens per se but from the way those devices are used in the classroom?

Maybe handing a computer to every student so they can complete the same activities and fill-in-the-blank worksheets assigned long before the internet arrived is turning them off from learning.

Could it be that sticking students in front of computers to work their way through “personalized” lessons is not the magic solution promised by the vendors? Not to mention boring.

Is it possible that constantly using the screens for kids to take meaningless-to-them standardized tests is one cause of that “malaise”?

Although this particular piece of crap article is little more than the usual attempt to blame devices for human failings, I do completely agree with the basic premise of the headline.

For most of the past two decade, we have spent a huge amount of money to put computers, software, and networks into US classrooms, and that a large chunk of that has been wasted on devices that should never have been purchased in the first place (I’m looking at you interactive whiteboards!). We have been throwing devices into classrooms with the expectation that their presence would lead directly to a significant increase in teacher proficiency and student learning. Even when many districts, like the one I used to work for, also put significant resources into providing training and technical support for teachers the results were far less than transformational.

Why?

Well, the screens certainly made schools look different to the administrators and politicians dropping in for photo ops. But anyone who bothered to look past the shiny objects found little difference from classrooms of ten, twenty, fifty years ago.

The curriculum, pedagogy, instructional materials, assignments, the fundamental teacher-directed structure of American education, all it changed very little. We did, of course, add an expensive, regressive standardized testing requirement, and in many places, used it to justify the continued purchase of computers. But that testing only served to narrow the focus of instruction to the few tested subjects and made automating the process of preparing for the exams even more attractive.

In other words, we spent that $60 billion from the headline buying devices with the power to connect classrooms, teachers and kids to the world’s information, and to each other. Machines that offered students incredible potential for developing and demonstrating their creative skills. Then we worked overtime using them to maintain a traditional concept of learning that this writer, and the academics he cites, fondly remember from their time in school.

At one point later in the article, the writer cites a line from a 16 year-old report from the Alliance for Children: “School reform is a social challenge, not a technological problem.”. That is the key point completely missed by this writer and many education reformers.

However, our administrators and political leaders find it much easier, not to mention cheaper, to buy the technology rather than address those social challenges, including poverty, nutrition, and a tremendous inequity of access to learning resources of all kinds.

You can call the $60 billion spent on technology for the classroom a “hoax” if you like, but don’t blame “screens” for the many educational and social maladies this writer and other observe. The fault lies in a lack of willingness by all of us to address and solve the real problems.