New Generation, Much of the Same

Researchers over analyzed and stereotyped baby boomers, Gen X, and Millennials (aka Gen Y), so now it’s time to do the same to Gen Z. And Adobe is right on the job with a new report on Gen Z in the Classroom.

For the study, they interviewed around a thousand students ages 11-17, plus 400 of their teachers. And what did they discover…

Gen Z students are most likely to describe themselves as “creative” and “smart.”

Gen Z students have mixed emotions when it comes to their future after they finish school – their top emotions are “excited” and also “nervous.”

Both students and teachers feel that Gen Z is only somewhat prepared for their futures after school.

Many students feel uncertain about what they want to do, worried about finding a job and concerned that school has not properly prepared them for the “real world.”

All of which could have been said about any group of teen agers in the US for decades. At 16, didn’t most of us think we were smarter than our parents? Were excited and nervous about the future? And were very uncertain about where we would be in ten years?

Being a technology company, a large part of Adobe’s focus in the survey was about the Gen Z group’s relationship with technology. But even then, most of the results are hardly surprising or particularly unique.

Both students and teachers agree that growing up in the age of technology is the defining characteristic of Gen Z – and technology provides more digital tools and outlets for creativity.

Computers & technology classes are the “sweet spot” – not only a favorite class, but also a top class to prepare students for the future and a top class for creativity.

Most say that increased access to digital tools and technology will make Gen Z more creative and better prepared for the future workforce. Still, some students and teachers think Gen Z’s reliance on technology is holding them back from thinking “outside the box.”

I always wonder when people use that phrase “outside the box”. Who gets to define “the box” and what’s inside or outside? In the case of kids, it’s the adults, of course.

Anyway, my favorite “findings” from the executive summary are in section Insight 3.

Both students and teachers alike agree that Gen Z learns best through doing/hands-on experience (e.g., lab work, creating content).

Both audiences wish that there was more of a focus on creativity in the classroom.

Teachers say that having more opportunities for this type of hands-on learning is the number one way they can better prepare Gen Z students for the workforce. Most feel that the technology is already in place, but the curriculum needs to catch up.

I’m not sure we needed more research to arrive at those conclusions. And I don’t believe they are unique to one generation. Millennials, Gen Xers, even us old Baby Boomers, all learned better through experiences rather than lectures, and most of us would have been better served if we could have had more of it during our time in school.

In the end, some variation of this report could have been written about any group of students from the past sixty years. The question is, why has American education not changed to better meet their needs in that time?

More Let’s Blame the Technology

Time Magazine, which I was surprised to discover still sells around 3 million paper copies each week in the US, recently featured a promotional article for a new book (“Glow Kids”) in which the writer is sharply critical of technology used for instruction: Screens In Schools Are a $60 Billion Hoax.

Ok, I’m not immune to that kind of click bait. Let’s jump in and see what he has to say.

He starts by stating that any acceptance of tech in the classroom as “a necessary and beneficial evolution” is not only a lie, but worse.

Tech in the classroom not only leads to worse educational outcomes for kids, which I will explain shortly, it can also clinically hurt them. I’ve worked with over a thousand teens in the past 15 years and have observed that students who have been raised on a high-tech diet not only appear to struggle more with attention and focus, but also seem to suffer from an adolescent malaise that appears to be a direct byproduct of their digital immersion. Indeed, over two hundred peer-reviewed studies point to screen time correlating to increased ADHD, screen addiction, increased aggression, depression, anxiety and even psychosis.

Attention and focus to what? What is “adolescent malaise”? Is “screen addiction” a real psychological condition?

cracked screen

Ok, what evidence do you have to suggest that the correlation supposedly conclusively established in those studies is causation for any of those diverse afflictions? And justification for the hyperbolic headline?

Well, first he spends a few paragraphs on the fact that selling technology into schools has become big business, attracting evil people like Rupert Murdoch. And on the billion dollar Pearson/iPad project mess in Los Angeles, which was the result of poor planning and possible corruption, not the technology itself.

Yes, we know Finland’s school system is magical without lots of screens. And that there are some academics who “once believed” and now say that there is no “technological fix” for the problems of American education.

But where is the evidence that the technology itself is harmful to children? Or a “hoax”? Maybe you have to buy the book to learn the answers but this particular article does little to substantiate the provocative claims in the first few paragraphs.

He does cite some research: the high-profile paper by the Organisation for Economic Co-operation and Development (OECD) from last year that relied entirely on data from their testing program. A 2012 British meta study of research that largely predates the iPhone. Another analysis of 50 studies that concluded “reading for pleasure among young people has decreased in recent decades”. Plus Jane Healy, who has made a career of blaming screens for all kinds of societal issues.

Bottom line for this writer, the existence of computers is to blame for any and all problems observed in classrooms.

But I have a question: Is it possible that those student struggles with attention in school and lack of progress in “learning” come not from the screens per se but from the way those devices are used in the classroom?

Maybe handing a computer to every student so they can complete the same activities and fill-in-the-blank worksheets assigned long before the internet arrived is turning them off from learning.

Could it be that sticking students in front of computers to work their way through “personalized” lessons is not the magic solution promised by the vendors? Not to mention boring.

Is it possible that constantly using the screens for kids to take meaningless-to-them standardized tests is one cause of that “malaise”?

Although this particular piece of crap article is little more than the usual attempt to blame devices for human failings, I do completely agree with the basic premise of the headline.

For most of the past two decade, we have spent a huge amount of money to put computers, software, and networks into US classrooms, and that a large chunk of that has been wasted on devices that should never have been purchased in the first place (I’m looking at you interactive whiteboards!). We have been throwing devices into classrooms with the expectation that their presence would lead directly to a significant increase in teacher proficiency and student learning. Even when many districts, like the one I used to work for, also put significant resources into providing training and technical support for teachers the results were far less than transformational.

Why?

Well, the screens certainly made schools look different to the administrators and politicians dropping in for photo ops. But anyone who bothered to look past the shiny objects found little difference from classrooms of ten, twenty, fifty years ago.

The curriculum, pedagogy, instructional materials, assignments, the fundamental teacher-directed structure of American education, all it changed very little. We did, of course, add an expensive, regressive standardized testing requirement, and in many places, used it to justify the continued purchase of computers. But that testing only served to narrow the focus of instruction to the few tested subjects and made automating the process of preparing for the exams even more attractive.

In other words, we spent that $60 billion from the headline buying devices with the power to connect classrooms, teachers and kids to the world’s information, and to each other. Machines that offered students incredible potential for developing and demonstrating their creative skills. Then we worked overtime using them to maintain a traditional concept of learning that this writer, and the academics he cites, fondly remember from their time in school.

At one point later in the article, the writer cites a line from a 16 year-old report from the Alliance for Children: “School reform is a social challenge, not a technological problem.”. That is the key point completely missed by this writer and many education reformers.

However, our administrators and political leaders find it much easier, not to mention cheaper, to buy the technology rather than address those social challenges, including poverty, nutrition, and a tremendous inequity of access to learning resources of all kinds.

You can call the $60 billion spent on technology for the classroom a “hoax” if you like, but don’t blame “screens” for the many educational and social maladies this writer and other observe. The fault lies in a lack of willingness by all of us to address and solve the real problems.

Science Proves Reading the Web Makes You a Bad Writer

Yet another piece of research that’s supposed to show that the internet is making us dumb. Or something like that.

recent study in the International Journal of Business Administration looked at MBA students at the University of Florida to determine how reading habits shape writing ability. Scientists analyzed writing samples from student cover letters, which were believed to be the most telling form of a student’s best writing – no one wants to make a bad impression on a cover letter – to determine complexity and style.

The study found students who consume primarily digital content (such as Reddit and Buzzfeed) had the lowest writing complexity scores, while those who often read literature and academic journals had the highest levels of writing complexity.

I haven’t read the actual study (why bother when websites like this one will summarize everything I need to know into a short, clickbait headline?), but I wonder if the researchers are really claiming a direct connection between reading mostly digital material and lower writing skills.

Or could it be that the content in this situation is more important than the format? That people who read more complex content learn to create more complex writing?

But then I’ve never been much of an academic so I could be wrong.

More, But Not Necessarily Better

EdWeek, a tabloid that’s supposed to be about education but more often deals more with business, posts about a report that claims spending on “edtech” has hit a world-wide total of $15 billion. According to a UK consulting firm, “growth has been strong”, with edtech purchases up by $4.5 billion over the past four years.

So, how do they define “educational” technology? And, more importantly, how is all this new tech being used to enhance student learning?

The press announcement doesn’t say much about the latter but did list the following equipment that was included in their research:

mobile computing — notebooks, netbooks, Chromebooks and tablets; classroom displays — interactive whiteboard, interactive flat panel, interactive projectors, standard projectors, attachment devices, complementary devices — visualisers, lectern panels/pen displays, voice amplification, voting systems and slates/tablets.

Note that most of that hardware is not for use by kids.1 And it’s likely that use of the mobile computing devices is almost completely teacher directed as well.

It would be nice to know more about how teachers are using all this stuff but that’s probably beyond the scope of a market report like this. Besides, if you want the full details of this kind of research, you’ll have to pay for it. “Knowledge-based” consulting companies don’t work for free.

However, just reading the summary is more than a little depressing. Especially this part about interactive displays.

Futuresource found that, in countries where teachers continue to stand in front of the class for instruction, display devices are more prevalent. That accounts for the fast growth of interactive flat panels in China, and interactive whiteboards in Spain, Italy and Russia.

In Germany, so-called “visualizers” are popular. These devices take images of documents and project them onto a screen.

In 2015, over 2.5 million interactive displays were sold. The volume of interactive flat panels more than doubled, with new models and new vendors entering the market.

Those “countries where teachers continue to stand in front of the class for instruction” unfortunately still includes most of the US. Of all the devices covered in this report, interactive flat panels/whiteboards do more to lock in that model of instruction than anything else. They are a giant waste of money, funds that could be spent in far better ways, for tech as well as other learning purposes.

The power of technology available for instructional use has increased dramatically over the past decade or two, as the cost of those devices and networks has dropped equally dramatically. But as schools spend more and more on electronic stuff, we waste much of that power and it’s potential for changing the way kids learn.

A Little EdTech Wishful Thinking

In a post reprinted in the Washington Post’s Answer Sheet blog, Larry Ferlazzo offers Eight education predictions (and some wishful thinking) for 2016. Number seven deals with educational technology, and I think this may be some of his “wishful thinking”.

The recent OECD report pointing out the lack of success educational technology has had in improving student learning, and the growing recognition among researchers and educators that its use is not narrowing the “achievement” (or “opportunity”) gap, will result in the most serious effort yet to dramatically reassess how tech is being applied in our schools around the country. Top-down unilateral decisions will begin to give way to the “radical” idea of listening to teachers’ opinions and acting on them.

We’ve had years, if not a couple of decades, of research questioning the effectiveness of technology in schools. There’s no reason to think that the OECD report released in September (which most school & district administrators have probably not read completely anyway) will alter the current top-down decision-making process used in most districts, for pretty much everything. I certainly doubt it will give way to “listening to teachers’ opinions and acting on them”.

Major instructional technology purchasing decisions, such as in the 1-1 computing program slowly lumbering forward in Fairfax County schools (aka the overly-large school district, and my former employer), are made by administrators and technical people. Teachers might be included in “focus groups” or other advisory committees but it’s rare that anyone who actually works with students (or the students themselves) will have any substantial influence over what tech is approved and purchased for classroom use.

And that’s not about to change in 2016.