Are Screens Really “Bad” For Kids?

Kids are spending too much time with digital screens.

At least they are according to some high profile studies, scary media stories about a tech backlash among “technologists” themselves, and many, many surveys of parents and teachers.

But what if they’re wrong?

In an interesting new book, “The New Childhood: Raising Kids to Thrive in a Connected World”, Jordan Shapiro, a professor of philosophy and senior fellow for the Joan Ganz Cooney Center at Sesame Workshop, argues that kids interacting with screens is just all part of growing up in a new age.

Shapiro draws on his understanding of history and centuries of philosophical thought to say that kids who spend hours engaged with devices are simply learning about and adapting to the world around them. It seems different from parents came of age but really is not.

Grown-ups are disoriented because, at first glance, today’s screen media seem personal and private. When kids are watching YouTube videos or playing video games, it feels like the devices are pulling them away from the family and into a cocoon. But also, in a paradoxical twist, the screens function like portals that transport kids out of the house, beyond the perfect picket fence, and into a vast public dystopian virtual reality. Hence, parents are confused. They don’t know whether their kids are too detached or too exposed. All they know for sure is that traditional home life feels out of order; things aren’t neat and organized.

This anxiety is understandable. But remember that new technologies will always beget new routines. Your job as a parent is not to stop unfamiliar tools from disrupting your nostalgic image of the ideal childhood, nor to preserve the impeccable tidiness of the Victorian era’s home/work split. Instead, it’s to prepare your kids to live an ethical, meaningful, and fulfilled life in an ever – changing world.

Shapiro is not saying that parents (and teachers) should just hand devices to the kids and walk away. Instead he offers some historical context of family life when dealing with other technologies and makes the case that parents can still guide their children without heavy-handed restrictions.

He simply wants parents to take a closer look at what is going on when kids are interacting with those screens and guide them in their use of devices, video games, social media, and the rest of the digital world. “Just say no” doesn’t work here either.

This is just a small part of what Shapiro discusses in the book, and you may very well disagree with some of his conclusions. However, his thoughts on the matter are something every adult who interacts with children should read and consider.

My Head Hurts

Today I received an ad for a new book titled “How to Teach So Students Remember”. I get lots of similar promotions but there was something about this one that caught my eye. And made my head hurt.

The first line of the description of the publications makes this declaration:

Ensuring that the knowledge teachers impart is appropriately stored in the brain and easily retrieved when necessary is a vital component of instruction.

The copy goes on to promise that the author will provided you with “a proven, research-based, easy-to-follow framework for doing just that”.

There is just so much wrong with everything in the space of one small email, it’s hard to know where to start.

How about the apparent core idea that the goal of good teaching is to have students “remember” all that we “impart” to them? Reflecting the traditional role of the teacher as someone who transfers information in carefully measured clumps from their tightly managed repository to the vessels sitting in the classroom.

And, in the same sentence, is the implication that success is derived from knowledge being “appropriately stored in the brain” and “easily retrieved when necessary”. I can only assume that the most important “necessary” time is the spring standardized tests.

Ok, all that snark is only based on a couple of paragraphs in an email. I haven’t read the actual book, although I did read through the first chapter posted on the web. And just that part certainly lives up to the promotion. Research-based pedagogy right out of a 50’s-era manual for running a traditional teacher-directed classroom.

I just couldn’t believe this is being peddled as a guide for modern teaching by one of the largest professional organizations for educators, the Association for Supervision and Curriculum (ASCD).


An image similar to the one at the top just stuck in my head from the minute I read the ad copy. The picture, taken in 1943, is of a classroom in a UK Catholic school and is used under license from the Wikimedia Commons

Beware the Algorithm

Cathy O’Neil is the author of a great book with the wonderfully provocative title, Weapons of Math Destruction. But it’s not hyperbole. In it she clearly and compellingly explains how algorithms, fueled by big data, are being used everyday to make decisions about important points in our lives. This includes whether or not we qualify for a loan, get a job, or go to prison.

Or create the “value add” rating a teacher receives, the one that mixes student test scores with data other points to determine if they keep their job.

The companies who sell those evaluation systems want us to believe that their mathematics is completely objective and a neutral judge of people. O’Neil, however, presents a variety of examples, starting with the teacher evaluation programs sold by Pearson and others, to explain why that’s just not true. She explains how the data used to build the algorithms usually comes with biases based on how and why it was collected. And additional bias is built into the systems by the people who ultimately decide what the resulting numbers mean.

While I highly recommend reading the book, you can get the TL;DR1 version of O’Neil’s warning in her TED talk from last spring, recently posted to their site and embedded below.

Questioning the Math

Weapons of Math Destruction

Any book with a title like that is worth looking into, right?

In a recent interview, the author, Cathy O’Neil, says she started looking into how data and math was being applied to a variety of human processes following the financial crisis of 2008, in which she had a “front-row seat”. Her research found that the “very worst manifestation” of those applications was “kind of a weaponized mathematical algorithm”.

One of the first of those algorithms she investigated was the “value-add” model for assessing teachers being used in New York City and other districts. This is the process where student test scores are mixed with other data to determine which teachers are given raises, and which should be fired.

However, O’Neil, a data scientist and “former Wall Street quant”, someone who might actually understand the mathematics (and explain it to the rest of us), was denied access.

It’s opaque, and it’s unaccountable. You cannot appeal it because it is opaque. Not only is it opaque, but I actually filed a Freedom of Information Act request to get the source code. And I was told I couldn’t get the source code and not only that, but I was told the reason why was that New York City had signed a contract with this place called VARK in Madison, Wisconsin. Which was an agreement that they wouldn’t get access to the source code either. The Department of Education, the city of New York City but nobody in the city, in other words, could truly explain the scores of the teachers.

It was like an alien had come down to earth and said, “ Here are some scores, we’re not gonna explain them to you, but you should trust them. And by the way you can’t appeal them and you will not be given explanations for how to get better.”

O’Neil says similar secret formulas are used by financial institutions to determine who can borrow money, by courts to decide who goes to jail and how long they will stay there, by Google in presenting the results of your last search, and many more. Some of this activity is benign and some is extremely manipulative.

But which is which? And what do we do about it?

The very first answer is that people need to stop trusting mathematics and they need to stop trusting black box algorithms. They need to start thinking to themselves. You know: Who owns this algorithm? What is their goal and is it aligned with mine? If they’re trying to profit off of me, probably the answer is no.

I’m doubt many people actually “trust” math, certainly not after the classes they took in school. But I’m pretty sure most of us don’t understand or even know about those “black box algorithms” being used by companies and governments to analyze our information.

However, just like the data manipulation behind polling and studies that drive public policy, its clear we all need to start asking questions.

Anyway, I’ve just started reading the book. Based on just the introduction and the first chapter, I’ll probably have much more to rant about later.

School 2.0

As a kid I was never good at book reports in English class. I’d rather discuss the work with others who had read it than produce 500 words about it. Although my writing has improved over time, I’m still not much better at book reviews as an adult (certainly not with as much reach in this field as Bill Gates or Mark Zuckerberg).

So I’m probably breaking a few rules of the genre by saying right up front in this post that anyone who calls themselves an educator should read the new book by Chris Lehmann and Zac Chase, “Building School 2.0: How to Create the Schools We Need“. Better yet, give a copy to your principal, superintendent, school board member, and especially to anyone who fancies themselves an education reform “expert”.

The book is laid out as a collection of 95 essays, mirroring the 95 theses that Martin Luther nailed to the church door in 1517, setting off the Protestant reformation. I’m pretty sure Chris and Zac don’t view their work with quite the same historical heft as Luther’s, but the format makes reading it fun and very accessible.

I have become very poor at reading books or other text in the traditional linear design (probably due to the nature of working on the web) and School 2.0 isn’t laid out in that format. The authors allow, almost encourage, the reader to jump around hitting the “chapters” in the order of their choosing. Which is especially nice in the ebook with a hyperlinked table of contents.

Even better, however, is that the Chris and Zac didn’t intend for people to just read the book. Each of their essays includes a few leading questions and suggestions for extending the ideas through conversations with your colleagues, students, administrators, or parents. This book is more of a professional development tool than a solitary reading experience. A work intended to generate rich discussions around the questions of what school should be and how do we get there. Maybe even to affect change.

Anyway, go get a copy, open it to any page and begin exploring. No matter where you land you’ll find wonderful ideas for creating the kind of schools our kids deserve.

To finish this possibly lame attempt at a book review, here are a couple of my favorite thoughts on the topic of instructional technology.

Technology is and must be a transformative element in our schools. Fundamentally, it changes the equation of why we come to school. Whereas previously, we came to school because the teacher was there, now we come to school because we are all there together. Technology can allow us to embrace a more finely honed sense of community in our schools.

Anything short of a vision of educational technology use that allows students and teachers to inquire more deeply, research more broadly, connect more intensely, share more widely, and create more powerfully, sells short the power of these tools – and more importantly, sells short the promise of learning and of school for our students.

Many more great discussion starters where those came from.