Ok, kiddies, let’s start with a little history lesson.
Not too long ago, many middle class households in this country (like the one in which I grew up) basically got their news twice a day. In the morning, a stack of paper containing a summary of the previous day’s important events (and lots of advertising) was delivered to the house and usually scanned over breakfast.
In the evening many of those same people gathered with the family to watch a 30 minute summary of the news considered important by the big three television networks, and presented by people named Walter, Tom, Peter, or possibly Barbara or Diane.
Today, of course, it’s a well-worn cliche that we live in a 24/7 world of information, with an unknown number of websites and at least three cable television channels, all spitting out an unbroken stream of raw data.Â
The problem is that very little of what comes through that stream rises to the level of valuable, or even useful.
Author Theodore Sturgeon said that 90% of everything is crap (aka Sturgeon’s Law). He was responding to critics who derided the low quality of science fiction and argued that the vast majority of books, movies, consumer goods, and more also fell into that category. He was writing in the 50’s but more than a half decade later, we should raise Sturgeon’s valuation of media and products closer to 99%.
Although the newspapers, network news programs, and weekly news magazines of the past century were not perfect, they did serve as an information filter and presented a relatively accurate picture of current events. Even if it did take most of those organizations a long time to catch up with major societal shifts like civil rights and the Vietnam war.
I’m certainly not advocating for returning to a time when a few news outlets determined what we should know and when. Those traditional institutional filters are rapidly falling apart, and as David Weinberger, Clay Shirky, Howard Rheingold, and other smart observers of the trend are saying, we need to develop our own network of filters to help us identify that rare 1% of the data flow that actually provides knowledge, insight, and value.
I simply think there’s a case to be made for that 70’s model of twice-a-day news consumption, especially during major news events like the recent hunt for those responsible for the bombings at the Boston Marathon. While many people watched one or more of the cable news channels or refreshed their browsers at media sites for hours on end, the river of material coming from television and the web was largely a waste and regularly hit 100% crap. It’s actually worse on a “normal” news day.
In addition to creating better filters for ourselves, we also need to do a much better job of helping our kids learn how to filter the flow and separate the small nuggets of useful information from the huge sludge pile of raw data that flows from today’s media (“crap detecting” in Rheingold’s language, by way of Hemingway and Postman). There is no better skill for us to teach our students during the time they spend in our classrooms.
I was talking with someone recently who said that she was tired of the unending stream – sometimes several hundred updates per day – of negativity on her CNN RSS feed. We discussed the idea that the pressures of recency and attention-gathering on online news organizations result in every single update being broadcast. When combined with news’ organizations tendency to lean toward the sensational/negative (“if it bleeds, it leads”), one can get depressed pretty easily at all of the negative things occurring in the world if you’re constantly bombarded by them. We talked about her maybe moving to a news magazine like TIME which, given its weekly publication cycle and space limitations, has the ability/time/need to decide what’s significant (and what isn’t) and devote more space to that rather than the latest huge-for-5-hours-and-then-gone-again news sensation.