Still Fighting the Native Myth

Northwestern University is concerned about how their freshmen present themselves on the web and offer a course in the Communications Studies Department called “Managing Your Online Reputation”.

While the course developed by Ms. Hargittai and Mr. King uses cautionary tales, it also seeks to train students to build robust, productive online identities through which they can engage topics of interest, command audiences, and advance their careers. The course draws on social-science research about reputation and crisis management. The professors believe it to be one of a kind.

Is this class really necessary? These days most kids are “digital natives”, right? They start using communications devices as babies and quickly learn to work those social media sites that totally baffle their parents. Don’t they?

But Ms. Hargittai and Mr. King, among others, say that the familiar narrative about tech-smart young people is false. Their course grew out of years of research conducted by Ms. Hargittai on the online skills of millennials. The findings paint a picture not of an army of app-building, HTML-typing twenty-somethings, but of a stratified landscape in which some, mostly privileged, young people use their skills constructively, while others lack even basic Internet knowledge.

I’ve ranted on this topic a few times over the years and the bottom line is that the “digital native” myth is one of the most detrimental edtech clichés of the past decade. Far too often used as an excuse to avoid the kind of instruction, by schools and parents, being offered in this course.

So, why is it “one of a kind”? More importantly, why doesn’t this kind of instruction start in high school?

Blame the Technology… Again

I really hate when popular media report research findings with headlines like this: “Students’ use of laptops in class found to lower grades”. Too many people won’t get past that blanket statement, never questioning the kind of superficial research behind it.

For the study, published earlier this year in the journal Computers & Education, research subjects in two experiments were asked to attend a university-level lecture and then complete a multiple-choice quiz based on what they learned.

The results were pretty much what you might expect.

Those students using laptops to take notes who were also asked to “complete a series of unrelated tasks on their computers when they felt they could spare some time”, such as search for information, did worse on the quiz than those who didn’t do any of that stuff.

In a second part of the experiment, those who took paper and pencil notes while surrounded by other students working on computers did even worse.

Of course, the implicit assumption here is that lectures are an important vehicle for learning, not to mention that a multiple-choice quiz is a valid assessment of that learning. And that use of the technology was the primary factor in the low scores.

I wonder how the results would have differed if the researchers had divided the subjects into two groups, those who were interested in the subject matter, and those who could care less and only were participating for the twenty bucks.

Ok, without any kind of research to back it, I’m going hypothesize that the single biggest factor in student learning is some kind of connection to the material. With or without a laptop.

Setting a Path Early in Life

A recent post by one of our elementary principals has been stuck in my head for a couple of days, and I’m not entirely sure why.  It’s about an activity in her school called “College Begins with Kindergarten” in which the kids learned about various “helper jobs” in the community (examples offered: doctors, nurses, teachers).

Now I certainly believe a basic understanding of those roles should be part of the school experience from the very beginning. But then students were asked to consider what they might study in college and to create their own future diplomas, complete with a statement of the subject in which they would major.

While there are two pieces to this assignment that I find troubling, the first is more of a question than a quibble. I wonder if the kids in this particular class were asked to consider more common but less stereotypical “helper jobs”, ones someone in their family might hold, such as plumber, auto mechanic, or store clerk, or even one unique to the DC area, lobbyist.

However, beyond the potential lack of inclusiveness, what bothers me more is that an activity like this seems to be telling kids at the beginning of their formal schooling that college is the only acceptable path to follow at the end of that path, more than a decade later. Are we starting the traditional college-is-the-only-way indoctrination too early, long before kids have any kind of clear understanding of their own talents and interests?

Having never taught elementary students, I’m sure someone can tell me why I’m wrong about this rant.

Is That The Only Option?

This past weekend, the New York Times posted a very long but interesting look at the growing debt being accumulated by recent college graduates, as well as those who dropout. It’s a good overview that touches on some of the major problems, including deceptive advertising (aka recruiting) and the somewhat shady for-profit college industry.

However, I was struck by a comment made by one member of the state House of Representatives in Ohio, who also happens to be a current college student with lots of loans: “students need to understand that attending college is not an entitlement”.

Maybe not. But if you look at it through the eyes of most high school students and their parents, we’ve made college attendance something of a societal inevitability.

First you have politicians from the president on down setting increased college attendance and graduation as vital to rebuilding the nation’s economic structure. It’s a matter of world competition! The Obama administration has established a goal to make the United States “first among developed nations in college completion”. Even many of those legislators voting to cut support to both students and schools also support the same argument.

Then there is the culture and structure of our K12 schools where, at least in this area, the message is drilled into the kids almost from the first day of Kindergarten that the only goal worth pursuing after graduation is college. Almost everyone gets funneled into a “college prep” schedule with no consideration for any other post-high school path, and certainly little for the interests and needs of the individual.

So, what’s the choice? Skip college and miss getting that “good job” (not to mention being considered a “failure” by the popular culture) or go and be saddled with a huge debt, even if you “settle” for a state school.

If we as a society really believe that a college degree is something that will benefit both the country in the long run and almost every high school graduate, then we have an obligation to cover the fundamental costs. You cannot reconcile a societal norm of every kid going to college while slashing the support to make that happen. 

Of course, as with everything else we think is important, that’s going to take money. Not to mention some major restructuring in the way that colleges and universities do business (and higher education is very much a business), starting with separating out the stuff that has little to do with getting a good education (high profile athletic programs that are little more than pro farm teams leap to mind first).

But if “college for all” is just political talk, if our “leaders” are not willing to make some hard budget choices to make it happen, then let’s stop feeding that illusion to our kids. Instead, provide students with multiple options and help them find other, less expensive and possibly more satisfying, avenues to follow after high school graduation.

Actually, that second option is a excellent one anyway.

Clicks Instead of Bricks

The New York Times this morning has an interesting overview of the open courseware movement that’s rapidly expanding at the college level around the world. What one speaker at a recent conference on the topic termed investing in “clicks instead of bricks”.

MIT and Stanford have been pioneers in making many of their undergraduate courses available to anyone who wants to participate, without the cost but also missing the credentials. But they’re not alone anymore with, according to another conference participant, more than 21,000 courses from universities on six continents now available and more being added every week.

Despite courses being free to the student, someone must see a business model here. Recently two companies have been spun off from Stanford’s open online courses and one of them just raised $16 million in venture capital.

I’m not sure I completely buy the claim of a representative from one open university who says “you don’t need a teacher for learning”. Some people do, and it often depends on the complexity of subject matter, like medicine or engineering.

However, for many general learning and introductory courses that fill the schedules of many undergraduates, courses offered online and providing a self-paced approach is an option that needs to be available to students.

And a similar approach could work for many, if not most, high school courses. Certainly not for all students, just like independent learning isn’t right for all those in college, or all adults for that matter. But maybe we need to offer kids the choice to opt out of live versions of required courses, especially those that are not part of their spheres of interest.

Anyway, with the rapid growth and acceptance of college level open courses, a though comes to mind.

I wonder when we reach the point where the learning a person gains from participation in a class becomes more important than the credentials awarded for attending in person.