Digging Into The Numbers

How can a parent judge if one high school is better than another?  According to a New York Times article from this weekend, there are a “daunting” number of high profile rankings of the best US high schools published this time of year to help them make that determination.

As a public service to aid “anxious consumers”, the writer sets about to analyze one of those lists in order to understand how anyone could “quantify something as complex and nuanced as a high-quality education”.

Sorry, “challenge” index fans. He chose the one from Newsweek.

The writer does a great job of picking apart their system and it’s worth reading the whole thing.

But for this rant, let’s just cut to the bottom line. Where are the best schools?

Want the best high schools for your child? Move to Texas or Florida. Texas has 15 of the 100 best, placing second over all nationwide, while Florida has 10, the fourth most.

Read that again: 25% of the 100 best high schools in the country are in Texas and Florida.

This is no doubt due in good part to the reform efforts of George W. and Jeb Bush, who – like Newsweek – have made standardized test results a true measure of academic excellence.

That would be my guess.

At all costs, avoid Scarsdale, N.Y. It didn’t even make the top 1,000. Though its average SAT score of 1935 would rank it 21st among the 100 best, the school does not offer A.P. courses, and Newsweek counts A.P. data as 40 percent of the rating.

No AP courses??? I know someone who would consider that child abuse.

However, forget about the quality of Newsweek’s selection process. There’s another, far more important bottom line to consider in their decision to publish a Best American High School list (not to mention the Post’s multiple annual floggings of the “challenge”).

Given that magazines and newspapers are bleeding to death, this is the only plausible justification I can think of: Lists are cash cows.

End of story.

No Room For Creativity in School

A major story in a recent edition of Newsweek* declares we have a creativity crisis is the US.

The authors base that screaming headline on the fact that scores on an assessment that seems to accurately predict kids “creative accomplishments as adults”, have been declining since 1990, after rising steadily since the test was first administered in the 50’s.

The fall in creativity scores has been “most serious” in younger children, kindergarden through 6th grade.

Why is this happening?

One likely culprit is the number of hours kids now spend in front of the TV and playing videogames rather than engaging in creative activities. Another is the lack of creativity development in our schools. In effect, it’s left to the luck of the draw who becomes creative: there’s no concerted effort to nurture the creativity of all children. (emphasis mine)

The writers correctly spotlight as one of the educational causes the testing culture in most schools that squeezes out any thought of teaching art, music, dance, or other activities thought of as “creative”.

However, they also make the even more valid point that there’s very little creative about how students are taught in their “core” subjects.

Researchers say creativity should be taken out of the art room and put into homeroom. The argument that we can’t teach creativity because kids already have too much to learn is a false trade-off. Creativity isn’t about freedom from concrete facts. Rather, fact-finding and deep research are vital stages in the creative process. Scholars argue that current curriculum standards can still be met, if taught in a different way. (emphasis mine)

I would go even farther and say that, for the most part, “current curriculum standards” are crap and should be junked… but this is a good start.

The whole article is actually worth reading for some interesting information about current research into creativity and children.

However, for an even better perspective on the matter, go rewatch Ken Robinson’s classic 2006 TED talk on how school are killing creativity in children or his return to TED from earlier this year (not quite as good) on the learning revolution.

Or read some of what Mitch Resnick has written on the subject of how children learn, especially Sowing the Seeds for a More Creative Society (pdf).

*I can’t tell you if it was the cover story since I haven’t seen a paper copy of the magazine in years.

A Very Weak Challenge Defense

In his Class Struggle column today, Jay Mathews is promoting Newsweek’s annual ranking of “best” high schools and also attempts to defend his “challenge” index that was used to compile the bogus list.

Many people prefer rating schools by average test scores, but I consider that a measure of the student family incomes, not the quality of the schools,…

So, instead of using one narrow, inadequate measure of school quality, use mine.

… I get many messages from principals, teachers and parents who like this way of assessing schools.

My index is popular so the results must be valid.

The list gets about 7 million page views a year.

And we all know popularity on the web equals quality information.

An extremely weak defense for this simplistic, misleading system.

Challenging Credibility

I guess I didn’t stay away long enough to avoid Newsweek’s annual cover story defining America’s “Best” High Schools.

That “best” ranking, of course, is based on the tenuous (and that’s being generous) assessment tool known as the “challenge” index, which assigns each school a number based solely on the ratio between numbers of AP/IB/Cambridge tests taken and the numbers of graduating seniors.

No factoring in how well students actually did on those tests (or any other academic criteria). Ignore the quality of arts programs. Dropout rates are irrelevant. And forget completely about students in vocational or any other programs that don’t involve college prep.

Schools rise to the top of this pile if they get kids to take tests.  Lots and lots of tests.

Which results in totally meaningless scores that often produce headlines in local papers, sometimes for very strange (and somewhat amusing) reasons.  Such as this dichotomy in Houston:

Newsweek has come out with its latest ranking of the nation’s best high schools, and the Houston school district is crowing that a record number of HISD highs made it.

The usual suspects are there — DeBakey, Carnegie, Bellaire and Lamar — but joining the list this year are 11 others, including Waltrip, Chavez, Sharpstown, Milby and — WTF? Sharpstown?

The same Sharpstown that is on quite another list — HISD superintendent Terry Grier’s “Apollo 20” list of failing schools? (Lee HS, too!)

Newsweek says Sharpstown HS is among the best in the country while the superintendent says it’s one of the worst in his district.

Who’s right?

And I wonder how many other schools racked up enough tests given to make this farcical “best” list while still failing to educate the majority of their students.

A Very Silly Challenge

I’ve expended a lot of electrons around here ranting about the incredibly simplistic ranking that is Jay Mathews’ “challenge” index.

Now someone who writes for a real-live big city print newspaper does a great job detailing just how absurd this annual exercise really is.

But here are five reasons why Newsweek’s list isn’t worth the glossy paper it’s printed on.

First: Newsweek’s rankings are based entirely on one unreliable number.

Hey, that’s always been my big complaint.

Second: A well-intended incentive program artificially increases the number of AP tests given at TAG [Dallas’ School for the Talented and Gifted] and SEM [Dallas’ Science and Engineering Magnet].

Today, at TAG and other Dallas high schools, a passing score in math, English or science is worth $100 to a student and $150 to his or her teacher.

Certainly an incentive to get as many kids into AP classes as possible. Maybe they can luck into a few more bonus payments.

Third: Kids at TAG take a lot of AP tests — but they don’t do amazingly well on them.

And, of course, student learning has nothing to do with the quality of a high school in the alternate universe of the “challenge” index.

Fourth: Newsweek’s methodology is supposed to eliminate schools like TAG from the rankings. But TAG slips through because its SAT scores aren’t high enough.

Another totally artificial element of this computation.

Fifth: Ranking America’s high schools may be fun, but it’s a pointless exercise meant to sell magazines.

And THAT is the bottom line for Mathews’ employer, the Washington Post Company, owners of Newsweek and Kaplan, one of the largest test preparation companies in the US.

However, while I certainly agree with this writer’s conclusion that the “challenge” index is “awfully close to nonsense”, too many schools and other media outlets take this silly exercise far too seriously.

[Thanks to Alexander for the link.]