Challenging Credibility

I guess I didn’t stay away long enough to avoid Newsweek’s annual cover story defining America’s “Best” High Schools.

That “best” ranking, of course, is based on the tenuous (and that’s being generous) assessment tool known as the “challenge” index, which assigns each school a number based solely on the ratio between numbers of AP/IB/Cambridge tests taken and the numbers of graduating seniors.

No factoring in how well students actually did on those tests (or any other academic criteria). Ignore the quality of arts programs. Dropout rates are irrelevant. And forget completely about students in vocational or any other programs that don’t involve college prep.

Schools rise to the top of this pile if they get kids to take tests.  Lots and lots of tests.

Which results in totally meaningless scores that often produce headlines in local papers, sometimes for very strange (and somewhat amusing) reasons.  Such as this dichotomy in Houston:

Newsweek has come out with its latest ranking of the nation’s best high schools, and the Houston school district is crowing that a record number of HISD highs made it.

The usual suspects are there — DeBakey, Carnegie, Bellaire and Lamar — but joining the list this year are 11 others, including Waltrip, Chavez, Sharpstown, Milby and — WTF? Sharpstown?

The same Sharpstown that is on quite another list — HISD superintendent Terry Grier’s “Apollo 20” list of failing schools? (Lee HS, too!)

Newsweek says Sharpstown HS is among the best in the country while the superintendent says it’s one of the worst in his district.

Who’s right?

And I wonder how many other schools racked up enough tests given to make this farcical “best” list while still failing to educate the majority of their students.

Aiming For a Higher Level

In his Monday morning Post education column, Jay Mathews relates the story of a disagreement between a teacher and his principal over the issue of student cheating.

The teacher, an instructor of AP US History in DC, during his evaluation conference explained the steps he took to discourage copying during tests, which included creating multiple versions of the exam and printing the pages in a smaller font.

His principal was not especially impressed.

“You are creating an expectation that students will cheat,” Martel [the teacher] recalls Cahall [the principal] saying. “By creating that expectation, they will rise to your expectation.”

When I asked Cahall about it, he did not deny that he said it. His intention, he said, was not to prohibit Martel’s methods but to urge him to consider another perspective.

“I am not opposed to multiple versions of a test or quiz; it is standard operating procedure for every type of testing program,” the principal said in an e-mail to me. “Instead, I would prefer that teachers use more rigorous assessments when possible, that require written responses and higher levels of thinking. In addition to being more challenging and requiring a sophisticated skill set, these types of assessments are also more difficult for students to copy.”

Mathews sides with the teacher in the dispute since “questioning a teacher’s approach to cheating may be going too far”.

Especially when dealing with an AP classroom, since, of course, that program is the golden salvation of high school education.

However, in this case the principal makes the better point.

We should be asking more of students than just copying back material they’ve been given or making rudimentary connections between the facts, stuff that’s easy to rip off without detection since it doesn’t ask for any value-add from the individual.

In the larger context, we should consider that if a test, or any other assignment, is easy to cheat on, it’s likely a poor or invalid assessment of their learning.

New Decade, Same Lame Challenge

Front page of this morning’s Post, above the masthead, in space normally reserved for major, earth shattering events, comes the news…

headline.jpg

The 2010 “challenge” index for DC-area schools has been unleashed on the unsuspecting, and largely statistically clueless, public!

The method for computing this highly-publicized ranking of high schools hasn’t changed.

Divide the number of Advanced Placement, International Baccalaureate or other college-level tests a school gave in 2009 by the number of graduating seniors. Tests taken by all students, not just seniors, are counted.

Also not changed is the glorification of the taking of tests, while factoring in nothing about how student actually score on them.

As with the 2009 release, the list includes something called the Equity and Excellence rate, defined as “the percentage of all seniors who have had at least one score on an AP, IB or Cambridge test that would qualify them for college credit”.

Which is also not an entirely accurate number since colleges make their own decisions as to what score on an AP test will earn credit. Or whether the student will get a pass on taking a similar level prerequisite course instead of credit.

So, what exactly is the purpose of the assembling the “challenge” index in the first place?

The rating is not a measure of the overall quality of the school but illuminates the one quantifiable factor that seems to reveal best the level of a high school’s commitment to preparing average students for college. [my emphasis]

The ONE quantifiable factor. Love to see the study supporting that contention, much less the concept that college is the best goal for every student.

While the Post seems to be avoid a “best” tag, it remains to be seen if Newsweek (owned by the Post), when they likely publish the the national version of the index in May, will refrain from billing Mathews’ list as the “nation’s best high schools” as they have in the past.

Ok, I know it’s probably a hopeless cause to continue ranting about this incredibly shallow assessment of high school quality year, after year.

Especially since both politicians and the press seem to be obsessed with reducing everything done in school to simple, headline-friendly numbers, something for which the “challenge” index is tailor made.

However, it would be great if more people would take a critical look at this and other hyper-simple schemes for assessing the complex process of teaching and learning.


By the way, I thought you added the possessive to a name ending in ‘s’ by simply adding an apostrophe. Or am I wrong that the proper punctuation is supposed to be Mathews’ list not Mathews’s list? I’m sure I make plenty of grammatical errors around this place, but I have an excuse. There are no highly trained and paid copy editors around here.

Debating AP

The Room for Debate section* of the New York Times web site notes that Advanced Placement programs in US high schools have “grown enormously in the past decade” and ask a couple of good questions.

Does the growth in Advanced Placement courses serve students or schools well?

Are there downsides to pushing many more students into taking these rigorous courses?

They post responses on the topic from six education “experts”, including, amazingly enough, an actual teacher.

roomfordebate_post.png

Actually, it’s not much of a debate, but there are some good points made on both sides. And the whole thing is worth a few minutes to read.

However, the idea repeated here that needs the greatest emphasis is that there is nothing magic about AP.

Just putting kids into the classes and having them take the tests will not improve American education, no matter how many schools bow down before the “challenge” index.

From Patrick Welsh, an English teacher at a high school a few non-rush-hour minutes up the road from here:

In part, the explosion in advanced placement test takers has been fueled by Newsweek’s annual cover story on America’s 100 Best High Schools, a listing arrived at by dividing the number of tests given at a school – regardless of the test results – by the number of students in the senior class.

Given the pressure of those rankings, maybe school administrators can be forgiven for beating the bushes to find students to take A.P. exams even if those kids do not have the remotest chance of getting the kind of score that will give them college credit. (A.P. tests are graded on a 1 to 5 scale. The most selective colleges only give credit for scores of 5, while almost no college gives credit for a 2 or a 1.)

And from Saul Geiser, a research associate at the Center for Studies in Higher Education, University of California, Berkeley:

Yet mere enrollment in AP classes (unlike A.P. exam scores) is not a good indicator of how students will perform in college. In extensive studies at the University of California, we have found that while A.P. exam scores are strongly related to student success, the number of A.P. classes that students take in high school bears almost no relationship to college performance. The key is not simply taking A.P., but mastering the material.

That last sentence pretty much says it all.


* I suppose I should mention that the Times includes this little rantfest in the education section of the Room for Debate blogroll. Still don’t understand that decision. :-)

Still Challenging The Index

A little later than usual, Newsweek in its next edition will be presenting it’s annual list of America’s top high schools.

“Top”, of course, is determined by Jay Mathews’ “challenge” index, a ranking based solely on the number of students who take AP tests without regard for their scores on those tests.

In his Monday Post column Mathews announces the unleashing of Index ’09 and continues with a defense of his belief in AP-for-all as a universal tool for high school reform.

I’ve offered my opinions on both the “challenge” index and the place of AP classes many times in the past (and I’ve never once called it a “formula for failure”).

But for now let’s stick with Mathews’ interpretation of what I’ve included in this little rant fest.

One of my favorite bloggers, Fairfax County instructional technology specialist Tim Stahmer of assortedstuff.com*, frequently says too many unprepared students are being channeled into AP and urged to go to college.

My response is, what harm does that do? They work harder in high school, and if they graduate still determined not to go to college, they will discover that those AP skills are just what they need to get the best available jobs or trade school slots.

For starters, I’m not the only one who thinks that too many unprepared students are counseled into AP classes. A large majority of AP teachers in surveys, including a recent study from the Fordham Foundation, agree with that statement.

As to urging students to prepare for college, I’ve never said there’s anything wrong with that. My objection is in the way that the AP program is used in most high schools.

Excessive emphasis on AP classes (which the “challenge” index enables), often means that students, especially those with talents and interests that would not be served by a college education, wind up with fewer options.

Not to mention fostering the message that students not on a “college track” are somehow inferior.

Then there’s the idea that “college-prep” skills will serve all students regardless of whether they will (or want to) attend college. It’s a nice concept but that’s not the way it works.

The secondary math curriculum in most districts, which tracks everyone towards taking Calculus, is a good example of why.

Certainly everyone needs many of the same concepts taught in the college track classes but many, if not most, students would be better served with an emphasis on applying those concepts rather than on the conceptual foundations that dominate college-prep curriculums.

If I was running things (yea, right), all students would take a course in practical statistics (as opposed to the heavily theoretical AP stats curriculum).

Anyway, a similar laser-like tracking toward what the universities believe is good preparation is at the core of the other AP programs.

Finally, I have a big problem with the way the “challenge” index is used by both the media and the schools themselves as some kind of definitive measure of high school quality.

I’ve seen too many web sites and press releases from schools a short drive from here trumpeting that they are in the “Newsweek (or Washington Post) Top 100 High Schools in the US” without mentioning the “award” is strictly for students taking tests and nothing else.

There are far too many factors that go into a quality education and this kind of single-number approach detracts from that discussion.

On a larger scale, however, the concept of AP-for-all helps to lock in the traditional view of high school as college-lite when what we need is a total rethinking of the American education system.

* What, no link? :-) BTW, full disclosure: Mathews was nice enough to send me an early look at a draft of his column which made writing this rant and posting it the same morning much easier.

Update (6/9): For those interested, the Newsweek list of “America’s Top High Schools” is now online. And one of the rotating headline features on the main page.