wasting bandwidth since 1999

Coping With Lots of Crappy Information

A couple of weeks ago, the news offered the results of the latest National Assessment of Educational Progress (NAEP), the largest national reading and math exam. The statistics showed that test scores in those areas had shown little improvement over past two years, despite the constant drumbeat of No Child Left Behind.

But according to the standardized testing programs in many states, which under NCLB must be blessed by the federal government, kids in those areas are doing just great.

Take Florida, where 30 percent of fourth graders were proficient in reading on this federal test in 2005. Yet on the Florida state test, 71 percent of fourth graders were proficient in reading in 2005. It’s a big difference: Are nearly three-quarters of your fourth graders proficient? Or less than a third? And it’s typical.

On the 2005 federal test, 33 percent of New York’s fourth graders were proficient in reading; on New York’s 2005 state test, 70 percent were. In Tennessee, 27 percent of fourth graders were proficient in reading on the federal test; 87.9 percent on the state test.

Some of us might look at this big discrepancy and think something’s wrong with at least one of the testing programs. But not the wise folks at the Department of Education.

Federal officials don’t see it that way. "To us, more information is better," said Tom Luce, an assistant secretary in the federal Department of Education. "People say, ‘Well, it’s confusing.’ But I think the American people can deal with two different pieces of information at once."

Mr. Luce said that when residents in states like New York, Tennessee and Florida see such big discrepancies, "they’re going to ask questions."

He added, "That’s why the NAEP test is there, to shed light."

And that’s why DoE assistant secretaries are there: to spread crap.

More information IS better but only if you have valid data to start with. In this case, even the most optimistic analysis should tell you something’s wrong with at least one set of results. I doubt the average parent understands these two different pieces of information well enough to ask the right questions.

There are plenty of experts in the field of education who will come up with wonderful statistical explanations for the huge differences in test results. I’m certainly not one of them, so I have a very simple reason.

The state tests mandated by NCLB each spring are an immediate threat hanging over the schools. As a result, teachers teach to that test, not NAEP.

Of course, that still leaves the open question of whether any of these standardized tests measure genuine learning. You know. The stuff kids will actually need to succeed in the real world where multiple choice assessments don’t exist.

nclb, standardized tests, naep


  1. Chris C.

    I think you’re right – teachers obviously feel more pressure to address whatever the state tests are going to cover. The NAEP and state tests don’t necessarily overlap in all areas.
    Additionally, the NAEP can simply be harder or easier than the state tests. And the level of difficulty on state tests often change for political reasons (got to get to 100% proficiency some way, after all). Those are the two biggest reasons for the discrepancies.

    One note – NAEP also assesses student reasoning through open ended items. It’s not quite all multiple choice.

  2. acceler8

    You have to keep in mind what these test are designed to do. Students will score differently on different tests because they are, well, different. Comparing a group of students’ performance on two different test tells you something about the tests, not the students. The virtue of the NAEP is that you can compare different students on the same test at a national scale. This is something you cannot do with the state tests. You are right to say that the data do not support (may even confuse) some conclusions, but it does not render the data completely useless.

© 2021 Assorted Stuff

Theme by Anders NorenUp ↑