Editorial: Qualifications cloud meaning of academic test results


Published Sep 14, 2013 at 05:00AM / Updated Nov 19, 2013 at 12:31AM

It’s become a refrain: Education testing results come out, often disappointing in some way. Educators explain they don’t really mean what they seem to mean, because this or that change was made since the last time this test was given. And you can’t compare across the nation because different demographic groups are taking the test, or it’s a different test altogether. And besides, the tests don’t really test what kids are learning anyway.

If that’s all true — and clearly at least some of it is — where does that leave us? How do we evaluate our students and our education system?

The two most recent examples are the ACT college readiness results announced in August and the Oregon Assessment of Knowledge and Skills (OAKS) that came out Thursday.

On the ACT, only 26 percent of students nationwide and 31 percent in Oregon met benchmarks in all four areas of English, reading, math and science. But the meaning of results is clouded by a change in benchmarks and the increasing number of students taking the test, which traditionally was taken only by college-bound students.

For OAKS, scores statewide fell in at least one subject at each grade, with the most serious drop in high school writing, where only 60 percent met the benchmark, 7 percentage points lower than last year. But the meaning of the results is clouded by the fact that students were allowed only one retake this year, while multiple tries were permitted in earlier years. In addition, students previously got more practice with the writing test, which was also given to freshmen and sophomores, while now only the juniors take it.

There’s at least a glimmer of hope on the horizon in new tests being developed for the Common Core State Standards being implemented in most states, including Oregon. Most states have at least tentatively signed on with one of the two major groups designing tests for those standards. A uniform test that matches the standards and doesn’t change from year to year would help clarify this muddy picture.