Saturday, May 27, 2006

Do proficiency levels give useful information?

Wisconsin has finally released its test scores from last year, described in an article in the Journal Sentinel. One odd headline--which I cannot find on-line: "Some schools above average." I would guess that about half were above average.

The tests were changed sufficiently this year that the Department of Public Instruction felt the need to reset the proficiency levels. The way they did this apparently is to take all the test scores and then set the cut off point between proficiency levels so that the same percentage of students would be in each category as last year. This seems like an efficient way to do it but begs the question of whether students are doing better or worse.

I think proficiency levels are a pretty poor way of reporting test scores. First it is not at all clear what they mean. Proficiency for what? With many jobs it is possible to define a needed level of competence, but it is not clear what should be the standard for students. The aim of most education is to prepare the student for more education. Thus the percentage of students at a given proficiency level may tell more about the people setting the levels than about the students. A recent comparison of NAEP and state proficiency standards, based on a comparison of the percentage of students judged proficient on the two exams, found huge variations between states. Wisconsin rated near the bottom with a C- compared to Massachusetts with an A. Similarly the frequently repeated comment that students do worse in high school than elementary school cannot be shown using proficiency levels. It is just as likely that the explanation lies in who sits on the committee setting the cut offs; at the high school level the members are likely to included teachers who specialize in the subject and therefore are more demanding.

A second problem is rating schools by the percentage proficient creates bad incentives. Rather than work to improve every student's score, the incentive is to concentrate on those students who are close to the cut off point. This may contribute to the common complaint of parents of able students that their children don't feel challenged.

There are a number of ways that test scores can help find out how students are doing and encourage improvement. One is to compare individual students' scores from one year to the next to make sure each student is progressing. Another is to make comparison between schools, locally to internationally (international comparisons do seem to show that American students fall further behind the longer they are in school). Finally, test scores can be used to search for factors, such as particular curricula, that affect learning.

Sunday, May 07, 2006

Schools facing more scrutiny

Two recent articles (here and here) reported that some MPS schools have been singled out for intervention based both on low overall test scores and low scores on the district's value-added measurements. These schools would have a district-appointed "instructional facilitator."

This move is the first time there have been real repercussions from poor test-score results. Low enrollment, rather than low achievement, has led to schools closing or having their budgets cut, although presumably a reputation for poor achievement could hurt enrollment. Ideally this move will help schools focus more strongly on figuring out how to help student achievement.

Neither article discusses the approach the facilitators will take. This could be crucial. Too often, districts have pursued approaches that have not been shown to be effective.