Value-Added Indicators: Do They Make an Important Difference? Evidence From the Milwaukee Public Schools

WCER Working Paper No. 2003-5

Robert H. Meyer

April 2003, 15 pp.

ABSTRACT: This paper draws on mathematics achievement data from Milwaukee middle school students to illustrate the value-added approach to estimating school performance. Value-added indicators differ from traditional attainment indicators such as average test scores in that they are designed to capture the contribution of schools to growth in student achievement. The indicators are thus based on longitudinal as opposed to cross-sectional student data. The empirical results of this study indicate that the (value-added) productivity of middle schools in Milwaukee varied substantially across schools over the period 1999 to 2000. Some middle schools generated almost twice as much achievement growth as the average middle school in Milwaukee. Others produced almost no growth in mathematics achievement from seventh to eighth grade. One very interesting finding is that most of the middle schools that served students with low prior achievement did a better than average job of producing growth in mathematics achievement. The overall statistical reliability of the school performance estimates was very high (89.4%). Finally, it was found that value-added indicators of school performance (based on longitudinal data) differed substantially from average mathematics attainment. The paper explains that attainment indicators such as an average test score are, in general, highly flawed as measures of school performance.

Full Paper

keywords: Value-Added; School Performance; Achievement Growth; Educational Accountability