Data-driven Reform Efforts can Improve Achievement Significantly

March 15, 2012

Geoffrey Borman

Geoffrey Borman

Information is power.

And good information empowers educators to improve teaching and student learning.

This is what Geoffrey Borman and graduate students Deven Carlson and Michelle Robinson discovered when studying whether implementing a data-driven reform effort could bring about district-wide improvements in students’ mathematics and reading achievement. They fielded a randomized experiment in over 500 schools within 59 districts and seven states in which approximately half of  the participating districts were randomly offered quarterly benchmark student assessments and received extensive training on interpreting and using the data to guide reform.  

They found that large-scale, data-driven reform efforts can lead to significant improvements in student achievement. Their findings will help guide the growing movement toward data-driven reform on achievement outcomes.

Borman’s analyses tested the effects of the 1st-year components of the CDDRE treatment. The finding show significant improvements in math and some improvements in reading.

Deven Carlson and Michelle Robinson are both graduate research fellows in WCER’s Interdisciplinary Training Program in the Education Sciences. Carlson’s home department is political science; Robinson’s is sociology. The Training Program aims to prepare a new generation of education science scholars to provide solid evidence of “what works” in education.

The study examined an initiative fielded by the Center for Data-Driven Reform in Education (CDDRE) at Johns Hopkins University. The CDDRE program bridges two major approaches to low-achieving districts and schools: data-based district reforms and comprehensive school and classroom reforms. It aligns the efforts of state, district, and school-based educators with the goal of accelerating achievement in low-performing schools.

The research compared districts and schools in the CDDRE program to those that that operated as usual, without benchmark assessments and associated services.

Results show that CDDRE helps school staff and district staff understand data on student performance, generate additional data to guide school improvement efforts, identify root causes of important problems, and select and implement evidence-based programs directed toward solving those problems.

Most of the districts and schools in the study were low performing, but in many other respects they were diverse, which lends authenticity to the findings:

  • Geography: The schools and districts are spread across seven states that represent nearly every region of the country.
  • District type: The sample contained rural districts and large, urban districts.
  • Socioeconomic: The proportion of students eligible for free or reduced-price lunch varied across districts.

The benchmark assessments monitored the progress of children in Grades 3 to 8 in mathematics and reading and guided data-driven reform efforts. The outcome measure was school-level performance on state-administered achievement tests.

The data-driven CDDRE reform model was found to have a positive effect on studentmathematics achievement: Assignment to the treatment group was estimated to increase average achievement by approximately 0.06 student-level standard deviations, which is statistically significant. In reading, the results were positive but did not reach a conventional level of statistical significance.

Taken together, the results indicate that district-level assignment to implement a data-driven reform initiative can cause increased achievement, particularly in mathematics.

Assessing the Magnitude of These Effects

Cluster-randomized designs, like this one, are becoming more common for evaluating the effects of educational interventions. Yet Borman’s is the first known evaluation in which school districts served as the unit of randomization.

Several educational interventions have been found to be effective in small-scale efficacy trials, but they were later found to produce no positive impacts when evaluated on a larger scale. By randomizing nearly 60 school districts, Borman’s results are largely insulated from such concerns. The external validity of these results is further enhanced by the fact that the study design incorporates districts from seven states.

Previous research has provided suggestive evidence that data-driven reform can produce improved student achievement. But these earlier studies were either somewhat underpowered or focused on the evaluation of a pilot program, Borman says. This study provides the best evidence to date that data-driven reform efforts, when implemented at scale, can result in substantive and statistically significant improvements in achievement outcomes.

For more see “A Multistate District-Level Cluster Randomized Trial of the Impact of Data-Driven Reform on Reading and Mathematics Achievement,” Educational Evaluation and Policy Analysis, September 2011, vol. 33 no. 3, 378-398. Available online athttp://epa.sagepub.com/content/33/3/378.