The Challenges of Producing Evidence-Based Claims: An Exploratory Study of NSF's Math and Science Partnership Community
WCER Working Paper No. 2009-4
Matthew T. Hora, Susan B. Millar, Jessica Arrigoni, and Kerry Kretchmar
June 2009, 53 pp.
ABSTRACT: This paper describes the analysis of the National Science Foundation (NSF) Math and Science Partnership (MSP) program's January 2008 Learning Network Conference (LNC). This study focuses on the methodologies used by the MSP community to generate evidence and seeks to understand topics of interest to the 320 LNC participants. The data set for this study included the 47 abstracts accepted for presentation, 68 interviews conducted during the conference, observations of all 26 breakout sessions, and 98 “think pieces” written by conference attendees. The analytic procedures included a holistic scoring rubric for the abstracts and inductive analyses of the interview, observation, and think-piece data using a structured approach to grounded theory. Findings included enthusiasm for the conference theme, respondent focus on realistic and field-tested ways to generate evidence instead of theory and implementation reports, and a strong assumption that student learning outcomes are the type of outcome data of primary interest to the NSF. The study also identified factors that influence the MSP community's approach to evaluation. Overall, the study is framed by observed patterns in how principal investigators (PIs) and their teams responded to project evaluation requirements. Some PIs experienced a dilemma as to whether their dominant operational approach should be discovery—as for science, technology, engineering, and mathematics (STEM) research projects—or delivery of pre-specified outcomes. Other PIs and project leaders were slow to start the evaluation and were impressed by the complexity of producing sound evaluation findings.
keywords: Program Evaluation; STEM Education; Math and Science Partnerships; Systemic Reform