Common Guidelines for Education Research and Development

(NSF) began work to establish cross-agency guidelines for improving the quality, coherence, and pace of knowledge development in science, technology, engineering and mathematics (STEM) education. The committee formed to enhance the efficiency and effectiveness of both agencies' STEM education research and development programs in response to recommendations from the Office of Science and Technology Policy (OSTP) and guidance from the Office of Management and Budget (OMB) (Zients, 2012). Although the starting place for the committee was research in STEM, ED quickly realized the broader applicability of the guidelines to other content areas in which it funds research and development. Education research and development programs at NSF are distributed throughout its science and engineering directorates but are located primarily in its Directorate for Education and Human Resources (EHR). EHR's purview includes K-12 education, postsecondary education, and after-school and informal learning environments, as well as the study of science and engineering innovations that emerge from other directorates. ED's research, development, and evaluation programs are located primarily in the Institute of Education Sciences (IES) but also are represented The Joint Committee examined whether the agencies' expectations for the research studies they fund could be characterized in such a way as to provide cross-agency guidance for program officers, prospective grantees, and peer reviewers. A first task was to define the types of ED-and NSF-funded research that relate to the development and testing of interventions and strategies designed to increase learning. Types of research range from early knowledge-generating projects to studies of full-scale implementation of programs, policies, or practices. Importantly, the committee sought to create a common vocabulary to describe the critical features of these study types to improve communication within and across the agencies and in the broader education research community. Second, the Joint Committee specified how the types of research relate to one another and described the theoretical and empirical basis needed to justify each research type. The committee emphasizes the importance of proposed studies building on and referencing an evidence base and, in turn, contributing to the accumulation of empirical evidence and development of theoretical models. Throughout its work, the Joint Committee generally adhered to the guiding principles x poses significant questions that can be investigated empirically; x links empirical research to relevant theory; x uses research designs and methods that permit direct investigation of the question; x is guided by a coherent and explicit chain of reasoning; x replicates and generalizes …

[1]  Keith Zvoch How Does Fidelity of Implementation Matter? Using Multilevel Models to Detect Relationships Between Participant Outcomes and the Delivery and Receipt of Treatment , 2012 .

[2]  N. Songer,et al.  Guiding Explanation Construction by Children at the Entry Points of Learning Progressions. , 2012 .

[3]  N. Songer,et al.  Assessing Students’ Progressing Abilities To Construct Scientific Explanations , 2012 .

[4]  Deborah C. Simmons,et al.  The Effects of an Intensive Shared Book-Reading Intervention for Preschool Children at Risk for Vocabulary Delay , 2011 .

[5]  Deborah C. Simmons,et al.  Developing Low-Income Preschoolers’ Social Studies and Science Vocabulary Knowledge Through Content-Focused Shared Book Reading , 2010 .

[6]  Jeremy Roschelle,et al.  Integration of Technology, Curriculum, and Professional Development for Advancing Middle School Mathematics , 2010 .

[7]  Jeremy Roschelle,et al.  Investigating Links from Teacher Knowledge, to Classroom Practice, to Student Learning in the Instructional System of the Middle-School Mathematics Classroom , 2010 .

[8]  Jeremy Roschelle,et al.  From New Technological Infrastructures to Curricular Activity Systems: Advanced Designs for Teaching and Learning , 2010 .

[9]  The Evaluation of Enhanced Academic Instruction in After-School Programs: Final Report. NCEE 2009-4077. , 2009 .

[10]  Anthony S. Bryk,et al.  Support a Science of Performance Improvement , 2009 .

[11]  J. Bransford,et al.  Mathematics Worth Knowing, Resources Worth Growing, Research Worth Noting: A Response to the National Mathematics Advisory Panel Report , 2008 .

[12]  The Evaluation of Enhanced Academic Instruction in After-School Programs: Findings After the First Year of Implementation. NCEE 2008-4021. , 2008 .

[13]  Jeremy Roschelle,et al.  Scaling Up Innovative Technology-Based Mathematics , 2008 .

[14]  Jeremy Roschelle,et al.  The role of scaling up research in designing for and evaluating robustness , 2008 .

[15]  G. Wilensky Developing a center for comparative effectiveness information. , 2006, Health affairs.

[16]  Pamela A. Moss,et al.  Standards for Reporting on Empirical Social Science Research in AERA Publications American Educational Research Association , 2006 .

[17]  A. Kelly Design Research in Education: Yes, but is it Methodological? , 2004 .

[18]  A. Schoenfeld,et al.  Improving Educational Research:Toward a More Useful, More Influential, and Better-Funded Enterprise , 2003 .

[19]  B. Hamber Publications , 1998, Weed Technology.