Strengthening the foundation of educational psychology by integrating construct validation into open science reform

Abstract An increased focus on transparency and replication in science has stimulated reform in research practices and dissemination. As a result, the research culture is changing: the use of preregistration is on the rise, access to data and materials is increasing, and large-scale replication studies are more common. In this article, I discuss two problems the methodological reform movement is now ready to tackle given the progress thus far and how educational psychology is particularly well suited to contribute. The first problem is that there is a lack of transparency and rigor in measurement development and use. The second problem is caused by the first—replication research is difficult and potentially futile as long as the first problem persists. I describe how to expand transparent practices into measure use and how construct validation can be implemented to bolster the validity of replication studies.

[1]  Christopher R. Chartier,et al.  The Psychological Science Accelerator , 2018 .

[2]  J. Reich Preregistration and registered reports , 2021, Educational Psychologist.

[3]  Christopher Rhoads The Implications of “Contamination” for Experimental Design in Education , 2011 .

[4]  Lorne Campbell,et al.  Preregistration Is Hard, And Worthwhile , 2019, Trends in Cognitive Sciences.

[5]  Brian A. Nosek,et al.  The preregistration revolution , 2018, Proceedings of the National Academy of Sciences.

[6]  Reginald B. Adams,et al.  Investigating Variation in Replicability: A “Many Labs” Replication Project , 2014 .

[7]  P. Alexander,et al.  A Motivated Exploration of Motivation Terminology. , 2000, Contemporary educational psychology.

[8]  G. Loewenstein,et al.  Measuring the Prevalence of Questionable Research Practices With Incentives for Truth Telling , 2012, Psychological science.

[9]  D. Simons Introducing Advances in Methods and Practices in Psychological Science , 2018 .

[10]  An Introduction to the Theory of Mental and Social Measurements , 1904 .

[11]  Christopher R. Chartier,et al.  The Psychological Science Accelerator: Advancing Psychology Through a Distributed Collaborative Network , 2018, Advances in methods and practices in psychological science.

[12]  D. Simons The Value of Direct Replication , 2014, Perspectives on psychological science : a journal of the Association for Psychological Science.

[13]  Audrey L. Qualls,et al.  The Degree of Congruence between Test Standards and Test Documentation with in Journal Publications , 1996 .

[14]  Peter D. Harms,et al.  Much Ado About Grit: A Meta-Analytic Synthesis of the Grit Literature , 2017, Journal of personality and social psychology.

[15]  Jessica L. Tracy,et al.  The Jingle and Jangle of Emotion Assessment: Imprecise Measurement, Casual Scale Usage, and Conceptual Fuzziness in Emotion Research , 2017, Emotion.

[16]  Chris S. Hulleman,et al.  Measuring cost: The forgotten component of expectancy-value theory , 2015 .

[17]  Implications of the open science era for educational psychology research syntheses , 2021 .

[18]  Andrew Gelman,et al.  Measurement error and the replication crisis , 2017, Science.

[19]  Leif D. Nelson,et al.  Data from Paper “False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant” , 2014 .

[20]  J. Flake,et al.  Measurement practices in large-scale replications: Insights from Many Labs 2. , 2020 .

[21]  David Mellor Improving norms in research culture to incentivize transparency and rigor , 2021, Educational Psychologist.

[22]  Michael C. Frank,et al.  Estimating the reproducibility of psychological science , 2015, Science.

[23]  Timothy D. Wilson,et al.  Comment on “Estimating the reproducibility of psychological science” , 2016, Science.

[24]  Brian A. Nosek,et al.  Promoting an open research culture : Author guidelines for journals could help to promote transparency , openness , and reproducibility , 2015 .

[25]  Carol L. O'Donnell Defining, Conceptualizing, and Measuring Fidelity of Implementation and Its Relationship to Outcomes in K–12 Curriculum Intervention Research , 2008 .

[26]  M. Haselton,et al.  Meta-analyses and p-curves support robust cycle shifts in women's mate preferences: reply to Wood and Carden (2014) and Harris, Pashler, and Mickes (2014). , 2014, Psychological bulletin.

[27]  D. Moher,et al.  CONSORT 2010 statement: Updated guidelines for reporting parallel group randomised trials , 2010, Journal of pharmacology & pharmacotherapeutics.

[28]  M. Baker 1,500 scientists lift the lid on reproducibility , 2016, Nature.

[29]  S. Messick Validity of Psychological Assessment: Validation of Inferences from Persons' Responses and Performances as Scientific Inquiry into Score Meaning. Research Report RR-94-45. , 1994 .

[30]  Andrew K. Przybylski,et al.  The association between adolescent well-being and digital technology use , 2019, Nature Human Behaviour.

[31]  Bryan G. Cook,et al.  Open accessibility in education research: Enhancing the credibility, equity, impact, and efficiency of research , 2021, Educational psychologist.

[32]  S. Christenson,et al.  Handbook of Research on Student Engagement , 2012 .

[33]  Francis Tuerlinckx,et al.  Increasing Transparency Through a Multiverse Analysis , 2016, Perspectives on psychological science : a journal of the Association for Psychological Science.

[34]  Hunter Gehlbach,et al.  Measure Twice, Cut down Error: A Process for Enhancing the Validity of Survey Scales , 2011 .

[35]  A. Gelman,et al.  The garden of forking paths : Why multiple comparisons can be a problem , even when there is no “ fishing expedition ” or “ p-hacking ” and the research hypothesis was posited ahead of time ∗ , 2019 .

[36]  B. Zumbo,et al.  Reflections on Validation Practices in the Social, Behavioral, and Health Sciences , 2014 .

[37]  Jessica Kay Flake,et al.  Construct Validation in Social and Personality Research , 2017 .

[38]  D. Eignor The standards for educational and psychological testing. , 2013 .

[39]  J. Rodgers,et al.  Psychology, Science, and Knowledge Construction: Broadening Perspectives from the Replication Crisis , 2018, Annual review of psychology.

[40]  Maya B. Mathur,et al.  Many Labs 5: Testing Pre-Data-Collection Peer Review as an Intervention to Increase Replicability , 2019, Advances in Methods and Practices in Psychological Science.

[41]  S. Muthukumaraswamy,et al.  Instead of "playing the game" it is time to change the rules: Registered Reports at AIMS Neuroscience and beyond , 2014 .

[42]  Matthew C. Makel,et al.  Replication is important for educational psychology: Recent developments and key issues , 2020, Educational Psychologist.

[43]  Robbie C. M. van Aert,et al.  Degrees of Freedom in Planning, Running, Analyzing, and Reporting Psychological Studies: A Checklist to Avoid p-Hacking , 2016, Front. Psychol..

[44]  Reinhard Pekrun,et al.  The Murky Distinction Between Self-Concept and Self-Efficacy: Beware of Lurking Jingle-Jangle Fallacies , 2019, Journal of Educational Psychology.

[45]  G. Antes,et al.  CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials , 2011 .

[46]  L. Cronbach,et al.  Construct validity in psychological tests. , 1955, Psychological bulletin.

[47]  Duane T. Wegener,et al.  Conceptualizing and evaluating the replication of research results , 2016 .

[48]  Brian A. Nosek,et al.  Scientific Utopia: I. Opening Scientific Communication , 2012, ArXiv.

[49]  A. Sias Interpretation of educational measurements. , 1928 .

[50]  Susann Fiedler,et al.  Badges to Acknowledge Open Practices: A Simple, Low-Cost, Effective Method for Increasing Transparency , 2016, PLoS biology.