Addressing Common Student Technical Errors in Field Data Collection: An Analysis of a Citizen-Science Monitoring Project

The scientific value of citizen-science programs is limited when the data gathered are inconsistent, erroneous, or otherwise unusable. Long-term monitoring studies, such as Our Project In Hawai’i’s Intertidal (OPIHI), have clear and consistent procedures and are thus a good model for evaluating the quality of participant data. The purpose of this study was to examine the kinds of errors made by student researchers during OPIHI data collection and factors that increase or decrease the likelihood of these errors. Twenty-four different types of errors were grouped into four broad error categories: missing data, sloppiness, methodological errors, and misidentification errors. “Sloppiness” was the most prevalent error type. Error rates decreased with field trip experience and student age. We suggest strategies to reduce data collection errors applicable to many types of citizen-science projects including emphasizing neat data collection, explicitly addressing and discussing the problems of falsifying data, emphasizing the importance of using standard scientific vocabulary, and giving participants multiple opportunities to practice to build their data collection techniques and skills.

[1]  Celiam . Smith,et al.  Expert variability provides perspective on the strengths and weaknesses of citizen-driven intertidal monitoring program. , 2012, Ecological applications : a publication of the Ecological Society of America.

[2]  Arco J. van Strien,et al.  Opportunistic citizen science data of animal species produce reliable estimates of distribution trends if analysed with occupancy models , 2013 .

[3]  Celiam . Smith,et al.  Spatial and Temporal Variation in Rocky Intertidal Communities Along the Main Hawaiian Islands1 , 2013 .

[4]  S. M. Evans,et al.  The value of marine ecological data collected by volunteers , 2003 .

[5]  Robert Eugene Yager Inquiry: The Key to Exemplary Science , 2009 .

[6]  Theresa M. Crimmins,et al.  Assessing accuracy in citizen science-based plant phenology monitoring , 2015, International Journal of Biometeorology.

[7]  Tomas J. Bird,et al.  Statistical solutions for error and bias in global citizen science datasets , 2014 .

[8]  Max J. Pfeffer,et al.  Process, Not Product: Investigating Recommendations for Improving Citizen Science “Success” , 2013, PloS one.

[9]  Erin Baumgartner,et al.  A case study of project‐based instruction in the ninth grade: a semester‐long study of intertidal biodiversity , 2008 .

[10]  J. Silvertown A new dawn for citizen science. , 2009, Trends in ecology & evolution.

[11]  Rob Slotow,et al.  An assessment of the use of volunteers for terrestrial invertebrate biodiversity surveys , 2009, Biodiversity and Conservation.

[12]  Sara E. Brownell,et al.  Toward a conceptual framework for measuring the effectiveness of course-based undergraduate research experiences in undergraduate biology , 2015 .