Utilization of a Think-Aloud Protocol to Cognitively Validate a Survey Instrument Identifying Social Capital Resources of Engineering Undergraduates

The use of verbal report (e.g. “think-aloud”) techniques in developing a survey instrument can be critical to establishing an instrument’s cognitive validity, which helps ensure that participants interpret and respond to survey items in the manner intended by the survey designer(s). The primary advantage of utilizing a verbal cognitive validation protocol is having evidence that survey items are interpreted by participants in the same way the researcher intended before the instrument is administered to a large sample. Think-aloud protocols have been used to accomplish different goals in a variety of fields, including engineering education where thinkalouds are commonly used in problem solving research. While think-alouds have been used by engineering education researchers, the engineering education literature includes few resources for researchers regarding the use of these protocols with respect to large-scale survey development and refinement. In this paper, we present a protocol based on elements of thinkalouds conducted inside and outside the engineering education domain. By presenting results and examples from our own experience suing this protocol, we aim to provide a cognitive validation model which may be useful to engineering education researchers designing their own survey instruments. By following the model outlined in this paper, participants in our study verbalized several issues of concern when interacting with our web-based survey. These issues ranged from minor grammatical errors to serious cognitive mismatches which caused participants to interpret and/or respond to items differently than we intended. Participants were asked for suggestions to correct these issues, and changes were made to the survey based on this feedback. The survey was retested in two additional iterations of think-aloud sessions with new participants to ensure the revisions successfully remedied the issues encountered by previous participants. Finally, the refined survey was pilot tested and subsequently reviewed by an expert in the field before being administered at seven institutions. This paper includes evidence and specific examples of how the cognitive validation model resulted in a refined survey instrument, as well as recommendations for other engineering education researchers wishing to employ similar techniques in designing and validating survey instruments. Introduction and Motivation Much of the extant literature surrounding the establishment of reliability and validity for survey instruments largely focuses predominately statistical methods to establish such measures as construct or internal consistency within an instrument, which requires the use of rigorous statistical methods to compute coefficients such as Cronbach’s alpha to verify that the instrument has achieved at least a minimum acceptable level of reliability and/or validity (e.g. Eris and colleagues). Such statistical methods can establish a case for whether or not the instrument consistently and appropriately measures participant responses to items by measuring a variety of constructs, which include (but is not limited to) ensuring the items within the instrument have an appropriate coverage of the relevant content, are scored or evaluated consistently, and/or are

[1]  G. Willis,et al.  The use of verbal report methods in the development and testing of survey questionnaires , 1991 .

[2]  J. C. Flanagan Psychological Bulletin THE CRITICAL INCIDENT TECHNIQUE , 2022 .

[3]  S. Huttly,et al.  Psychometric and cognitive validation of a social capital measurement tool in Peru and Vietnam. , 2006, Social science & medicine.

[4]  Lorette K. Woolsey The Critical Incident Technique: An Innovative Qualitative Method of Research , 1986 .

[5]  Flora P. McMartin,et al.  Methods to improve the validity and sensitivity of a self/peer assessment instrument , 2000, IEEE Trans. Educ..

[6]  N. Salkind Tests & Measurement for People Who (Think They) Hate Tests & Measurement , 2005 .

[7]  Stuart A. Karabenick,et al.  Cognitive Processing of Self-Report Items in Educational Research: Do They Think What We Mean? , 2007 .

[8]  Emily R. Hoole,et al.  What Are You Thinking? Postsecondary Student Think-Alouds of Scientific and Quantitative Reasoning Items , 2006, The Journal of General Education.

[9]  E. Barozet,et al.  Social Capital. A Theory of Social Structure and Action , 2002 .

[10]  J R Lewis,et al.  Sample Sizes for Usability Studies: Additional Considerations , 1994, Human factors.

[11]  L. Faulkner Beyond the five-user assumption: Benefits of increased sample sizes in usability testing , 2003, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[12]  Ruth N. Bolton Pretesting Questionnaires: Content Analyses of Respondents' Concurrent Verbal Protocols , 1993 .

[13]  Robert A. Virzi,et al.  Refining the Test Phase of Usability Evaluation: How Many Subjects Is Enough? , 1992 .

[14]  Ozgur Eris,et al.  Development Of The Persistence In Engineering (Pie) Survey Instrument , 2005 .

[15]  Julie Martin Trenor,et al.  The Relations of Ethnicity to Female Engineering Students' Educational Experiences and College and Career Plans in an Ethnically Diverse Learning Environment , 2008 .

[16]  J. Trenor A phenomenological inquiry of the major choice processes of an overlooked demographic: First generation college students in engineering , 2009 .

[17]  J. Trenor,et al.  Influences for selecting engineering: Insights on access to Social Capital from two case studies , 2008, 2008 38th Annual Frontiers in Education Conference.

[18]  J. Trenor,et al.  First generation college students in engineering: A qualitative investigation of barriers to academic plans , 2008, 2008 38th Annual Frontiers in Education Conference.

[19]  T.S. Harding,et al.  A case study on research in engineering education: designing, testing, and administering the PACES-2 survey on academic integrity , 2005, Proceedings Frontiers in Education 35th Annual Conference.

[20]  S. Presser,et al.  Survey Questions: Handcrafting the Standardized Questionnaire. , 1988 .

[21]  K. A. Ericsson,et al.  Protocol Analysis: Verbal Reports as Data , 1984 .

[22]  Martin van der Gaag Measurement of individual social capital , 2005 .

[23]  J. Payne Thinking Aloud: Insights Into Information Processing , 1994 .

[24]  G. Willis,et al.  Research Synthesis: The Practice of Cognitive Interviewing , 2007 .