Using an Instrument Blueprint to Support the Rigorous Development of New Surveys and Assessments in Engineering Education

Many sound methods exist for creating the items or questions that make up educational surveys and assessments. These methods include the use of content experts, reviews of existing instruments, and lists of behaviors and descriptors commonly associated with the construct(s) we wish to assess. Unfortunately, however, item creation sometimes becomes overly dependent upon a researcher’s personal attitudes about the construct(s) being tested, or on “borrowing” items from other instruments that may or may not be sound measures of the construct(s) of interest. These risks are particularly likely for new researchers in engineering education, who may have little experience with best practices in social science research. One way to support best practices in the development of new surveys and assessments is to use an instrument blueprint to guide the creation of items, as well as the collection of validity evidence. This paper outlines a process for instrument blueprint creation and content validation to help support best practices in educational assessment. Based on Messick’s unified theory of validity, the instrument blueprint includes a process for item construction that incorporates multiple resources, including: (1) the views of content experts; (2) research from the relevant domain of interest; (3) reviews of existing instruments; and (4) the expertise of the research team. This paper uses the development of a new instrument to measure engineering innovativeness as an illustrative example of the blueprinting process. Our new instrument will assess 20 characteristics of innovative engineers as identified by in-depth studies of expert engineering innovators in previous research. This work highlights the positive impact of using a systematic process for item construction to transform current methods of assessment in engineering education.

[1]  Jessica Menold,et al.  Characterizing engineering innovativeness through a modified Delphi Study , 2014, 2014 IEEE Frontiers in Education Conference (FIE) Proceedings.

[2]  D. Betsy McCoach,et al.  Development of an Instrument to Measure Perspectives of Engineering Education Among College Students , 2008 .

[3]  Dan Budny,et al.  Assessment of the Impact of Freshman Engineering Courses * , 1998 .

[4]  H. Suen Principles of test theories , 1990 .

[5]  D. Campbell,et al.  Convergent and discriminant validation by the multitrait-multimethod matrix. , 1959, Psychological bulletin.

[6]  Paul Kline,et al.  A Handbook of Test Construction : Introduction to Psychometric Design , 1987 .

[7]  M. Besterfield-Sacre,et al.  Assessing student learning in technology entrepreneurship , 2008, 2008 38th Annual Frontiers in Education Conference.

[8]  Lee J. Cronbach,et al.  Psychological tests and personnel decisions , 1958 .

[9]  T. Cook,et al.  Quasi-experimentation: Design & analysis issues for field settings , 1979 .

[10]  Jessica Menold,et al.  Collaborative Research: Identifying and Assessing Key Factors of Engineering Innovativeness , 2014 .

[11]  Joanna S. Gorin,et al.  Improving Construct Validity with Cognitive Psychology Principles. , 2001 .

[12]  Steven M. Downing,et al.  Test Item Development: Validity Evidence From Quality Assurance Procedures , 1997 .

[13]  Ş. Purzer,et al.  A Critical Review of Measures of Innovativeness , 2014 .

[14]  S. Messick Validity of Psychological Assessment: Validation of Inferences from Persons' Responses and Performances as Scientific Inquiry into Score Meaning. Research Report RR-94-45. , 1994 .

[15]  S. Haynes,et al.  Content validity in psychological assessment: A functional approach to concepts and methods. , 1995 .

[16]  L. Cronbach,et al.  Psychological tests and personnel decisions , 1958 .

[17]  Daniel M. Ferguson,et al.  How engineering innovators characterize engineering innovativeness: A qualitative study , 2013 .

[18]  J. Loevinger Objective Tests as Instruments of Psychological Theory , 1957 .

[19]  Roger T. Lennon,et al.  Assumptions Underlying the Use of Content Validity , 1956 .

[20]  松木 健一 Reconstruction of Educational Research and Teacher Education as Seen from a Clinical Research : Focusing on the Teacher Education of Faculty of Education and Regional Studies Fukui University( Clinical Knowledge of Education) , 2002 .

[21]  C. H. Lawshe A QUANTITATIVE APPROACH TO CONTENT VALIDITY , 1975 .