THE IMPACT OF COMPUTER SCREEN RESOLUTION SETTING ON STUDENT COMPUTER-BASED HANDS-ON TASK PERFORMANCE IN AN INTRODUCTORY INFORMATION SYSTEMS COURSE

Over the past several decades, considerable research has been completed exploring the possible influence various computer-based testing formats could have on student performance. Indeed, a simple search of the ERIC database using the descriptor “computer assisted testing” dated 1988 through 2008 returned over 1,800 citations. The thrust of many early studies was on assessing the consistency of student performance on computer-based and paper-andpencil versions of the same test. Most recently researchers have expressed the need for studies exploring the potential impact computer-based test interface configurations on student performance. Thus, the purpose of this posttest only experimental control group design study was to determine if there was a significant difference in student hands-on task performance based on computer screen resolution setting. Study results indicate that student hands-on task performance is not significantly impacted by computer screen resolution setting. These same results were found when the variable gender was included in the analysis.

[1]  Barbara S. Chaparro,et al.  Comparing the effects of text size and format on the readibility of computer-displayed Times New Roman and Arial text , 2003, Int. J. Hum. Comput. Stud..

[2]  Descriptors Adaptive,et al.  Annual Meeting of the National Council on Measurement in Education , 2002 .

[3]  P. A Chalmers User Interface Improvements in Computer-Assisted Instruction, the Challenge. , 2000 .

[4]  H. Leeson The Mode Effect: A Literature Review of Human and Technological Issues in Computerized Testing , 2006 .

[5]  Joseph J. Pedulla,et al.  Performance Differences According to Test Mode and Computer Familiarity on a Practice Graduate Record Exam , 2002 .

[6]  Robert F. Testa,et al.  Educational Research: Competencies for Analysis and Application , 1979 .

[7]  Larry B. Wallnau,et al.  Statistics for the Behavioral Sciences , 1985 .

[8]  Mary Pommerich,et al.  The Effect of Administration Mode on Test Performance and Score Precision, and Some Factors Contributing to Mode Differences , 2002 .

[9]  Stephen A. Brewster,et al.  The Effect of Age and Font Size on Reading Text on Handheld Computers , 2005, INTERACT.

[10]  Cynthia G. Parshall,et al.  Practical Considerations in Computer-Based Testing , 2002 .

[11]  Brent Bridgeman,et al.  Effects of Screen Size, Screen Resolution, and Display Rate on Computer-Based Test Performance , 2001 .

[12]  C. Ricketts,et al.  Improving Student Performance Through Computer-based Assessment: Insights from recent research , 2002 .

[13]  Glenn Fulcher,et al.  lnterface design in computer-based language testing , 2011 .

[14]  Richard M. Luecht Challenges of Web-Based Assessment , 2001 .

[15]  Allen D. Truell,et al.  The Impact of Settable Test Item Exposure Control Interface Format On Postsecondary Business Student Test Performance , 2005 .

[16]  Daniel H. Robinson,et al.  Speed and Performance Differences among Computer-Based and Paper-Pencil Tests , 2004 .

[17]  Douglas F. Becker,et al.  The Score Equivalence of Paper-and-Pencil and Computerized Versions of a Speeded Test of Reading Comprehension , 2002 .

[18]  Geoffrey E. Mills,et al.  Educational Research: Competencies for Analysis and Application , 1995 .

[19]  Frederick J. Gravetter,et al.  Statistics for the Behavioral Sciences [6th ed.] , 2004 .

[20]  R. Bennett Online Assessment and the Comparability of Score Meaning , 2003 .

[21]  Richard Furuta,et al.  Skimming electronic newspaper headlines: a study of typeface, point size, screen resolution, and monitor size , 1997, Inf. Process. Manag..