Complexity Questionnaires of Visual Displays: A Validation Study of Two Information Complexity Questionnaires of Visual Displays

With the prevalent use of visual interfaces and the increasing demand to display more information, information complexity in human–computer interfaces becomes a major concern for designers. Complex interfaces may adversely affect the effectiveness, efficiency, and even the operational safety of a system. Previously, two questionnaires were developed by researchers at the Federal Aviation Administration to evaluate information complexity of air traffic control displays. This study adapted the questionnaires for commercial visual interfaces and validated them with two types of tasks on three travel websites. The questionnaire measures the information complexity of a visual display based on perceptual, cognitive, and action complexity in terms of three complexity factors: quantity, variety, and relation. The result demonstrates that the questionnaire that uses multiple items for measuring complexity construct has good reliability, validity, and sensitivity. Information complexity is also found to be negatively correlated with usability and positively correlated with mental workload. The contribution of the study includes validating the theoretical framework for the information complexity concept through the use of questionnaires and providing a practical tool for designers to measure information complexity of the visual display for iterative improvement. © 2011 Wiley Periodicals, Inc.

[1]  Oscar Mauricio Serrano Jaimes,et al.  EVALUACION DE LA USABILIDAD EN SITIOS WEB, BASADA EN EL ESTANDAR ISO 9241-11 (International Standard (1998) Ergonomic requirements For office work with visual display terminals (VDTs)-Parts II: Guidance on usability , 2012 .

[2]  Gavriel Salvendy,et al.  Content Preparation and Management for Web Design: Eliciting, Structuring, Searching, and Displaying Information , 2002, Int. J. Hum. Comput. Interact..

[3]  A L Sears,et al.  Automated metrics for user interface design and evaluation. , 1994, International journal of bio-medical computing.

[4]  S. Phillips,et al.  Processing capacity defined by relational complexity: implications for comparative, developmental, and cognitive psychology. , 1998, The Behavioral and brain sciences.

[5]  B. Thompson,et al.  Factor Analytic Evidence for the Construct Validity of Scores: A Historical Overview and Some Guidelines , 1996 .

[6]  Yuanzhen Li,et al.  Feature congestion: a measure of display clutter , 2005, CHI.

[7]  S. Messick Validity of Psychological Assessment: Validation of Inferences from Persons' Responses and Performances as Scientific Inquiry into Score Meaning. Research Report RR-94-45. , 1994 .

[8]  James R. Lewis,et al.  IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for use , 1995, Int. J. Hum. Comput. Interact..

[9]  C M Carswell,et al.  Choosing Specifiers: An Evaluation of the Basic Tasks Model of Graphical Perception , 1992, Human factors.

[10]  Andrew R. A. Conway,et al.  Individual differences in working memory capacity: more evidence for a general capacity theory. , 1996, Memory.

[11]  Ken Nakayama,et al.  Attentional requirements in a ‘preattentive’ feature search task , 1997, Nature.

[12]  Christopher D. Wickens,et al.  Workload Assessment Metrics - What Happens When They Dissociate? , 1983 .