Learning and satisficing: an analysis of sequence effects in health valuation.

OBJECTIVE To estimate the effect of sequence on response precision and response behavior in health valuation studies. METHODS Time trade-off (TTO) and paired comparison responses from six health valuation studies-four US, one Spanish, and one Dutch-were examined (22,225 respondents) to test whether task sequence influences response precision (e.g., rounding), response changes, and median response times. Each study used a computer-based instrument that randomized task sequence among a national sample of adults, age 18 years or older, from the general population. RESULTS For both TTO and paired comparisons, median response times decreased with sequence (i.e., learning), but tended to flatten after the first three tasks. Although the paired comparison evidence demonstrated that sequence had no effect on response precision, the frequency of rounded TTO responses (to either 1-year or 5-year units) increased with sequence. CONCLUSIONS Based on these results, randomizing or reducing the number of paired comparison tasks does not appear to influence response precision; however, generalizability, practicality, and precautionary considerations remain. Overall, participants learned to respond efficiently within the first three tasks and did not resort to satisficing, but may have rounded their TTO responses.

[1]  Donald M. Waldman,et al.  (www.interscience.wiley.com) DOI: 10.1002/jae.984 LEARNING AND FATIGUE DURING CHOICE EXPERIMENTS: A COMPARISON OF ONLINE AND MAIL SURVEY MODES , 2022 .

[2]  Chanjin Chung,et al.  How many choice sets and alternatives are optimal? Consistency in choice experiments , 2011 .

[3]  N. Schwarz,et al.  Context Effects in Social and Psychological Research , 1992 .

[4]  M. Johannesson,et al.  An experimental test of question framing in health state utility assessment. , 1998, Health policy.

[5]  André de Palma,et al.  Rational Choice under an Imperfect Ability to Choose , 1994 .

[6]  Mark Oppe,et al.  Introducing the composite time trade-off: a test of feasibility and face validity , 2013, The European Journal of Health Economics.

[7]  Mickael Bech,et al.  Ordering effect and price sensitivity in discrete choice experiments: need we worry? , 2006, Health economics.

[8]  Aki Tsuchiya,et al.  A uniform time trade off method for states better and worse than dead: feasibility study of the 'lead time' approach. , 2011, Health economics.

[9]  Hunter Gehlbach,et al.  Using the Theory of Satisficing to Evaluate the Quality of Survey Data , 2011, Research in Higher Education.

[10]  Glenn Salkeld,et al.  Does attribute framing in discrete choice experiments influence willingness to pay? Results from a discrete choice experiment in screening for colorectal cancer. , 2009, Value in health : the journal of the International Society for Pharmacoeconomics and Outcomes Research.

[11]  R. Tourangeau,et al.  Fast times and easy questions: the effects of age, experience and question complexity on web survey response times , 2008 .

[12]  Kim Rand-Hendriksen,et al.  Learning effects in time trade-off based valuation of EQ-5D health states. , 2012, Value in health : the journal of the International Society for Pharmacoeconomics and Outcomes Research.

[13]  G W Torrance,et al.  A utility maximization model for evaluation of health care programs. , 1972, Health services research.

[14]  Don A. Dillman,et al.  Designing Scalar Questions for Web Surveys , 2009 .

[15]  Dirk Heerwegh,et al.  Explaining Response Latencies and Changing Answers Using Client-Side Paradata from a Web Survey , 2003 .

[16]  J. Krosnick Response strategies for coping with the cognitive demands of attitude measures in surveys , 1991 .

[17]  M Ryan,et al.  Response-ordering effects: a methodological issue in conjoint analysis. , 1999, Health economics.

[18]  Jordan J. Louviere,et al.  An exploratory analysis of the effect of numbers of choice sets in designed choice experiments: an airline choice application , 2001 .

[19]  C. Gudex,et al.  Time trade-off user manual: props and self-completion methods , 1994 .

[20]  Ian J. Bateman,et al.  Ordering effects and choice set awareness in repeat-response stated preference studies , 2012 .

[21]  Neil Malhotra,et al.  Completion Time and Response Order Effects in Web Surveys , 2008 .

[22]  Bregje Holleman,et al.  Agree or Disagree? Cognitive Processes in Answering Contrastive Survey Questions , 2011 .

[23]  S. Ramachandran,et al.  Relative risk of a shuffled deck: a generalizable logical consistency criterion for sample selection in health state valuation studies. , 2006, Health economics.

[24]  Peter Martinsson,et al.  How Much is Too Much? , 2008 .

[25]  David A. Hensher,et al.  Revealing Differences in Willingness to Pay due to the Dimensionality of Stated Choice Designs: An Initial Assessment , 2006 .

[26]  Kevin J. Boyle,et al.  Convergent Validity of Attribute-Based, Choice Questions in Stated-Preference Studies , 2009 .