Designed Blindness: An Action Science Perspective on Program Theory Evaluation

This article is intended to stimulate a dialogue between program theory evaluation and action science for the purposes of cross-fertilization and mutual enrichment. Both program theory evaluation and action science use the concept of implicit “theories of action” as a central construct in the study of social practice. However, an action science approach suggests a wider understanding of program theory that (1) specifies the links between individual reasoning and behavior to program implementation, and (2) accounts for how programs deal with dilemmas, conflict, and error. This paper begins with a systematic, though not exhaustive, comparison of program theory evaluation and action science. It analyzes an exemplar of program theory evaluation from an action science perspective to illustrate a subtheory, “designed blindness,” and its impact on both program implementation and the evaluation itself. It then offers a theory for overcoming designed blindness. Finally, this article argues that action science concepts and skills can enable program theory evaluators to be more effective in confronting defensiveness and in facilitating learning among stakeholders when there is a gap between “espoused” program theory and “theory-in-use.”

[1]  M. Lipsky,et al.  Street-Level Bureaucracy: The Dilemmas of the Individual in Public Service , 1983 .

[2]  Lynn Westbrook,et al.  Utilization-focused evaluation , 1998 .

[3]  Donald A. Schön,et al.  Organizational Learning: A Theory Of Action Perspective , 1978 .

[4]  Patricia J. Rogers,et al.  Program theory evaluation: Practice, promise, and problems , 2000 .

[5]  Edgar H. Schein,et al.  On Dialogue, Culture, and Organizational Learning , 1993 .

[6]  Jay Rothman Action Evaluation: Helping To Define, Assess, and Achieve Organizational Goals. , 1999 .

[7]  H. Simon,et al.  Selective perception: A note on the departmental identifications of executives. , 1958 .

[8]  Kenneth F. Warren Implementation: How Great Expectations in Washington Are Dashed in Oakland; Or Why It's Amazing That Federal Programs Work at All . By Jeffrey L. Pressman and Aaron B. Wildavsky. (Berkeley: University of California Press, 1973. Pp. xviii, 182. $7.50.) , 1974 .

[9]  Johan P. Olsen,et al.  Ambiguity and choice in organizations , 1976 .

[10]  D. Hickson Organizational Analysis: A Sociological View , 1972 .

[11]  Victor J. Friedman,et al.  Human Resources or Politics: Framing the Problem of Appointing Managers in an Organizational Democracy , 1994 .

[12]  R. Putnam Unlocking Organizational Routines that Prevent Learning , 2000 .

[13]  Gustavo Stubrich The Fifth Discipline: The Art and Practice of the Learning Organization , 1993 .

[14]  K. Lewin,et al.  The Conflict Between Aristotelian and Galileian Modes of Thought in Contemporary Psychology , 1931 .

[15]  Advancing the Development and Application of Theory-based Evaluation in the Practice of Public Health , 1999 .

[16]  J. Dewey Logic, the theory of inquiry , 1938 .

[17]  C. Argyris Reasoning, learning, and action : individual and organizational , 1982 .

[18]  S Hassfeld,et al.  EVALUATION OF MODELS , 2002, Biomedizinische Technik. Biomedical engineering.

[19]  John A. McLaughlin,et al.  Logic models: a tool for telling your programs performance story , 1999 .

[20]  Leonard Bickman,et al.  Summing up program theory , 2000 .

[21]  Donald A. Schön,et al.  Social Experimentation as Reflection-in-A ction , 1984 .

[22]  Kathryn A. Sielbeck-Bowen Development of Local Program Theory: Using Theory-Oriented Evaluation to Make a Difference , 2000 .

[23]  Victor J. Friedman,et al.  The Individual as Agent of Organizational Learning , 2002 .

[24]  A. Edmondson Three Faces of Eden: The Persistence of Competing Theories and Multiple Diagnoses in Organizational Intervention Research , 1996 .

[25]  Donald A. Schön Educating the Reflective Practitioner: Toward a New Design for Teaching and Learning in the Professions , 1987 .

[26]  C. Argyris Field Theory as a Basis for Scholarly Consulting , 1997 .

[27]  C. Argyris Good communication that blocks learning , 1994 .

[28]  C. Weiss How Can Theory-Based Evaluation Make Greater Headway? , 1997 .

[29]  Sue C. Funnell Developing and using a program theory matrix for program evaluation and performance monitoring , 2000 .

[30]  Jay Rothman,et al.  Resolving Identity-Based Conflict In Nations, Organizations, and Communities , 1997 .

[31]  E. Suchman Evaluative Research: Principles and Practice in Public Service and Social Action Progr , 1968 .

[32]  Daniel L. Stufflebeam,et al.  Evaluation Models: New Directions for Evaluation , 2001 .

[33]  Patricia J. Rogers,et al.  Causal models in program theory evaluation , 2000 .

[34]  William M. K. Trochim,et al.  An introduction to concept mapping for planning and evaluation. , 1989 .

[35]  L. Christman Theory in Practice: Increasing Professional Effectiveness , 1977 .

[36]  Massimo Marraffa,et al.  Organizational learning II: Theory, method and practice , 1998 .

[37]  Huey-tsyh Chen Theory-driven evaluations , 1990 .

[38]  Anne L. Schneider Studying Policy Implementation , 1982 .

[39]  Mark W. Lipsey,et al.  Driving toward theory in program evaluation: More models to choose from , 1989 .

[40]  P. Senge,et al.  The Fifth Discipline Fieldbook , 1994 .

[41]  C. Argyris Teaching Smart People How to Learn , 2002 .

[42]  Joseph S. Wholey,et al.  Federal evaluation policy : analyzing the effects of public programs , 1974 .

[43]  Chris Argyris,et al.  Knowledge for Action: A Guide to Overcoming Barriers to Organizational Change , 1993 .

[44]  E. Davidson,et al.  Ascertaining causality in theory‐based evaluation , 2000 .

[45]  Thomas D. Cook,et al.  The false choice between theory‐based evaluation and experimentation , 2000 .

[46]  C. Argyris Strategy, change, and defensive routines , 1985 .