Evaluation methods for creativity support environments

Creativity refers to the human processes that underpin sublime forms of expression and fuel innovation. Creativity support environments (CSEs) address diverse areas, such as education, science, business, disaster response, design, art, performance, and everyday life. A CSE may consist of a desktop application, or use specialized hardware, networked topologies, and mobile devices. CSEs may address temporal-spatial aspects of collaborative work. This workshop gathers a community of researchers developing and evaluating CSEs. We will share approaches, engage in dialogue, and develop best practices. The outcome is not a single prescription, but an ontology of methodologies with consideration to how they map to creative activities, and an emerging consensus on the range of expectations for rigorous evaluation to shape the field of CSE research. The workshop will organize an open repository of CSE evaluation methods and test data.

[1]  Ben Shneiderman,et al.  Creativity Support Tools: Report From a U.S. National Science Foundation Sponsored Workshop , 2006, Int. J. Hum. Comput. Interact..

[2]  Andruid Kerne,et al.  Representing Collections as Compositions to support distributed creative cognition and situated creative learning , 2007, New Rev. Hypermedia Multim..

[3]  Celine Latulipe,et al.  Creativity factor evaluation: towards a standardized survey metric for creativity support , 2009, C&C '09.

[4]  Giuseppe Santucci,et al.  BELIV'08: Beyond time and errors: novel evaluation methods for information visualization , 2008, CHI Extended Abstracts.

[5]  Celine Latulipe,et al.  Evaluating longitudinal projects combining technology with temporal arts , 2011, CHI.

[6]  Kristina Höök,et al.  Sense and sensibility: evaluation and interactive art , 2003, CHI '03.

[7]  James C. Kaufman,et al.  Toward a broader conception of creativity: A case for "mini-c" creativity. , 2007 .

[8]  James J. Duderstadt,et al.  Engineering Research and America's Future: Meeting the Challenges of a Global Economy , 2005 .

[9]  Andrew M. Webb,et al.  Integrating implicit structure visualization with authoring promotes ideation , 2011, JCDL '11.

[10]  Celine Latulipe,et al.  Triangulating the personal creative experience: self-report, external judgments, and physiology , 2012, Graphics Interface.

[11]  Michael A. Terry,et al.  Creativity Support Tool Evaluation Methods and Metrics , 2005 .

[12]  N. Hoffart Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory , 2000 .

[13]  Ben Shneiderman,et al.  Strategies for evaluating information visualization tools: multi-dimensional in-depth long-term case studies , 2006, BELIV '06.

[14]  Andruid Kerne,et al.  An Experimental Method for Measuring the Emergence of New Ideas in Information Discovery , 2008, Int. J. Hum. Comput. Interact..

[15]  Steven M. Smith,et al.  Metrics for measuring ideation effectiveness , 2003 .

[16]  Martin H. Levinson Creativity: Flow and the Psychology of Discovery and Invention , 1997 .

[17]  MARY LOU MAHER,et al.  COMPARISON OF DESIGNERS USING A TANGIBLE USER INTERFACE AND A GRAPHICAL USER INTERFACE AND THE IMPACT ON SPATIAL COGNITION , 2005 .

[18]  Ben Shneiderman,et al.  Creativity support tools , 2002, CACM.

[19]  Brian P. Bailey,et al.  HCI and new media arts: methodology and evaluation , 2007, CHI Extended Abstracts.

[20]  Anselm L. Strauss,et al.  Basics of qualitative research : techniques and procedures for developing grounded theory , 1998 .