A study of commenting agents as design support

Sixteen subjects were observed using a simulated (Wizard-of-Oz) commenting agent in a design support system. Different commenting behavior was tested, and the overall usefulness evaluated. The interaction was logged and recorded on video, and the subjects rated the agent with respect to usefulness, understandability, system competence, disturbance and perceived stress. Perceived mental workload was measured using RTLX. The results show that a commenting tool is seen as disturbing but useful, that the comments from an active tool risk being overlooked, and that comments pointing out ways of overcoming identified design problems are the easiest to understand.