Heuristic Evaluation Techniques for Collaborative Software

Heuristic Evaluations are most often conducted to evaluate the usability of a software system’s interface. However when evaluating collaborative software, it is critical to understand not only how well the interface design meets these general standards, but also how well it is designed to meet the collaboration needs of the users. In order to address this issue, the traditional heuristic evaluation process was modified to assess the tools’ usability in supporting collaborative behaviors defined in a previously developed Collaboration Evaluation Framework (Klein and Adelman, 2005). In this report we describe the key findings from our heuristic evaluation of the collaborative usability of Groove v3.0, InfoWorkSpace v2.5, and Lotus Sametime. For instance, failure to preserve common ground had a major impact on the effective use of all three tools. Generally, the tools need to find the middle ground of attracting the users’ attention without distracting them from their tasks. Transmission flexibility was the greatest success of all three tools.