tion may be voiced during a software design meeting. We can capture the spoken and visual software design meeting information by videotaping the meeting and any white-boards used. By indexing these videos, we make it easy to retrieve the videotaped information without watching the entire video from start to nish. Motivation: We want to allow software design meetings to continue as they are, with software designers discussing the design and drawing free-hand sketches of these designs on a white-board. Using our system, designers can sketch naturally, as we place few requirements on the sketcher. We recognize and interpret these diagrams using sketch recognition. Because the diagrams are interpreted, we provide natural editing capabilities to the designers, allowing the users to edit their original strokes in an intuitive way. For instance, the designer can drag their drawn class from the center and move all of the strokes used to draw the class as well as stretch and skew the strokes used to create an attached arrow. The interpreted diagrams are used to automatically generate stub code using a software engineering tool. Software design meetings are videotaped to capture visual and spoken design information unobtrusively. When drawn items are interpreted, we use these understood sketch events to index the videotape of the software design meeting. We decided to design our application as a Metaglue agent since the Metaglue agent architecture provides support for multi-modal interactions through speech, gesture, and graphical user interfaces[2]. The Metaglue agent architecture also provides mechanisms for resource discovery and management which allows us to use available video agents or screen capture agents in a Metaglue supported room. We have selected UML-type diagrams because they are a de facto standard for depicting software applications. Within UML [1] we focused on class diagrams, rst because of their central role in describing program structure, and second because many of the symbols used in class diagrams are quite similar, and hence, offer an interesting challenge for sketch recognition. We added several symbols for agent-design since many of the applications created in the Intelligent Room [6] of the MIT AI Lab are
[1]
Christian Heide Damm,et al.
Tool support for cooperative object-oriented design: gesture based modelling on an electronic whiteboard
,
2000,
CHI.
[2]
Krzysztof Z. Gajos,et al.
An Agent-Based System for Capturing and Indexing Software Design Meetings
,
2002
.
[3]
Michael H. Coen,et al.
Meeting the Computational Needs of Intelligent Environments: The Metaglue System
,
2000
.
[4]
Ivar Jacobson,et al.
Unified Modeling Language
,
2020,
Definitions.
[5]
Ivar Jacobson,et al.
The Unified Modeling Language User Guide
,
1998,
J. Database Manag..
[6]
James A. Landay,et al.
Sketch-based user interfaces for collaborative object-oriented modeling
,
1999
.
[7]
Ajay Kulkarni,et al.
Building Agent-Based Intelligent Workspaces
,
2002,
International Conference on Internet Computing.
[8]
Sean Jy-Shyang Chen,et al.
An interactive system for recognizing hand drawn UML diagrams
,
2000,
CASCON.