ARDressCode: Augmented Dressing Room with Tag-based Motion Tracking and Real-Time Clothes Simulation

This paper introduces a new augmented reality concept for dressing rooms enabling a customer to combine a tactile experience of the fabrics with easy simulated try-on. The dressing room has a camera and a projection surface instead of a mirror. The customers stick a few visual tags to their normal clothes. Then the ARDressCode application features motion capture and provides an AR video stream on the AR “mirror” with the selected piece of clothes mixed in and fitted to the customer body. Design issues and technical implementation as well as the prospects of further development of the techniques are discussed.

[1]  Anind K. Dey,et al.  Understanding and Using Context , 2001, Personal and Ubiquitous Computing.

[2]  Andy Cockburn,et al.  FingARtips: gesture based direct manipulation in Augmented Reality , 2004, GRAPHITE '04.

[3]  Kaj Grønbæk,et al.  Designing Augmented Reality Board Games : The BattleBoard 3 D experience , 2004 .

[4]  Blair MacIntyre,et al.  DART: a toolkit for rapid design exploration of augmented reality experiences , 2005, ACM Trans. Graph..

[5]  Blair MacIntyre,et al.  DART: the Designer's Augmented Reality Toolkit , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[6]  Jay David Bolter,et al.  DART: a toolkit for rapid design exploration of augmented reality experiences , 2005, SIGGRAPH 2005.

[7]  Ivan Poupyrev,et al.  The Magic Book: An interface that moves seamlessly between reality and virtuality , 2001 .

[8]  Ronald Azuma,et al.  Recent Advances in Augmented Reality , 2001, IEEE Computer Graphics and Applications.