Projector-camera pair : an universal IO device for Human Machine Interaction

Video projectors and cameras are widely used in augmented environment applications, such as telepresence, smart rooms or immersive spaces. In this paper we discuss the potential of camera-projector systems as sensor-actuator devices that are particularly suitable for robots operating in human-populated environments. A brief review of sensing techniques and possible applications involving the use of a camera-projector pair is presented. In order to illustrate the use of camera-projector systems in human-machine interaction, an application of a portable display surface is presented. The application involves tracking a rectangular card-board and augmenting it with projected images, thus transforming the card-board into a display surface. The results of system latency evaluation highlight an important challenge for camera-projector-based system.

[1]  P. Beardsley,et al.  iLamps: geometrically aware and self-configuring projectors , 2003, SIGGRAPH Courses.

[2]  Joaquim Salvi,et al.  Pattern codification strategies in structured light systems , 2004, Pattern Recognit..

[3]  Augustin Lux,et al.  The Imalab method for vision systems , 2003, Machine Vision and Applications.

[4]  Luc Van Gool,et al.  Real-time range scanning of deformable surfaces by adaptively coded structured light , 2003, Fourth International Conference on 3-D Digital Imaging and Modeling, 2003. 3DIM 2003. Proceedings..

[5]  D. Maynes-Aminzade,et al.  The actuated workbench: computer-controlled actuation in tabletop tangible interfaces , 2003, ACM Trans. Graph..

[6]  Anders Green,et al.  Social and collaborative aspects of interaction with a service robot , 2003, Robotics Auton. Syst..

[7]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[8]  Claudio S. Pinhanez,et al.  Steerable interfaces for pervasive computing spaces , 2003, Proceedings of the First IEEE International Conference on Pervasive Computing and Communications, 2003. (PerCom 2003)..

[9]  C Guan,et al.  Composite structured light pattern for three-dimensional video. , 2003, Optics express.

[10]  James L. Crowley,et al.  Projecting Rectified Images in an Augmented Environment , 2003 .

[11]  Sophie Dupuy-Chessa,et al.  Ontology for Multi-surface Interaction , 2003, INTERACT.

[12]  Li Zhang,et al.  Rapid shape acquisition using color structured light and multi-pass dynamic programming , 2002, Proceedings. First International Symposium on 3D Data Processing Visualization and Transmission.

[13]  Illah R. Nourbakhsh,et al.  The role of expressiveness and attention in human-robot interaction , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[14]  Marc Levoy,et al.  Real-time 3D model acquisition , 2002, ACM Trans. Graph..

[15]  Dennis R. Wixon,et al.  CHI '02 Extended Abstracts on Human Factors in Computing Systems , 2002, CHI 2002.

[16]  Sara B. Kiesler,et al.  Cooperation with a robotic assistant , 2002, CHI Extended Abstracts.

[17]  Nobutatsu Nakamura,et al.  Active Projector: Image correction for moving image over uneven screens , 2002 .

[18]  Claudio S. Pinhanez The Everywhere Displays Projector: A Device to Create Ubiquitous Graphical Interfaces , 2001, UbiComp.

[19]  Paolo Cignoni,et al.  A low cost 3D scanner based on structured light , 2001 .

[20]  Szymon Rusinkiewicz,et al.  Stripe boundary codes for real-time structured-light range scanning of moving objects , 2001, Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.

[21]  Hiroshi Ishii,et al.  Sensetable: a wireless object tracking platform for tangible user interfaces , 2001, CHI.

[22]  Lola Cannery,et al.  I Show You how I Like You-Can You Read it in My Face , 2001 .

[23]  Constantine D. Spyropoulos,et al.  HUMAN-ROBOT INTERACTION BASED ON SPOKEN NATURAL LANGUAGE DIALOGUE , 2001 .

[24]  Armin Gruen,et al.  Videometrics and Optical Methods for 3d Shape Measurement , 2000 .

[25]  Colin Swindells,et al.  System lag tests for augmented and virtual environments , 2000, UIST '00.

[26]  Wolfram Burgard,et al.  The Interactive Museum Tour-Guide Robot , 1998, AAAI/IAAI.

[27]  Cengizhan Ozturk,et al.  Structured Light Using Pseudorandom Codes , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[28]  Mark A. Livingston,et al.  Managing latency in complex augmented reality systems , 1997, SI3D.

[29]  Wendy E. Mackay,et al.  Back to the real world , 1993, CACM.

[30]  I. Scott MacKenzie,et al.  Lag as a determinant of human performance in interactive systems , 1993, INTERCHI.

[31]  K. L. Boyer,et al.  Color-EncodedStructured Light forRapid Active Ranging , 1987 .

[32]  Jeffrey L. Posdamer,et al.  Surface measurement by space-encoded projected beam systems , 1982, Comput. Graph. Image Process..