Natural user interface for lighting control: Case study on desktop lighting using modular robots

Roombots (RB) are self-reconfigurable modular robots designed to explore physical structure change by robotic reconfiguration and adaptive locomotion on structured grid environments or unstructured environments. The primary goal of RB is to create adaptive furniture. In this study, we propose a novel and user-friendly interface to control position and intensity of a mobile desk light using RB modules. In the proposed method, the user interacts with the RB with only hand/arm gestures. The user's arm is tracked with a single Kinect having bird's eye view. We demonstrate the effectiveness of the proposed interface in real hardware setup and discuss contributions of it.

[1]  Massimo Piccardi,et al.  Background subtraction techniques: a review , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[2]  Zhiwei Luo,et al.  A Soft Human-Interactive Robot RI-MAN , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  Max Mühlhäuser,et al.  Automatic Camera Control for Tracking a Presenter during a Talk , 2012, 2012 IEEE International Symposium on Multimedia.

[4]  Marek P. Michalowski,et al.  Keepon : A Playful Robot for Research, Therapy, and Entertainment (Original Paper) , 2009 .

[5]  Martin Frassl,et al.  A prototyping environment for interaction between a human and a robotic multi-agent system , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[6]  James McLurkin,et al.  Speaking Swarmish: Human-Robot Interface Design for Large Swarms of Autonomous Mobile Robots , 2006, AAAI Spring Symposium: To Boldly Go Where No Human-Robot Team Has Gone Before.

[7]  J Vertut,et al.  Teleoperation and robotics :: applications and technology , 1987 .

[8]  Aude Billard,et al.  Roombots: Reconfigurable Robots for Adaptive Furniture , 2010, IEEE Computational Intelligence Magazine.

[9]  M. Hirose,et al.  Development of Humanoid Robot ASIMO , 2001 .

[10]  Karsten Nebe,et al.  dSensingNI: a framework for advanced tangible interaction using a depth camera , 2012, TEI.

[11]  Takeo Igarashi,et al.  Design and enhancement of painting interface for room lights , 2013, The Visual Computer.

[12]  Ricardo Gutierrez-Osuna,et al.  Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation , 2011, International journal of health geographics.

[13]  Marek P. Michalowski,et al.  Keepon , 2009, Int. J. Soc. Robotics.

[14]  Andreas Savakis,et al.  Interactive display using depth and RGB sensors for face and gesture control , 2011, 2011 Western New York Image Processing Workshop.

[15]  Otmar Hilliges,et al.  Steerable augmented reality with the beamatron , 2012, UIST.

[16]  Patrick Jermann,et al.  Design and evaluation of a graphical iPad application for arranging adaptive furniture , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[17]  Cynthia Breazeal,et al.  Achieving fluency through perceptual-symbol practice in human-robot collaboration , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[18]  Auke Jan Ijspeert,et al.  Natural user interface for Roombots , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.