Projection-based Localization and Control Method of Robot Swarms for Swarm User Interfaces

The augmented reality systems, in which computer graphics and multiple mobile robots are cooperatively controlled, are useful for an intuitive understanding of complicated information because users can touch and operate information directly through robots [10, 18]. The swarm user interfaces using cooperatively controlled multiple mobile robots are also a useful approach [11]. These systems express various modalities utilizing the assembly and distribution of robots; however, they cannot realize a collaboration between computer graphics and robots. Therefore, the system, in which computer graphics and multiple mobile robots seamlessly cooperate, can be expected to expand the domain of the swarm user interfaces. Two problems remain in terms of ensuring a seamless collaboration between computer graphics and multiple mobile robots. First, previous methods use external measurement systems for localization that use the computer vision technology. However, the camera positions must be corrected and calibrated, and the spatial position of the robot in the camera images must be computed. Localization methods without computer vision may depend on a laser, sonar, or visible light communication [13]. However, these approaches have limited accuracy because of the resolution of each sensor. Augmented Coliseum [10] approaches this issue using a display-based measurement control system (DMCS) [19]. This technology eliminates the need for position-measuring devices and can support multiple robots on display. However, a prior initialization of tracking robots by marker-pattern images is necessary. Therefore, we cannot add or remove robots.

[1]  Ramesh Raskar,et al.  Prakash: lighting aware motion capture using photosensing markers and multiplexed illuminators , 2007, ACM Trans. Graph..

[2]  Yasushi Iwatani,et al.  Interactions with a line-follower: An interactive tabletop system with a markerless gesture interface for robot control , 2011, 2011 IEEE International Conference on Robotics and Biomimetics.

[3]  Sho Kimura,et al.  PVLC projector: image projection with imperceptible pixel-level metadata , 2008, SIGGRAPH '08.

[4]  Takeshi Naemura,et al.  Phygital field: an integrated field with a swarm of physical robots and digital images , 2016, SIGGRAPH ASIA Emerging Technologies.

[5]  Peter Neumann,et al.  Communication in industrial automation—What is going on? , 2004 .

[6]  Paul A. Beardsley,et al.  RFIG lamps: interacting with a self-describing world via photosensing wireless tags and projectors , 2004, ACM Trans. Graph..

[7]  Kenichi Mase,et al.  Improved Indoor Location Estimation Using Fluorescent Light Communication System with a Nine-Channel Receiver , 2010, IEICE Trans. Commun..

[8]  Masahiko Inami,et al.  Augmented coliseum: an augmented game environment with small vehicles , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[9]  Masahiko Inami,et al.  A Display-Based Tracking System: Display-Based Computing for Measurement Systems , 2007, 17th International Conference on Artificial Reality and Telexistence (ICAT 2007).

[10]  Antonio Carlos Sementille,et al.  Support on the Remote Interaction for Augmented Reality System , 2007 .

[11]  M. Sugimoto,et al.  A Display-Based Tracking System: Display-Based Computing for Measurement Systems , 2007 .

[12]  Michael Rubenstein,et al.  Massive uniform manipulation: Controlling large populations of simple robots with a common input signal , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Ramesh Raskar,et al.  Automatic projector calibration with embedded light sensors , 2004, UIST '04.

[14]  Takeshi Naemura,et al.  Reconfigurable Pixel-level Visible Light Communication with Light Source Control , 2016 .

[15]  Pierre Dragicevic,et al.  Zooids: Building Blocks for Swarm User Interfaces , 2016, UIST.

[16]  Masanori Sugimoto,et al.  VisiCon: a robot control interface for visualizing manipulation using a handheld projector , 2007, ACE '07.

[17]  Dirk Timmermann,et al.  Survey on real-time communication via ethernet in industrial automation environments , 2014, Proceedings of the 2014 IEEE Emerging Technology and Factory Automation (ETFA).

[18]  Ken Perlin,et al.  Physical objects as bidirectional user interface elements , 2004, IEEE Computer Graphics and Applications.

[19]  Takeshi Naemura,et al.  Projection-based localization and navigation method for multiple mobile robots with pixel-level visible light communication , 2016, 2016 IEEE/SICE International Symposium on System Integration (SII).

[20]  K. Hara,et al.  Navigation using one laser source for mobile robot with optical sensor array installed in pan and tilt mechanism , 2008, 2008 IEEE/ASME International Conference on Advanced Intelligent Mechatronics.