Phygital Field: An Integrated Field with Physical Robots and Digital Images Using Projection-Based Localization and Control Method

Collaboration between computer graphics and multiple robots has attracted increasing attention in several fields. To enhance the seamless connection between them, the system should be able to accurately determine the position and state of the robots and to control them easily and instantly. However, realizing a responsive control system for a large number of mobile robots without complicated settings while avoiding the system load problem is not trivial. We propose a novel system, called “Phygital Field,” for the localization and control of multiple mobile robots. Utilizing pixel-level visible light communication technology, our system can project two types of information in the same location: visible images for humans and data patterns for mobile robots. The system uses coded light superimposed onto a visual image and projected onto the robots. The robots localize their position by receiving and decoding the projected light and can follow a target using the coded velocity vector field. Localization and control information can be independently conveyed in each pixel, and we can change this information over time. The system only requires a projector to control the robot swarm; thus, it can be used on any projection surface. We experimentally assess the localization accuracy of our system for both stationary and moving robots. To further illustrate the utility of our proposed system, we demonstrate the control of multiple mobile robots in spatially and temporally varying vector fields. We also propose prototype applications that can provide users with novel content from collaboration between computer graphics and robot swarm.

[1]  Joachim Hertzberg,et al.  High-speed laser localization for mobile robots , 2005, Robotics Auton. Syst..

[2]  Ken Perlin,et al.  Physical objects as bidirectional user interface elements , 2004, IEEE Computer Graphics and Applications.

[3]  Ken Sugawara,et al.  Foraging behavior of interacting robots with virtual pheromone , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[4]  Hideo Makino,et al.  Basic study on robot control in an intelligent indoor environment using Visible light Communication , 2009, 2009 IEEE International Symposium on Intelligent Signal Processing.

[5]  Masahiko Inami,et al.  Smart Light-Ultra High Speed Projector for Spatial Multiplexing Optical Transmission , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops.

[6]  Ivan Poupyrev,et al.  Lumitrack: low cost, high precision, high speed tracking with projected m-sequences , 2013, UIST.

[7]  Fumitoshi Matsuno,et al.  Communication using pheromone field for multiple robots , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  Guy Theraulaz,et al.  Alice in Pheromone Land: An Experimental Setup for the Study of Ant-like Robots , 2007, 2007 IEEE Swarm Intelligence Symposium.

[9]  Suat Karakaya,et al.  Image processing based indoor localization system , 2014, 2014 22nd Signal Processing and Communications Applications Conference (SIU).

[10]  Michael Rubenstein,et al.  Massive uniform manipulation: Controlling large populations of simple robots with a common input signal , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Pierre Dragicevic,et al.  Zooids: Building Blocks for Swarm User Interfaces , 2016, UIST.

[12]  Masanori Sugimoto,et al.  VisiCon: a robot control interface for visualizing manipulation using a handheld projector , 2007, ACE '07.

[13]  Danilo Navarro,et al.  Line based robot localization using a rotary sonar , 2007, 2007 IEEE Conference on Emerging Technologies and Factory Automation (EFTA 2007).

[14]  Heinz Wörn,et al.  The I-SWARM project , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[15]  Takeshi Naemura,et al.  Projection-based localization and navigation method for multiple mobile robots with pixel-level visible light communication , 2016, 2016 IEEE/SICE International Symposium on System Integration (SII).

[16]  M. Sugimoto,et al.  A Display-Based Tracking System: Display-Based Computing for Measurement Systems , 2007 .

[17]  Ramesh Raskar,et al.  Automatic projector calibration with embedded light sensors , 2004, UIST '04.

[18]  Masahiko Inami,et al.  Augmented coliseum: an augmented game environment with small vehicles , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[19]  Ramesh Raskar,et al.  Prakash: lighting aware motion capture using photosensing markers and multiplexed illuminators , 2007, SIGGRAPH 2007.

[20]  Ramesh Raskar,et al.  Prakash: lighting aware motion capture using photosensing markers and multiplexed illuminators , 2007, ACM Trans. Graph..

[21]  Kenichi Mase,et al.  Improved Indoor Location Estimation Using Fluorescent Light Communication System with a Nine-Channel Receiver , 2010, IEICE Trans. Commun..

[22]  Takeo Igarashi,et al.  A dipole field for object delivery by pushing on a flat surface , 2010, 2010 IEEE International Conference on Robotics and Automation.

[23]  Fred Rothganger,et al.  Motion planning for disc-shaped robots pushing a polygonal object in the plane , 2002, IEEE Trans. Robotics Autom..

[24]  Paul A. Beardsley,et al.  RFIG lamps: interacting with a self-describing world via photosensing wireless tags and projectors , 2004, ACM Trans. Graph..

[25]  Darren Leigh,et al.  RFIG lamps: interacting with a self-describing world via photosensing wireless tags and projectors , 2004, SIGGRAPH 2004.

[26]  K. Hara,et al.  Navigation using one laser source for mobile robot with optical sensor array installed in pan and tilt mechanism , 2008, 2008 IEEE/ASME International Conference on Advanced Intelligent Mechatronics.

[27]  Paul A. Beardsley,et al.  Multi-robot system for artistic pattern formation , 2011, 2011 IEEE International Conference on Robotics and Automation.

[28]  Yasushi Iwatani,et al.  Interactions with a line-follower: An interactive tabletop system with a markerless gesture interface for robot control , 2011, 2011 IEEE International Conference on Robotics and Biomimetics.

[29]  Sho Kimura,et al.  PVLC projector: image projection with imperceptible pixel-level metadata , 2008, SIGGRAPH '08.