Human robot interaction using Android and PointBug algorithm

This paper deals with the project that deals with the development of a robot, particularly a Board Of Education bot that is controlled via an android application. The robot is built in such a way that it can be controlled both using manual keys of a laptop and an android application from which the commands will be transferred to the robots chassis using the Bluetooth communication module. The same will be functional using speech commands that will be passed onto the robot using the same procedure. The robot will also behave like a maze following bot and will continue to maneuver itself in a domain or an area that is filled with obstacles without collision into any of them. The obstacle avoidance or the maze following will be implemented using the Infrared Sensors. Finally a line following module is implemented along with the maze following module by integrating Charge Transfer Infrared sensors which is further combined with the path finding PointBug algorithm in order to fine the optimal path from the source to the target by colliding with the obstacles and finding the minimum distance from the source to the target each time it meets with a collision with any of the obstacles. In addition to it the robot can also be controlled using the accelerometer functionalities. The entire project is developed in such a way that it is capable of traversing to the target location by avoiding all types of obstacles and moving on an optimized path and leading to significant results both with respect to time and energy and thereby implementing the latest technology in order to make the system to be controlled using Android, Bluetooth, speech and path finding algorithms to introduce intelligence into the control system to make suitable decisions at a particular instance of time so that the user is capable of reaching the target location without any type of interference or assistance from additional aids.

[1]  Frédéric Lerasle,et al.  Mutual assistance between speech and vision for human-robot interaction , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Arpit Sharma,et al.  Android Phone Controlled Robot Using Bluetooth , 2014 .

[3]  Nupur Choudhury,et al.  Voice Controlled BOEbot using PointBug Algorithm for Human Robot Interaction using Android Technology , 2015 .

[4]  Iwan Ulrich,et al.  The GuideCane-applying mobile robot technologies to assist the visually impaired , 2001, IEEE Trans. Syst. Man Cybern. Part A.

[5]  Abdel Ilah Alshbatat Automated Mobility and Orientation System for Blind or Partially Sighted People , 2013 .

[6]  K. Chang,et al.  Robots in human environments , 1999, Proceedings of the First Workshop on Robot Motion and Control. RoMoCo'99 (Cat. No.99EX353).

[7]  R. Rangarajan,et al.  Voice Recognition Robotic Dog Guides For Visually Impaired People , 2014 .

[8]  R. Dillmann,et al.  Using gesture and speech control for commanding a robot assistant , 2002, Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication.

[9]  M. Joshuva Regan,et al.  Voice Recognition Robot for Visually ImpairedPeople , 2014 .

[10]  M. Chung,et al.  A Nursing Robot System for The Elderly and The Disabled , 2002 .

[11]  Vladimir A. Kulyukin,et al.  Robot-assisted shopping for the blind: issues in spatial cognition and product selection , 2008, Intell. Serv. Robotics.

[12]  J. M. Hans du Buf,et al.  The SmartVision local navigation aid for blind and visually impaired persons , 2011 .