This paper proposes a cost-effective approach to map and navigate an area with only the means of a single, low-resolution camera on a “smart robot,” avoiding the cost and unreliability of radar/sonar systems. Implementation is divided into three main parts: object detection, autonomous movement, and mapping by spiraling inwards and using A* Pathfinding algorithm. Object detection is obtained by editing Horn–Schunck’s optical flow algorithm to track pixel brightness factors to subsequent frames, producing outward vectors. These vectors are then focused on the objects using Sobel edge detection. Autonomous movement is achieved by finding the focus of expansion from those vectors and calculating time to collisions, which are then used to maneuver. Algorithms are programmed in MATLAB and JAVA, and implemented with LEGO Mindstorm NXT 2.0 robot for real-time testing with a low-resolution video camera. Through numerous trials and diversity of the situations, validity of results is ensured to autonomously navigate and map a room using solely optical inputs.
[1]
Wolfgang Menzel,et al.
Automotive radar – investigation of mutual interference mechanisms
,
2010
.
[2]
Amaury Nègre,et al.
Real-Time Time-to-Collision from Variation of Intrinsic Scale
,
2006,
ISER.
[3]
Berthold K. P. Horn,et al.
Determining Optical Flow
,
1981,
Other Conferences.
[4]
Berthold K. P. Horn,et al.
Determining Optical Flow
,
1981,
Other Conferences.
[5]
Guido Zunino,et al.
Simultaneous localization and mapping for navigation in realistic environments
,
2002
.
[6]
Pawan Kumar,et al.
Efficient path finding for 2D games
,
2004
.