Gesture Driven Fuzzy Interface System for Car Racing Game

The recently developed Kinect sensor has opened a new horizon to Human-Computer Interface and its native connection with Microsoft’s product line of Xbox 360 and Xbox One video game consoles makes completely hands-free control in next generation of gaming. The recently developed Kinect sensor has opened a new horizon to Human-Computer Interface and its native connection with Microsoft’s product line of Xbox 360 and Xbox One video game consoles makes completely hands-free control in next generation of gaming. Games that requires a lot of degree of freedoms, especially the driving control of a car in racing games is best suitable to be driven by gestures, as the use of simple buttons does not scale to the increased number of assistive, comfort, and infotainment functions. In this chapter, we propose a Mamdani type-I fuzzy inference system based data processing module which effectively takes into account the dependence of actual steering angle with the distance of two palm positions and angle generated with respect to the sagittal plane. The FIS output variable controls the duration of a virtual “key-pressed” event which mocks the users pressing of actual keys assigned to control car direction in the original game. The acceleration and brake (deceleration) of the vehicle is controlled using the relative displacement of left and right feet. The proposed experimental setup, interfacing Kinect and a desktop based racing game, has shown that the virtual driving environment can be easily applied to any games belonging to this particular genre.

[1]  John Solaro,et al.  The Kinect Digital Out-of-Box Experience , 2011, Computer.

[2]  Zahid Halim,et al.  Dynamic time wrapping based gesture recognition , 2014, 2014 International Conference on Robotics and Emerging Allied Technologies in Engineering (iCREATE).

[3]  Masayoshi Wada,et al.  Stability analysis of car driving with a joystick interface , 2013, 2013 IEEE 4th International Conference on Cognitive Infocommunications (CogInfoCom).

[4]  Binggang Cao,et al.  SIMULATION AND EXPERIMENT OF DRIVING CONTROL SYSTEM FOR ELECTRIC VEHICLE , 2006 .

[5]  Baining Guo,et al.  Kinect Identity: Technology and Experience , 2011, Computer.

[6]  Mark T. Bolas,et al.  Unobtrusive measurement of subtle nonverbal behaviors with the Microsoft Kinect , 2012, VR.

[7]  Albrecht Schmidt,et al.  Multimodal interaction in the car: combining speech and gestures on the steering wheel , 2012, AutomotiveUI.

[8]  Yi Li,et al.  Hand gesture recognition using Kinect , 2012, 2012 IEEE International Conference on Computer Science and Automation Engineering.

[9]  L. Baum,et al.  Statistical Inference for Probabilistic Functions of Finite State Markov Chains , 1966 .

[10]  Tilak Dutta,et al.  Evaluation of the Kinect™ sensor for 3-D kinematic measurement in the workplace. , 2012, Applied ergonomics.

[11]  Mariusz Oszust,et al.  Recognition of signed expressions observed by Kinect Sensor , 2013, 2013 10th IEEE International Conference on Advanced Video and Signal Based Surveillance.

[12]  Lasitha Piyathilaka,et al.  Gaussian mixture based HMM for human daily activity recognition using 3D skeleton features , 2013, 2013 IEEE 8th Conference on Industrial Electronics and Applications (ICIEA).