An Inertial Measurement System for Hand and Finger Tracking

The primary Human Computer Interfaces (HCI) today are the keyboard and mouse. These interfaces do not facilitate a fluid flow of thought and intent from the operator to the computer. A computer mouse provides only 2 Degrees of Freedom (2DOF). Touch interfaces also provide 2DOF, but with multiple points, making the touch interface far more expressive. The hand has 6 Degrees of Freedom (6DOF) by itself. Combined with the motion of the fingers, the hand has the potential to represent a vast array of differing gestures. Hand gestures must be captured before they can be used as a HCI. Fortunately, advances in device manufacturing now make it possible to build a complete Inertial Measurement Unit (IMU) the size of a fingernail. This thesis documents the design and development of a glove outfitted with six IMUs. The IMUs are used to track the finger and hand positions. The glove employs a controller board for capturing IMU data and interfacing with the host computer. PythonTM software on the host computer captures data from the glove. MATLABTM is used to perform IMU calculations of the incoming data. The calculated data drives a 3D visualization of the glove rendered in Panda3DTM. Future work using the glove would include improved IMU algorithms and development of gesture pattern recognition.

[1]  Matthew N. O. Sadiku,et al.  Evolution of computer systems , 1996, Technology-Based Re-Engineering Engineering Education Proceedings of Frontiers in Education FIE'96 26th Annual Conference.

[2]  Guoqing Xu,et al.  Human computer interaction for the disabled with upper limbs amputation , 2010, 2010 2nd International Conference on Advanced Computer Control.

[3]  Jay A. Farrell,et al.  Aided Navigation: GPS with High Rate Sensors , 2008 .

[4]  Jef Raskin Holes in history: a personal perspective on how and why the early history of today's major interface paradigm has been so often misreported , 1994, INTR.

[5]  Sang Min Yoon,et al.  Real-time 3D reconstruction and pose estimation for human motion analysis , 2010, 2010 IEEE International Conference on Image Processing.

[6]  Jun Rekimoto,et al.  GestureWrist and GesturePad: unobtrusive wearable interaction devices , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[7]  J.G. Hanse Honeywell MEMS inertial technology & product status , 2004, PLANS 2004. Position Location and Navigation Symposium (IEEE Cat. No.04CH37556).

[8]  C. Draper Origins of inertial navigation , 1981 .

[9]  B. Mittman A brief history of the computer chess tournaments: 1970–1975 , 1983 .

[10]  Viet Nam Recognizing postures in vietnamese sign language with MEMS accelerometers , 2007 .

[11]  Douglas C. Engelbart,et al.  Display-Selection Techniques for Text Manipulation , 1967 .

[12]  Robert W. Lindeman,et al.  The AcceleGlove: a whole-hand input device for virtual reality , 2002, SIGGRAPH '02.

[13]  D. W. Heckman,et al.  A Submarine Navigator for the 218' Century , 2002 .

[14]  P. Asare A sign of the times: A composite input device for human-computer interactions , 2010, IEEE Potentials.

[15]  Kristofer S. J. Pister,et al.  Acceleration sensing glove (ASG) , 1999, Digest of Papers. Third International Symposium on Wearable Computers.

[16]  Tae-Seong Kim,et al.  3-D hand motion tracking and gesture recognition using a data glove , 2009, 2009 IEEE International Symposium on Industrial Electronics.