Inverse Kinematics Solver for Android Faces with Elastic Skin

The ability of androids to display facial expressions is a key factor towards more natural human-robot interaction. However, controlling the facial expressions of such robots with elastic facial skin is difficult due to the complexity of modeling the skin deformation. We propose a method to solve the inverse kinematics of android faces to control the android’s facial expression using target feature points. In our method, we use an artificial neural network to model the forward kinematics and minimizing a weighted squared error function for solving the inverse kinematics. We then implement an inverse kinematics solver and evaluate our method using an actual android.

[1]  T. Tsuji,et al.  Development of the Face Robot SAYA for Rich Facial Expressions , 2006, 2006 SICE-ICASE International Joint Conference.

[2]  Norman I. Badler,et al.  Real-Time Inverse Kinematics Techniques for Anthropomorphic Limbs , 2000, Graph. Model..

[3]  Hiroshi Ishiguro,et al.  A blendshape model for mapping facial motions to an android , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  Nasser M. Nasrabadi,et al.  Pattern Recognition and Machine Learning , 2006, Technometrics.

[5]  Chris Melhuish,et al.  Facial behaviour mapping - From video footage to a robot head , 2008, Robotics Auton. Syst..

[6]  Jun TAKAMATSU,et al.  Inverse Kinematics Solver for an Android Face using Neural Network , 2022 .