Parametric Motion Blending through Wavelet Analysis

Abstract This paper shows how multiresolution blending can be employed with time-warping for realistic parametric motion generation from pre-stored motion data. The goal is to allow the animator to define the desired motion using its natural parameters such as speed. Generation of a realistic motion is achieved using pre-stored captured animations. Analysis has been carried out to investigate the relationship between the walking speed and blending factor to remove the burden of trial and errors from the animator. As a result, realistic walking motion with the speed specified by the user can be generated. This desired speed should be between the minimum and maximum speeds of the available motion data. Analysis to generalise these results to other motions are in progress. Generating the desired motion for different scaled avatars is also discussed. 1. Introduction Realistic human motion animation is still a challenging task although human motion appears to us (as humans) to be very easy and natural behaviour. The real human motion has many unique characteristics that identifies it from synthetic ones. The absence of these characteristics (even the very small ones) results in unnatural or robot-like appearance. This unnatural appearance could be easily noticed by humans but most probably it is not easy to identify its source. With the development of virtual reality, the demand has been increased for virtual humans in a wide variety of fields and applications from games and entertainment to simulation and scientific visualisation. As a result, the need for realistic human motion animation is increasing rapidly. The most realistic animation is that which can preserve the unique human characteristics. In that sense, computer human animation using the motion captured data can produce more natural-looking and realistic animation. As the motion is captured from real people, the generated animation is more realistic and physically correct. The motion captured animation becomes more realistic with the development of more advanced and accurate motion capturing systems and techniques. The problem appears when the captured animation needs to be modified. Even if the needed modification is very small, most probably the whole capturing procedure should be repeated to satisfy the desired motion. This also happens if the captured animation is to be applied to another human model (with different properties) which is referred to as the retargeting problem. To benefit from the advantages of the motion captured data in human animation, analysing and editing systems have to be available. These systems should provide an easy and reliable way to edit and/or modify the captured data (within some limits) to produce the desired motion. This may be done by modifying the motion parameters (speed, step frequency/length, .., etc.), mode or emotional status (tired, happy, angry, .., etc.). The goal of this research is to provide a natural and easy way for the animator to define the desired motion using the natural human motion parameters. The desired motion is generated using the multiresolution blending and time-warping techniques based on existing pre-stored animation data. This results in a parametric motion blending which could be a framework to parametrise the motion captured data. In the next section, an overview of the previous work in the editing and modification of the human motion animation is shown. Sections 3 and 4 show the use of wavelets as a powerful signal processing tool in motion editing and motion synthesis using the multiresolution blending respectively. The proposed analysis in parametric multiresolution motion blending is presented in section 5. Then, a brief discussion of