Real-time motion-based frame estimation in video lossy transmission

In movie transmission, video frames are subject to loss due to noise and/or congestion. The loss of video frames could cause a loss of synchronization between the audio and video streams. If not corrected, this cumulative loss can seriously degrade the motion picture's quality beyond viewers' tolerance. We initially study and classify the effect of audio-video de-synchronization. Then, we develop and study motion-based techniques for estimating the lost frames using the existing received frames, without the need for retransmissions or error control information. The estimated frames are injected at their appropriate locations in the movie stream. The objective is to bring back the synchronization within the tolerance level of viewers, while attempting to find a very close estimation to the original frames at a suitable computation cost.

[1]  A. Deever,et al.  Dense motion field reduction for motion estimation , 1998, Conference Record of Thirty-Second Asilomar Conference on Signals, Systems and Computers (Cat. No.98CH36284).

[2]  Carlo Tomasi Pictures and trails: a new framework for the computation of shape and motion from perspective image sequences , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[3]  Nader M. Namazi,et al.  Simultaneous motion estimation and filtering of image sequences , 1997, Proceedings of International Conference on Image Processing.

[4]  Yan Yang,et al.  Rate-constrained motion estimation and perceptual coding , 1997, Proceedings of International Conference on Image Processing.

[5]  Sang Hyuk Son,et al.  Synchronization of temporal constructs in distributed multimedia systems with controlled accuracy , 1994, 1994 Proceedings of IEEE International Conference on Multimedia Computing and Systems.

[6]  Julian Magarey,et al.  Optimal schemes for motion estimation on colour image sequences , 1997, Proceedings of International Conference on Image Processing.

[7]  Athanassios N. Skodras,et al.  An 8/spl times/8-block based motion estimation using Kalman filter , 1997, Proceedings of International Conference on Image Processing.

[8]  Sagar Naik Specification and synthesis of a multimedia synchronizer , 1994, 1994 Proceedings of IEEE International Conference on Multimedia Computing and Systems.

[9]  Xiao Su,et al.  Streaming video with transformation-based error concealment and reconstruction , 1999, Proceedings IEEE International Conference on Multimedia Computing and Systems.

[10]  Sheila S. Hemami,et al.  Multi-resolution motion estimation , 1997, 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing.

[11]  Injong Rhee Retransmission-based error control for interactive video applications over the Internet , 1998, Proceedings. IEEE International Conference on Multimedia Computing and Systems (Cat. No.98TB100241).

[12]  Arun N. Netravali,et al.  Digital Video: An introduction to MPEG-2 , 1996 .

[13]  Kurt Rothermel,et al.  Representing time in multimedia systems , 1994, 1994 Proceedings of IEEE International Conference on Multimedia Computing and Systems.

[14]  Borko Furht,et al.  Evaluation of multimedia synchronization techniques , 1994, 1994 Proceedings of IEEE International Conference on Multimedia Computing and Systems.

[15]  Mohammed Bennamoun Application of time-frequency signal analysis to motion estimation , 1997, Proceedings of International Conference on Image Processing.

[16]  Yoshinori Sakai,et al.  Block matching motion estimation using block integration based on reliability metric , 1997, Proceedings of International Conference on Image Processing.