A degree-of-edit ranking for consumer generated video retrieval

We introduce degree-of-edit (DoE) ranking to focus on “how much a CGV is edited” as a ranking measure for consumer generated video (CGV) retrieval; a method to estimate DoE ranking is proposed. In the proposed method, the DoE score of a CGV is estimated by using low-level features such as the number of shot boundaries and time ratio of music. We evaluate the rank correlation between DoE ranking determined by subjects and by our method. To demonstrate its performance in a practical scenario, a user test is performed on over 22,000 CGVs in the context of CGV search. The obtained results show that our method significantly improves conventional CGV ranking results in terms of availabilities of interesting and high-quality CGVs.

[1]  Gary H. Anderson,et al.  Video editing and post-production: A professional guide , 1984 .

[2]  Shoji Kurakake,et al.  Telop character extraction from video data , 1997, Proceedings Workshop on Document Image Analysis (DIA'97).

[3]  Thorsten Joachims,et al.  Optimizing search engines using clickthrough data , 2002, KDD.

[4]  Shih-Fu Chang,et al.  Video search reranking via information bottleneck principle , 2006, MM '06.

[5]  Stefanos D. Kollias,et al.  Interactive content-based retrieval in video databases using fuzzy classification and relevance feedback , 1999, Proceedings IEEE International Conference on Multimedia Computing and Systems.

[6]  Tat-Seng Chua,et al.  A match and tiling approach to content-based video retrieval , 2001, IEEE International Conference on Multimedia and Expo, 2001. ICME 2001..

[7]  Yukinobu Taniguchi,et al.  Structured Video Computing , 1994, IEEE MultiMedia.

[8]  Hiroshi Hamada,et al.  Video Handling with Music and Speech Detection , 1998, IEEE Multim..