This report describes metrics for the evaluation of the effectiveness of segment-based retrieval based on existing binary information retrieval metrics. This metrics are described in the context of a task for the hyperlinking of video segments. This evaluation approach re-uses existing evaluation measures from the standard Cranfield evaluation paradigm. Our adaptation approach can in principle be used with any kind of effectiveness measure that uses binary relevance, and for other segment-baed retrieval tasks. In our video hyperlinking setting, we use precision at a cut-off rank n and mean average precision.
[1]
Maria Eskevich,et al.
The Search and Hyperlinking Task at MediaEval 2013
,
2013,
MediaEval.
[2]
Mark Sanderson,et al.
Do user preferences and evaluation measures line up?
,
2010,
SIGIR.
[3]
Maria Eskevich,et al.
New Metrics for Meaningful Evaluation of Informally Structured Speech Retrieval
,
2012,
ECIR.
[4]
Ellen M. Voorhees.
I Come Not To Bury Cranfield, but to Praise It | NIST
,
2009
.
[5]
Gabriella Kazai,et al.
Tolerance to irrelevance: a user-effort oriented evaluation of retrieval systems without predefined retrieval unit
,
2004
.