This paper reports the use of a document distance-based approach to automatically expand the number of available relevance judgements when these are limited and reduced to only positive judgements. This may happen, for example, when the only available judgements are extracted from a list of references in a published review paper. We compare the results on two document sets: OHSUMED, based on medical research publications, and TREC-8, based on news feeds. We show that evaluations based on these expanded relevance judgements are more reliable than those using only the initially available judgements, especially when the number of available judgements is very limited.
|Title of host publication||Proceedings of the ACM SIGIR Workshop on Gathering Efficient Assessments of Relevance (GEAR)|
|Place of Publication||United States|
|Publisher||Cornell University Library|
|Number of pages||4|
|Publication status||Published - 2014|
|Event||ACM SIGIR 2014 Workshop on Gathering Efficient Assessments of Relevance - Gold Coast, Australia|
Duration: 11 Jul 2014 → 11 Jul 2014
|Workshop||ACM SIGIR 2014 Workshop on Gathering Efficient Assessments of Relevance|
|City||Gold Coast, Australia|
|Period||11/07/14 → 11/07/14|
- Information Retrieval
- Relevance Judgements Expansion
Mollá, D., Amini, I., & Martinez, D. (2014). Document distance for the automated expansion of relevance judgements for information retrieval evaluation. In Proceedings of the ACM SIGIR Workshop on Gathering Efficient Assessments of Relevance (GEAR) (pp. 1-4). United States: Cornell University Library.