Abstract
This paper reports the use of a document distance-based approach to automatically expand the number of available relevance judgements when these are limited and reduced to only positive judgements. This may happen, for example, when the only available judgements are extracted from a list of references in a published review paper. We compare the results on two document sets: OHSUMED, based on medical research publications, and TREC-8, based on news feeds. We show that evaluations based on these expanded relevance judgements are more reliable than those using only the initially available judgements, especially when the number of available judgements is very limited.
Original language | English |
---|---|
Title of host publication | Proceedings of the ACM SIGIR Workshop on Gathering Efficient Assessments of Relevance (GEAR) |
Place of Publication | United States |
Publisher | Cornell University Library |
Pages | 1-4 |
Number of pages | 4 |
Publication status | Published - 2014 |
Event | ACM SIGIR 2014 Workshop on Gathering Efficient Assessments of Relevance - Gold Coast, Australia Duration: 11 Jul 2014 → 11 Jul 2014 |
Workshop
Workshop | ACM SIGIR 2014 Workshop on Gathering Efficient Assessments of Relevance |
---|---|
City | Gold Coast, Australia |
Period | 11/07/14 → 11/07/14 |
Keywords
- Information Retrieval
- Evaluation
- Relevance Judgements Expansion