Document distance for the automated expansion of relevance judgements for information retrieval evaluation

Diego Mollá, Iman Amini, David Martinez

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contribution

Abstract

This paper reports the use of a document distance-based approach to automatically expand the number of available relevance judgements when these are limited and reduced to only positive judgements. This may happen, for example, when the only available judgements are extracted from a list of references in a published review paper. We compare the results on two document sets: OHSUMED, based on medical research publications, and TREC-8, based on news feeds. We show that evaluations based on these expanded relevance judgements are more reliable than those using only the initially available judgements, especially when the number of available judgements is very limited.
Original languageEnglish
Title of host publicationProceedings of the ACM SIGIR Workshop on Gathering Efficient Assessments of Relevance (GEAR)
Place of PublicationUnited States
PublisherCornell University Library
Pages1-4
Number of pages4
Publication statusPublished - 2014
EventACM SIGIR 2014 Workshop on Gathering Efficient Assessments of Relevance - Gold Coast, Australia
Duration: 11 Jul 201411 Jul 2014

Workshop

WorkshopACM SIGIR 2014 Workshop on Gathering Efficient Assessments of Relevance
CityGold Coast, Australia
Period11/07/1411/07/14

Keywords

  • Information Retrieval
  • Evaluation
  • Relevance Judgements Expansion

Fingerprint Dive into the research topics of 'Document distance for the automated expansion of relevance judgements for information retrieval evaluation'. Together they form a unique fingerprint.

  • Cite this

    Mollá, D., Amini, I., & Martinez, D. (2014). Document distance for the automated expansion of relevance judgements for information retrieval evaluation. In Proceedings of the ACM SIGIR Workshop on Gathering Efficient Assessments of Relevance (GEAR) (pp. 1-4). United States: Cornell University Library.