CLEI Electronic Journal (Apr 2011)

A Semantic Framework for Evaluating Topical Search Methods

  • Rocío L. Cecchini,
  • Carlos M. Lorenzetti,
  • Ana G. Maguitman,
  • Filippo Menczer

DOI
https://doi.org/10.19153/cleiej.14.1.2
Journal volume & issue
Vol. 14, no. 1

Abstract

Read online

The absence of reliable and efficient techniques to evaluate information retrieval systems has become a bottleneck in the development of novel retrieval methods. In traditional approaches users or hired evaluators provide manual assessments of relevance. However these approaches are neither efficient nor reliable since they do not scale with the complexity and heterogeneity of available digital information. Automatic approaches, on the other hand, could be efficient but disregard semantic data, which is usually important to assess the actual performance of the evaluated methods. This article proposes to use topic ontologies and semantic similarity data derived from these ontologies to implement an automatic semantic evaluation framework for information retrieval systems. The use of semantic simi- larity data allows to capture the notion of partial relevance, generalizing traditional evaluation metrics, and giving rise to novel performance measures such as semantic precision and semantic harmonic mean. The validity of the approach is supported by user studies and the application of the proposed framework is illustrated with the evaluation of topical retrieval systems. The evaluated systems include a baseline, a supervised version of the Bo1 query refinement method and two multi-objective evolutionary algorithms for context-based retrieval. Finally, we discuss the advantages of ap- plying evaluation metrics that account for semantic similarity data and partial relevance over existing metrics based on the notion of total relevance.