Picture of neon light reading 'Open'

Discover open research at Strathprints as part of International Open Access Week!

23-29 October 2017 is International Open Access Week. The Strathprints institutional repository is a digital archive of Open Access research outputs, all produced by University of Strathclyde researchers.

Explore recent world leading Open Access research content this Open Access Week from across Strathclyde's many research active faculties: Engineering, Science, Humanities, Arts & Social Sciences and Strathclyde Business School.

Explore all Strathclyde Open Access research outputs...

Introduction to the special issue on evaluating interactive information retrieval systems

Borlund, P. and Ruthven, I. (2008) Introduction to the special issue on evaluating interactive information retrieval systems. Information Processing and Management, 44 (1). pp. 1-3. ISSN 0306-4573

[img]
Preview
PDF (strathprints003178.pdf)
strathprints003178.pdf

Download (64kB) | Preview

Abstract

Evaluation has always been a strong element of Information Retrieval (IR) research, much of our focus being on how we evaluate IR algorithms. As a research field we have benefited greatly from initiatives such as Cranfield, TREC, CLEF and INEX that have added to our knowledge of how to create test collections, the reliability of system-based evaluation criteria and our understanding of how to interpret the results of an algorithmic evaluation. In contrast, evaluations whose main focus is the user experience of searching using IR systems have not yet reached the same level of maturity. Such evaluations are complex to create and assess due to the increased number of variables to incorporate within the study, the lack of standard tools available (for example, test collections) and the difficulty of selecting appropriate evaluation criteria for study.