Picture of two heads

Open Access research that challenges the mind...

The Strathprints institutional repository is a digital archive of University of Strathclyde research outputs. Strathprints provides access to thousands of Open Access research papers by University of Strathclyde researchers, including those from the School of Psychological Sciences & Health - but also papers by researchers based within the Faculties of Science, Engineering, Humanities & Social Sciences, and from the Strathclyde Business School.

Discover more...

Reflections on Mira: interactive evaluation in information retrieval

Dunlop, M.D. (2000) Reflections on Mira: interactive evaluation in information retrieval. Journal of the American Society for Information Science and Technology, 51 (14). pp. 1269-1274. ISSN 1532-2882

[img]
Preview
PDF (strathprints002577.pdf)
strathprints002577.pdf

Download (95kB) | Preview

Abstract

Evaluation in information retrieval (IR) has focussed largely on noninteractive evaluation of text retrieval systems. This is increasingly at odds with how people use modern IR systems: in highly interactive settings to access linked, multimedia information. Furthermore, this approach ignores potential improvements through better interface design. In 1996 the Commission of the European Union Information Technologies Programme, funded a three year working group, Mira, to discuss and advance research in the area of evaluation frameworks for interactive and multimedia IR applications. Led by Keith van Rijsbergen, Steve Draper and myself from Glasgow University, this working group brought together many of the leading researchers in the evaluation domain from both the IR and human computer interaction (HCI) communities. This paper presents my personal view of the main lines of discussion that took place throughout Mira: importing and adapting evaluation techniques from HCI, evaluating at different levels as appropriate, evaluating against different types of relevance and the new challenges that drive the need for rethinking the old evaluation approaches. The paper concludes that we need to consider more varied forms of evaluation to complement engine evaluation.