Picture of model of urban architecture

Open Access research that is exploring the innovative potential of sustainable design solutions in architecture and urban planning...

Strathprints makes available scholarly Open Access content by researchers in the Department of Architecture based within the Faculty of Engineering.

Research activity at Architecture explores a wide variety of significant research areas within architecture and the built environment. Among these is the better exploitation of innovative construction technologies and ICT to optimise 'total building performance', as well as reduce waste and environmental impact. Sustainable architectural and urban design is an important component of this. To this end, the Cluster for Research in Design and Sustainability (CRiDS) focuses its research energies towards developing resilient responses to the social, environmental and economic challenges associated with urbanism and cities, in both the developed and developing world.

Explore all the Open Access research of the Department of Architecture. Or explore all of Strathclyde's Open Access research...

Evaluating implicit feedback models using searcher simulations

White, R.W. and Ruthven, I. and Jose, J.M. and van Rijsbergen, C.J. (2005) Evaluating implicit feedback models using searcher simulations. ACM Transactions on Information Systems, 23 (3). pp. 325-361.

Full text not available in this repository. Request a copy from the Strathclyde author

Abstract

In this article we describe an evaluation of relevance feedback (RF) algorithms using searcher simulations. Since these algorithms select additional terms for query modification based on inferences made from searcher interaction, not on relevance information searchers explicitly provide (as in traditional RF), we refer to them as implicit feedback models. We introduce six different models that base their decisions on the interactions of searchers and use different approaches to rank query modification terms. The aim of this article is to determine which of these models should be used to assist searchers in the systems we develop. To evaluate these models we used searcher simulations that afforded us more control over the experimental conditions than experiments with human subjects and allowed complex interaction to be modeled without the need for costly human experimentation. The simulation-based evaluation methodology measures how well the models learn the distribution of terms across relevant documents (i.e., learn what information is relevant) and how well they improve search effectiveness (i.e., create effective search queries). Our findings show that an implicit feedback model based on Jeffrey's rule of conditioning outperformed other models under investigation.