Picture of athlete cycling

Open Access research with a real impact on health...

The Strathprints institutional repository is a digital archive of University of Strathclyde's Open Access research outputs. Strathprints provides access to thousands of Open Access research papers by Strathclyde researchers, including by researchers from the Physical Activity for Health Group based within the School of Psychological Sciences & Health. Research here seeks to better understand how and why physical activity improves health, gain a better understanding of the amount, intensity, and type of physical activity needed for health benefits, and evaluate the effect of interventions to promote physical activity.

Explore open research content by Physical Activity for Health...

Report on the First International Workshop on the Evaluation on Collaborative Information Seeking and Retrieval (ECol'2015)

Soulier, Laure and Tamine, Lynda and Sakai, Tetsuya and Azzopardi, Leif and Pickens, Jeremy (2016) Report on the First International Workshop on the Evaluation on Collaborative Information Seeking and Retrieval (ECol'2015). ACM SIGIR Forum, 50 (1). pp. 42-48. ISSN 0163-5840

Text (Azzopardi-etal-SIGIR-2016-First-international-workshop-on-the-evaluation-on-collaborative-information-seeking)
Azzopardi_etal_SIGIR_2016_First_international_workshop_on_the_evaluation_on_collaborative_information_seeking.pdf - Final Published Version

Download (143kB) | Preview


The workshop on the evaluation of collaborative information retrieval and seeking (ECol) was held in conjunction with the 24th Conference on Information and Knowledge Management (CIKM) in Melbourne, Australia. The workshop featured three main elements. First, a keynote on the main dimensions, challenges, and opportunities in collaborative information retrieval and seeking by Chirag Shah. Second, an oral presentation session in which four papers were presented. Third, a discussion based on three seed research questions: (1) In what ways is collaborative search evaluation more challenging than individual interactive information retrieval (IIIR) evaluation? (2) Would it be possible and/or useful to standardise experimental designs and data for collaborative search evaluation? and (3) For evaluating collaborative search, can we leverage ideas from other tasks such as diversified search, subtopic mining and/or e-discovery? The discussion was intense and raised many points and issues, leading to the proposition that a new evaluation track focused on collaborative information retrieval/seeking tasks, would be worthwhile.