Abstract
Information Retrieval (IR) research has
traditionally focused
on serving the best results for a single
query| so-called ad
hoc retrieval. However, users typically
search iteratively,
re ning and reformulating their queries
during a session. A
key challenge in the study of this
interaction is the creation
of suitable evaluation resources to assess
the e ectiveness of
IR systems over sessions. This paper
describes the TREC
Session Track, which ran from 2010
through to 2014, which
focussed on forming test collections that
included various
forms of implicit feedback. We describe the
test collections;
a brief analysis of the di erences between
datasets over the
years; and the evaluation results that
demonstrate that the
use of user session data signi cantly
improved e ectiveness.
Original language | English |
---|---|
Title of host publication | Not Known |
Pages | 685-688 |
DOIs | |
Publication status | Published - 18 Jul 2016 |
Event | Special Interest Group on Information Retrieval (SIGIR) - Pisa, Italy Duration: 17 Jul 2016 → 21 Jul 2016 |
Conference
Conference | Special Interest Group on Information Retrieval (SIGIR) |
---|---|
Country/Territory | Italy |
City | Pisa |
Period | 17/07/16 → 21/07/16 |