Download - Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Transcript
Page 1: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Accurately Interpreting Clickthrough Data as Implicit Feedback

Joachims, Granka, Pan, Hembrooke, Gay

Paper Presentation: Vinay Goel

10/27/05

Page 2: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Introduction

Adapt a retrieval system to users and or collections

Manual adaptation - time consuming or even impractical

Explore and evaluate implicit feedbackUse clickthrough data in WWW search

Page 3: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

User Study

Record and evaluate user actionsProvide insight into the decision processRecord users’ eye movements : Eye

tracking

Page 4: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Questions used

Page 5: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Two Phases of the study

Phase I 34 participants Start search with Google query, search for answers

Phase II Investigate how users react to manipulations of search

results Same instructions as phase I Each subject assigned to one of three experimental

conditions Normal, Swapped, Reversed

Page 6: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Explicit Relevance Judgments

Collected explicit relevance judgments for all queries and results pages

Inter-judge agreements

Page 7: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Analysis of user behavior

Which links do users view and click?

Do users scan links from top to bottom?

Which links do users evaluate before clicking?

Page 8: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Which links do users view and click?

Almost equal frequency of 1st and 2nd link, but more clicks on 1st link

Once the user has started scrolling, rank appears to become less of an influence

Page 9: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Do users scan links from top to bottom?

Big gap before viewing 3rd ranked abstract Users scan viewable results thoroughly before

scrolling

Page 10: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Which links do users evaluate before clicking?

Abstracts closer above the clicked link are more likely to be viewed

Abstract right below a link is viewed roughly 50% of the time

Page 11: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Analysis of Implicit Feedback

Does relevance influence user decisions?

Are clicks absolute relevance judgments?

Page 12: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Does relevance influence user decisions? Yes Use the “reversed” condition

Controllably decreases the quality of the retrieval function and relevance of highly ranked abstracts

Users react in two ways View lower ranked links more frequently, scan

significantly more abstracts Subjects are much less likely to click on the first link,

more likely to click on a lower ranked link

Page 13: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Clicks = absolute relevance judgments?Interpretation is problematicTrust Bias

Abstract ranked first receives more clicks than the second

First link is more relevant (not influenced by order of presentation) or

Users prefer the first link due to some level of trust in the search engine (influenced by order of presentation)

Page 14: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Trust Bias

Hypothesis that users are not influenced by presentation order can be rejected

Users have substantial trust in search engine’s ability to estimate relevance

Page 15: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Quality Bias

Quality of the ranking influences the user’s clicking behavior If relevance of retrieved results decreases,

users click on abstracts that are on average less relevant

Confirmed by the “reversed” condition

Page 16: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Are clicks relative relevance judgments?An accurate interpretation of clicks needs

to take into consideration User’s trust into quality of search engine Quality of retrieval function itself

Difficult to measure explicitlyInterpret clicks as pairwise preference

statements

Page 17: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Strategy 1

Takes trust and quality bias into consideration Substantially and significantly better than

random Close in accuracy to inter judge agreement

Page 18: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Strategy 2

Slightly more accurate than Strategy 1 Not a significant difference in Phase II

Page 19: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Strategy 3

Accuracy worse than Strategy 1 Ranking quality has an effect on the accuracy

Page 20: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Strategy 4

No significant differences compared to Strategy 1

Page 21: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Strategy 5

Highly accurate in the “normal” condition Misleading

Aligned preferences probably less valuable for learning Better results even if user behaves randomly

Less accurate than Strategy 1 in the “reversed” condition

Page 22: Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.

Conclusion

Users’ clicking decisions influenced by search bias and quality bias

Strategies for generating relative relevance feedback signals

Implicit relevance signals are less consistent with explicit judgments than the explicit judgments among each other

Encouraging results