CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted...

44
CpSc 881: Information Retrieval

Transcript of CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted...

Page 1: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

CpSc 881: Information Retrieval

Page 2: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

2

Copy Right Notice

Most slides in this presentation are adopted from slides of text book and various sources. The Copyright belong to the original authors. Thanks!

Page 3: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

3

General Information

Class Time: 11:00 AM ~ 12:15PM TTH

Location: 319 Tillman Hall

Instructor: Dr. Feng Luo

Office: 310 McAdams Hall

Phone: 864-656-4793

Email: [email protected]

Office Hours: 1:30PM ~ 2:30PM TTH

Web site:http://www.cs.clemson.edu/~luofeng/course/2015spring/881/ir.html

Page 4: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

4

Prerequisite

Know Java

Familiarity with basic computer science principles and skills.

Familiarity with the basic mathematics, like probability theory, basic linear algebra.

Page 5: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

5

Text Book

Textbook:

Christopher D. Manning, Prabhakar Raghavan, Hinrich Schutze. “Introduction to Information Retrieval”. Cambridge University Press. ISBN 978-0-521-86571-5.

http://nlp.stanford.edu/IR-book/information-retrieval-book.html

Page 6: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

6

Grading

Grading:Mid-term exam 25 %Final exam 25 %Term project 50 %Based on final integrated score and carved to A, B, C

Page 7: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

7

Resources

http://nlp.stanford.edu/IR-book/information-retrieval.html

http://lucene.apache.org/

http://nutch.apache.org/about.html

http://lucene.apache.org/solr/

http://tika.apache.org/

http://oodt.apache.org/

Page 8: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

8

Definition of information retrieval

Information retrieval (IR) is finding material (usually documents) of an unstructured nature (usually text) that satisfies an information need from within large collections (usually stored on computers).

Page 9: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

9

Page 10: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

10

Page 11: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

11

Boolean retrieval

The Boolean model is arguably the simplest model to base an information retrieval system on.

Queries are Boolean expressions, e.g., CAESAR AND BRUTUS

The search engine returns all documents that satisfy the Boolean expressions

Does Google use the Boolean model?

Page 12: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

12

Unstructured data in 1650

Which plays of Shakespeare contain the words BRUTUS AND CAESAR, but not CALPURNIA?

One could grep all of Shakespeare’s plays for BRUTUS and CAESAR, then strip out lines containing CALPURNIA

Why is grep not the solution?Slow (for large collections)grep is line-oriented, IR is document-oriented“NOT CALPURNIA” is non-trivialOther operations (e.g., find the word ROMANS near COUNTRYMAN ) not feasible

Page 13: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

13

Term-document incidence matrix

Entry is 1 if term occurs. Example: CALPURNIA occurs in Julius Caesar.

Entry is 0 if term doesn’t occur. Example: CALPURNIA doesn’t occur in The tempest.

Anthony and Cleopatra

Julius Caesar

The Tempest

Hamlet Othello Macbeth . . .

ANTHONYBRUTUS CAESARCALPURNIACLEOPATRAMERCYWORSER. . .

1110111

1111000

0000011

0110011

0010011

1010010

Page 14: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

14

Incidence vectors

So we have a 0/1 vector for each term.

To answer the query BRUTUS AND CAESAR AND

NOT CALPURNIA:Take the vectors for BRUTUS, CAESAR AND NOT CALPURNIA Complement the vector of CALPURNIA Do a (bitwise) and on the three vectors

110100 AND 110111 AND 101111 = 100100

Page 15: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

15

0/1 vector for BRUTUS

Anthony and Cleopatra

Julius Caesar

The Tempest

Hamlet Othello Macbeth . . .

ANTHONYBRUTUS CAESARCALPURNIACLEOPATRAMERCYWORSER. . .

1110111

1111000

0000011

0110011

0010011

1010010

result: 1 0 0 1 0 0

Page 16: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

16

Answers to query

Anthony and Cleopatra, Act III, Scene ii

Agrippa [Aside to Domitius Enobarbus]: Why, Enobarbus,

When Antony found Julius Caesar dead,He cried almost to roaring; and he weptWhen at Philippi he found Brutus slain.

Hamlet, Act III, Scene ii

Lord Polonius: I did enact Julius Caesar: I was killed i’ the Capitol; Brutus killed me.

Page 17: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

17

Bigger collections

Consider N = 106 documents, each with about 1000 tokens

⇒ total of 109 tokensOn average 6 bytes per token, including spaces andpunctuation size of document collection is ⇒about 6 ・ 109 = 6 GBAssume there are M = 500,000 distinct terms in the collection(Notice that we are making a term/token distinction.)

Page 18: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

18

Can’t build the incidence matrix

M = 500,000 × 106 = half a trillion 0s and 1s.

But the matrix has no more than one billion 1s.

Matrix is extremely sparse.

What is a better representations?We only record the 1s.

Page 19: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

19

Inverted Index

For each term t, we store a list of all documents that contain t.

dictionary postings

Page 20: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

20

Inverted index construction

❶ Collect the documents to be indexed:

❷ Tokenize the text, turning each document into a list

of tokens:

❸ Do linguistic preprocessing, producing a list of normalized tokens, which are the indexing terms:

❹ Index the documents that each term occurs in by creating an inverted index, consisting of a dictionary and postings.

Page 21: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

21

Tokenizing and preprocessing

Page 22: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

22

Generate posting

Page 23: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

23

Sort postings

Page 24: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

24

Create postings lists, determine document frequency

Page 25: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

25

Split the result into dictionary and postings file

dictionary postings

Page 26: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

26

Simple conjunctive query (two terms)

Consider the query: BRUTUS AND CALPURNIA

To find all matching documents using inverted index:

❶ Locate BRUTUS in the dictionary

❷ Retrieve its postings list from the postings file

❸ Locate CALPURNIA in the dictionary

❹ Retrieve its postings list from the postings file

❺ Intersect the two postings lists

❻ Return intersection to user

Page 27: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

27

Intersecting two posting lists

Page 28: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

28

Intersecting two posting lists

This is linear in the length of the postings lists.

Note: This only works if postings lists are sorted.

Page 29: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

29

Query processing: Exercise

Compute hit list for ((paris AND NOT france) OR lear)

Page 30: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

30

Query optimization

Consider a query that is an and of n terms, n > 2

For each of the terms, get its postings list, then add them together

Example query: BRUTUS AND CALPURNIA AND CAESAR

What is the best order for processing this query?

Page 31: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

31

Query optimization

Example query: BRUTUS AND CALPURNIA AND CAESAR

Simple and effective optimization: Process in order of increasing frequency

Start with the shortest postings list, then keep cutting further

In this example, first CAESAR, then CALPURNIA, then BRUTUS

Page 32: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

32

Optimized intersection algorithm forconjunctive queries

Page 33: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

33

More general optimization

Example query: (MADDING OR CROWD) and (IGNOBLE OR STRIFE)

Get frequencies for all terms

Estimate the size of each or by the sum of its frequencies (conservative)

Process in increasing order of or sizes

Page 34: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

34

Recall basic intersection algorithm

Linear in the length of the postings lists.

Can we do better?

Page 35: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

35

Skip pointers

Skip pointers allow us to skip postings that will not figure in the search results.

This makes intersecting postings lists more efficient.

Some postings lists contain several million entries – so efficiency can be an issue even if basic intersection is linear.

Where do we put skip pointers?

How do we make sure intersection results are correct?

Page 36: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

36

Basic idea

Page 37: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

37 37

Skip lists: Larger example

37

Page 38: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

38

Intersection with skip pointers

Page 39: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

39

Where do we place skips?

Tradeoff: number of items skipped vs. frequency skip can be taken

More skips: Each skip pointer skips only a few items, but we can frequently use it.

Fewer skips: Each skip pointer skips many items, but we can not use it very often.

Simple heuristic: for postings list of length P, use evenly-spaced skip pointers.

This ignores the distribution of query terms.Easy if the index is static; harder in a dynamic environment because of updates.

How much do skip pointers help? They used to help a lot. With today’s fast CPUs, they don’t help that much

anymore.

Page 40: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

40

Boolean queries

The Boolean retrieval model can answer any query that is a Boolean expression.

Boolean queries are queries that use AND, OR and NOT to join query terms.Views each document as a set of terms.Is precise: Document matches condition or not.

Primary commercial retrieval tool for 3 decades

Many professional searchers (e.g., lawyers) still like Boolean queries.

You know exactly what you are getting.

Many search systems you use are also Boolean: spotlight, email, intranet etc.

Page 41: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

41

Commercially successful Boolean retrieval: Westlaw

Largest commercial legal search service in terms of the number of paying subscribers

Over half a million subscribers performing millions of searches a day over tens of terabytes of text data

The service was started in 1975.

In 2005, Boolean search (called “Terms and Connectors” by Westlaw) was still the default, and used by a large percentage of users . . .

. . . although ranked retrieval has been available since 1992.

Page 42: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

42

Westlaw: Example queries

Information need: Information on the legal theories involved inpreventing the disclosure of trade secrets by employees formerlyemployed by a competing company

Query: “trade secret” /s disclos! /s prevent /s employe!

Information need: Requirements for disabled people to be able to access a workplace

Query: disab! /p access! /s work-site work-place (employment /3 place)

Information need: Cases about a host’s responsibility for drunkguests

Query: host! /p (responsib! liab!) /p (intoxicat! drunk!)/p guest

Page 43: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

43

Westlaw: Comments

Proximity operators: /3 = within 3 words, /s = within a sentence, /p = within a paragraph

Space is disjunction, not conjunction! (This was the default in search pre-Google.)

Long, precise queries: incrementally developed, not like web search

Why professional searchers often like Boolean search: precision, transparency, control

When are Boolean queries the best way of searching? Depends on: information need, searcher, document collection, . . .

Page 44: CpSc 881: Information Retrieval. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources. The Copyright.

44

Does Google use the Boolean model?

On Google, the default interpretation of a query [w1 w2 . . .wn] is w1 AND w2 AND . . .AND wn

Cases where you get hits that do not contain one of the wi :

anchor text

page contains variant of wi (morphology, spelling correction, synonym)long queries (n large)boolean expression generates very few hits

Simple Boolean vs. Ranking of result setSimple Boolean retrieval returns matching documents in no particular order.Google (and most well designed Boolean engines) rank the result set – they rank good hits (according to some estimator of relevance) higher than bad hits.