Probabilistic inference in human semantic memory Mark Steyvers , Tomas L. Griffiths, and Simon...

19
Probabilistic inference in human semantic memory Mark Steyvers, Tomas L. Griffiths, and Simon Dennis 소소소소소소소소소 소소소 TRENDS in Cognitive Sciences vol. 10, no. 7, 2006

description

TRENDS in Cognitive Sciences vol. 10, no. 7, 2006. Probabilistic inference in human semantic memory Mark Steyvers , Tomas L. Griffiths, and Simon Dennis. 소프트컴퓨팅연구실 오근현. Overview . Relational models of memory The Retrieving Effectively from Memory(REM) Model - PowerPoint PPT Presentation

Transcript of Probabilistic inference in human semantic memory Mark Steyvers , Tomas L. Griffiths, and Simon...

Page 1: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

Probabilistic inference in human semantic memory

Mark Steyvers, Tomas L. Griffiths, and Simon Dennis

소프트컴퓨팅연구실오근현

TRENDS in Cognitive Sciences vol. 10, no. 7, 2006

Page 2: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

Overview • Relational models of memory

• The Retrieving Effectively from Memory(REM) Model

• High-dimensional semantic spaces

• Probabilistic topic models

• Modeling semantic memory at the sentence level

• Conclusion

2

Page 3: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

Relational Models of Memory• Encoding much knowledge stored in memory using language

– Most of the stimuli used in memory experience are lin-guistic

– How the statistics of language influence human memory?

• Rational analysis – A framework for developing computational models of

cognition– Making the working assumption

• human cognition approximates an optimal response to the computational problems posed by the environmental

• The role of probabilistic inference in memory– A history factor

• The occurrence pattern of the item over time( Recent or fre-quent )

– A context factor• The association between items (Similar)

3

Page 4: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

The Retrieving Effectively from Memory(REM)

• Stronger assumptions– Encoding process– Representation of information

• Emphasizing the role of probabilistic inference in ex-plaining human memory

• Recognition memory task(Application)– Discriminating between old items and new items– Assumptions

• Words : Vectors of Features(noisy and incomplete)– A calculation of the likelihood ratio

• Balancing the evidence for and ‘old’ decision against a ‘new’ decision

• Degree of match between the memory probe and contents of memory

4

Page 5: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

Example of Comparing a Test• 5 Features

• Green box : Matches

• Red box : Mismatches

• Empty Positions : Missing Features

• 3rd memory trace : the most evidence that test item is old

• Data ‘D’ : Each trace

5

Page 6: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

Posterior odds• Posterior Odds

– Posterior : 증거 (Evidence) 이후

– Odds : 두 배반적 가설의 확률 비

• Prior odds is set at 1 # of old == # of new

• An old decision– Exceeding some

criterion(usually set at 1)

6

• An example distribution of log posterior odds

Page 7: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

Mirror Effects• Increase in hit rate

– Hit rate : correctly recognizing an old item as old

• decrease in false alarm rates– False alarm rates : falsely recognizing a new item as old

• Example – Low frequency words– More rare features are stored for low frequency words

relative to high frequency words– How encoding conditions that improves the diagnosticity

of feature matches for low frequency target items– lower the probability of chance matches by low frequency

distractor items

7

Page 8: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

The Hyperspace Analog to Language(HAL)

• Each word by a vector where each element of the vector corresponds to a weighted co-occurrence value of that word with some other word

8

Page 9: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

The Latent Semantic Analysis(LSA)• The co-occurrence infor-mation between words and passages

9

• Applying matrix decom-position techniques

– to reduce the dimension-ality of the original ma-trix to a much smaller size

– preserving as much as possible the covariation structure of words and documents

Page 10: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

The Word Association Space(WAS)• Input a set of association norms

– Formed by asking subjects to produce the first word that comes to mind in response to a given cue

10

Page 11: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

Probabilistic Topic Models• The Focus on prediction as a central problem of mem-ory

• Emphasizing the role of context in guiding predictions

• The latent structure of words using topics– Documents : mixtures of topics– A topic : A Probability distributions over words

• Topic Model is a generative model for documents

11

Page 12: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

Distributions for Topics• Different weight to themati-

cally related words

• P(z) : The distribution over topics z in a particular doc-ument

• P(W|z) : The probability dis-tribution over words w given topic z

• P( =j) : the probability that the ‘j’th topic was sampled ‘i’th word

• P( | =j) : the probability of word under topic j

• T : The number of topics12

Page 13: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

Using Bayes’ Rule• The problem is to predict the conditional probability of word (the response word) given the cue word

• Applying Bayes’s rule

13

Page 14: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

The same word with two different mean-ing

14

Page 15: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

The same word with two different mean-ing

• Predictions about which words are likely to appear next in a document or conversation, based on the previous words

• A rational model of how context should influence memory

– The focus on the role of history in earlier models– The effects of context being appropriately modulated by

the statistics of the environment

15

Page 16: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

The Syntagmatic Paradigmatic(SP) Model

• Modeling Semantic Memory at the sentence level

• Specifying a simple probabilistic model of knowledge and allowing representational content

– to emerge response to the structure of the environment

• Syntagmatic Associations– Capturing structural and propositional knowledge be-

tween words that follow each other– Structural traces that correspond to individual sentence

• Paradigmatic Associations– between words that fit in the same slots across sen-

tences– combined to form relational(or propositional) traces that

correspond to the same sentences 16

Page 17: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

Sampling structural and relational ex-emplars

• Mapping the two sentences to each other in a one to one fashion

• Change : Sampras-Kuerten / Agassi-Roddick• Match : defeated

• Delete : Pete – ㅡ (symbol of deletion, empty words)• Priority

– Match > change > insertion or deletion

17

Page 18: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

The Architecture of SP Model

• 4 and 5 traces in Sequential LTM• # containing the pattern {Roddick, Costa}• {Sampras} defeat {Agassi} Agassi need to align ‘#’ symbol

• The pattern {Agassi, Roddick, Costa}

18

Page 19: Probabilistic inference  in human semantic memory Mark  Steyvers , Tomas L. Griffiths, and Simon Dennis

Conclusion• Probabilistic inference

– A natural way to address problems of reasoning under uncertainty

• A search for richer representations of the structure of linguistic stimuli

• Contribution to the question– How linguistic stimuli might be represented in memory

• Rational Analysis– A natural framework to understand the tight coupling be-

tween behavior and environmental staticstics– Linguistic stimuli form an excellent testing ground for ra-

tional model of memory

19