"Human Capacity Cognitive Computing" by Patrick Ehlen, PhD and Chief Scientist at Loop AI Labs...
-
Upload
data-science-milan -
Category
Technology
-
view
91 -
download
1
Transcript of "Human Capacity Cognitive Computing" by Patrick Ehlen, PhD and Chief Scientist at Loop AI Labs...
Cognitive Computing Platforms
• IDC forecast: $12.5B market in 2019 (CAGR of 35%)
• By 2018, ½ of consumers will regularly interact with cognitive computing services
• How will it work?• What is “cognitive computing,” anyhow?
Embed and recurse over long sequences
This is the rat that ate the malt that lay in the house that Jack built.
“Discrete Infinity”: infinite use of finite means
This is the farmer sowing the corn, that kept the cock that crowed in the morn, that waked the priest all shaven and shorn, that married the man all tattered and torn, that kissed the maiden all forlorn, that milked the cow with the crumpled horn, that tossed the dog, that worried the cat, that killed the rat, that ate the malt that lay in the house that Jack built.
3 Possible Sources:
• Interface• Learning Algorithm• Architecture
“The Human Capacity”
Deep Learning
Hinton (’81, ’86)
• Assemblies for different semantic roles (hack)
Hinton, G.E. (1981) Implementing semantic networks in parallel hardware.Hinton, G.E. (1986) Learning distributed representations of concepts.
Frankland & Greene (2015)
• Assemblies for different semantic roles (brain)
Frankland, S.M. & Greene, J.D. (2015) An architecture for encoding sentence meanings in left mid-superior temporal cortex. PNAS 112:37
Recap
• (Even though we can all do it…) Language is Hard
• “Human Capacity” probably arises fromspecial architecture
Bachelor #1 Bachelor #2
Katz, J.J. & Fodor, J.A. (1963) The structure of a semantic theory. Language 39:2
Bachelor #1 Bachelor #2
noun
hum
an
anim
al
mal
e
neve
r mar
ried
youn
g
unm
ated
seal
acad
emic
deg
ree
“Units Representation”
Problems with Dictionary Approach:
• “necessary and sufficient” features: neither necessary nor sufficient
• Prototypical examples (E. Rosch): • “Robin” -> more representative of bird than
“finch” or “penguin”• Things don’t categorize so easily
• Metaphorical & analogic nature of language (G. Lakoff & M. Johnson, D. Hofstadter)
Problems with Dictionary Approach:
• “Edge cases” (C. Fillmore):• “Widow”: woman who murdered husband?
“Max went too far today and teapotted a policeman”
(H. Clark)
Distributional approach:
• Determine what words mean solely by their lexical context (surrounding words)
the quick brown fox jumped over the lazy dog
Distributional approach:
• Determine what words mean solely by their lexical context (surrounding words)
the quick brown fox jumped over the lazy dog
Distributional approach:
• Determine what words mean solely by their lexical context (surrounding words)
the quick brown fox jumped over the lazy dog
Distributional approach:
• Determine what words mean solely by their lexical context (surrounding words)
Cont
ext F
eatu
res
Distributional approach:
• Determine what words mean solely by their lexical context (surrounding words)
• Use dimensionality reduction to collapse into latent factors (or “microfeatures”)
Local Representation:
noun
hum
an
anim
al
mal
e
neve
r mar
ried
youn
g
unm
ated
seal
acad
emic
deg
ree
Deep Learning
• Learn distributed representations using neural networks
• Learn from data as it comes in• Learn from sequences (e.g., sentences)
Deep Learning
• Learn from lots of additional context features (not just other words)
• Visual features (CNNs)• Parse structure (Recursive NNs)• Higher-level abstractions from earlier sequences
(RNNs)
Deep Learning
• Learn from lots of additional context features (not just other words)
“Max went too far today and teapotted a policeman”
(H. Clark)
Deep Learning
• Learn from lots of additional context features (not just other words)
• For Human Capacity Cognitive Computing: HUGE potential “context feature” input space• very sparse
Deep Learning
• Large, sparse input fully-connected to many layers
• Complex memory assemblies• RNN• LSTM or GRU to retain relevant context features
from further upstream
Human Capacity Cognitive Computing Platform
• Handle context feature input from multiple modalities and project into single representation space
• Support architectures with specialized assemblies permitting recursion / embedding
• “Discrete Infinity”• “Fluid” interpretation and understanding
Loop Cognitive Computing Platform
• GPU-based appliance• Human Capacity understanding• Learns from unstructured and structured data• Produces a structured representation• Understands concepts in the context of their
domain