information-theoretic semantics - rensselaerfaculty.evansville.edu/tb2/PDFs/RPI Presentation...
Transcript of information-theoretic semantics - rensselaerfaculty.evansville.edu/tb2/PDFs/RPI Presentation...
(How) Can Seman-c Informa-on Be Reduced to Informa-on Theory?
Implica-ons for an Informa-on-‐Theore-c Philosophy of Mind
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Cogni&ve Science Lecture Series Rensselaer Polytechnic Ins&tute – April 4th, 2012
(How) Can Seman-c Informa-on Be Reduced to Informa-on Theory?
Implica-ons for an Informa-on-‐Theore-c Philosophy of Mind
Anthony Beavers, Ph.D. Director of Cogni7ve Science
What brings happiness?
Photo Courtesy of the Consor&um for Cogni&ve Science Instruc&on (CCSI) hGp://www.mind.ilstu.edu/curriculum/searle_chinese_room/searle_chinese_room.php
(How) Can Seman-c Informa-on Be Reduced to Informa-on Theory?
Presenta7on Outline
Anthony Beavers, Ph.D. Director of Cogni7ve Science
• A Brief Introduc7on to Informa7on-‐Theore7c Teleodynamics • The Hypothesis and the Method • Dynamic Associa7ve Networks (DANs): Circuits Not SoJware
• A DAN Example of Object Iden7fica7on Sensi7ve to Context • A DAN Example of Natural Language Processing without a Parser
• Toward an Informa7on-‐Theore7c Philosophy of Mind • The Chinese Room Argument • The Symbol Grounding Problem • The China Room Argument
• Tenta7ve Conclusions and Next Steps • Past Work Related to this Presenta7on and Acknowledgements
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Thermodynamics, Morphodynamics and Teleodynamics
Edited by Philip Clayton & Paul Davies, Oxford 2006.
Terrence W. Deacon, “Emergence: The Hole at the Wheel’s Hub,” 111-‐150.
• Thermodynamics (first order emergence) • Morphodynamics (second order emergence) • Teleodynamics (third order emergence)
These are NOT mere philosophical dis7nc7ons! Deacon is an anthropologist at Berkeley with specializa7ons in biological anthropology and neuroscience.
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Thermodynamic Systems
T. Deacon, “Emergence: The Hole at the Wheel’s Hub.”
Thermodynamics (first order emergence)
• No top-‐down effects • Fully reduc7ve – can be fully explained solely in reference to parts • Causally transparent • S7ll admits of property asymmetry between parts and wholes • Example: Liquidity
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Morphodynamic Systems
T. Deacon, “Emergence: The Hole at the Wheel’s Hub.”
Morphodynamics (second order emergence)
• Emerges from thermodynamic systems • Exhibits “simple recurrence” that enables self-‐ organiza7on • Results from the instability of thermodynamic possibili7es and environmental feedback that amplifies a stable pacern – past ac7on restrains the space of future possibili7es • Examples: Bénard cells, snowflakes
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Teleodynamic Systems
T. Deacon, “Emergence: The Hole at the Wheel’s Hub.”
Teleodynamics (third order emergence)
• Emerges from morphodynamic systems • Exhibits goal-‐oriented ac7vity thanks to memory and informa7on • Results from the sampling of influences that select for “self-‐similarity maintenance” • Uses a circular architecture of causal self-‐ reference analogous to “acractors” • Examples: Memory, evolu7on
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Teleodynamic Systems
T. Deacon, “Emergence: The Hole at the Wheel’s Hub.”
Teleodynamics (third order emergence)
“Third-‐order emergent dynamics are … intrinsically organized around specific absences. This physical disposi7on to develop toward some target state of order merely by persis7ng and replica7ng becer than neighboring alterna7ves is what jus7fies calling this class of physical processes teleodynamic, even if it is not directly and literally a ‘pull’ from the future.” (143)
Anthony Beavers, Ph.D. Director of Cogni7ve Science
A Working Defini7on of ITT (Not Deacon’s)
Informa7on-‐Theore7c Teleodynamics is an approach to intelligence in both natural and ar7ficial systems that uses the quan7ty and distribu7on of informa7on to drive goal-‐oriented behavior without recourse to seman7c content. (Beavers & Harrison 2012) Related to Deacon, as we will see, the way in which network ac7va7on pushes its way through a trajectory that simultaneously opens before it is par7cularly interes7ng, since the trajectory dynamically modifies itself in step with changing network ac7va7on and thereby con7nually sets up possibili7es that the network WILL actualize given further s7mulus. These possibili7es allow us to conceive of the system as teleodynamic and not merely teleonomic.
Anthony Beavers, Ph.D. Director of Cogni7ve Science
A Working Hypothesis It is possible to reduce seman7c informa7on to the quan7fica7on of informa7on flow provided that other condi&ons respec&ng the internal structure of an agent and its rela&ons to its environment are met.
Image from hGp://explodingdog.com/feb26/thisjob.html
<<< Not this!
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Answers to one of Floridi’s Open Problems in the Philosophy of Informa&on: “Do informa7on or algorithmic theories … provide the necessary condi7ons for any theory of seman7c informa7on?” (p. 31).
Oxford 2011
“It is … quite difficult to think about the code en7rely in abstracto without any kind of circuit.”
Alan Turing 1947 Lecture to the London Mathema7cal Society
Anthony Beavers, Ph.D. Director of Cogni7ve Science
The Method
Computa-onal Philosophy (CP): The use of computa7onal techniques to aid in the discovery of philosophical insights that might not be easily discovered otherwise. Cogni-ve Modeling (CM): The use of computer models to illuminate epistemic and informa7onal processes that govern behavior and decision making.
(CP and CM, taken together, hope to resituate the interroga7ve terrain of epistemology to make headway on issues of current concern in both philosophy and cogni7ve science.)
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Dynamic Associa7ve Networks: Circuits Not SoJware
4. Shape recogni7on on a grid 5. Associa7on across simulated sense modali7es 6. Primary sequen7al memory
• DANs vs ANNs • Hebbian Learning with Circuits 1. Object iden-fica-on with context sensi-vity 2. Comparison of similarity and difference 3. Automa7c document classifica7on
7. Network branching across subnets 8. Eight-‐bit register control 9. Rudimentary natural S&R language processing
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Informa7on Entropy and Teleodynamics
Hopfield Acractor from Scholarpedia
Shannon Informa7on Entropy (1948)
The more random a piece of informa7on the more it may decrease our ignorance. Hence, on a seman7c reading, as entropy rises so does the informa7veness of a piece of informa7on. In what follows, we use the concept and not the formula for informa7on entropy. Shannon, B. (1948) A Mathema7cal Theory of Communica7on. Bell System Technical Journal 27:379-‐423, 623-‐656.
Anthony Beavers, Ph.D. Director of Cogni7ve Science
The Inverse Rela7onship Principle (IRP) Barwise, J., and Seligman, J. (1997) Informa7on Flow: The Logic of Distributed Systems. Cambridge, UK: Cambridge University Press.
Though Shannon famously claimed that his acempts to quan7fy informa7on in engineering had licle to do with seman7c content , his word was not the last on the subject, and the issue has been a macer of some debate. Barwise and Seligman noted seman7c corollaries to Shannon in what has been iden7fied as the Inverse Rela7onship Principle. IRP says that the more rare a piece of informa7on is, the more informa7ve it is, and vice versa. We can represent this with the simple formula of 1/n or 1/n2, where n re-‐presents the number of connec7ons that feed into a given node in the DANs.
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Sample DAN Recurrent Wiring Schema7c
Etc. Etc.
.11
.06 .06
t = .11 .25
.25
t = 1.00
.17
.06
.17
.06
.11
1
1
1.50
.50
Etc.
Proper7es Proper7es Objects
Property Set: small, furry, meows, barks, winged, flies, swims, finned, crawls, many-‐legged (more than four), two-‐legged, warm-‐blooded, live off-‐ spring, walks, four-‐legged, farm animal, oinks, moos, ridable, big, crows, woodland, hops, hoofed, curly, hisses, cold-‐blooded, scaled, quacks, coos, seafaring, spawns, large-‐mouthed, web-‐making, whiskered and feathered. Object Set: dog, cat, bird, duck, fish, caterpillar, human, pig, cow, horse, rooster, rabbit, deer, retriever, poodle, mouse, snake, crocodile, lizard, turtle, dolphin, whale, ostrich, pigeon, shark, salmon, bass, house fly, cricket, spider, cauish and seal.
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Sample Results using 1/n2
Winged
2-‐legged
Bird Small
Duck Winged
Human Flies
Rooster Swims
Ostrich 2-‐legged
Pigeon Warm-‐blooded
House Fly Farm Animal
Cricket Walks
Big
Crows
Woodland
Quacks
Coos
Feathered
Note that flies has yet to be pulled into the list, since ostriches and roosters do not fly. However, on the next recursion, rooster and ostrich are slightly demoted (but only slightly), lewng the generic bird with the duck and pigeon count for more (and allowing the human, house fly and cricket to pass the threshold at a very low level). Typicality wins out on the recursion cycle over uniqueness, as flies is elevated over the threshold, further demo7ng the rooster and the ostrich. Many-‐legged, live offspring and cold-‐blooded are pulled into the property set, but they do not cross the threshold. Further recursion aJer the third cycle produces no further re-‐evalua7on of nodes, unlike in other networks that can take several hundred itera7ons to secle.
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Natural Language Processing (NLP) without a Parser Elementary natural languages require that four pieces of informa7on be tracked: (1) the agent (2) at a par7cular loca&on (3) doing a par7cular what (4) at a par7cular &me. (Scheutz and Schermerhorn, 2008) Scheutz and Schermerhorn go on to argue that language based on these four elements require “representa7onal devices [that], in turn, require a systema7c encoding (i.e., representa7ons with formal rules defining well-‐formed expressions) and mechanisms that can encode and decode informa7on (i.e., parsers).” Scheutz, M., and Schermerhorn, P. (2008) The limited u7lity of communica7on in simple organisms. In Bullock, S., Noble, J., Watson, R. & Bedau, M., (Eds.), Ar7ficial Life XI: Proceedings of the Eleventh Interna7onal Conference on the Simula7on and Synthesis of Living Systems (pp. 521-‐528).
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Natural Language Processing (NLP) without a Parser Elementary natural languages require that four pieces of informa7on be tracked: (1) the agent (2) at a par7cular loca&on (3) doing a par7cular what (4) at a par7cular &me. (Scheutz and Schermerhorn, 2008) Scheutz and Schermerhorn go on to argue that language based on these four elements requires “representa7onal devices [that], in turn, require a systema7c encoding (i.e., representa7ons with formal rules defining well-‐formed expressions) and mechanisms that can encode and decode informa7on (i.e., parsers).” Scheutz, M., and Schermerhorn, P. (2008) The limited u7lity of communica7on in simple organisms. In Bullock, S., Noble, J., Watson, R. & Bedau, M., (Eds.), Ar7ficial Life XI: Proceedings of the Eleventh Interna7onal Conference on the Simula7on and Synthesis of Living Systems (pp. 521-‐528).
DAN-‐based NLP suggests otherwise … at least for primi7ve agents and, per-‐haps, ul7mately, for more complex agents as well.
Anthony Beavers, Ph.D. Director of Cogni7ve Science
The Complete Vocabulary Set – 75 Tokens 1596 1685 Berkeley F3 F14 France Principia
1632 1687 born F4 F15 Hume Spinoza
1633 1690 Descartes F5 F16 Ireland Sweden
1637 1704 died F6 F17 Leibniz TheoPolTre
1642 1710 Edinburgh F7 F18 Leipzig what
1646 1711 England F8 F19 LeMonde when
1650 1716 Enquiry F9 F20 Locke where
1656 1727 Essay F10 F21 Medita7ons who
1663 1753 excommunicated F11 F22 Newton wrote
1670 1776 F1 F12 F23 ordained
1677 Amsterdam F2 F13 facts PrinCarPhil
Anthony Beavers, Ph.D. Director of Cogni7ve Science
The Vocabulary as DAN Clustered – Slide 1
@ Who Descartes @ Who Spinoza @ Who Leibniz @ Who Newton @ Who Locke @ Who Berkeley @ Who Hume @ @ What LeMonde @ What Medita7ons @ What PrinCarPhil @ What TheoPolTre @ What Essay @ What priest @ What Enquiry @ What Principia @ @ When 1596 @ When 1650 @ When 1633 @ When 1637 @ When 1632 @ When 1677 @ When 1656 @ When 1663 @ When 1670 @ When 1646 @ When 1716 @ When 1642 @ When 1727 @ When 1704 @ When 1690 @ When 1685 @ When 1753 @ When 1710 @ When 1711 @ When 1776 @ When 1687 @ @ Where France @ Where Sweden @ Where Amsterdam @ Where Leipzig @ Where England @ Where Ireland @ Where Edinburgh @
Anthony Beavers, Ph.D. Director of Cogni7ve Science
The Vocabulary as DAN Clustered – Slide 2 @ Facts F1 @ Facts F2 @ Facts F3 @ Facts F4 @ Facts F5 @ Facts F6 @ Facts F7 @ Facts F8 @ Facts F9 @ Facts F10 @ Facts F11 @ Facts F12 @ Facts F13 @ Facts F14 @ Facts F15 @ Facts F16 @ Facts F17 @ Facts F18 @ Facts F19 @ Facts F20 @ Facts F21 @ Facts F22 @ Facts F23 @ @ F1 Descartes born France 1596 @ F2 Descartes died Sweden 1650 @ F3 Descartes wrote LeMonde 1633 @ F4 Descartes wrote Medita7ons 1637 @ F5 Spinoza born Amsterdam 1632 @ F6 Spinoza died 1677 @ F7 Spinoza excommunicated 1656 @ F8 Spinoza wrote PrinCarPhil 1663 @ F9 Spinoza wrote TheoPolTre 1670 @ F10 Leibniz born Leipzig 1646 @ F11 Leibniz died 1716 @ F12 Newton born England 1642 @ F13 Newton died 1727 @ F14 Locke born England 1632 @ F15 Locke died England 1704 @ F16 Locke wrote Essay 1690 @ F17 Berkeley born Ireland 1685 @ F18 Berkeley died England 1753 @ F19 Berkeley ordained priest 1710 @ F20 Hume born 1711 Edinburgh @ F21 Hume died Edinburgh 1776 @ F22 Hume wrote Enquiry @ F23 Newton wrote Principia 1687 @
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Some Queries H: Who wrote the Medita7ons? C: Descartes H: When? C: 1637 H: Where was Spinoza born? C: Amsterdam H: When? C: 1632 H: Who was excommunicated? C: Spinoza
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Some Queries H: Who was a priest? C: Berkeley H: Where was Berkeley born? C: England and Ireland <-‐ Error H: Facts about where Berkeley was born? C: F17 England Ireland H: F17 {on the computer’s request} C: Ireland H: {clear} Where did Berkeley die(d)? C: England
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Some Queries H: {clear} What did Spinoza and Descartes wrote? C: LeMonde, The Medita7ons, PrinCarPhil, TheoPolTre H: Spinoza? C: PrinCarPhil and TheoPolTre H: When? C: 1663 and 1670 H: 1670? C: TheoPolTre H: What did Descartes and Spinoza wrote and when? C: LeMonde, 1633, Medita7ons, 1673, PrinCarPhil , 1633, TheoPolTree, 1670
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Some Queries H: Spinoza? C: PrinCarPhil, 1633, TheoPolTre. 1670 H: What? C: PrinCarPhil and TheoPolTre H: When? C: 1663 and 1670 H: TheoPolTre? C: 1670 H: {clear} Who was born in England? C: Locke, Newton, Berkeley
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Some Queries H: Facts about who was born in England? C: F12,F14, Locke, Newton, Berkeley H: F12? C: Newton H: F14? C: Locke
Note that the mechanism can answer ques7ons correctly, and in condi7ons of uncertainty, it can provide mul7ple answers. Furthermore, when the system recognizes the uncertainty (perhaps with outside influence), it can correct for it.
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Limita7ons of This Par7cular NLP Model • Does not yet use informa7on entropy in its design, (though I could and need to
include it) • Does not yet accommodate highly structured languages • Does not take word sequence into account, (though we do have a Primary
Sequen7al Memory (PSM) model that do take sequences into account for numbers and may be extendable to informal languages)
• Our DAN-‐based PSM model uses c. 88 nodes to do without any training what Allen and Lange’s 1996 connec7onist model takes 280 to do.
Allan, C., and Lange, T. (1996) Primary Sequen7al Memory: An Ac7va7on-‐Based Connec7onist Model. Neurocompu&ng 2-‐4: 227-‐243.
To be perfectly clear, this model is not offered as a model of human linguis7c com-‐petence, though it may be befiwng of an insect … and perhaps Nim Chimsky, who learned 125 signs and some basic combinatorics for them. It is, nonetheless, a start toward an Informa7on-‐Theore7c Philosophy of Mind just the same.
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Toward an Informa7on-‐Theore7c Philosophy of Mind
1: Revise Newell and Simon (1976) on the strong form of the Physical Symbol System Hypothesis.
From: A physical symbol system (i.e., a syntac7c system that uses rules and tokens) has the necessary and sufficient means for intelligent ac7on. To: A properly wired circuit made out of the right kind of components has the necessary and sufficient means for intelligent ac7on.
Newell, A., and Simon, H. (1976) Computer Science as Empirical Inquiry: Symbols and Search. Communica&ons of the ACM 19.3: 113-‐126.
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Toward an Informa7on-‐Theore7c Philosophy of Mind
2: Revise Haugeland (1985) on the Formalist Moco that governs strong ar7ficial intelligence.
From: If you take care of the syntax, the seman7cs will take care of itself. To: If you take care of the circuit, the syntax (and hence the seman7cs) will take care of itself.
Haugeland, J. (1985) Ar7ficial Intelligence: The Very Idea. Cambridge, MA: MIT Press.
3: Escape from Searle’s Chinese Room (1980).
4: Solve or eliminate Harnad’s Symbol Grounding Problem (1990).
Anthony Beavers, Ph.D. Director of Cogni7ve Science
The Chinese Room Argument
Image from hcp://berto-‐meister.blogspot.com/2011/12/chinese-‐room-‐thought-‐experiment.html
Searle, J. (1980) Minds, Brains and Programs. Behavioral and Brain Sciences 3: 417-‐457.
Anthony Beavers, Ph.D. Director of Cogni7ve Science
The Symbol Grounding Problem (SGP) Harnad, S. (1990) The Symbol Grounding Problem. Physica D: 335-‐346.
“How can the seman7c interpreta7on of a formal system be made intrinsic to the system, rather than just parasi7c on the meanings in our heads? How can the meanings of the meaningless symbol tokens, manipulated solely on the basis of their (arbitrary) shapes, be grounded in anything but other meaning-‐less symbols?” (335)
Requires that we assume that the Chinese Room is an apt characteriza7on of human intelligence. Thus, if we can escape from the Chinese Room, SGP is eliminated.
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Answers to #3 and #4
Inten7onality is mere associa7on mys7fied to look like an illusory phenomenon of ‘aboutness’ (Beavers 2004). If we wish to use the metaphor of a Turing Machine to characterize the human mind, we need to extend it accordingly: the brain is comparable to the read/write head (along with some of its own on-‐board memory) and the external world is the tape. (Internalist concep7ons are remnants of a closet Cartesian-‐ism that refuses to disappear; they must, therefore, figure out how to get from inside the head out into the world. But we are always already in the world.) Note that this is prefatory to understanding the evolu&on of intelligence as a species-‐phenomenon &ed directly to the development of informa&on techno-‐logies.
Anthony Beavers, Ph.D. Director of Cogni7ve Science
The China Room Argument
Image from the White House, Washington, D.C., USA (2011)
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Tenta7ve Conclusion and Next Steps
Fully working out the preceding should show how it is possible for the brain to be conceived of as a thermodynamic engine that fully obeys the laws of physics and yet gives rise to seman7cally-‐rich, goal-‐direc7ve thought and ac7on. The result: A mechanical and, yet, biologically-‐based, emergent materialism grounded in an informa7on-‐theore7c approach to understanding circuitry.
Next Steps: Develop entropy-‐based NLP DANs to serve as the “brains” of individual agents that interact in an ABM to extend Patrick Grim’s model of emergent communica7on as presented in his 2004 paper, Making Meaning Happen. Grim, P., et. al., (2004) Making Meaning Happen. The Journal of Experimental and Theore7cal Ar7ficial Intelligence 16.4: 209-‐243.
(How) Can Seman-c Informa-on Be Reduced to Informa-on Theory?
Implica-ons for an Informa-on-‐Theore-c Philosophy of Mind
Anthony Beavers, Ph.D. Director of Cogni7ve Science
What brings happiness?
Photo Courtesy of the Consor&um for Cogni&ve Science Instruc&on (CCSI) hGp://www.mind.ilstu.edu/curriculum/searle_chinese_room/searle_chinese_room.php
(How) Can Seman-c Informa-on Be Reduced to Informa-on Theory?
Implica-ons for an Informa-on-‐Theore-c Philosophy of Mind
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Be the stream of the universe! (whatever that’s supposed to mean)
Photo Courtesy of the Consor&um for Cogni&ve Science Instruc&on (CCSI)
hGp://www.mind.ilstu.edu/curriculum/searle_chinese_room/searle_chinese_room.php
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Past Work Related to this Presenta7on Hybrid Networks: Transforming Networks for Social and Textual Analysis into Teleodynamic and Predic7ve Mechanisms (with C. Harrison). Ins7tute for Advanced Topics in the Digital Humani7es, Networks and Network Analysis for the Humani7es, sponsored by the Na7onal Endowment for the Humani7es, the Na7onal Science Founda7on and the Ins7tute for Pure and Applied Mathema7cs (IPAM), University of California, Los Angeles, October 22nd, 2011.
A Brief Introduc7on to Informa7on-‐Theore7c Teleodynamics. Info-‐Metrics: Informa7on Processing across the Sciences. Info-‐Metrics Ins7tute Workshop on the Philosophy of Informa7on, American University, Washington, D.C., October 3rd, 2011.
Informa7on-‐Theore7c Teleodynamics in Natural and Ar7ficial Systems (with C. Harrison). Forthcoming in A Computable Universe: Understanding Computa&on and Exploring Nature as Computa&on, H. Zenil, (Ed). World Scien&fic, forthcoming 2012.
Deriving Seman7c Meaning by Measuring the Quan7ty of Informa7on Flow. Fall Faculty Conference, The University of Evansville, Evansville, Indiana, August 17th, 2011.
Inside the Psychology of the Agent: Associa7on, Acrac7on and Repulsion. Ins7tute for Advanced Topics in the Digital Humani7es, sponsored by the Na7onal Endowment for the Humani7es and the Complex Systems Ins7tute, University of North Carolina, Charloce, June 10th, 2011.
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Past Work Related to this Presenta7on Dynamic Associa7ve Network Automa7c Document Classifica7on, D. Burrows, 2010-‐2011. Computer Science Senior Project for Noesis and the Indiana Philosophy Ontology Project. Funded by the Na7onal Endowment for the Humani7es. Winner of the 2011 University of Evansville Senior Project in Computer Science Award.
Typicality Effects and Resilience in Evolving Dynamic Networks. In FS-‐10-‐03, AAAI Press, 2010.
More Fun with Jets and Sharks: Typicality Effects and the Search for the Perfect Acractors. North American Mee7ng of the Interna7onal Associa7on for Compu7ng and Philosophy (IACAP), Simula7ons and Their Philosophical Implica7ons, Carnegie Mellon University, July 24th-‐26th, 2010.
Dynamic Associa7ve Networks and Automa7c Document Classifica7on, G. Wyant, 2009-‐2010. Computer Science Senior Project for Noesis and the Indiana Philosophy Ontology Project. Funded by the Na7onal Endowment for the Humani7es. Winner of the 2010 University of Evansville Senior Project in Computer Science Award.
Mechanical vs. Symbolic Computa7on: Two Contras7ng Strategies for Informa7on Processing. Society for Machines and Mentality, Eastern Division Mee7ng of the American Philosophical Associa7on, New York City, December 27th-‐30th, 2009.
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Past Work Related to this Presenta7on Modeling and Visualizing Dynamic Associa7ve Networks: Towards Developing a More Robust and Biologically-‐Plausible Cogni7ve Model, M. Zlatkovsky. Winner of the 2009 University of Evansville Senior Project in Computer Science Award. See companion website for details.
Inten7onality and Associa7on. The 3rd Annual Compu7ng and Philosophy Conference, Oregon State University, August, 2003.
Phenomenology and Ar7ficial Intelligence. In CyberPhilosophy: The Intersec&on of Philosophy and Compu&ng, edited by James H. Moor and Terrell Ward Bynum (Oxford, UK: Blackwell, 2002), 66-‐77. Also in Metaphilosophy 33.1/2 (2002): 70-‐82.
Anthony Beavers, Ph.D. Director of Cogni7ve Science
Acknowledgements
Na7onal Endowment for the Humani7es Na7onal Science Founda7on
Also the Ins7tute for Pure and Applied Mathema7cs, University of California, Los Angeles; the Cogni7ve Science Program and the Library Science Program, Indiana University; and the University of Evansville. For their cri7cal comments as interlocutors in shaping my thoughts for this presenta7on, I wish to thank Colin Allen, Paul Bello, Aaron Bramson, Selmer Bringsjord, Luciano Floridi, Patrick Grim, Mirsad Hadzikadic, Christopher Harrison, Derek Jones, MaXhias Scheutz, Talitha Washingon, Guy Wyant, Hector Zenil and Michael Zlatkovsky.