Color, Consciousness, MUI - UCI Social Sciencesddhoff/APS-Poster.pdf · Locke’s Question Color,...

1
C olor , Consciousness, MUI Locke’s Question Scrambling Theorem Abstract Reductive Functionalism Meaning of Theorem Example of Theorem b .1 .2 .3 .4 .2 .4 .1 .3 Jack Jill d( , ) = .3 d'( , ) = .3 d( , ) = .4 d'( , ) = .4 d( , ) = .5 d'( , ) = .5 d( , ) = .5 d'( , ) = .5 d( , ) = .6 d'( , ) = .6 d( , ) = .7 d'( , ) = .7 X=Y={ , , , } X Y p q b �� �� �� �� �� Objections and Replies Wittgenstein’s Question Nonreductive Functionalism Is it possible that “the idea that a violet produced in one man’s mind by his eyes were the same that a marigold produced in another man’s, and vice versa”? Essay Concerning Human Understanding, 1690 Locke’s question is still debated because of its implica- tions for functionalist theories of conscious experience. Here we prove that the answer to Locke’s question is yes: Jack’s color experiences could differ from Jill’s, yet no experiment would reveal this difference. This entails that conscious experiences are not identical with func- tional relations. We propose a new theory. Scrambling Theorem. Denote by X the color expe- riences of one person and by Y the color experiences of a second. Let the following describe the functional re- lations among color experiences of the first: (1)f : X 1 × ... × X n W , where X 1 ,...,X n are copies of X , n is a positive integer and W is any set, (2) X 2 X , and (3) f : X 1 × ... × X N Z , where X 1 ,...,X n are copies of X , m is a positive integer and Z is any set. Let b: X Y by any bijection, i.e., any scrambling, between the two sets of color experiences. Then there exist (1) g : Y 1 × ... × Y n W , where Y 1 ,...,Y n are copies of Y , (2) Y 2 Y , and (3) g : Y 1 × ... × Y m Z , where Y 1 ,...,Y m are copies of Y , such that (1) f = gb, (2)X = b -1 (Y ) and (3) f = g b. Proof. Let g = fb -1 ; Y = b(X ); g = f b -1 . Suppose we show Jack a display, and he has color expe- riences x 1 ,...,x n . If we show Jill the same display, then she will have color experiences b(x 1 ),...,b(x n ), since b describes how Jill’s color experiences are scrambled rel- ative to Jack’s. Now Jack’s performance in color tasks is determined by the functional relations f,X ,f , and Jill’s by g,Y ,g . The Scrambling Theorem says it is always possible to choose g,Y ,g so that Jill responds identically to Jack in all color tasks, despite their differ- ent color experiences. Therefore their conscious experi- ences cannot be identical to the functional relations. Let f be a metric that gives distances, d, between color experiences. Suppose that d is induced by symmetric difference from a probability measure, p, on (X, X ), where X is a σ -algebra on X : d(x i ,x j )= p(x i x j )= p(x i x j -x i x j ), x i ,x j X . 1. Conscious experiences cannot be represented math- ematically. If so, then there is no science of consciousness. But it is premature to claim this. Give science a chance. 2. X could represent unconscious states of robots, so the Theorem is not about conscious experiences. Using integers to count apples does not preclude using integers to count oranges. 3. The Theorem is not adequately constrained by em- pirical facts about color perception. The Theorem establishes the logical possibility of inverted spectra, not their empirical existence. 4. Since I can’t imagine a red that looks more similar to a green than another red, it must not be possible. Proof establishes logical possibility and, ipso facto, imaginability. Proof trumps weak imagination. 5. The Scrambling Theorem is mathematically trivial. Yes. The real work is the proper math formulation. It is a bonus that then the answer is obvious. Even though Jill sees red what Jack sees blue and Jack sees red what Jill sees green, Jack and Jill still both call grass green and the sky blue. Noncommuting Diagram Nonreductive functionalism claims that conscious expe- riences are determined by, but not identical to, the func- tional organization of a physical system. Such function- alists argue that Wittgenstein’s color inversion could not happen without a change in functional organization (e.g., Chalmers 1995; see also Tye, 1995). For if Jack’s color experiences changed but his functional organiza- tion did not, then his beliefs about his color experiences would not change. He would believe, e.g., that he ex- perienced red when he in fact experienced blue. This, they conclude, is a reductio ad absurdum. But the re- ductio fails, for if one grants for sake of argument that experience can change without changes in functional or- ganization, then surely one must grant that beliefs can so change as well. “Consider this case: someone says “I can’t understand it, I see everything red blue today and vice versa.” We answer “it must look queer!” He says it does and, e.g., goes on to say how cold the glowing coal looks and how warm the clear (blue) sky. I think we should under these or similar circumstances be inclined to say that he saw red what we saw blue. And again we should say that we know that he means by the words ‘blue’ and ‘red’ what we do as he has always used them as we do.” Philosophical Review, 3, 284, 1968. If Jack’s color experiences suddenly change, he will re- port the changes. He will say, e.g., that the sky is red and the grass yellow. This contradicts the prediction of nonreductive functionalism. b Jack 1 Jack 2 �� �� �� �� The MUI Theory of Consciousness The MUI theory of consciousness states that the con- scious experiences of an agent are a multi-modal user interface (MUI) between that agent and an objective world, objective in the sense that its existence does not depend on that agent. MUI theory makes no claim about the nature of that objective world. In particu- lar, it does not claim that the format or contents of the MUI in any way resemble or approximate that objec- tive world. Conscious experiences of space and time, rocks and trees, colors and smells, might not resemble or approximate anything in that objective world. A Virtual-Reality Illustration of MUI Imagine that you and some friends play virtual volley- ball at a VR arcade. You don a helmet and body suit, and find yourself at the beach on a volleyball court. You serve the volleyball and a friend across the net hits it back. You consciously experience a volleyball, the net, the sand, your body, your friends, and palm trees in the background. These are all contents of your MUI. The objective world, in this example, is a computer with megabytes of software running a VR program. The vol- leyball of your MUI does not resemble or approximate the diodes, resistors, or software of the computer. Nor do the net, sand, and your body. But these elements of your MUI inform you how to act so as to alter registers in the computer to win the game (Hoffman, 1998). Many researchers model perceptual experience as the result of Bayesian estimation. They assume the objec- tive world contains physical objects with determinate properties. Perceptual experiences approximate, or re- construct, these properties. This physicalist model of the objective world is one of countless models allowed by MUI theory. On evolutionary grounds, however, the physicalist model is unlikely. Evolutionary pressures dictate that our MUI interface, this world of our daily experience, should itself be a radical simplification, se- lected not for the exhaustive depiction of the objective world but for the mutable pragmatics of survival. Hu- mans, like roaches, are a species in a niche. We do not expect the perceptions of roaches to be insights into the true nature of the objective world; it would be a miracle if the perceptions of humans were such insights. MUI, Bayes and Evolution A B Don Hoffman, Cognitive Science, UC Irvine 92697 Epiphysicalism and the Brain Conscious Realism Conscious realism states that the objective world, the world behind the MUI, consists only of conscious agents. Conscious realism can be made mathematically precise, and used to derive quantum field theory (Bennett et al., 1989). It thus differs from Kant’s idealism, which claims that things-in-themselves cannot be mathemati- cally modeled. It also differs from panpsychism. The Mind-Body Problem Physicalism has failed to produce a scientific theory de- scribing how brain activity could cause, or be, conscious experience. It faces a mysterious gap between (1) the experience of red and (2) ions flowing through holes in neural membranes. MUI Theory and Conscious Real- ism together transform the mind-body mystery into a tractable scientific problem. Consciousness is primary, and can be mathematically modeled (Bennett et al., 1989). Tables, chairs and other “physical” objects are elements of a MUI, constructed by a conscious mind according to rules now being discovered in the cogni- tive sciences (e.g., Hoffman, 1998). Electrons and atoms are also elements of a MUI, constructed by a conscious agent—a claim compatible with the standard Copen- hagen interpretation of quantum theory. References and Note 1. Bennett, B.M., Hoffman, D.D., Prakash, C. (1989). Observer Mechanics: A Formal Theory of Per- ception. New York: Academic Press. 2. Chalmers, D.J. (1995) The Conscious Mind. Oxford Press. 3. Hoffman, D.D. (1998). Visual Intelligence: How We Create What We See. New York: W.W. Norton. 4. Locke, J. (1690). An Essay Concerning Human Understanding. 5. Tye, M. (1995). Ten Problems of Consciousness. Cambridge, MA: MIT Press. 6. Wittgenstein, L. (1968). Notes for Lectures on ‘Private Experience’ and ‘Sense Data’. (Rush Rhees, ed.), Philosophical Review, 3, (July LXXVII): 284. Poster presented at the 17th Annual Convention of the American Psychological Society, Los Angeles, California, May 26–29, 2005, and at the 9th Annual Meeting of the Association for the Scientific Study of Consciousnes, Caltech, Pasadena, California, June 24–27, 2005. For any scrambling b: X Y , let the probability, q , on Y be the distribution of b, i.e., for any measurable set A of Y , q (A)= p(b -1 (A)). Then taking g to be the metric d on Y induced by q satisfies the Theorem. Scrambling Colors, But Not Functions On your computer screen is an icon for a text file. The icon is, say, red and rectangular. The shape, color, and position of the icon do not, of course, resemble anything about the text file, but give you access to that file. You drag the text-file icon to the recycle icon, and your file is deleted. Did the motion of the icon on your screen cause the file to be deleted? No. Instead the icon in- formed your actions, guiding you to move and click a mouse, and your actions triggered the causal sequence that deleted the file. Likewise, the tables, chairs, rocks and other objects of your everyday MUI appear to have causal powers, but they do not. In particular, the brain has no causal powers. It is another icon of your MUI. Indeed, just as there is no cube in the figure above until someone looks and constructs a cube, so also you have no brain until someone observes it and, thereby, con- structs it. Of course, if someone destroys your primary visual cortex, you go blind, just as when you drag the file icon to the recycle bin you delete the file. And if someone stimulates your somatosensory cortex, you feel a sensation in your body, just as when you click on the file icon you access the file. Nevertheless, the brain does not cause these sensations, any more than the text-file icon causes the text file. Consciousness does not super- vene, logically or naturally, on the brain. More Objections and Replies 1. If that speeding train is just an icon of your MUI, why don’t you step in front of it? For the same reason I don’t carelessly drag a file icon to the recycle bin. Not taking icons literally differs from not taking them seriously. Icons are there to inform our behavior. 2. How can this table be just a construct in my MUI, since you see the same table? We don’t see the same table, any more than we see the same volleyball in the VR volleyball game. You construct your ball and your table, I construct mine. We assume, though we cannot prove, that our constructs are relevantly similar. 3. If this table is just a construct in my MUI, why can’t I put my fist through it? Shouldn’t my con- structs do what I want? The constructs in your MUI are constrained by your rules of construction and your interactions with the objective world that triggers your con- struction. They are not subject to your every whim. 4. Physicalists don’t say that the objective world re- sembles our experiences. They claim instead, e.g., that this table is mostly empty space, with protons, neutrons and electrons whizzing around. That is the true objective reality. This claim makes the same mistake as someone who pulls out a magnifying glass, looks at a computer screen, and concludes that, although the file icon is just an icon, the true objective reality is the set of pixels now visible. Those pixels are not the com- puter to which the icon is an interface. The pixels are part of the windows interface; similarly, pro- tons, neutrons and electrons are part of your MUI. X W Y f b g X‘ Z Y‘ f‘ b g‘ Reductive functionalism claims that the type identity conditions for mental states refer only to relations be- tween inputs, outputs and each other (e.g., Block & Fodor, 1972). That is, it claims that each mental state is numerically identical to some functional relation. The Scrambling Theorem proves that reductive functional- ism if false: Color experiences, a type of mental state, are not identical to any functional relations. x i x j

Transcript of Color, Consciousness, MUI - UCI Social Sciencesddhoff/APS-Poster.pdf · Locke’s Question Color,...

Page 1: Color, Consciousness, MUI - UCI Social Sciencesddhoff/APS-Poster.pdf · Locke’s Question Color, Consciousness, MUI Scrambling Theorem Abstract Reductive Functionalism Meaning of

Color, Consciousness, MUILocke’s Question

Scrambling Theorem

Abstract

Reductive Functionalism

Meaning of Theorem

Example of Theorem

b

.1

.2

.3

.4

.2

.4

.1

.3

Jack Jill

d( , ) = .3 d'( , ) = .3 d( , ) = .4 d'( , ) = .4 d( , ) = .5 d'( , ) = .5 d( , ) = .5 d'( , ) = .5 d( , ) = .6 d'( , ) = .6 d( , ) = .7 d'( , ) = .7

X=Y={ , , , }

X Y

p q

b

� � � � � ���

� � � � �� � �� � �

� � � � � � �� � � ��� � �

� � � ��� � �

� � � � ��� � � �

Objections and Replies

Wittgenstein’s Question

Nonreductive Functionalism

Is it possible that “the idea that a violet produced in oneman’s mind by his eyes were the same that a marigoldproduced in another man’s, and vice versa”?Essay Concerning Human Understanding, 1690

Locke’s question is still debated because of its implica-tions for functionalist theories of conscious experience.Here we prove that the answer to Locke’s question isyes: Jack’s color experiences could differ from Jill’s, yetno experiment would reveal this difference. This entailsthat conscious experiences are not identical with func-tional relations. We propose a new theory.

Scrambling Theorem. Denote by X the color expe-riences of one person and by Y the color experiences ofa second. Let the following describe the functional re-lations among color experiences of the first: (1)f :X1 ×. . . × Xn → W , where X1, . . . , Xn are copies of X, n

is a positive integer and W is any set, (2) X ′ ⊆ 2X ,and (3) f ′:X ′

1 × . . . ×X ′N → Z, where X ′

1, . . . , X′n are

copies of X ′, m is a positive integer and Z is any set.Let b:X → Y by any bijection, i.e., any scrambling,between the two sets of color experiences. Then thereexist (1) g:Y1 × . . . × Yn → W , where Y1, . . . , Yn arecopies of Y , (2) Y ′ ⊆ 2Y , and (3) g′:Y ′

1 × . . .×Y ′m → Z,

where Y ′1 , . . . , Y ′

m are copies of Y ′, such that (1) f = gb,(2)X ′ = b−1(Y ′) and (3) f ′ = g′b.

Proof. Let g = fb−1; Y ′ = b(X ′); g′ = f ′b−1.

Suppose we show Jack a display, and he has color expe-riences x1, . . . , xn. If we show Jill the same display, thenshe will have color experiences b(x1), . . . , b(xn), since b

describes how Jill’s color experiences are scrambled rel-ative to Jack’s. Now Jack’s performance in color tasksis determined by the functional relations f,X ′, f ′, andJill’s by g, Y ′, g′. The Scrambling Theorem says it isalways possible to choose g, Y ′, g′ so that Jill respondsidentically to Jack in all color tasks, despite their differ-ent color experiences. Therefore their conscious experi-ences cannot be identical to the functional relations.

Let f ′ be a metric that gives distances, d, between colorexperiences. Suppose that d is induced by symmetricdifference from a probability measure, p, on (X, X ′),where X ′ is a σ-algebra on X:d(x′

i, x′j) = p(x′

i4x′j) = p(x′

i∪x′j−x′

i∩x′j), x′

i, x′j ∈ X ′.

1. Conscious experiences cannot be represented math-ematically.

• If so, then there is no science of consciousness. Butit is premature to claim this. Give science a chance.

2. X could represent unconscious states of robots, sothe Theorem is not about conscious experiences.

• Using integers to count apples does not precludeusing integers to count oranges.

3. The Theorem is not adequately constrained by em-pirical facts about color perception.

• The Theorem establishes the logical possibility ofinverted spectra, not their empirical existence.

4. Since I can’t imagine a red that looks more similarto a green than another red, it must not be possible.

• Proof establishes logical possibility and, ipso facto,imaginability. Proof trumps weak imagination.

5. The Scrambling Theorem is mathematically trivial.• Yes. The real work is the proper math formulation.

It is a bonus that then the answer is obvious.

Even though Jill sees red what Jack sees blue and Jacksees red what Jill sees green, Jack and Jill still both callgrass green and the sky blue.

Noncommuting Diagram

Nonreductive functionalism claims that conscious expe-riences are determined by, but not identical to, the func-tional organization of a physical system. Such function-alists argue that Wittgenstein’s color inversion couldnot happen without a change in functional organization(e.g., Chalmers 1995; see also Tye, 1995). For if Jack’scolor experiences changed but his functional organiza-tion did not, then his beliefs about his color experienceswould not change. He would believe, e.g., that he ex-perienced red when he in fact experienced blue. This,they conclude, is a reductio ad absurdum. But the re-ductio fails, for if one grants for sake of argument thatexperience can change without changes in functional or-ganization, then surely one must grant that beliefs canso change as well.

“Consider this case: someone says “I can’t understandit, I see everything red blue today and vice versa.” Weanswer “it must look queer!” He says it does and, e.g.,goes on to say how cold the glowing coal looks and howwarm the clear (blue) sky. I think we should under theseor similar circumstances be inclined to say that he sawred what we saw blue. And again we should say that weknow that he means by the words ‘blue’ and ‘red’ whatwe do as he has always used them as we do.”Philosophical Review, 3, 284, 1968.

If Jack’s color experiences suddenly change, he will re-port the changes. He will say, e.g., that the sky is redand the grass yellow. This contradicts the prediction ofnonreductive functionalism.

b

Jack1

Jack2

� � � � �� � �� � �

� � � � � � �� � � ��� � �

� � � ��� � �

� � � � ��� � � �

The MUI Theory of ConsciousnessThe MUI theory of consciousness states that the con-scious experiences of an agent are a multi-modal userinterface (MUI) between that agent and an objectiveworld, objective in the sense that its existence does notdepend on that agent. MUI theory makes no claimabout the nature of that objective world. In particu-lar, it does not claim that the format or contents of theMUI in any way resemble or approximate that objec-tive world. Conscious experiences of space and time,rocks and trees, colors and smells, might not resembleor approximate anything in that objective world.

A Virtual-Reality Illustration of MUIImagine that you and some friends play virtual volley-ball at a VR arcade. You don a helmet and body suit,and find yourself at the beach on a volleyball court. Youserve the volleyball and a friend across the net hits itback. You consciously experience a volleyball, the net,the sand, your body, your friends, and palm trees in thebackground. These are all contents of your MUI. Theobjective world, in this example, is a computer withmegabytes of software running a VR program. The vol-leyball of your MUI does not resemble or approximatethe diodes, resistors, or software of the computer. Nordo the net, sand, and your body. But these elements ofyour MUI inform you how to act so as to alter registersin the computer to win the game (Hoffman, 1998).

Many researchers model perceptual experience as theresult of Bayesian estimation. They assume the objec-tive world contains physical objects with determinateproperties. Perceptual experiences approximate, or re-construct, these properties. This physicalist model ofthe objective world is one of countless models allowedby MUI theory. On evolutionary grounds, however, thephysicalist model is unlikely. Evolutionary pressuresdictate that our MUI interface, this world of our dailyexperience, should itself be a radical simplification, se-lected not for the exhaustive depiction of the objectiveworld but for the mutable pragmatics of survival. Hu-mans, like roaches, are a species in a niche. We do notexpect the perceptions of roaches to be insights into thetrue nature of the objective world; it would be a miracleif the perceptions of humans were such insights.

MUI, Bayes and Evolution

A

B

Don Hoffman, Cognitive Science, UC Irvine 92697

Epiphysicalism and the Brain

Conscious RealismConscious realism states that the objective world, theworld behind the MUI, consists only of conscious agents.Conscious realism can be made mathematically precise,and used to derive quantum field theory (Bennett etal., 1989). It thus differs from Kant’s idealism, whichclaims that things-in-themselves cannot be mathemati-cally modeled. It also differs from panpsychism.

The Mind-Body ProblemPhysicalism has failed to produce a scientific theory de-scribing how brain activity could cause, or be, consciousexperience. It faces a mysterious gap between (1) theexperience of red and (2) ions flowing through holes inneural membranes. MUI Theory and Conscious Real-ism together transform the mind-body mystery into atractable scientific problem. Consciousness is primary,and can be mathematically modeled (Bennett et al.,1989). Tables, chairs and other “physical” objects areelements of a MUI, constructed by a conscious mindaccording to rules now being discovered in the cogni-tive sciences (e.g., Hoffman, 1998). Electrons and atomsare also elements of a MUI, constructed by a consciousagent—a claim compatible with the standard Copen-hagen interpretation of quantum theory.

References and Note1. Bennett, B.M., Hoffman, D.D., Prakash, C. (1989). Observer Mechanics: A Formal Theory of Per-

ception. New York: Academic Press.2. Chalmers, D.J. (1995) The Conscious Mind. Oxford Press.3. Hoffman, D.D. (1998). Visual Intelligence: How We Create What We See. New York: W.W. Norton.4. Locke, J. (1690). An Essay Concerning Human Understanding.

5. Tye, M. (1995). Ten Problems of Consciousness. Cambridge, MA: MIT Press.6. Wittgenstein, L. (1968). Notes for Lectures on ‘Private Experience’ and ‘Sense Data’. (Rush Rhees,

ed.), Philosophical Review, 3, (July LXXVII): 284.

Poster presented at the 17th Annual Convention of the American Psychological Society, Los Angeles,California, May 26–29, 2005, and at the 9th Annual Meeting of the Association for the Scientific Study ofConsciousnes, Caltech, Pasadena, California, June 24–27, 2005.

For any scrambling b:X → Y , let the probability, q, onY be the distribution of b, i.e., for any measurable setA of Y , q(A) = p(b−1(A)). Then taking g′ to be themetric d′ on Y induced by q satisfies the Theorem.

Scrambling Colors, But Not Functions

On your computer screen is an icon for a text file. Theicon is, say, red and rectangular. The shape, color, andposition of the icon do not, of course, resemble anythingabout the text file, but give you access to that file. Youdrag the text-file icon to the recycle icon, and your fileis deleted. Did the motion of the icon on your screencause the file to be deleted? No. Instead the icon in-formed your actions, guiding you to move and click amouse, and your actions triggered the causal sequencethat deleted the file. Likewise, the tables, chairs, rocksand other objects of your everyday MUI appear to havecausal powers, but they do not. In particular, the brainhas no causal powers. It is another icon of your MUI.Indeed, just as there is no cube in the figure above untilsomeone looks and constructs a cube, so also you haveno brain until someone observes it and, thereby, con-structs it. Of course, if someone destroys your primaryvisual cortex, you go blind, just as when you drag thefile icon to the recycle bin you delete the file. And ifsomeone stimulates your somatosensory cortex, you feela sensation in your body, just as when you click on thefile icon you access the file. Nevertheless, the brain doesnot cause these sensations, any more than the text-fileicon causes the text file. Consciousness does not super-vene, logically or naturally, on the brain.

More Objections and Replies1. If that speeding train is just an icon of your MUI,

why don’t you step in front of it?• For the same reason I don’t carelessly drag a file

icon to the recycle bin. Not taking icons literallydiffers from not taking them seriously. Icons arethere to inform our behavior.

2. How can this table be just a construct in my MUI,since you see the same table?

• We don’t see the same table, any more than wesee the same volleyball in the VR volleyball game.You construct your ball and your table, I constructmine. We assume, though we cannot prove, thatour constructs are relevantly similar.

3. If this table is just a construct in my MUI, whycan’t I put my fist through it? Shouldn’t my con-structs do what I want?

• The constructs in your MUI are constrained byyour rules of construction and your interactionswith the objective world that triggers your con-struction. They are not subject to your every whim.

4. Physicalists don’t say that the objective world re-sembles our experiences. They claim instead, e.g.,that this table is mostly empty space, with protons,neutrons and electrons whizzing around. That isthe true objective reality.

• This claim makes the same mistake as someone whopulls out a magnifying glass, looks at a computerscreen, and concludes that, although the file icon isjust an icon, the true objective reality is the set ofpixels now visible. Those pixels are not the com-puter to which the icon is an interface. The pixelsare part of the windows interface; similarly, pro-tons, neutrons and electrons are part of your MUI.

X W

Y

f

bg

X‘ Z

Y‘

f‘

bg‘

Reductive functionalism claims that the type identityconditions for mental states refer only to relations be-tween inputs, outputs and each other (e.g., Block &Fodor, 1972). That is, it claims that each mental stateis numerically identical to some functional relation. TheScrambling Theorem proves that reductive functional-ism if false: Color experiences, a type of mental state,are not identical to any functional relations.

xi x

j‘ ‘