Boeree, C. George - The History of Psychology (I-IV)

347
C. George Boeree: History of Psychology Part One: The Ancients E-Text Source: [ http://www.ship.edu/%7Ecgboeree/historyofpsych.html ] 1 | 71 © Copyright 2006 C. George Boeree

description

"This is an e-text about the historical and philosophical background of Psychology. It was originally written for the benefit of my students at Shippensburg University, but I hope that it helps anyone with an intellectual interest in the field. The material is original and copyrighted by myself, and any distribution must be accompanied by my name and the copyright information. For personal educational use, it is free to one and all." - C. George Boeree (ONLINE VERSION: webspace.ship.edu/cgboer/historyofpsych.html)

Transcript of Boeree, C. George - The History of Psychology (I-IV)

Page 1: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

E-Text Source:[ http://www.ship.edu/%7Ecgboeree/historyofpsych.html ]

1 | 71© Copyright 2006 C. George Boeree

Page 2: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Index

Index

Psyche and Eros 3

The Presocratics [ The Greeks | The Basics | The Ionians | The Greeks of Italy | The Abderans ]

5

The Alphabet 12

Two Poems by Sappho 15

Timeline: from 600 BC to 200 BC 17

Map: Ancient Mediterranean 17

Socrates, Plato, and Aristotle 18

Plato Selection: Allegory of the Cave 25

Logical Fallacies 28

Epicureans and Stoics [ Cynicism | Hedonism | Skepticism | Stoicism | Epicureanism ]

40

A Letter from Epicurus 45

Selections from Epictetus 47

Timeline: From 1 AD to 400 AD 50

Map: Roman Empire 116 AD 50

The Roman Empire [ Neo-Platonism | Mithraism | Christianity | Gnosticism | Manicheanism | St. Augustine | The Fall of Rome ]

51

A Brief History of Judaism 61

Early Christian Heresies 64

Sunnis and Shiites 67

Early History of China and India 68

2 | 71© Copyright 2006 C. George Boeree

Page 3: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

The Story of Psyche and Eros

The so-called psyche or butterfly is generated from caterpillars which grow on green leaves, chiefly leaves of theraphanus, which some call crambe or cabbage. At first it is less than a grain of millet; it then grows into a small grub; and in three days it is a tiny caterpillar. After this it grows on and on, and becomes quiescent and changes its shape, and is now called a chrysalis. The outer shell is hard,and the chrysalis moves if you touch it. It attaches itself by cobweb-like filaments, and is unfurnished with mouth or any other apparent organ. After a little while the outer covering bursts asunder, and out flies the winged creature that we call the psyche or butterfly. (From Aristotle's History of Animals 551a.1)

Psyche was one of three sisters, princesses in a Grecian kingdom. All three were beautiful, but Psyche was the most beautiful. Aphrodite, the goddess of love and beauty, heard about Psyche and her sisters and was jealous of all the attention people paid to Psyche. So she summoned her son, Eros, and told him to put a spell on Psyche.

Always obedient, Eros flew down to earth with two vials of potions. Invisible, he sprinkled the sleeping Psyche with a potion that would make men avoid her when it came to marriage. Accidentally, he pricked her with one of his arrows (which make someone fall in love instantly) and she startled awake. Her beauty, in turn, startled Eros, and he accidentally pricked himself as well. Feeling bad about what he had done, he then sprinkled her with the other potion, which would provide her with joy in her life.

Sure enough, Psyche, although still beautiful, could find no husband. Her parents, afraid that they had offended the gods somehow, asked an oracle to reveal Psyche's future husband. The oracle said that, while no man would have her, there was a creature on the top of a mountain that would marry her.

Surrendering to the inevitable, she headed for the mountain. When she came within sight, she was lifted by a gentle wind and carried the rest of the way. When she arrived, she saw that her new home was in fact a rich and beautiful palace. Her new husband never permitted her to see him, but he proved to be a true and gentle lover. He was, of course, Eros himself.

After some time, she grew lonely for her family, and she asked to be allowed to have her sisters for a visit. When they saw how beautiful Psyche's new home was, they grew jealous. They went to her and told her not to forget that her husband was some kind of monster, and that, no doubt, he was only fattening her up in order to eat her. They suggested that she hide a lantern and a knife near her bed, so that the next time he visited her, she could look to see if he was indeed a monster, and cut off his head if it was so.

Her sisters convinced her this was best, so the next time her husband came to visit her, she had a lamp and a knife ready. When she raised the lamp, she saw that her husband was not a monster but Eros! Surprised, he ran to the window and flew off. She jumped out after him, but fell the ground and lay there unconscious.

When she awoke, the palace had disappeared, and she found herself in a field near her old home. She went to the temple of Aphrodite and prayed for help. Aphrodite responded by giving her a series of tasks to do – tasks that Aphrodite believed the girl would not be able to accomplish.

The first was a matter of sorting a huge pile of mixed grains into separate piles. Psyche looked at the pile and despaired, but Eros secretly arranged for an army of ants to separate the piles. Aphrodite, returning the following morning, accused Psyche of having had help, as indeed she had.

The next task involved getting a snippet of golden fleece from each one of a special herd of sheep that lived across a nearby river. The god of the river advised Psyche to wait until the sheep sought shade from the midday sun. Then they would be sleepy and not attack her. When Psyche presented Aphrodite with the fleece, the goddess again accused her of having had help.

3 | 71© Copyright 2006 C. George Boeree

Page 4: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

The third task Aphrodite set before Psyche was to get a cup of water from the river Styx, where it cascades down from an incredible height. Psyche thought it was all over, until an eagle helped her by carrying the cup up the mountain and returning it full. Aphrodite was livid, knowing full well that Psyche could never have done this alone!

Psyche's next task was to go into hell to ask Persephone, wife of Hades, for a box of magic makeup. Thinking that she was doomed, she decided to end it all by jumping off a cliff. But a voice told her not to, and gave her instructions on making her way to hell to get the box. But, the voice warned, do not look inside the box under any circumstances!

Well, Psyche received the box from Persephone and made her way back home. But, true to her nature, she was unable to restrain herself from peeking inside. To her surprise, there was nothing inside but darkness, which put her into a deep sleep. Eros could no longer restrain himself either and wakened her. He told her to bring the box to Aphrodite, and that he would take care of the rest.

Eros went to the heavens and asked Zeus to intervene. He spoke of his love for Psyche so eloquently that Zeus was moved to grant him his wish. Eros brought Psyche to Zeus who gave her a cup of ambrosia,

the drink of immortality. Zeus then joined Psyche and Eros in eternal marriage. They later had a daughter, who would be named Pleasure.

The Greek name for a butterfly is Psyche, and the same word means the soul. There is no illustration of the immortality of the soul so striking and beautiful as the butterfly, bursting on brilliant wings from the tomb in which it has lain, after a dull, grovelling, caterpillar existence, to flutter in the blaze of day and feed on the most fragrant and delicate productions of the spring. Psyche, then, is the human soul, which is purified by sufferings and misfortunes, and is thus prepared for the enjoyment of true and pure happiness.

(From Bulfinch's Mythology: The Age of Fable, chapter XI)

Image: The Abduction of Psyche, by William Adolphe Bourguereau (1825 - 1905).

4 | 71© Copyright 2006 C. George Boeree

Page 5: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

The Pre-Socratics

5 | 71© Copyright 2006 C. George Boeree

Page 6: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

"Know thyself." – inscribed on the Temple of Apollo at Delphi

Psyche, from the Greek psu-khê, possibly derived from a word meaning "warm blooded:" Life, soul, ghost, departed spirit, conscious self, personality, butterfly or moth. Similar words: Thymos, meaning breath, life, soul, temper, courage, will; Pneuma, meaning breath, mind, spirit, angel; Noös, meaning mind, reason, intellect, or the meaning of a word.

The Greeks

Western intellectual history always begins with the ancient Greeks. This is not to say that no one had any deep thoughts prior to the ancient Greeks, or that the philosophies of ancient India and China (and elsewhere) were in any way inferior. In fact, philosophies from all over the world eventually came to influence western thought, but only much later. But it was the Greeks that educated the Romans and, after a long dark age, it was the records of these same Greeks, kept and studied by the Moslem and Jewish scholars as well as Christian monks, that educated Europe once again.

We might also ask, why the Greeks in the first place? Why not the Phoenicians, or the Carthaginians, or the Persians, or the Etruscans? There are a variety of possible reasons.

One has to do with the ability to read and write, which in turn has to do with the alphabet. It is when ideas get recorded that they enter intellectual history. Buddhism, for example, although a very sophisticated philosophy, was an oral tradition for hundreds of years until committed to writing, since the Brahmi alphabet was late in coming. It was only then that Buddhism spread throughout Asia.

The alphabet was invented by the Semites of the Mediterranean coast, including the Hebrews and the Phoenicians, who used simple drawings to represent consonants instead of words. The Phoenicians apparently passed it on to the Greeks. The Greeks improved on the idea by inventing vowels, using some extra letters their language had no use for.

Prior to the invention of the alphabet, reading and writing was the domain of specialized scribes, concerned mostly with keeping government records. Even in the case of the Phoenicians, writing was more a tool of the merchant class, to keep track of trade, than a means of recording ideas. In Greece, at least in certain city-states, reading and writing was something "everyone" did.

By everyone, of course, I mean upper class males. Women, peasants, and slaves were discouraged from picking up the skill, as they would be and still are in many places around the world. If you wonder where all the women philosophers are, well, there were very few indeed! The poet Sappho of Lesbos is the closest we get to a female philosopher on record in the ancient world.

Still, the alphabet does not explain everything. Another thing that made the Greeks a bit more likely to start the intellectual ball rolling was the fact that they got into overseas trading early. Their land and climate was okay for agriculture, but not great, so the idea of trading for what you can’t grow or make yourself came naturally. Plus, Greece is practically all coastline and islands, so seafaring came equally naturally.

What sea trading gives you is contact with a great variety of civilizations, including their religions and philosophies and sciences. This gets people to thinking: If this one says x, and that one says y, and the third one says z, what then is the truth? Traders are usually skeptics.

6 | 71© Copyright 2006 C. George Boeree

Page 7: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Still, the Phoenicians (and their cousins, the Carthaginians) had the alphabet first, and were excellent sea traders as well. Why weren’t they the founders of western intellectual history? Perhaps it had to do with centralization. The Phoenicians had an authoritarian government controlled by the most powerful merchants. The Carthaginians had the same. Perhaps being surrounded by powerful authoritarian empires forced them to adopt that style of government to survive.

The Greeks, on the other hand, were divided into many small city-states, each unique, each fiercely independent, always bickering and often fighting. It may seem disadvantageous, but when it comes to ideas, diversity and even conflict can be invigorating! Consider that when Greece was finally united under Macedonian rule, the flurry of intellectual activity slowed. And when the Romans took over, it practically died.

The Basics

The ancient Greek philosophers gave us the basic categories of philosophy, beginning with metaphysics. Metaphysics is the part of philosophy that asks questions such as "What is the world made of?" and "What is the ultimate substance of all reality?"

In fact, the ancient Greeks were among the first to suggest that there is a "true" reality (noumenon) under the "apparent" reality (phenomenon), an "unseen real" beneath the "unreal seen." The question is, what is this true reality? Is it matter and energy, i.e. something physical? This is called materialism. Or something more spiritual or mental, such as ideas or ideals? This is called idealism. Materialism and idealism constitute the two extreme answers. Later, we will explore some other possibilities.

A second aspect of philosophy is epistemology. Epistemology is the philosophy of knowledge: How do we know what is true or false, what is real or not? Can we know anything for certain, or is it ultimately hopeless?

Again, the Greeks outlined two opposing approaches to the problem of knowledge. One is called empiricism, which says that all knowledge comes through the senses. The other is called rationalism, which says that knowledge is a matter of reason, thought. There are other answers in epistemology as well. In fact, empiricism and rationalism have never been entirely exclusive.

The third aspect of philosophy that we will be concerned with is ethics. Ethics is the philosophical understanding of good and bad, right and wrong. It is often called morality, and most consider the two words synonymous. After all, ethics comes from ethos, which is Greek for customs, and morality comes from mores, which is Latin for customs!

As we shall see, ethics is the most difficult of the three aspects of philosophy. For the present, we might want to differentiate the extremes of hedonism and cynicism. Hedonism says that good and bad come down to what I like and what I don’t like, what gives me pleasure and what gives me pain. Cynicism says that world is essentially evil, and we can only work at distancing ourselves from it and moving towards the ultimate good, which is God.

There are many other aspects of philosophy – logic, for example, and esthetics, the study of beauty. But metaphysics, epistemology, and ethics are sufficient for now.

7 | 71© Copyright 2006 C. George Boeree

Page 8: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

The Ionians

Greek philosophy didn’t begin in Greece (as we know it); It began on the western coast of what is now Turkey, an area known then as Ionia. In Ionia’s richest city, Miletus, was a man of Phoenician descent called Thales (624-546). He studied in Egypt and other parts of the near east, and learned geometry and astronomy.

His answer to the great question of what the universe is made of was water. Inasmuch as water is a simple molecule, found in gaseous, liquid, and solid forms, and found just about everywhere, especially life, this is hardly a bad answer! It makes Thales not only the nominal first philosopher, but the first materialist as well. Since ultimate nature was known in Greek as physis, he could also be considered the first physicist (or, as the Greeks would say, physiologist).

We should note, however, that he also believed that the whole universe of material things is alive, and that animals, plants, and even metals have souls – an idea called panpsychism.

His most famous student was Anaximander (611-549), also of Miletus. He is probably best known as having drawn the first known map of the inhabited world, which probably looked something like this:

Anaximander added an evolutionary aspect to Thales’ materialism: The universe begins as an unformed, infinite

mass, which develops over time into the many-faceted world we see around us. But, he warns, the world will eventually return to the unformed mass!

Further, the earth began as fluid, some of which dries to become earth and some of which evaporates to become atmosphere. Life also began in the sea, only gradually becoming animals of the land and birds of the air.

Like Thales, Heraclitus (540-475) was an Ionian, from Ephesus, a little north of Miletus. And, like Thales, he was searching for the ultimate substance that unifies all reality. He decided on fire, or energy – again, not a bad guess at all.

The multiplicity of reality comes out of fire by condensation, becoming humid air, then water, and finally earth. But this is balanced by rarefaction, and the earth liquifies, then evaporates, and finally returns to pure energy.

Taking fire as his ultimate substance led to a more dynamic view of reality. Change, for Heraclitus, is the only constant. "Panta rei, ouden menei" – all things flow, nothing abides – is his most famous saying. He is also known for the saying that we cannot step into the same river twice, because new water is constantly flowing onto us.

Fire is also associated in his theory with mind or spirit. And, just like any other fire, he points out that our individuality eventually dies. There is no

personal immortality. Only God – the divine fire – is eternal.

8 | 71© Copyright 2006 C. George Boeree

Page 9: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

In many ways, Heraclitus reminds me of a Greek Taoist. He believed that, although ultimate reality is One, the world we know is made of up dualities, with each pole requiring the existence of its opposite: Up requires down, white requires black, good requires bad, and so on.

And he sees these oppositions as being the source of harmony, pointing out that, unless you stretch your harp strings in two opposing directions, you cannot play music.

And, again like the Taoists, he believed that the best way to live one’s life is in harmony with nature. But he died alone, at the age of 70, due to his intense dislike for human company!

The Greeks of Italy

Another Ionian was Pythagorus (582-500). After travelling everywhere from Gaul (modern day France) to Egypt and India, he settled down in Crotona, a sea port of southern Italy. Southern Italy was the greatest settlement of Greeks outside of Greece, to the point that the Romans referred to the area as Magna Grecia ("greater Greece"). There, he set up his famous school.

His school was more like a large commune, and his philosophy more like a religion. Because they believed in reincarnation, all of his followers were vegetarians. They avoided wine, swearing by the gods, sexual misconduct, excesses and frivolity. For the first five years, a new pupil took a vow of silence. Women were treated as equals – a true rarity in the ancient world!

His philosophy was rooted in mathematics, which meant geometry to the ancient Greeks. Pythagorus is credited with a number of geometric proofs, most notably the pythagorian theorum: The sum of the squares of the two sides of a right triangle is equal to the square

of the hypotenuse. He discovered the mathematical basis of music, and saw the same patterns in the movements of the planets. He is the first person to realize that the earth, moon, and planets are all spheres (hence, the "music of the spheres!"). He saw the elegant lawfulness of geometry as the foundation of the entire universe.

So, rather than look for an understanding of the universe in the movement of matter and energy, he looked for laws of nature, the form rather than the material. But, since these laws exist only in the mind as ideas, we call Pythagorus an idealist.

Although his life remains mysterious, his school lasted 300 years, and had a profound influence on all who followed, most particularly Plato.

In Elea, another Greek seaport in the south of Italy lived Xenophanes (570-475). He is best known for his denial of the existence of the Greek gods.

"Mortals fancy that gods are born, and wear clothes, and have voice and form like themselves. Yet if oxen and lions had hands, and could paint and fashion images as men do, they would make the pictures and images of their gods in their own likenesses; horses would make them like horses, oxen like oxen. Ethiopians make their gods black and snub-nosed; Thracians give theirs blue eyes and red hair." (from Diogenes Laertes "Xenophanes," iii.)

There is only one God, he said, and that is the universe, Nature. This perspective is known as pantheism. Nevertheless, said Xenophanes, all things, even human beings, evolved from earth and water by means of

9 | 71© Copyright 2006 C. George Boeree

Page 10: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

natural laws. But things and people remain forever secondary to the ultimate reality that is God-or-Nature.

Parmenides (540-470) of Elea, was a disciple of Xenophanes, and would have a particularly potent influence on Plato. He extended Xenophanes’ concept of the one God by saying "Hen ta panta," all things are One. Ultimate reality is constant. What we believe to be a world of things and motion and change is just an illusion.

One of Parmenides’ disciples was Zeno of Elea (490-430, not to be confused with Zeno of Citium, whom we will look at in a later chapter). Zeno wrote a book of famous paradoxes, including the story of Achilles and the tortoise: Let’s give the tortoise a head start. By the

time Achilles gets to where the tortoise started, the tortoise will have move a little further. By the time Achilles gets to where the tortoise had moved, the tortoise will have moved a little further still, and so on. Hence, Achilles can never catch up with the tortoise. The point of the story, and all the stories, is that motion is an illusion.

In making his point, he invented the form of argument known as "reduction to absurdity." Note, however, that his arguments don’t hold up in the long run, because he mistakenly takes motion, time, and space as made up of an infinite number of points, rather than being continuous.

The Abderans

Leucippus (fl. c. 440) was from Miletus in Ionia, home of Thales and Anaximander. He studied with Zeno at Elea, then started teaching in Abdera, an Ionian Greek colony on the southern shore of Thrace (northeastern Greece).

Although only one sentence of his actual teachings remains, Leucippus will always be remembered as the man who invented the ideas of the atom, empty space, and cause-and-effect. Even the soul, he said, is made up of atoms!

Itwas Leucippus’ student, Democritus (460-370) of Abdera, who would take these ideas and develop them into a full-bodied philosophy. He travelled extensively, wrote books on every subject, and was considered the equal of the great Plato and Aristotle. But he never founded a school, and so his ideas never had quite the same impact as Plato’s and Aristotle’s on later civilization.

Democritus was quite skeptical of sense data, and introduced the idea of secondary qualities: Things like color and sound and taste are more in your mind than in the thing itself. Further, he said that sensations are a matter of atoms falling on the sense organs, and that all the senses are essentially forms of touch.

He also introduced the idea that we identify qualities by convention – i.e. we call sweet things "sweet," and that is what leads us to group them together, not some quality of the things themselves. This is called the nominalism, from the Latin word for name. This way of thinking doesn't show up again till the late Middle Ages.

The soul or mind, he said, is composed of small, smooth, round atoms, a lot like fire or energy atoms, and can be found throughout the bodies of both humans and animals, and even the rest of the world.

Happiness comes from acquiring knowledge and ultimately wisdom. Sensual pleasure is way too short-lived and fickle to depend on. Instead, the wise man or woman should seek peace of mind (ataraxia) through cheerfulness, moderation, and orderly living. His moral theory is based on the sense of integrity: "A man

10 | 71© Copyright 2006 C. George Boeree

Page 11: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

should feel more shame in doing evil before himself than before all the world."

Democritus did not believe in gods nor an afterlife. In fact, he formed an atheist organization called the Kakodaimoniotai – "the devils club." He is sometimes called the laughing philosopher, because he found life much more cheerful without what he considered to be the depressing superstitions of religion.

He took Leucippus’ materialism very seriously, noting that matter can never be created nor destroyed, that there were an infinity of worlds like our own, and that there was no such thing as chance – only causation. It would be many centuries before these ideas would again become popular.

A little older than Democrates was Protagorus (480-411), also of Abdera. He is the most famous of the group of philosophers known as the sophists. The word comes from the Greek sophistai, which means teachers of wisdom – i.e. professor. Because some of these professors taught little more than how to win arguments in court, and did so for exorbitant fees, the name has become somewhat derogatory. Sophistry now means argument for argument’s sake, or for the sake of personal gain. But then, it is also the root of the word sophisticated!

Protagorus, although his teaching fees were in fact high, was a serious philosopher. He can be credited with founding the science of grammar, being the first to distinguish the various conjugations of verbs and declensions of nouns. He was also a major contributor to logic and was using the Socratic method (teaching by question and answer) before Socrates.

He was a skeptic, and believed that there were no ultimate truths, that truth is a relative, subjective thing. "Man is the measure of all things," is his most famous quote, meaning that things are what we say they are.

Applying this skepticism to the gods, he scared the Athenian powers-that-be, and he was ordered to leave Athens. Apparently, he drowned on his way to Sicily.

Into this idea-rich environment would come the three Athenians that would come to dominate philosophy for the next 2000 years: Socrates, Plato, and Aristotle.

11 | 71© Copyright 2006 C. George Boeree

Page 12: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

The Origin of the Alphabet

The original alphabet was developed by a Semitic people living in or near Egypt.* They based it on the idea developed by the Egyptians, but used their own specific symbols. It was quickly adopted by their neighbors and relatives to the east and north, the Canaanites, the Hebrews, and the Phoenicians. The Phoenicians spread their alphabet to other people of the Near East and Asia Minor, as well as to the Arabs, the Greeks, and the Etruscans, and as far west as present day Spain. The letters and names on the left are the ones used by the Phoenicians. The letters on the right are possible earlier versions. If you don't recognize the letters, keep in mind that they have since been reversed (since the Phoenicians wrote from right to left) and often turned on their sides!

'aleph, the ox, began as the image of an ox's head. It represents a glottal stop before a vowel. The Greeks, needing vowel symbols, used it for alpha (A). The Romans used it as A.

Beth, the house, may have derived from a more rectangular Egyptian alphabetic glyph of a reed shelter (but which stood for the sound h). The Greeks called it beta (B), and it was passed on to the Romans as B.

Gimel, the camel, may have originally been the image of a boomerang-like throwing stick. The Greeks called it gamma (Γ). The Etruscans – who had no g sound – used it for the k sound, and passed it on to the Romans as C. They in turn added a short bar to it to make it

do double duty as G.

Daleth, the door, may have originally been a fish! The Greeks turned it into delta (Δ), and passed it on to the Romans as D.

He may have meant window, but originally represented a man, facing us with raised arms, calling out or praying. The Greeks used it for the vowel epsilon (E, "simple E"). The Romans used it as E.

Waw, the hook, may originally have represented a mace. The Greeks used one version of waw which looked like our F, which they called digamma, for the number 6. This was used by the Etruscans for v, and they passed it on to the Romans as F. The Greeks had a second

version – upsilon (Υ) – which they moved to to the back of their alphabet. The Romans used a version of upsilon for V, which later would be written U as well, then adopted the Greek form as Y. In 7th century England, the W – "double-u" – was created.

* Until recently, it was believed that these people lived in the Sinai desert and began using their alphabet in the 1700's bc. In 1998, archeologist John Darnell discovered rock carvings in southern Egypt's "Valley of Horrors" that push back the origin of the alphabet to the 1900's bc or even earlier. Details suggest that the inventors were Semitic people working in Egypt, who thereafter passed the idea on to their relatives further east.

12 | 71© Copyright 2006 C. George Boeree

Page 13: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Zayin may have meant sword or some other kind of weapon. The Greeks used it for zeta (Z). The Romans only adopted it later as Z, and put it at the end of their alphabet.

H.eth, the fence, was a "deep throat" (pharyngeal) consonant. The Greeks used it for the vowel eta (H), but the Romans used it for H.

Teth may have originally represented a spindle. The Greeks used it for theta (Θ), but the Romans, who did not have the th sound, dropped it.

Yodh, the hand, began as a representation of the entire arm. The Greeks used a highly simplified version of it for iota (Ι). The Romans used it as I, and later added a variation for J.

Kaph, the hollow or palm of the hand, was adopted by the Greeks for kappa (K) and passed it on to the Romans as K.

Lamedh began as a picture of an ox stick or goad. The Greeks used it for lambda (Λ). The Romans turned it into L.

Mem, the water, became the Greek mu (M). The Romans kept it as M.

Nun, the fish, was originally a snake or eel. The Greeks used it for nu (N), and the Romans for N.

Samekh, which also meant fish, is of uncertain origin. It may have originally represented a tent peg or some kind of support. It bears a strong resemblance to the Egyptian djed pillar seen in many sacred carvings. The Greeks used it for xi (Ξ) and a simplified variation of it

for chi (X). The Romans kept only the variation as X.

'ayin, the eye, was another "deep throat" consonant. The Greeks used it for omicron (O, "little O"). They developed a variation of it for omega (Ω, "big O"), and put it at the end of their alphabet. The Romans kept the original for O.

Pe, the mouth, may have originally been a symbol for a corner. The Greeks used it for pi (Π). The Romans closed up one side and turned it into P.

Sade, a sound between s and sh, is of uncertain origin. It may have originally been a symbol for a plant, but later looks more like a fish hook. The Greeks did not use it, although an odd variation does show up as sampi, a symbol for 900. The Etruscans used it in the shape of an

M for their sh sound, but the Romans had no need for it.

13 | 71© Copyright 2006 C. George Boeree

Page 14: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Qoph, the monkey, may have originally represented a knot. It was used for a sound similar to k but further back in the mouth. The Greeks only used it for the number 90 (Ϙ), but the Etruscans and Romans kept it for Q.

Resh, the head, was used by the Greeks for rho (P). The Romans added a line to differentiate it from their P and made it R.

Shin, the tooth, may have originally represented a bow. Although it was first pronounced sh, the Greeks used it sideways for sigma (Σ). The Romans rounded it to make S.

Taw, the mark, was used by the Greeks for tau (T). The Romans used it for T.

The Greek letter phi (Φ) was already common among the Anatolians in what is now Turkey. Psi (Ψ) appears to have been invented by the Greeks themselves, perhaps based on Poseidon's trident. For comparison, here is the complete Greek alphabet:

14 | 71© Copyright 2006 C. George Boeree

Page 15: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Two Poems by Sappho

Sappho was born somewhere around 630 bc on the Greek island Lesbos. She wrote many volumes of poetry that were admired throughout the ancient Greek world. Plato once suggested that she should be added to the list of muses said to inspire artists. Her home island even minted a coin with her likeness in her lifetime. Sappho had both male and female lovers, and it is her island which gave its name to the love between women. She is said to have committed suicide by leaping off of a high cliff, because of a broken heart.

Her poetry usually concerned love, and often refers to the goddess of love, Aphrodite. It was accompanied by simple music, played on the lyre, the small harp you see her holding in the painting below. Because her poetry only survives in fragments, modern translators have the difficult task of reconstructing her poetry on the basis of the bits and pieces.

Below are two such poems. The first is Sappho remembering a lost love; the second is an ode to her daughter, Cleis.

I have not had one word from her Frankly I wish I were dead

When she left, she wepta great deal; she said to me, "This parting must be

endured, Sappho. I go unwillingly."

I said, "Go, and be happy but remember (you know

well) whom you leave shackled by love

"If you forget me, think of our gifts to Aphrodite

and all the loveliness that we shared

"all the violet tiaras, braided rosebuds, dill and

crocus twined around your young neck

"myrrh poured on your head and on soft mats girls with

all that they most wished for beside them

"while no voices chanted choruses without ours,

no woodlot bloomed in spring without song..." –Translated by Mary Barnard

Source: http://www.sappho.com/poetry/historical/sappho.html

Sappho [ an 1877 painting by Charles-August Mengin (1853-1933) ]

15 | 71© Copyright 2006 C. George Boeree

Page 16: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Sleep, darling

I have a small daughter called

Cleis, who islike a golden

flower I wouldn't

take all Croesus' kingdom with love thrown in, for her

Don't ask me what to wear I have no embroidered

headband from Sardis to give you, Cleis, such as

I wore and my mother

always said that in her day a purple ribbon

looped in the hair was thought to be high style indeed

but we were dark: a girl

whose hair is yellower than torchlight should wear no

headdress but fresh flowers

–Translated by Mary Barnard Source: gopher://gopher.OCF.Berkeley.EDU:70/

00/Library/Poetry/Sappho/sappho.Cleis

16 | 71© Copyright 2006 C. George Boeree

Page 17: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Timeline: 600 bc to 200 bc

Map: Ancient Mediterranean

17 | 71© Copyright 2006 C. George Boeree

Page 18: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Socrates, Plato, and Aristotle

18 | 71© Copyright 2006 C. George Boeree

Page 19: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

"The unexamined life is not worth living."

Socrates

The Athenians

When we think of ancient Greece, we think right away of Athens. Several of the philosophers we have already discussed considered it the pinnacle of their careers to come and teach in this great city.

But Athens wasn’t always great. It began as a collection of villages in some of the poorest agricultural land in Greece. Only carefully tended grapes and olives provided early Athens with a livelihood, that and trade.

The distance between the haves – the ruling aristocratic trading families – and the have nots – peasants working the land – and the accompanying feudal oppression, grew so great that it looked like the city and its surrounding area would collapse under the weight.

In 594 bc, the leaders of the middle class recruited a merchant named Solon to accept leadership of the city and restore some peace and prosperity. He began by canceling all debts and freeing all who had been enslaved on account of debt. Then he proceeded to draft a constitution in which the population was divided into four classes based entirely on economic worth, with the highest retaining the greatest power, but the lowest being exempt from taxes.

After a difficult transition, the world’s first democracy was established under the leadership of Cleisthenes in 507 bc, when he decried that all free men would be permitted to vote. This, of course, falls short of a complete democracy, but don't judge them too harshly: Slavery would not outlawed until 1814, when Mexico would become the very first sovereign nation to permanently ban slavery. The US wouldn't free its slaves until 1865 with the 13th amendment. And women didn't get to vote until New Zealand gave them the vote in 1893. It would take the US until 1919 and the 19th amendment.

Unfortunately, at about the same time the democratic experiment began, the great Persian empire to the east decided to expand into, first, Ionia, and then Greece itself. But in 490, 20,000 Greeks defeated 100,000 Persian troops at Marathon, north of Athens. (A messenger named Pheidippides ran the 26 miles – 42.195 km – to Athens to give them the good news, hence the sport of marathon running!)

In 481, the Persian emperor Xerxes sent an army of over two million men, assisted by a fleet of 1200 ships, to attack Greece again. The army ravaged the north of Greece and prepared to attack Athens. They found the city deserted. The Persian navy, however, found the Greek fleet waiting for it in the Bay of Salamis. The Greeks won the day against enormous odds. By 479, the Persians were forced back into Asia Minor.

If this seems like just a little piece of history, consider: This victory allowed the Greek adventure to continue to produce the kind of thinking that would set the tone for the next two millennia in Europe and the Mediterranean.

During the time period we are looking at in this chapter, Athens had as many as 300,000 people, making it one of the largest cities in the world. About half were free, one third were slaves, and one sixth were foreigners (metics). The free adult males who could vote numbered about 50,000.

Socrates

Socrates (470-399) was the son of a sculptor and a midwife, and served with distinction in the Athenian army during Athens’ clash with Sparta. He married, but had a tendency to fall in love with handsome young men, in particular a young soldier named Alcibiades. He was, by all accounts, short and stout, not given to good

19 | 71© Copyright 2006 C. George Boeree

Page 20: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

grooming, and a lover of wine and conversation. His famous student, Plato, called him "the wisest, and justest, and best of all men whom I have ever known" (Phaedo).

He was irritated by the Sophists and their tendency to teach logic as a means of achieving self-centered ends, and even more their promotion of the idea that all things are relative. It was the truth that he loved, desired, and believed in.

Philosophy, the love of wisdom, was for Socrates itself a sacred path, a holy quest – not a game to be taken lightly. He believed – or at least said he did in the dialog Meno – in the reincarnation of an eternal soul which contained all knowledge. We unfortunately lose touch with that knowledge at every birth, and so we need to be reminded of what we already know (rather than learning something new).

He said that he did not teach, but rather served, like his mother, as a midwife to truth that is already in us! Making use of questions and answers to remind his students of knowledge is called maieutics (midwifery), or the Socratic method.

One example of his effect on philosophy is found in the dialog Euthyphro. He suggests that what is to be considered a good act is not good because gods say it is, but is good because it is useful to us in our efforts to be better and happier people. This means that ethics is no longer a matter of surveying the gods or scripture for what is good or bad, but rather thinking about life. He even placed individual conscience above the law – quite a dangerous position to take!

Socrates himself never wrote any of his ideas down, but rather engaged his students – wealthy young men of Athens – in endless conversations. In exchange for his teaching, they in turn made sure that he was taken care of. Since he claimed to have few needs, he took very little, much to his wife Xanthippe’s distress.

Plato reconstructed these discussions in a great set of writings known as the Dialogs. It is difficult to distinguish what is Socrates and what is Plato in these dialogs, so we will simply discuss them together.

Socrates wasn’t loved by everyone by any means. His unorthodox religious views (that there was only one god behind the variety of Greek gods) gave the leading citizens of Athens the excuse they needed to sentence him to death for corrupting the morals of the youth of the city. In 399, he was ordered to drink hemlock, which he did in the company of his students.

Plato

Plato (437-347) was Socrates’ prized student. From a wealthy and powerful family, his actual name was Aristocles – Plato was a nickname, referring to his broad physique. When he was about twenty, he came under Socrates’ spell and decided to devote himself to philosophy. Devastated by Socrates’ death, he wandered around Greece and the Mediterranean and was taken by pirates. His friends raised money to ransom him from slavery, but when he was released without it, they bought him a small property called Academus to start a school – the Academy, founded in 386.

The Academy was more like Pythagorus’ community – a sort of quasi-religious fraternity, where rich young men studied mathematics, astronomy, law, and, of course, philosophy. It was free, depending entirely on donations. True to his ideals, Plato also permitted women to attend! The Academy would become the center of Greek learning for almost a millennium.

Plato can be understood as idealistic and rationalistic, much like Pythagorus but much less mystical. He divides reality into two: On the one hand we have ontos, idea or ideal. This is ultimate reality, permanent, eternal, spiritual. On the other hand, there’s phenomena, which is a manifestation of the ideal. Phenomena are appearances – things as they seem to us – and are associated with matter, time, and space.

20 | 71© Copyright 2006 C. George Boeree

Page 21: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Phenomena are illusions which decay and die. Ideals are unchanging, perfect. Phenomena are definitely inferior to Ideals! The idea of a triangle – the defining mathematics of it, the form or essence of it – is eternal. Any individual triangle, the triangles of the day-to-day experiential world, are never quite perfect: They may be a little crooked, or the lines a little thick, or the angles not quite right.... They only approximate that perfect triangle, the ideal triangle.

If it seems strange to talk about ideas or ideals as somehow more real than the world of our experiences, consider science. The law of gravity, 1+1=2, "magnets attract iron," E=mc2, and so on – these are universals, not true for one day in one small location, but true forever and everywhere! If you believe that there is order in the universe, that nature has laws, you believe in ideas!

Ideas are available to us through thought, while phenomena are available to us through our senses. So, naturally, thought is a vastly superior means to get to the truth. This is what makes Plato a rationalist, as opposed to an empiricist, in epistemology.

Senses can only give you information about the ever-changing and imperfect world of phenomena, and so can only provide you with implications about ultimate reality, not reality itself. Reason goes straight to the idea. You "remember," or intuitively recognize the truth, as Socrates suggested in the dialog Meno.

According to Plato, the phenomenal world strives to become ideal, perfect, complete. Ideals are, in that sense, a motivating force. In fact, he identifies the ideal with God and perfect goodness. God creates the world out of materia (raw material, matter) and shapes it according to his "plan" or "blueprint" – ideas or the ideal. If the world is not perfect, it is not because of God or the ideals, but because the raw materials were not perfect. I think you can see why the early Christian church made Plato an honorary Christian, even though he died three and a half centuries before Christ!

Plato applies the same dichotomy to human beings: There’s the body, which is material, mortal, and "moved" (a victim of causation). Then there’s the soul, which is ideal, immortal, and "unmoved" (enjoying free will).

The soul includes reason, of course, as well as self-awareness and moral sense. Plato says the soul will always choose to do good, if it recognizes what is good. This is a similar conception of good and bad as the Buddhists have: Rather than bad being sin, it is considered a matter of ignorance. So, someone who does something bad requires education, not punishment.

The soul is drawn to the good, the ideal, and so is drawn to God. We gradually move closer and closer to God through reincarnation as well as in our individual lives. Our ethical goal in life is resemblance to God, to come closer to the pure world of ideas and ideal, to liberate ourselves from matter, time, and space, and to become more real in this deeper sense. Our goal is, in other words, self-realization.

Plato talks about three levels of pleasure. First is sensual or physical pleasure, of which sex is a great example. A second level is sensuous or esthetic pleasure, such as admiring someone’s beauty, or enjoying one’s relationship in marriage. But the highest level is ideal pleasure, the pleasures of the mind. Here the example would be Platonic love, intellectual love for another person unsullied by physical involvement.

Paralleling these three levels of pleasure are three souls. We have one soul called appetite, which is mortal and comes from the gut. The second soul is called spirit or courage. It is also mortal, and lives in the heart. The third soul is reason. It is immortal and resides in the brain. The three are strung together by the cerebrospinal canal.

Plato is fond of analogies. Appetite, he says, is like a wild horse, very powerful, but likes to go its own way. Spirit is like a thoroughbred, refined, well trained, directed power. And reason is the charioteer, goal-directed, steering both horses according to his will.

Other analogies abound, especially in Plato’s greatest work, The Republic. In The Republic, he designs (through Socrates) a society in order to discover the meaning of justice. Along the way, he compares elements of his society (a utopia, Greek for "no place") to the three souls: The peasants are the foundation of the society. They till the soil and produce goods, i.e. take care of society’s basic appetites. The warriors

21 | 71© Copyright 2006 C. George Boeree

Page 22: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

represent the spirit and courage of the society. And the philosopher kings guide the society, as reason guides our lives.

Before you assume that we are just looking at a Greek version of the Indian caste system, please note: Everyone’s children are raised together and membership in one of the three levels of society is based on talents, not on one’s birth parents! And Plato includes women as men’s equals in this system. I leave you with a few quotes:

"Wonder is the feeling of a philosopher, and philosophy begins in wonder."

"...(I)f you ask what is the good of education in general, the answer is easy; that education makes good men, and that good men act nobly."

"(I) do to others as I would they should do to me."

"Our object in the construction of the State is the greatest happiness of the whole, and not that of any one class."

Aristotle

Aristotle (384-322) was born in a small Greek colony in Thrace called Stagira. His father was a physician and served the grandfather of Alexander the Great. Presumably, it was his father who taught him to take an interest in the details of natural life.

He was Plato’s prize student, even though he disagreed with him on many points. When Plato died, Aristotle stayed for a while with another student of Plato, who had made himself a dictator in northern Asia Minor. He married the dictator’s daughter, Pythias. They moved to Lesbos, where Pythias died giving birth to their only child, a daughter. Although he married again, his love for Pythias never died, and he requested that they be buried side by side.

For four years, Aristotle served as the teacher of a thirteen year old Alexander, son of Philip of Macedon. In 334, he returned to Athens and established his school of philosophy in a set of buildings called the Lyceum (from a name for Apollo, "the shepherd"). The beautiful grounds and covered walkways were conducive to leisurely

walking discussions, so the students were known as peripatoi ("covered walkways").

First, we must point out that Aristotle was as much a scientist as a philosopher. He was endlessly fascinated with nature, and went a long way towards classifying the plants and animals of Greece. He was equally interested in studying the anatomies of animals and their behavior in the wild.

Aristotle also pretty much invented modern logic. Except for its symbolic form, it is essentially the same today.

Let’s begin with metaphysics: While Plato separates the ever-changing phenomenal world from the true and eternal ideal reality, Aristotle suggests that the ideal is found "inside" the phenomena, the universals "inside" the particulars.

What Plato called idea or ideal, Aristotle called essence, and its opposite, he referred to as matter. Matter is without shape or form or purpose. It is just "stuff." pure potential, no actuality. Essence is what provides the shape or form or purpose to matter. Essence is "perfect," "complete," but it has no substance, no solidity. Essence and matter need each other!

Essence realizes ("makes real") matter. This process, the movement from formless stuff to complete being, is

22 | 71© Copyright 2006 C. George Boeree

Page 23: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

called entelechy, which some translate as actualization.

There are four causes that contribute to the movement of entelechy. They are answers to the question "why?" or "what is the explanation of this?"

1. The material cause: what something is made of. 2. The efficient cause: the motion or energy that changes matter. 3. The formal cause: the thing’s shape, form, or essence; its definition. 4. The final cause: its reason, its purpose, the intention behind it.

1. The material cause: The thing’s matter or substance. Why a bronze statue? The metal it is made of. Today, we find an emphasis on material causation in reductionism, explaining, for example, thoughts in terms of neural activity, feelings in terms of hormones, etc. We often go down a "level" because we can’t explain something at the level it’s at.

2. The efficient cause: The motion or energy that changes matter. Why the statue? The forces necessary to work the bronze, the hammer, the heat, the energy.... This is what modern science focuses on, to the point where this is what cause now tends to mean, exclusively. Note that modern psychology usually relies on reductionism in order to find efficient causes. But it isn’t always so: Freud, for example, talked about psychosexual energy and Skinner talked about stimulus and response.

3. The formal cause: The thing’s shape, form, definition, or essence. Why the statue? Because of the plan the sculptor had for the bronze, it’s shape or form, the non-random ordering of it’s matter. In psychology, we see some theorists focus on structure – Piaget and his schema, for example. Others talk about the structure inherent in the genetic code, or about cognitive scripts.

4. The final cause: The end, the purpose, the teleology of the thing. Why the statue? The purpose of it, the intention behind making it. This was popular with medieval scholars: They searched for the ultimate final cause, the ultimate purpose of all existence, which they of course labeled God! Note that, outside of the hard sciences, this is often the kind of cause we are most interested in: Why did he do it, what was his purpose or intention? E.g. in law, the bullet may have been the "efficient" cause of death, but the intent of the person pulling the trigger is what we are concerned with. When we talk about intentions, goals, values, and so on, we are talking about final causes.

Aristotle wrote the first book on psychology (as a separate topic from the rest of philosophy). It was called, appropriately, Para Psyche, Greek for "about the mind or soul." It is better known in the Latin form, De Anima. In this book, we find the first mentions of many ideas that are basic to psychology today, such as the laws of association.

In it, he says the mind or soul is the "first entelechy" of the body, the "cause and principle" of the body, the realization of the body. We might put it like this: The mind is the purposeful functioning of the nervous system.

Like Plato, he postulates three kinds of souls, although slightly differently defined. There is a plant soul, the essence of which is nutrition. Then there is an animal soul, which contains the basic sensations, desire, pain and pleasure, and the ability to cause motion. Last, but not least, is the human soul. The essence of the human soul is, of course, reason. He suggests that, perhaps, this last soul is capable of existence apart from the body.

He foreshadowed many of the concepts that would become popular only two thousand years later. Libido, for example: "In all animals... it is the most natural function to beget another being similar to itself... in order that they attain as far as possible, the immortal and divine.... This is the final cause of every creatures natural life."

And the struggle of the id and ego: "There are two powers in the soul which appear to be moving forces – desire and reason. But desire prompts actions in violation of reason... desire... may be wrong."

23 | 71© Copyright 2006 C. George Boeree

Page 24: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

And the pleasure principle and reality principle: "Although desires arise which are opposed to each other, as is the case when reason and appetite are opposed, it happens only in creatures endowed with a sense of time. For reason, on account of the future, bids us resist, while desire regards the present; the momentarily pleasant appears to it as the absolutely pleasant and the absolutely good, because it does not see the future."

And finally, self-actualization: We begin as unformed matter in the womb, and through years of development and learning, we become mature adults, always reaching for perfection. "So the good has been well explained as that at which all things aim."

24 | 71© Copyright 2006 C. George Boeree

Page 25: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Plato: Book VII of The Republic: The Allegory of the Cave *

Here's a little story from Plato's most famous book, The Republic. Socrates is talking to a young follower of his named Glaucon, and is telling him this fable to illustrate what it's like to be a philosopher – a lover of wisdom: Most people, including ourselves, live in a world of relative ignorance. We are even comfortable with that ignorance, because it is all we know. When we first start facing truth, the process may be frightening, and many people run back to their old lives. But if you continue to seek truth, you will eventually be able to handle it better. In fact, you want more! It's true that many people around you now may think you are weird or even a danger to society, but you don't care. Once you've tasted the truth, you won't ever want to go back to being ignorant!

[Socrates is speaking with Glaucon]

[Socrates:] And now, I said, let me show in a figure how far our nature is enlightened or unenlightened: –Behold! human beings living in a underground den, which has a mouth open towards the light and reaching all along the den; here they have been from their childhood, and have their legs and necks chained so that they cannot move, and can only see before them, being prevented by the chains from turning round their heads. Above and behind them a fire is blazing at a distance, and between the fire and the prisoners there is a raised way; and you will see, if you look, a low wall built along the way, like the screen which marionette players have in front of them, over which they show the puppets.

[Glaucon:] I see.

And do you see, I said, men passing along the wall carrying all sorts of vessels, and statues and figures of animals made of wood and stone and various materials, which appear over the wall? Some of them are talking, others silent.

You have shown me a strange image, and they are strange prisoners.

* From http://classics.mit.edu/Plato/republic.html

25 | 71© Copyright 2006 C. George Boeree

Page 26: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Like ourselves, I replied; and they see only their own shadows, or the shadows of one another, which the fire throws on the opposite wall of the cave?

True, he said; how could they see anything but the shadows if they were never allowed to move their heads?

And of the objects which are being carried in like manner they would only see the shadows?

Yes, he said.

And if they were able to converse with one another, would they not suppose that they were naming what was actually before them?

Very true.

And suppose further that the prison had an echo which came from the other side, would they not be sure to fancy when one of the passers-by spoke that the voice which they heard came from the passing shadow?

No question, he replied.

To them, I said, the truth would be literally nothing but the shadows of the images.

That is certain.

And now look again, and see what will naturally follow if the prisoners are released and disabused of their error. At first, when any of them is liberated and compelled suddenly to stand up and turn his neck round and walk and look towards the light, he will suffer sharp pains; the glare will distress him, and he will be unable to see the realities of which in his former state he had seen the shadows; and then conceive some one saying to him, that what he saw before was an illusion, but that now, when he is approaching nearer to being and his eye is turned towards more real existence, he has a clearer vision, -what will be his reply? And you may further imagine that his instructor is pointing to the objects as they pass and requiring him to name them, – will he not be perplexed? Will he not fancy that the shadows which he formerly saw are truer than the objects which are now shown to him?

Far truer.

And if he is compelled to look straight at the light, will he not have a pain in his eyes which will make him turn away to take and take in the objects of vision which he can see, and which he will conceive to be in reality clearer than the things which are now being shown to him?

True, he said.

And suppose once more, that he is reluctantly dragged up a steep and rugged ascent, and held fast until he 's forced into the presence of the sun himself, is he not likely to be pained and irritated? When he approaches the light his eyes will be dazzled, and he will not be able to see anything at all of what are now called realities.

Not all in a moment, he said.

He will require to grow accustomed to the sight of the upper world. And first he will see the shadows best, next the reflections of men and other objects in the water, and then the objects themselves; then he will gaze upon the light of the moon and the stars and the spangled heaven; and he will see the sky and the stars by night better than the sun or the light of the sun by day?

Certainly.

Last of he will be able to see the sun, and not mere reflections of him in the water, but he will see him in his own proper place, and not in another; and he will contemplate him as he is.

Certainly.

He will then proceed to argue that this is he who gives the season and the years, and is the guardian of all that is in the visible world, and in a certain way the cause of all things which he and his fellows have been

26 | 71© Copyright 2006 C. George Boeree

Page 27: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

accustomed to behold?

Clearly, he said, he would first see the sun and then reason about him.

And when he remembered his old habitation, and the wisdom of the den and his fellow-prisoners, do you not suppose that he would felicitate himself on the change, and pity them?

Certainly, he would.

And if they were in the habit of conferring honours among themselves on those who were quickest to observe the passing shadows and to remark which of them went before, and which followed after, and which were together; and who were therefore best able to draw conclusions as to the future, do you think that he would care for such honours and glories, or envy the possessors of them? Would he not say with Homer,

Better to be the poor servant of a poor master, and to endure anything, rather than think as they do and live after their manner?

Yes, he said, I think that he would rather suffer anything than entertain these false notions and live in this miserable manner.

Imagine once more, I said, such an one coming suddenly out of the sun to be replaced in his old situation; would he not be certain to have his eyes full of darkness?

To be sure, he said.

And if there were a contest, and he had to compete in measuring the shadows with the prisoners who had never moved out of the den, while his sight was still weak, and before his eyes had become steady (and the time which would be needed to acquire this new habit of sight might be very considerable) would he not be ridiculous? Men would say of him that up he went and down he came without his eyes; and that it was better not even to think of ascending; and if any one tried to loose another and lead him up to the light, let them only catch the offender, and they would put him to death.

No question, he said.

This entire allegory, I said, you may now append, dear Glaucon, to the previous argument; the prison-house is the world of sight, the light of the fire is the sun, and you will not misapprehend me if you interpret the journey upwards to be the ascent of the soul into the intellectual world according to my poor belief, which, at your desire, I have expressed whether rightly or wrongly God knows. But, whether true or false, my opinion is that in the world of knowledge the idea of good appears last of all, and is seen only with an effort; and, when seen, is also inferred to be the universal author of all things beautiful and right, parent of light and of the lord of light in this visible world, and the immediate source of reason and truth in the intellectual; and that this is the power upon which he who would act rationally, either in public or private life must have his eye fixed.

27 | 71© Copyright 2006 C. George Boeree

Page 28: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Logical Fallacies

28 | 71© Copyright 2006 C. George Boeree

Page 29: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Fallacies are arguments that may sound logical, but are not. When you look at some of the examples below, you may see some with conclusions you agree with and some you don't. But the truth, in the empirical sense, is not what is at issue: What these examples are all about is logical argument. All these examples are illogical and based in fallacious thinking.

For example, one fallacy is called "sweeping generalization." Someone may argue: "That is the richest sorority on campus; so Sue, who belongs to that sorority must be one of the richest women on campus." Well, Sue may be one of the richest; or she may be one of the poorest. It doesn't matter whether the conclusion is true or not in the literal sense. The argument is illogical. It means nothing at all to say that, if a group has a certain quality, then a member of the group must have that quality, too.

Probably everyone has been guilty of inadvertently using them. Most of us fall for them even if we know better. And there are some people (propogandists, advertisers, and many politicians) who use them all the time. It would be wise to become familiar with the fallacies in order to protect ourselves from the unscrupulous. But by no means is this list meant to encourage the use of fallacies!

Affirmation of the consequent: "A implies B, B is true, therefore A is true"

"If the universe had been created by a supernatural being, we would see order and organization everywhere. And we do see order, not randomness – so it's clear that the universe had a creator." (No: The order could have some other origin.)

"If there is indeed a collective unconscious, then we will find that the mythologies of all the world’s cultures have profound commonalities. And indeed they do – therefore, there must be a collective unconscious!" (No: There may be all sorts of other reasons for mythologies to have commonalities.)

This is the converse of denial of the antecedent (below).

A slight variation of affirming the consequent is converting a conditional: "If A then B, therefore if B then A".

"When educational standards are lowered, the quality of shows on television worsens. So if we see television getting worse over the next few years, we'll know that our educational standards are still falling." (No: The worsening of television could have other causes.)

"If the latest drugs work well, we will see a great improvement n mental health. So, if mental health improves, we will know that these drugs were effective!" (No again! Mental health may improve for other reasons.)

This fallacy is similar to the affirmation of the consequent, but phrased as a conditional statement.

Denial of the antecedent: "A implies B, A is false, therefore B is false"

"If the God of the Bible appeared to me, personally, that would certainly prove that Christianity was true. But God has never appeared to me, so the Bible must be a work of fiction." (Nope: God may not appear to you even if the Bible were true.)

"If there were such a thing as penis envy, we would expect women to be easier on their sons than on their daughters. But penis envy is, of course, not real – so naturally women do not treat their sons better than their daughters." (No: They may still do so, just for other reasons.)

This is the converse of the fallacy of affirmation of the consequent.

There is also a version that says "if A, then B, therefore, if not A, then not B."

"If you have a PhD in psychology, you must be pretty knowledgeable in the field. Therefore, if you don’t have the PhD, you must be abysmally ignorant of psychology." (No: Having that PhD may mean you have knowledge, but knowledge hardly depends on a degree.)

29 | 71© Copyright 2006 C. George Boeree

Page 30: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Fallacy of composition: the idea that a property shared by a number of individual items, is also shared by a collection of those items; or that a property of the parts of an object, must also be a property of the whole thing.

"This new truck is made entirely of lightweight aluminum components, and is therefore very lightweight." In fact, a truck is composed of so many "lightweight" parts, it is bound to be far from lightweight itself!

Note that ton of feathers does NOT weigh less than a ton of lead!

"Since neurons are either excitatory or inhibitory, the brain itself must have basically excitatory or inhibitory states."

A variation of composition is the genetic fallacy: Drawing a conclusion about the goodness or badness of something on the basis of the goodness or badness of the thing’s origin. E.g. "The medicine made from that plant must be poisonous, because that plant is poisonous." (not actually ad hominem – see below – but often listed there)

"The humanitarian work we do may well come out of our need to look good in front of our fellow man. So humanitarian work is basically egotistical!"

The opposite of the fallacy of composition is the fallacy of division: assuming that a property of some thing must apply to its parts;or that a property of a collection of items is shared by each item.

"Humans are conscious and are made of cells; therefore, each cell has consciousness"

"You are studying at a rich college. Therefore you must be rich."

"Since the team could solve the problem so easily, I assume that each member of the team could do it just as well alone."

And a fallacy that totally confuses parts and wholes: the fallacy of the undistributed middle: Suggesting that things are in some way similar, but not actually specifing how. A is a kind of C, B is a kind of C, therefore, A is B

"Cats are a form of animal based on carbon chemistry, dogs are a form of animal based on carbon chemistry, so aren't dogs and cats basically identical?"

"They’re both students, so I can expect the same from both."

"Since they are both schizophrenics, they should both have the same reaction to this new medication."

Sweeping generalization (The fallacy of accident, dicto simpliciter): Applying a general rule to special case; A general rule is applied to a particular situation, but the features of that particular situation mean the rule is inapplicable.

"Christians generally dislike atheists. You are a Christian, so you must dislike atheists."

Sweeping generalization includes a common misunderstanding the nature of statistics:

"The majority of people in the United States die in hospitals, so stay out of them."

"Men are statistically more aggressive than women. Therefore, I, a male, must be more aggressive than you, a female."

30 | 71© Copyright 2006 C. George Boeree

Page 31: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Hasty generalization is the converse of sweeping generalization: A special case is used as the basis of a general rule. A general rule is created by examining only a few specific cases which aren't representative of all possible cases.

"I know a union representative and he's a terrible person. I wouldn't trust any of them."

"Jim Bakker was an insincere Christian. Therefore all Christians are insincere."

"This schizophrenic has paranoid delusions. It stands to reason that they all do."

Hasty generalization includes another common misunderstanding of statistics called the statistics of small numbers:

"My parents smoked all their lives and they never got cancer."

"The five subjects in our experiment responded well to our intervention. We can therefore recommend the procedure to everyone."

Another version is called observational selection: pointing out favorable circumstances while ignoring the unfavorable. For example, at any gambling institution, a great deal of fuss is paid to those who win, while those who lose are quietly encouraged to sneak out the back. This way, winning seems much more likely that it is!

"All of these people who prayed for a cure survived their disease. Prayer is clearly to be recommended!"

And observational selection includes anecdotal evidence:

"Just last week I read about a girl who was dying of cancer. Her whole family went to church and prayed for her, and she was cured. That only proves the power of prayer!"

"Uncle Joe got over his rheumatism by drinking his own urine!"

"Urban myths" are usually good examples!

Bifurcation ("black or white," excluded middle, false dichotomy): Presuming an either-or distinction. Suggesting that there are only two alternatives, where in fact other alternatives exist or can exist. Instead of black or white, we can have shades of gray... or even rainbows of colors!

"We must choose between safety and freedom. And it is in the nature of good Americans to take the risk of freedom." Must we choose? Can't we have both?

"A patient either gets better or they don’t."

"Come on now– is he or isn’t he bipolar?"

Considering only the extremes:

"He's either guilty or not guilty."

Begging the question (petitio principii ). Assuming as a premise the conclusion which you wish to reach. Instead of offering real proof, we can just restate the conclusion we are supposed to come to, and hope the listener doesn't notice.

"Government ownership of public utilities is dangerous, because it is socialistic." But government ownership of public utilities is socialism. You've just been told that it's dangerous because it is what it is.

31 | 71© Copyright 2006 C. George Boeree

Page 32: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

"We must encourage our youth to worship God to instill moral behavior." But does religion and worship actually produce moral behavior? Of course not!

"Qualitative methods are essentially worthless because they don’t involve measurement or statistics."

The most obvious form of begging the question is the circular argument (vicious cycle, circulus in demonstrando): Stating in one's proof that which one is supposed to be proving.

"We know that God exists because the Bible tells us so. And we know that the Bible is true because it is the word of God."

"Your arguments against Freud are due to your unresolved unconscious conflicts."

"Your arguments against Skinner are due to your conditioning."

"Your arguments against existentialism are indicative of your inauthenticity."

There’s also the appeal to faith: Faith, by definition, relies on a belief that does not rest on logic or evidence. Faith depends on irrational thought.

"If you accept the Lord, you will understand!"

"If you would only take Maslow at his word, you would finally get it!"

And the most common way to use begging the question is question-begging epithets (loaded words, emotive language, etc.). Restating the conclusion in "hot" language: "This criminal is charged with the most vicious crime known to man." Does it prove something, or just get the blood flowing?

Often hard to identify (and so very dangerous) is the ad hoc argument: Giving an after-the-fact explanation which doesn't apply to other situations.

"I see that John’s cancer is in remission." "Yes, our prayers have been answered!" "But didn’t you pray for Susan, too, and look what happened to her." "I’m sure God had a special reason for taking her."

"Those people who don’t follow the expected pattern of strong-mother/weak-father leading to homosexuality are no doubt hiding their true orientation!"

Look out when people say "everything has a reason" or "God has a purpose for all of us."

Complex question (loaded question, trick question, leading question, fallacy of interrogation, fallacy of presupposition): Interrogative form of begging the question (above). Ask a question that leads others to believe that a previous question has been answered in a certain way.

"Answer yes or no: Did you ever give up your evil ways?" If you say yes, that tells us you had evil ways; if you say no, that tells us you still have them. What if you never had them?

"Have you stopped beating your wife yet?"

"So, are you gay, or just in denial?"

"And when will you come out of the closet?"

A variation on the complex question is the fallacy of many questions (plurium interrogationum) : This fallacy occurs when someone demands a simple (or simplistic) answer to a complex question.

"Yes or no: Is democracy ultimately the best system of government?"

Another form of this fallacy is to ask for an explanation of something which is untrue or not yet established.

32 | 71© Copyright 2006 C. George Boeree

Page 33: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

False cause (non causa pro causa, non sequitur): Something is identified as the cause of an event, but it has not actually been shown to be the cause. For example:

"I took an aspirin and prayed to God, and my headache disappeared. So God cured me of the headache."

"Artists often suffered from depression as adolescents. So, if you want your child to be a great artist, don’t put them on Prozac!"

The most common form of false cause is called post hoc ergo propter hoc: An inference or conclusion that does not follow from established premises or evidence. Assuming causal connections that haven't been demonstrated. The Latin phrase means "after this, therefore because of this."

"You should go to Harvard, because Harvard graduates make more money." Or could it be that they had more money before they went?

"She got sick after she visited China, so something in China caused her sickness." Or could it be that she was sick prior to leaving for China?

"There was an increase of births during the full moon. Therefore, full moons cause birth rates to rise."

A slight variation is cum hoc ergo propter hoc: Saying that, because two events occur together, they must be causally related. It's a fallacy because it ignores all the other possible causes of the events.

"Literacy rates have steadily declined since the advent of television. Clearly television viewing impedes learning."

"He started using drugs just about the time he started seeing that girl. I knew she was a bad influence!"

A common statistical version of this is confusion of correlation and causation: correlation cannot tell you anything about the direction of causality. If X is powerfully correlated with Y, X could be the cause of Y, Y could be the cause of X, or (most likely) something else is the cause of both. Possibly, the relationship is accidental!

"More chess players are men, therefore, men make better chess players than women."

"Far more women than men suffer from depression. We can assume that there is something about a woman’s physiology that leads to depression."

(Often followed by an ad hoc argument: The men with depression must in some way be effeminate!)

Missing the point (irrelevant thesis, ignoratio elenchi, irrelevant conclusion, ignoring the issue, befogging the issue, diversion, red herring, etc.). Demonstrating a point other than the one at issue. Diverting attention by changing the subject. Escaped convicts in Elizabethan England would smear themselves with rotten (red) herring to throw the dogs off the scent.

"I fail to see why hunting should be considered cruel when it gives tremendous pleasure to many people and employment to even more." So we should stop talking about cruelty and start talking about pleasure and employment?

"Christianity is the only true religion: It has clearly been of great help to many people." No matter how well he argues how much it has helped people, he will not have shown that Christian teachings are true.

"It is very clear that we prescribe psychoactive medications to people who don’t really need them. We should outlaw these drugs altogether!"

One example is the straw man: Creating a false scenario and then attacking it. Misrepresenting someone else's position so that it can be attacked more easily.

"Evolutionists think that everything came about by random chance. How could that be?" Most evolutionists

33 | 71© Copyright 2006 C. George Boeree

Page 34: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

think in terms of natural selection which may involve incidental elements, but does not depend entirely on random chance. Painting your opponent with false colors only deflects the purpose of the argument.

"To summarize Freud, he believed that it all boils down to sex. Let me show you why Freud is therefore full of crap!"

Another example is reification (hypostatization): when people treat an abstract concept or hypothetical construct as if it represented a concrete event or physical entity.

IQ tests are often presented as actual measures of intelligence, for example.

"What is consciousness? You can’t find it anywhere in the human brain, so we must reject the concept."

And another example, the meaningless question:

"How high is up?" "Up" describes a direction, not a measurable entity.

"Does anything really exist?"

"How can we experience the collective unconscious directly?"

A really tricky version of missing the point is the appeal to logic (argumentum ad logicam ): This is the "fallacy fallacy" of arguing that a proposition is false because it has been presented as the conclusion of a fallacious argument. Remember that fallacious arguments can arrive at true conclusions.

"Take the fraction 16/64. Now, cancelling a six on top and a six on the bottom, we get that 16/64 = 1/4." "Wait a second! You can't just cancel the six! Your math is wrong: 16/64 does not equal 1/4!" Yes it does, even though the math is wrong.

Very common are half truths (suppressed evidence): An statement usually intended to deceive that omits some of the facts necessary for an accurate description.

And one of the worst versions of missing the point is false analogy: An analogy or metaphor illustrates or elaborates; it doesn't prove anything: "The American Indian had to make way for Western civilization; after all, you can't make an omelette without breaking a few eggs." Are the lives and cultures of millions comparable to eggs? What does making omelettes have to do with history and morality?

"Since the mind is essentially a wet computer, our task is to figure out how we can best program it!"

There are many fallacies that involve the misuse of words.

Very common is special pleading: Here, we use a double-standard of words.

"The ruthless tactics of the enemy, his fanatical, suicidal attacks have been foiled by the stern measures of our commanders and the devoted self-sacrifice of our troops." Are ruthless tactics different from stern measures? Fanatical, suicidal attacks from devoted self-sacrifice?

"Ellis’s therapy is authoritarian and aggressive!"

"Rogers’s therapy is laissez faire, even lazy!"

This is not far from the fallacy of equivocation: Use of ambiguous words. A key word is used with two or more different meanings in the same argument. Shifting the meaning of the words.

"What could be more affordable than free software? But to make sure that it remains free, that users can do what they like with it, we must place a license on it to make sure that will always be freely redistributable."

One way to avoid this fallacy is to choose your terminology carefully before beginning the argument, and avoid words like "free" which have many meanings.

The "no true Scotsman..." fallacy: Suppose I assert that no Scotsman puts sugar on his porridge. You

34 | 71© Copyright 2006 C. George Boeree

Page 35: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

counter this by pointing out that your friend Angus likes sugar with his porridge. I then say "Ah, yes, but no true Scotsman puts sugar on his porridge. I basically limit the meaning of the word "Scotsman."

"How can he do that to her if he loves her?" "Ah, but that’s not true love, see?"

"No caring therapist would use methods like that!"

"No well-trained scientist would come to those conclusions!"

"Christians turn the other cheek." "But I've seen many Christians strike others." "Yes, but those aren't good Christians. They aren't even real Christians at all!"

The previous example includes the use of accent – changing oral stress within a sentence to alter the meaning.

"All men are created equal..." implies that women are not. "All men are created equal..." suggests that they don’t end up equal.

An amusing misuse of words is amphiboly – use of ambiguous sentences.

"Two pizzas for one special price." Two for one? Or both at the same "special" price?

Personal attack (argumentum ad hominem): Attacks the person instead of the argument. In personal attack, we ask the listener not to consider the argument, but to consider where it is coming from:

"This theory about a new cure for cancer has been introduced by a man known for his Marxist sympathies. I don't see why we should extend him the courtesy of our attention."

"You can’t trust Freud – he used cocaine!"

"You can’t trust Adler – he was a socialist!"

"You can’t trust Horney – she suffered from depression!"

But Marxists, cocain users, socialists, and depressed people can be right!

Then there’s the abusive form of the personal attack:

"You claim that atheists can be moral – yet I happen to know that you abandoned your wife and children."

"You don’t agree with experimentation? I’ve read that you were never able to get any of your own research published!"

A little more clever is the circumstantial form of the personal attack:

"It is perfectly acceptable to kill animals for food. Since you are wearing leather shoes, I am sure you won’t argue with that."

"You don’t agree with Rogers – yet I notice you use reflection in your own practice!"

Very damaging is poisoning the well: The personal attack can also be used as an excuse to reject a particular conclusion such as when you allege that someone is rationalizing a conclusion for selfish reasons. You’ve "poisoned the well" in that, from now on, people will tend to doubt his arguments.

"Of course you'd argue that affirmative action is a bad thing. You're white."

Note that if someone is a known perjurer or liar, that fact will reduce their credibility as a witness. It won't, however, prove that their testimony is false in this case. Liars can tell the truth!

"Don’t listen to her criticisms of existentialism – she’s an experimentalist!"

35 | 71© Copyright 2006 C. George Boeree

Page 36: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

And every teenagers favorite argument is called tu quoque (two wrongs make a right ): Latin for "you, too!" or "look who's talking!"

"If you think communal living is such a great idea, why aren't you living in a commune?"

"If psychology is so great, how come YOU have so many problems?"

"If smoking is so bad for you, why do you smoke?"

But even a smoker can know that it isn't good for you!

Appeal to the masses (argumentum ad populum, appealing to the people, mob appeal, appealing to the gallery, appeal to popular pieties). This involves theatrical appeals to our lowest instincts, such as selfishness, greed, jealousy, or vanity rather than facts and reasons. "Because you are a college audience, I know I can speak to you about difficult matters seriously." Oh, well, thank you very much; please do go on!

"The enormous popularity of books on dream analysis alone suggests its validity!"

One example of appeal to the masses is the bandwagon fallacy (consensus gentium, argumentum ad numerum): concluding that an idea has merit simply because many people believe it or practice it.

"Most people believe in a god; therefore, it must be true." Simply because many people may believe something says nothing about the fact of that something. Once upon a time, everyone thought the earth was flat!

"All I'm saying is that millions of people believe in astrology, so there must be something to it."

"Everyone is moving into cognitive style research – there must be something to it!"

Argument from omniscience: The "everybody" version of the preceding.

"Everyone knows that men and women are psychologically the same!"

"People need to believe in something. Everyone knows that." Beware of words like "all," "everyone," "everything."

Appeal to authority (argumentum ad verecundiam): This is where we bring up famous people, reference groups, science, tradition, religion, universality....

"Professor Boeree says behaviorism is dead." Simply because an authority says something does not necessarily mean it's correct.

The great philosopher Santayana said "Those who remain ignorant of history are doomed to repeat it." But Henry Ford said "History is bunk!" So who is right?

"Freud said.... – and who are we to argue with a genius of his caliber?"

This includes the famous technique called snob appeal: "Camel filters. They're not for everybody!"

"All those who can afford it prefer Freudian therapy!"

Variations include appeal to tradition (argumentum ad antiquitatem): This is the fallacy of asserting that something is right or good simply because it's old, or because "that's the way it's always been."

Just because people practice a tradition, says nothing about whether it is true. See, for example, astrology, slavery, superstition, human sacrifice....

"Psychologists have always agreed that...."

36 | 71© Copyright 2006 C. George Boeree

Page 37: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

The opposite is called appeal to novelty (argumentum ad novitatem): The fallacy of asserting that something is better or more correct simply because it is new, or newer than something else.

"It’s the latest!"

"Windows 99 is much better than Windows 95. How could it not be, coming after so many years of experience!"

"The most recent studies show that...."

Appeal to riches (argumentum ad crumenam): The fallacy of believing that money is a criterion of correctness; that those with more money are more likely to be right, or that something that costs more is intrinsically better.

"Microsoft software is undoubtedly superior; why else would Bill Gates have gotten so rich?"

"It costs twice as much – it must be twice as good!" "Indeed. You get what you pay for!" Do you?

"I’ll have to side with the psychiatrists. After all, they make more money than the PhD psychologists!"

The opposite is appeal to poverty (argumentum ad lazarum): The fallacy of assuming that someone poor is sounder or more virtuous than someone who's wealthier, or that something inexpensive or plain is somehow naturally better. For example:

"Monks are more likely to possess insight into the meaning of life, as they have given up the distractions of wealth."

"A simple loaf of bread, made lovingly by hand – what could be better?"

"Since John does so much of his work pro bono, he must be a much more honest therapist."

Appeal to nature (the natural law fallacy): Arguing that, because human beings are products of the natural world, we must mimic behavior seen in the natural world, and that to do otherwise is 'unnatural'. A common fallacy in political arguments.

"The natural world is characterized by competition; animals struggle against each other for ownership of limited natural resources. Capitalism, the competitive struggle for ownership of capital, is simply an inevitable part of human nature. It's how the natural world works."

"Of course homosexuality is unnatural. When's the last time you saw two animals of the same sex mating?" (Actually, that’s much more common than people think! But that, too, is irrelevant.)

"Our attraction to 'beautiful' people parallels the instincts of birds and mammals. Love, therefore, is nothing but an instinct!"

Appeal to pity (argumentum ad misericordiam): This is an appeal to your tender emotions, your sympathy: Listen, if you can bear it, to any telethon. Or listen to advertisements that try to sell computers to parents.

"You wouldn't want your kids to be left behind on the information super-highway, would you? What kind of parent are you anyway?"

"I did not murder my mother and father with an axe! Please don't find me guilty; I'm suffering enough through being an orphan."

"Qualitative methods are used by a small group of dedicated researchers working in a hostile environment of experimentalism."

37 | 71© Copyright 2006 C. George Boeree

Page 38: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Appeal to ignorance (argumentum ad ignorantiam, argumentum ex silentio): Arguing that something must be true, simply because it hasn't been proved false. Or arguing that something must be false because it hasn't been proved true. That is, my position is right because there is no evidence against it. Or yours is wrong because there is no evidence for it.

"We have no evidence that God doesn't exist. Therefore, he must exist."

"There is intelligent life in outer space, for no one has been able to prove that there isn't." Fact of the matter is, you can't prove the non-existence of something: No matter how hard you look, I can always say you haven't looked hard enough. Go ahead: Prove to me that unicorns don't exist!

"We don’t know whether holistic medicines actually help psychological disorders, so we might as well use them!" (Followed by a pity argument: Would you deny people the chance of getting better, just because there’s no evidence?)

A common accompaniment to the appeal to ignorance is shifting the burden of proof: The burden of proof is always on the person asserting something. Shifting the burden of proof is the fallacy of putting the burden of proof on the person who denies or questions the assertion.

So, when an arguer cannot provide the evidence for his claims, he may challenge his opponent to prove him wrong.

"Prove God doesn't exist, then!"

"Prove UFO's aren't real, then!"

"I believe that homosexuality is based on biological differences – I dare you to prove me wrong!"

Appeal to fear (argumentum ad baculum, appeal to force): Don't argue with me, it's dangerous!

"If you do not convict this murderer, one of you may be his next victim." A similar argument is frequently used in deodorant ads.

"If you don't believe in God, you'll burn in hell"

"You better learn your stats: You’ll never be able to get your doctorate if you don’t!"

A little more subtle is the argument from adverse consequences:

"The accused must be found guilty, otherwise others will commit similar crimes"

And a common variation is the slippery slope: Arguing that a change in procedure, law, or action, will result in adverse consequences.

"Give ‘em an inch, and they’ll take a mile!"

"Pass the equal rights for women amendment and before you know it, we’ll all be using unisex bathrooms!"

"If we legalize marijuana, then more people would start to take crack and heroin, and we'd have to legalize those too. Before long we'd have a nation full of drug-addicts on welfare. Therefore we cannot legalize marijuana."

"If we allow doctor assisted suicide, then eventually the government will control how we die." It does not necessarily follow that just because we make changes that a slippery slope will occur.

"If you start people on Prozac, they will become dependent on it, then on drugs in general, and never learn to deal with their problems on their own!"

38 | 71© Copyright 2006 C. George Boeree

Page 39: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Argumentum ad nauseam: This is the incorrect belief that an assertion is more likely to be true, or is more likely to be accepted as true, the more often it is heard. So an Argumentum ad Nauseam is one that employs constant repetition in asserting something; saying the same thing over and over again until you're sick of hearing it. See almost any commercial, or take a look at the practice of having children memorizing Bible verses.

"Classical conditioning must be at the root of all learning – I had that drummed into my head at Penn State!"

"All my life, people have told me: a man doesn’t show weakness!"

39 | 71© Copyright 2006 C. George Boeree

Page 40: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Epicureans and Stoics

40 | 71© Copyright 2006 C. George Boeree

Page 41: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Cynicism After Plato and Aristotle, the concerns of the philosophers moved further and further from metaphysics, epistemology, and anything resembling modern science, to the issue that had always concerned the ancient Greeks the most – ethics. What is it to be virtuous, to have character, to live the good life, to have "arete" (nobility)?

Antisthenes (445-365) was the son of an Athenian citizen and a Thracian slave girl. After starting his own school, he came to recognize that Socrates was wiser than he. He went over, students and all, to learn from the master.

Antisthenes is the founder of cynicism. Cynic comes from the Greek word for dog, originally because Antisthenes taught at the Cynosarges (Dogfish) gymnasium, which had been set up for the poor of Athens.

Cynicism involves living the simple life in order that the soul can be set free. It is a "back to nature" type of philosophy, ala St. Francis of Assisi or the Hindu ascetics. By eliminating one’s needs and possessions, one can better concentrate on the life of philosophy.

Cynicism makes virtue the only good, the only true happiness. You can’t control the world and life’s ups and downs, so control yourself! Inhibit your desires! become independent of the world! "I would rather go mad than feel pleasure!" said Antisthenes. Rejecting civilization, cynics tended to withdraw from society, even to live in the desert. In this, they may have influenced early Jewish and Christian monastics.

Cynicism wasn’t entirely negative (from today’s values perspective): They strongly encouraged individualism, believed that all men were brothers, were against war and slavery, and believed in free

speech. They also believed in the legitimacy of suicide and, oddly, free love!

The most famous of the cynics was Diogenes (412-323), a student of Antisthenes. He saw himself as a citizen of the world (a "cosmopolitan"), yet for a time lived in a discarded clay jar. There is a famous story that has Alexander the Great finding him sleeping in the sun and announcing "I am Alexander the great king!" Diogenes replied "I am Diogenes the dog!" Alexander asked if there was anything he could do for him. Diogenes just asked him to move out of the sun.

Hedonism

Aristippus (435-355) was also a student of Socrates. Originally from Cyrene on the north coast of Africa, he returned there to found his own school, where he taught the philosophy of hedonism (from the Greek word for pleasure). Hedonism is very simple: Whatever we do, we do to gain pleasure or to avoid pain. Pleasure is the only good, and the achievement of pleasure the only virtue. Morality is only a matter of culture and customs and laws, something we now call ethical relativism. Further, science, art, civilization in general, are good only to the extent that they are useful in producing pleasure.

Note, however, that Aristuppus also taught that some pleasures are higher than others, and that we should be

41 | 71© Copyright 2006 C. George Boeree

Page 42: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

slaves to none of them. He was equally cheerful in good times and in poverty, and despised useless displays of wealth.

He and his students lived as a part of a commune-like school where all practiced what they preached, including free love, more than 2000 years before Woodstock! Women were the full equals of men, and not only hypothetically: His daughter Arete succeeded him in leadership of the school and commune. She wrote 40 books herself and was honored by the city of Cyrene with the title "Light of Hellas.*"

Skepticism

Skepticism today is usually considered a positive thing – not to accept anything on faith could be a motto for any number of famous philosophers! In its origin, however, it was a bit more extreme. Pyrrho of Elis (365-275) is usually credited with founding the "school" of skepticism. It is believed that he traveled to India and studied with the "gymnosophists" (naked lovers of wisdom), which could have been any number of Indian sects. From there, he brought back the idea that nothing can be known for certain. The senses are easily fooled, and reason follows too easily our desires.

If we cannot ever know anything for certain, then we may as well suspend our judgment, stop arguing over what will never be settled, and try to find a little peace and tranquility in life. That tranquility he called ataraxia. Note that, although we can't know anything for certain, we can know many things well enough to get by. The sun may or may not rise tomorrow – but the odds are good that it will, and what use would it serve to worry about it anyway!

Likewise, if no system is ultimately supportable, for the sake of peace, simply adopt whatever system is prevalent in your neck of the woods. Pyrrho lived out his life worshiping the gods of Elis, although he would certainly never acknowledge that they had any more likelihood of reality as any other gods, or no gods at all! There are many things a skeptic might accept for convenience, even though there be no ultimate proof.

Although at first glance this sounds positive, one of my students, Annie Lam, said this:

Using Pyrrho’s reasoning, slavery would still exist today because Black Americans should accept their role in life as chattel in order to preserve peace in the community. Most societies organize themselves into hierarchical systems, thus, those groups of individuals who are lower on the hierarchy typically experience oppression, and in some extreme examples may be dehumanized and brutalized. I agree with the idea that nothing can be known for certain; however, it is for this reason that I believe arguments and debates should occur as opposed to being discouraged as advocated by Pyrrho. It is only with the free and respectful exchange of ideas that individuals can develop their personal values and beliefs in an educated manner. I think if we sacrifice this exchange in order to acquire ataraxia, we also sacrifice our ability to develop a genuine self because self-reflection – judgment of self and others – is not encouraged.

Later skeptics became prevalent among the students in Plato’s Academy. One in particular, Carneades of Cyrene (c.214-129 ), was notorious for arguing one side of an issue one day and the other the next day. He said "There is absolutely no criterion for truth. For reason, senses, ideas, or whatever else may exist are all deceptive."

42 | 71© Copyright 2006 C. George Boeree

Page 43: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Stoicism

"Only the educated are free." – Epictetus

The founder of stoicism is Zeno of Citium (333-262) in Cyprus. Zeno may have been Phoenician or part Phoenician. He was a student of the cynics, but was also influenced by Socrates. His philosophy was similar to that of Antisthenes, but tempered by reason. Basically, he believed in being virtuous, and that virtue was a matter of submitting to God’s will. As usual for Greeks who postulated a single god, Zeno did not strongly differentiate God from nature. So another way of putting it is to live according to nature ("Zen kata physin.").

The school got its name from the Painted Porch (stoa poikile) in Athens where Zeno studied. Walking up and down the open hallways, he lectured his students on the value of apatheia, the absence of passion, something not too different from the Buddhist idea of non-attachment. By passion Zeno meant uncontrolled emotion or physical desire. Only by taking this attitude, he felt, could we develop wisdom and the ability to apply it.

"Let no one break your will!" he said. Man conquers the world by conquering himself. Start by developing an indifference to pain and pleasure, through meditation. Wisdom occurs when reason controls passions; Evil occurs when passions control us.

Another aspect of Stoicism is its belief in the development of a universal state, in which all men were brothers. Stoics believed in certain "natural rights," a concept which we wouldn’t see again until the 18th century. They also believed in the right to commit suicide – an important part of Roman cultural tradition.

The best presentation of stoicism is by the Greek slave Epictetus (50-138 ad), who wrote during the Roman era. There is also a little book, Meditations, by the Roman emperor Marcus Aurelius (121-180 ad).

Epicureanism

"The gods are not to be feared;death cannot be felt;

the good can be won;all that we dread can be conquered."

– Epicurus

Epicurus (341-270) was born on the island of Samos in Ionia. At 19, went to Athens to study at the Academy. It seemed, though, that he liked the philosophy of Democritus better. The school he founded was particularly egalitarian, accepting women and slaves. Epicurus, it is said, wrote 300 books. Sadly, only fragments survive.

Epicurus had little patience with religion, which he considered a form of ignorance. He was particularly eager to help people loose their fear of the gods. He did, however, also say that the gods existed, although they lived far away in space somewhere and had little or nothing to do with people on earth. Atheism, you see, was still illegal in Athens!

43 | 71© Copyright 2006 C. George Boeree

Page 44: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

One of the most persistent issues concerning belief in God is the problem of evil. Epicurus's argument still holds up:

Is God willing to prevent evil, but not able? Then he is not omnipotent. Is he able, but not willing? Then he is malevolent. Is he both able and willing? Then whence cometh evil? Is he neither able nor willing? Then why call him God?

Epicurus felt that it was useless to argue over metaphysics, that there was no such thing as a soul that lived after death, that we arrived at our present condition by means of evolution, and that we had the quality of free will.

We can see an almost "modern" materialism and empiricism here: All things – including minds – are made of atoms and follow natural laws. All knowledge comes from the senses. Thoughts and memories are nothing but weak sensations....

Virtue for Epicurus was a means to an end. That end is happiness. It is good to feel pleasure and to avoid pain, but one needs to apply reason to life. Sometimes pain is necessary in order to gain happiness. Other times, pleasure leads to more suffering than it is worth.

And there are levels of pain and pleasure, smaller and greater happinesses. Friendship, for example, is rated one of the highest pleasures. "A sage loves his friends as he loves himself," he said, and "It is better to give than to receive." And "It is not possible to live pleasantly without living prudently, honorably, and justly; nor to live prudently, honorably, and justly without living pleasantly." He reminds me of Benjamin Franklin!

Society is seen as necessary: It protects one from injustices. He foreshadows utilitarianism by suggesting that a society should be arranged to provide the greatest happiness to the greatest number.

The ultimate happiness, though, is peace, and he borrows Pyrrho’s word for tranquility – ataraxia. His motto was "lathe biosas" – live unobtrusively. He may be considered the first true humanist, as witnessed by this quote: "Philosophy is an activity that uses reasoning and rigorous argument to promote human flourishing."

The best summary of epicureanism is the epic poem On the Nature of Things by Roman Lucretius (95-52).

Note the practical similarities between stoicism and epicureanism, despite their theoretical differences! Both were popular in the Roman era, stoicism in Rome’s early, more vigorous years and continuing among the rank and file of Roman citizenry, and epicureanism (even hedonism) behind closed doors, especially at the highest levels of the empire.

Alexander the Great introduced what is called the Hellenistic 1 period of history: His empire brought Greek ideas, art, language, habits to "the world," as far east as India and south as Egypt. But, with his death at the age of 33, his empire began to come apart, his generals dividing it amongst themselves and incompletely conquered nations reasserting their independence. And a new people stood in the wings to take over dominance of the Mediterranean: The Romans.

And yet the influence of the Greeks would outlast the empire of Alexander, its collapse, and even the Romans. But for now, the world had become a different place, a place of large powers maneuvering among themselves, centralized authorities just like those of Asia, huge trading and marketing conglomerates tightly tied to those authorities. Not quite the place for individualistic thinking and observation.

1 Hellas is Greek for Greece, and the Hellenes are the Greeks. Greece and Greek are Roman names.

44 | 71© Copyright 2006 C. George Boeree

Page 45: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Epicurus: Letter to Menoeceus * Greetings.

Let no one be slow to seek wisdom when he is young nor weary in the search thereof when he is grown old. For no age is too early or too late for the health of the soul. And to say that the season for studying philosophy has not yet come, or that it is past and gone, is like saying that the season for happiness is not yet or that it is now no more. Therefore, both old and young ought to seek wisdom, the former in order that, as age comes over him, he may be young in good things because of the grace of what has been, and the latter in order that, while he is young, he may at the same time be old, because he has no fear of the things which are to come. So we must exercise ourselves in the things which bring happiness, since, if that be present, we have everything, and, if that be absent, all our actions are directed toward attaining it.

Those things which without ceasing I have declared to you, those do, and exercise yourself in those, holding them to be the elements of right life. First believe that God is a living being immortal and happy, according to the notion of a god indicated by the common sense of humankind.... For truly there are gods, and knowledge of them is evident; but they are not such as the multitude believe, seeing that people do not steadfastly maintain the notions they form respecting them. Not the person who denies the gods worshipped by the multitude, but he who affirms of the gods what the multitude believes about them is truly impious. For the utterances of the multitude about the gods are not true preconceptions but false assumptions; hence it is that the greatest evils happen to the wicked and the greatest blessings happen to the good from the hand of the gods, seeing that they are always favorable to their own good qualities and take pleasure in people like to themselves, but reject as alien whatever is not of their kind.

Accustom yourself to believe that death is nothing to us, for good and evil imply awareness, and death is the privation [removal] of all awareness; therefore a right understanding that death is nothing to us makes the mortality of life enjoyable, not by adding to life an unlimited time, but by taking away the yearning after immortality. For life has no terror; for those who thoroughly apprehend that there are no terrors for them in ceasing to live. Foolish, therefore, is the person who says that he fears death, not because it will pain when it comes, but because it pains in the prospect. Whatever causes no annoyance when it is present, causes only a groundless pain in the expectation. Death, therefore, the most awful of evils, is nothing to us, seeing that, when we are [alive], death is not come, and, when death is come, we are not. It is nothing, then, either to the living or to the dead, for with the living it is not and the dead exist no longer. But in the world, at one time people shun death as the greatest of all evils, and at another time choose it as a respite from the evils in life. The wise person does not deprecate [devalue] life nor does he fear the cessation of life. The thought of life is no offense to him, nor is the cessation of life regarded as an evil. And even as people choose of food not merely and simply the larger portion, but the more pleasant, so the wise seek to enjoy the time which is most pleasant and not merely that which is longest. And he who admonishes the young to live well and the old to make a good end speaks foolishly, not merely because of the desirability of life, but because the same exercise at once teaches to live well and to die well. Much worse is he who says that it were good not to be born, but when once one is born to pass with all speed through the gates of Hades. For if he truly believes this, why does he not depart from life? It were easy for him to do so, if once he were firmly convinced. If he speaks only in mockery, his words are foolishness, for those who hear believe him not.

We must remember that the future is neither wholly ours nor wholly not ours, so that neither must we count upon it as quite certain to come nor despair of it as quite certain not to come.

We must also reflect that of desires some are natural, others are groundless; and that of the natural some are necessary as well as natural, and some natural only. And of the necessary desires some are necessary if we are to be happy, some if the body is to be rid of uneasiness, some if we are even to live. He who has a clear and certain understanding of these things will direct every preference and aversion toward securing health of

*© 1994-1998 by the Internet Classics Archive, at classics.mit.edu/Epicurus/menoec.html

Translated by Robert Drew Hicks

45 | 71© Copyright 2006 C. George Boeree

Page 46: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

body and tranquillity of mind, seeing that this is the sum and end of a happy life. For the end of all our actions is to be free from pain and fear, and, when once we have attained all this, the tempest of the soul is laid; seeing that the living creature has no need to go in search of something that is lacking, nor to look to anything else by which the good of the soul and of the body will be fulfilled. When we are pained, then, and then only, do we feel the need of pleasure. For this reason we call pleasure the alpha and omega of a happy life. Pleasure is our first and kindred good. It is the starting-point of every choice and of every aversion, and to it we come back, inasmuch as we make feeling the rule by which to judge of every good thing. And since pleasure is our first and native good, for that reason we do not choose every pleasure whatever, but often pass over many pleasures when a greater annoyance ensues from them. And often we consider pains superior to pleasures when submission to the pains for a long time brings us as a consequence a greater pleasure. While therefore all pleasure because it is naturally akin to us is good, not all pleasure is worthy of choice, just as all pain is an evil and yet not all pain is to be shunned. It is, however, by measuring one against another, and by looking at the conveniences and inconveniences, that all these matters must be judged. Sometimes we treat the good as an evil, and the evil, on the contrary, as a good. Again, we regard. independence of outward things as a great good, not so as in all cases to use little, but so as to be contented with little if we have not much, being honestly persuaded that they have the sweetest enjoyment of luxury who stand least in need of it, and that whatever is natural is easily procured and only the vain and worthless hard to win. Plain fare gives as much pleasure as a costly diet, when once the pain of want has been removed, while bread and water confer the highest possible pleasure when they are brought to hungry lips. To habituate one's self therefore, to simple and inexpensive diet supplies all that is needful for health, and enables a person to meet the necessary requirements of life without shrinking and it places us in a better condition when we approach at intervals a costly fare and renders us fearless of fortune.

When we say, then, that pleasure is the end and aim, we do not mean the pleasures of the prodigal or the pleasures of sensuality, as we are understood to do by some through ignorance, prejudice, or willful misrepresentation. By pleasure we mean the absence of pain in the body and of trouble in the soul. It is not an unbroken succession of drinking-bouts and of merrymaking, not sexual love, not the enjoyment of the fish and other delicacies of a luxurious table, which produce a pleasant life; it is sober reasoning, searching out the grounds of every choice and avoidance, and banishing those beliefs through which the greatest disturbances take possession of the soul.... For this reason prudence is a more precious thing even than the other virtues, for one cannot lead a life of pleasure which is not also a life of prudence, honor, and justice; nor lead a life of prudence, honor, and justice, which is not also a life of pleasure. For the virtues have grown into one with a pleasant life, and a pleasant life is inseparable from them.

Who, then, is superior in your judgment to such a person? He holds a holy belief concerning the gods, and is altogether free from the fear of death. He has diligently considered the end fixed by nature, and understands how easily the limit of good things can be reached and attained, and how either the duration or the intensity of evils is but slight. Destiny which some introduce as sovereign over all things, he laughs to scorn, affirming rather that some things happen of necessity, others by chance, others through our own agency. For he sees that necessity destroys responsibility and that chance or fortune is inconstant; whereas our own actions are free, and it is to them that praise and blame naturally attach. It were better, indeed, to accept the legends of the gods than to bow beneath destiny which the natural philosophers have imposed. The one holds out some faint hope that we may escape if we honor the gods, while the necessity of the naturalists is deaf to all entreaties. Nor does he hold chance to be a god, as the world in general does, for in the acts of a god there is no disorder; nor to be a cause, though an uncertain one, for he believes that no good or evil is dispensed by chance to people so as to make life happy, though it supplies the starting-point of great good and great evil. He believes that the misfortune of the wise is better than the prosperity of the fool. It is better, in short, that what is well judged in action should not owe its successful issue to the aid of chance.

Exercise yourself in these and kindred precepts day and night, both by yourself and with him who is like to you; then never, either in waking or in dream, will you be disturbed, but will live as a god among people. For people lose all appearance of mortality by living in the midst of immortal blessings.

THE END

46 | 71© Copyright 2006 C. George Boeree

Page 47: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Epictetus: Selections from The Enchiridion *

1. Some things are in our control and others not. Things in our control are opinion, pursuit, desire, aversion, and, in a word, whatever are our own actions. Things not in our control are body, property, reputation, command, and, in one word, whatever are not our own actions.

The things in our control are by nature free, unrestrained, unhindered; but those not in our control are weak, slavish, restrained, belonging to others. Remember, then, that if you suppose that things which are slavish by nature are also free, and that what belongs to others is your own, then you will be hindered. You will lament, you will be disturbed, and you will find fault both with gods and men. But if you suppose that only to be your own which is your own, and what belongs to others such as it really is, then no one will ever compel you or restrain you. Further, you will find fault with no one or accuse no one. You will do nothing against your will. No one will hurt you, you will have no enemies, and you not be harmed.

....

Work, therefore to be able to say to every harsh appearance, "You are but an appearance, and not absolutely the thing you appear to be." And then examine it by those rules which you have, and first, and chiefly, by this: whether it concerns the things which are in our own control, or those which are not; and, if it concerns anything not in our control, be prepared to say that it is nothing to you.

2. Remember that following desire promises the attainment of that of which you are desirous; and aversion promises the avoiding that to which you are averse. However, he who fails to obtain the object of his desire is disappointed, and he who incurs the object of his aversion wretched. If, then, you confine your aversion to those objects only which are contrary to the natural use of your faculties, which you have in your own control, you will never incur anything to which you are averse. But if you are averse to sickness, or death, or poverty, you will be wretched. Remove aversion, then, from all things that are not in our control, and transfer it to things contrary to the nature of what is in our control. But, for the present, totally suppress desire: for, if you desire any of the things which are not in your own control, you must necessarily be disappointed; and of those which are, and which it would be laudable to desire, nothing is yet in your possession. Use only the appropriate actions of pursuit and avoidance; and even these lightly, and with gentleness and reservation. ....

5. Men are disturbed, not by things, but by the principles and notions which they form concerning things. Death, for instance, is not terrible, else it would have appeared so to Socrates. But the terror consists in our notion of death that it is terrible. When therefore we are hindered, or disturbed, or grieved, let us never attribute it to others, but to ourselves; that is, to our own principles. An uninstructed person will lay the fault of his own bad condition upon others. Someone just starting instruction will lay the fault on himself. Some who is perfectly instructed will place blame neither on others nor on himself.

....

8. Don't demand that things happen as you wish, but wish that they happen as they do happen, and you will go on well. ....

11. Never say of anything, "I have lost it"; but, "I have returned it." Is your child dead? It is returned. Is your wife dead? She is returned. Is your estate taken away? Well, and is not that likewise returned? "But he who took it away is a bad man." What difference is it to you who the giver assigns to take it back? While he gives it to you to possess, take care of it; but don't view it as your own, just as travelers view a hotel. ....

* © 1994-1998 by the Internet Classics Archive, at classics.mit.edu Translated by Elizabeth Carter

47 | 71© Copyright 2006 C. George Boeree

Page 48: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

13. If you want to improve, be content to be thought foolish and stupid with regard to external things. Don't wish to be thought to know anything; and even if you appear to be somebody important to others, distrust yourself. For, it is difficult to both keep your faculty of choice in a state conformable to nature, and at the same time acquire external things. But while you are careful about the one, you must of necessity neglect the other.

14. If you wish your children, and your wife, and your friends to live for ever, you are stupid; for you wish to be in control of things which you cannot, you wish for things that belong to others to be your own. So likewise, if you wish your servant to be without fault, you are a fool; for you wish vice not to be vice, but something else. But, if you wish to have your desires undisappointed, this is in your own control. Exercise, therefore, what is in your control. He is the master of every other person who is able to confer or remove whatever that person wishes either to have or to avoid. Whoever, then, would be free, let him wish nothing, let him decline nothing, which depends on others else he must necessarily be a slave.

15. Remember that you must behave in life as at a dinner party. Is anything brought around to you? Put out your hand and take your share with moderation. Does it pass by you? Don't stop it. Is it not yet come? Don't stretch your desire towards it, but wait till it reaches you. Do this with regard to children, to a wife, to public posts, to riches, and you will eventually be a worthy partner of the feasts of the gods. And if you don't even take the things which are set before you, but are able even to reject them, then you will not only be a partner at the feasts of the gods, but also of their empire. For, by doing this, Diogenes, Heraclitus and others like them, deservedly became, and were called, divine.

....

22. If you have an earnest desire of attaining to philosophy, prepare yourself from the very first to be laughed at, to be sneered by the multitude, to hear them say,." He is returned to us a philosopher all at once," and " Whence this supercilious [high-brow] look?" Now, for your part, don't have a supercilious look indeed; but keep steadily to those things which appear best to you as one appointed by God to this station. For remember that, if you adhere to the same point, those very persons who at first ridiculed will afterwards admire you. But if you are conquered by them, you will incur a double ridicule.

....

30. Duties are universally measured by relations. Is anyone a father? If so, it is implied that the children should take care of him, submit to him in everything, patiently listen to his reproaches, his correction. But he is a bad father. Are you naturally entitled, then, to a good father? No, only to a father. Is a brother unjust? Well, keep your own situation towards him. Consider not what he does, but what you are to do to keep your own faculty of choice in a state conformable to nature. For another will not hurt you unless you please. You will then be hurt when you think you are hurt. In this manner, therefore, you will find, from the idea of a neighbor, a citizen, a general, the corresponding duties if you accustom yourself to contemplate the several relations. ...

Be for the most part silent, or speak merely what is necessary, and in few words. We may, however, enter, though sparingly, into discourse sometimes when occasion calls for it, but not on any of the common subjects, of gladiators, or horse races, or athletic champions, or feasts, the vulgar topics of conversation; but principally not of men, so as either to blame, or praise, or make comparisons. If you are able, then, by your own conversation bring over that of your company to proper subjects; but, if you happen to be taken among strangers, be silent.

Don't allow your laughter be much, nor on many occasions, nor profuse.

Avoid swearing, if possible, altogether; if not, as far as you are able.

Avoid public and vulgar entertainments; but, if ever an occasion calls you to them, keep your attention upon the stretch, that you may not imperceptibly slide into vulgar manners. For be assured that if a person be ever so sound himself, yet, if his companion be infected, he who converses with him will be infected likewise.

48 | 71© Copyright 2006 C. George Boeree

Page 49: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Provide things relating to the body no further than mere use; as meat, drink, clothing, house, family. But strike off and reject everything relating to show and delicacy.

As far as possible, before marriage, keep yourself pure from familiarities with women, and, if you indulge them, let it be lawfully. But don't therefore be troublesome and full of reproofs to those who use these liberties, nor frequently boast that you yourself don't.

If anyone tells you that such a person speaks ill of you, don't make excuses about what is said of you, but answer: "He does not know my other faults, else he would not have mentioned only these." ....

41. It is a mark of want of genius to spend much time in things relating to the body, as to be long in our exercises, in eating and drinking, and in the discharge of other animal functions. These should be done incidentally and slightly, and our whole attention be engaged in the care of the understanding.

42. When any person harms you, or speaks badly of you, remember that he acts or speaks from a supposition of its being his duty. Now, it is not possible that he should follow what appears right to you, but what appears so to himself. Therefore, if he judges from a wrong appearance, he is the person hurt, since he too is the person deceived. For if anyone should suppose a true proposition to be false, the proposition is not hurt, but he who is deceived about it. Setting out, then, from these principles, you will meekly bear a person who reviles you, for you will say upon every occasion, "It seemed so to him."

....

44. These reasonings are unconnected: "I am richer than you, therefore I am better"; "I am more eloquent than you, therefore I am better." The connection is rather this: "I am richer than you, therefore my property is greater than yours;" "I am more eloquent than you, therefore my style is better than yours." But you, after all, are neither property nor style.

45. Does anyone bathe in a mighty little time? Don't say that he does it ill, but in a mighty little time. Does anyone drink a great quantity of wine? Don't say that he does ill, but that he drinks a great quantity. For, unless you perfectly understand the principle from which anyone acts, how should you know if he acts ill? Thus you will not run the hazard of assenting to any appearances but such as you fully comprehend. ....

48. The condition and characteristic of a vulgar person, is, that he never expects either benefit or hurt from himself, but from externals. The condition and characteristic of a philosopher is, that he expects all hurt and benefit from himself. The marks of a proficient are, that he censures no one, praises no one, blames no one, accuses no one, says nothing concerning himself as being anybody, or knowing anything: when he is, in any instance, hindered or restrained, he accuses himself; and, if he is praised, he secretly laughs at the person who praises him; and, if he is censured, he makes no defense. But he goes about with the caution of sick or injured people, dreading to move anything that is set right, before it is perfectly fixed. He suppresses all desire in himself; he transfers his aversion to those things only which thwart the proper use of our own faculty of choice; the exertion of his active powers towards anything is very gentle; if he appears stupid or ignorant, he does not care, and, in a word, he watches himself as an enemy, and one in ambush.

49 | 71© Copyright 2006 C. George Boeree

Page 50: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Timeline 1 to 400 ad

A note about dates: It has become a tradition in the west to use BC (Before Christ) and AD (Anno Domini, "year of our lord," the presumed year of Jesus' birth). This was started by a monk by the name of Dionysius Exiguus in 525 (AD, of course) while trying to figure out the appropriate dates to celebrate Easter. As an ancient Roman, he didn't use zero, so the first year was year one, which is the reason that the centuries and the millenia begin a year after they seem like they should (e.g. 2001 instead of 2000). Because not everyone is Christian and because Jesus was probably born several years BC (Before Christ!), many now use the initials BCE (before the common era) and CE (the common era) instead. Since I am old and cranky, I stubbornly continue to use BC and AD.

Map: The Roman Empire under Trajan 116 ad

50 | 71© Copyright 2006 C. George Boeree

Page 51: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

The Philosophies and Religions

of the Roman Empire

51 | 71© Copyright 2006 C. George Boeree

Page 52: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Rome was founded c. 500 bc. By 200 bc, it ruled most Italy, and in 150 bc, it conquered Carthage, the greatest power of the western Mediterranean at the time. By 150 bc, only three cities had over 100,000 people: Antioch, Alexandria, and Rome. By 44 bc, Rome would rule them all.

When Julius Caeser was assassinated in 44 bc (pretty much as Shakespeare described it!), that ended the vigorous Roman Republic. His adopted heir, calling himself Augustus Caesar, became first emperor. The Roman Empire would reach its greatest extent in 116 ad under the Emperor Trajan.

As you can imagine, the best minds of Rome were absorbed into politics, war, and economics. Few had the luxury of abstract philosophizing. Besides which, the Greeks had done that already, and look how far it got them: Quite a number of Greek philosophers wound up as Roman slaves, tutoring the youth of Roman aristocracy!

In this atmosphere, we find a powerful renewed interest, among the rich and poor alike, in religion. The old religion of Rome was given lip service, to be sure. But most saw the gods as little more than stories to scare naughty children (except when the adults themselves got frightened!). They were looking for comfort in uncertain times, and they found philosophy too dry. Many different cults – of the Great Mother, of Dionysus, of Isis from Egypt, Mithra from Persia, Baal from Syria, Yahweh from Palestine – became popular. Eventually, the Judaic sect we now call Christianity would prevail.

[Why talk about religions and religious philosophies in a book on the history of psychology? There are actually a number of reasons. First, religion, philosophy, science, and psychology all come from the same human roots: We have a strong desire, even need, to understand the nature of the universe, our place in that universe, and the meaning of our lives. Religion included answers to these issues that have been psychologically satisfying as well as socially and politically powerful. Philosophy began separating from religion in the Greek and Roman times, and yet the great majority of people stuck with religion for their answers. In the renaissance and enlightenment, science began to separate from both religion and philosophy, and still the great majority remained loyal to religious dogma. And throughout much of history, religions have often taken a strongly anti-philosophical and anti-scientific position. Psychology inherits some of these issues, even into the modern era. It is valuable to any student of the history of philosophy, science, and psychology to understand the roots of religious belief and the power of those beliefs. – CGB]

52 | 71© Copyright 2006 C. George Boeree

Page 53: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Neo-Platonism

Roman Philosophy was rarely more than a pale reflection of the Greek, with occasional flares of literary brilliance, but with few innovative ideas. On the one hand, there was the continuation of a sensible, if somewhat plodding, stoic philosophy, bolstered to some extent by the tendency to eclecticism (e.g. Cicero). On the other hand, there was the growing movement towards a somewhat mystical philosophy, an outgrowth of Stoicism usually referred to as Neo-Platonism. It’s best known proponent was Plotinus.

Plotinus (204-269) was born in Lycopolis in Egypt. He studied with Ammonius Saccus, a philosopher and dock worker and teacher of the church father Origen, in Alexandria. Plotinus left for Rome in 244, where he would teach until his death. He would have considerable influence on the Emperor Julian "the Apostate," who tried unsuccessfully to return the Roman Empire to a philosophical version of Paganism, against the tide of Christianity.

On a military campaign to Persia, he encountered a variety of Persian and Indian ideas that he blended with Plato's philosophy:

God is the supreme being, the absolute unity, and is indescribable. Any words (even the ones I just used) imply some limitation. God is best referred to as "the One," eternal and infinite. Creation, Plotinus believed, is a continuous outflow from the One, with each "spasm" of creation a little less perfect than

the one before.

The first outflow is called Nous (Divine Intelligence or Divine Mind, also referred to as Logos), and is second only to the One – it contemplates the One, but is itself no longer unitary. It is Nous that contains the Forms or Ideas that the earlier Greeks talked about. Then comes Psyche (the World Soul), projected from Nous into time. This Psyche is fragmented into all the individual souls of the universe. Finally, from Psyche emanates the world of space, matter, and the senses.

Spirituality involves moving from the senses to contemplation of one’s own soul, the World Soul, and Divine Intelligence – an upward flow towards the One. Ultimately, we require direct ecstatic communion with the One to be liberated. This made neo-Platonism quite compatable with the Christianity of ascetic monks and the church fathers, and with all the forms of mysticism that would flourish in the following 1800 years!

Another proponent of Neo-Platonism worth mentioning is Hypatia of Alexandria (370-415). A woman of great intellect, she became associated with an enemy of the Christian Bishop Cyril. He apparently ordered his monks to take care of her. They stripped her naked, dragged her from her home, beat her, cut her with tiles, and finally burned her battered body. Raphael thought enough of her to include her in his masterpiece, The School of Athens.

53 | 71© Copyright 2006 C. George Boeree

Page 54: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Mithraism

One of the most popular religions of the Roman Empire, especially among Roman soldiers, was Mithraism. Its origins are Persian, and involves their ancient hierarchy of gods, as restructured by Zarathustra (c. 628-c. 551 bc) in the holy books called the Avestas.

The universe was seen as involved in an eternal fight between light and darkness, personified by Ahura-Mazda (good) vs. Ahriman (evil). This idea probably influenced Jews while they were in Babylon, which is when they adopted HaShatan – Satan – as the evil one!

Within the Persian pantheon, Mithra was "the judger of souls" and "the protector," and was considered the representative of Ahura-Mazda on earth.

Mithra, legend says, was incarnated into human form (as prophesized by Zarathustra) in 272 bc. He was born of a virgin, who was called the Mother of God. Mithra's birthday was celebrated December 25 and he was called "the light of the world." After teaching for 36 years, he ascended into heaven in 208 bc.

There were many similarities with Christianity: Mithraists believed in heaven and hell, judgement and resurrection. They had baptism and communion of bread and wine. They believed in service to God and others.

In the Roman Empire, Mithra became associated with the sun, and was referred to as the Sol Invictus, or unconquerable sun. The first day of the week – Sunday – was devoted to prayer to him. Mithraism became the official religion of Rome for some 300 years. The early Christian church later adopted Sunday as their holy day, and December 25 as the birthday of Jesus.

Mithra became the patron of soldiers. Soldiers in the Roman legions believed they should fight for the good, the light. They believed in self-discipline and chastity and brotherhood. Note that the custom of shaking hands comes from the Mithraic greeting of Roman soldiers.

It was operated like a secret society, with rites of passage in the form of physical challenges. Like in the gnostic sects (described below), there were seven grades, each protected by a planet.

Since Mithraism was restricted to men, the wives of the soldiers often belonged to clubs of Great Mother (Cybele) worshippers. One of the women’s rituals involved baptism in blood by having an animal slaughtered over the initiate in a pit. This combined with the myth of Mithra killing the first living creature, a bull, and forming the world from the bull's body, and was adopted by the Mithraists as well.

When Constantine converted to Christianity, he outlawed Mithraism. But a few Zoroastrians still exist today in India, and the Mithraic holidays were celebrated in Iran until the Ayatollah came into power. And, of course, Mithraism survives more subtly in various European – even Christian – traditions.

Christianity

Jesus was born, it is thought, about 6 bc. His name is Latinization of the Hebrew name Yeshua, which we know as Joshua. Legend has it that he was born in the small town of Bethlehem, to a virgin named Mary, the

54 | 71© Copyright 2006 C. George Boeree

Page 55: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

fiancée of a carpenter, Joseph. He grew up in Nazareth, part of a large Jewish family. He was apparently very intelligent and learned, for example, to read without formal education.

As a young man, he became very religious, and joined a group of ascetic Jews led by a charismatic leader named John the Baptist. When John was beheaded by local authorities for "rabble-rousing," many began looking to Jesus for leadership.

He had 12 disciples from various towns and walks of life, and literally hundreds of other followers, men, women, and children. They wandered the area, in part to spread their beliefs, in part to stay ahead of unfriendly authorities.

At first, Jesus’s message was a serious, even fundamentalist, Judaism. He promoted such basic ethics as loving one’s neighbor and returning hatred with kindness. He particularly emphasized the difference between the formal religion of the priests and Jewish ruling class and the less precise, but more genuine, zeal of the simple people. Supporting the message was his apparent ability to heal the sick.

The Jews of his time felt oppressed by their Roman overlords, and many believed that their God would intervene on behalf of his people by sending a messiah – a charismatic leader who would drive out the Romans and establish a new Jewish state.

Many of Jesus’s followers, of course, believed that he was the messiah. At some point in his career, he began to believe this, too. Unfortunately, the Jewish authorities, answerable to the Romans, were concerned with his popularity, and had him arrested in Jerusalem.

He was condemned to death and crucified. His followers were clearly disappointed that the promised Jewish state was not delivered. But rumor of his coming back to life, and his appearance as a vision to several of his followers, reignited their faith. Many believed that he would return – soon! – to lead them.

As time went by, of course, it was clear that he wouldn’t be coming back in their lifetimes. The less messianic, more religious aspects of his teaching began to be emphasized, and his notion of the kingdom of God as within us, or at least as our heavenly reward, replaced the hoped-for Jewish state.

For better or worse, Judea was actually quite metropolitan – heavily "Hellenized" if not so "Romanized." The same currents of thought in other parts of the empire where felt here as well. So the story of Jesus, as recorded in the gospels of Mark, Matthew, and Luke, began to be attached to ideas that were more properly neo-Platonist, gnostic, or even Mithraist!

The gospel of John, for example, is very different from the others, and refers to Jesus as the word, or Logos – a common Greek idea. Revelations, also attributed to John, but very different in style and content, has all the complex imagery of gnostic and Mithraist end-of-the-world stories, popular among the Jews at this time. It includes the idea of an eventual resurrection of the body – a concept that Jesus of the gospels did not promote, and which most Christians today do not believe in.

But it was Paul (c. 10 - c. 64 ad), a Romanized Jew, who would be most responsible for re-creating Jesus, whom he had never met, and never refers to by name. He is also responsible for divorcing this newly formed religion from its Jewish roots. It was Paul who introduced the idea that Jesus was the son of God and that only by faith in him could we hope to be "saved" from our inherent sinfulness.

For nearly a century, the early Christians were split into two hostile camps: One group followed Peter, one of Jesus’s original disciples. They were predominantly Jews and continued many Jewish traditions, as Jesus himself had done. The other group followed Paul, who was far more open to non-Jewish converts and waived much of Jewish law for those not born into it. The battle between these groups was, of course, won by Paul. Some critics suggest that Christianity ought to be called "Paulism!"

55 | 71© Copyright 2006 C. George Boeree

Page 56: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Both Peter and Paul were executed in Rome about 64 ad. Paul was beheaded. Peter was crucified upside-down (at his request, so as to avoid comparison with Jesus).

The Patrists, or church fathers, were the first Christian philosophers. In the eastern part of the empire, there was Origen of Alexandra (185-254); in the west, there was Tertullian of Carthage (165-220). Tertullian is best remembered for saying that he believed (in the death and resurrection of Jesus Christ) precisely because it was absurd. Origen, on the other hand, had much more of the Greek in him, and pointed out that much of the Bible should be understood metaphorically, not literally. Keep in mind, though, that Origen cut off his own genitals because he took Matthew XIX, 12 literally!

The idea of the trinity, not found in the Bible itself, preoccupied the Patrists after Theophilius of Antioch introduced the concept in 180 ad. Tertullian felt that the trinity referred to God, his word (Logos), and his wisdom (Sophia). Origen was more precise, and said that it refers to the One (the father), intelligence (Logos, here meaning the son), and soul (Psyche, the holy spirit), following the Neo-Platonic scheme. Because the concept of the trinity is a difficult one, it was the root of many different interpretations which did not coincide with the official explanation. These alternative interpretations were labelled heresies, of course, and their authors excommunicated and books burned.

Origen also did not believe in hell: Like the Neo-Platonists, he thought that all souls will eventually return to the One. In fact, it is believed that Origen and the great neo-Platonist Plotinus had the same teacher – a dock worker/philosopher by the name of Ammonius Saccus.

The Patrists' philosophies were for the most part the same: All truth comes from God, through the mystical experience they called grace (intuition, interior sense, light of faith). This clearly puts the church fathers in the same league as the neo-Platonists, and contrasts Christian philosophy with that of the ancient Greeks: To take truth on faith would be a very odd idea indeed to the likes of Socrates, Plato, Democrites, and Aristotle!

Christianity had certain strengths, with strong psychological (rather than philosophical) messages of protection, hope, and forgiveness. But its greatest strength was its egalitarianism: It was first and foremost a religion of the poor, and the empire had plenty of poor! Despite incredible persecution, it kept on growing.

Then, on the eve of battle on October 27, 312, a few miles north of Rome, Emperor Constantine had a vision of a flaming cross. He won the battle, adopted Christianity, and made it a legal religion with the Edict of Milan. In 391, all other religions were outlawed. But even then, Christianity still had competition.

Gnosticism

Gnosticism refers to a variety of religio-philosophical traditions going back to the times of the Egyptians and the Babylonians. All forms of Gnosticism involved the idea that the world is made up of matter and mind or spirit, with matter considered negative or even evil, and mind or spirit positive. Gnostics believe that we can progress towards an ultimate or pure form of spirit (God) by attaining secret knowledge – "the way" as announced by a savior sent by God.

The details of the various gnostic sects depended on the mythological metaphors used – Egyptian, Babylonian, Greek, Jewish, Christian... Gnosticism overall was heavily influenced by Persian religions (Zoroastrianism, Mithraism) and by Platonic philosophy.

There was a strong dependence on astrology (which they inherited from the Babylonians). Especially significant are the seven planets, which represent the seven spheres the soul must pass through to reach God. Magical incantations and formulas, often of Semitic origins, were also important.

When Christianity hit the stage, gnosticism adapted to it quickly, and began to promoted itself as a higher,

56 | 71© Copyright 2006 C. George Boeree

Page 57: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

truer form of Christianity. The theology looked like this:

At first, there was just God (a kind of absolute). Then there were emanations from God called his sons or aions. The youngest of these aions was Sophia, wisdom and the first female "son." Sophia had a flaw, which was pride, which then infected the rest of the universe. We need to undo this flaw (original sin) but we cannot do it on our own. We need a savior aion, who could release Sophia from the bonds of error and restore her to her status as an emanation of God.

Worship among the gnostics included baptism, confirmation, and the eucharist. In fact, it is likely that several of the non-canonical gospels were written by Christian gnostics, and some say that John was a gnostic.

Gnosticism was strongly refuted by the early Christian Church in the 100’s and 200’s, as well as by the neo-Platonists, like Plotinus, who saw it as a corruption of Plato’s thought. In fact, of course, the reason for the animosity was more a matter of how similar gnosticism was to Christianity and neo-Platonism!

Manicheanism

Manicheanism was founded by Mani, born 215 ad in Persia. At 12, he was visited by an angel, who told him to be pure for 12 more years, at which time he would be rewarded by becoming a prophet. He would eventually consider himself the seal (i.e. the last) of the prophets, a title Mohammed would later claim for himself.

Forced to leave Persia, he wandered the east, preaching a gnostic version of Mithraism, with elements of Judaism, Christianity, and Buddhism. He considered himself an apostle of Jesus. When he returned to Persia, he was imprisoned and crucified.

In Manicheanism, Ormuzd (a corrupion of the name Ahura Mazda) is the good god, the god of light, creator of souls. There is also a god of evil and darkness – sometimes referred to as Jehovah! – who created the material world, even trapping Ormuzd’s souls in bodies. Another tradition has Ormuzd placing fragments of light – reason – in the evil one’s mannequins.

So there is light trapped inside of darkness! Mani believed that salvation comes through knowledge, self-denial, vegetarianism, fasting, and chastity. The elect are those who follow the rules most stringently. Their ultimate reward is a release of the light from its prison.

His followers were severely persecuted, by Persians and Romans alike. Still, the religion spread to Asia Minor, India, China, the Middle East, even Spain. It lasted in Europe until the 10th century ad and influenced later Christian heresies such as the Bogomils and the Cathars.

57 | 71© Copyright 2006 C. George Boeree

Page 58: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

St. Augustine

St. Aurelius Augustine of Hippo (354-430) was a Manichean for 10 years before converted to Christianity in 386 ad. He would go on to become the best known Christian philosopher prior to the Middle ages.

He is best known to us for the first truly psychological, introspective account of his search for truth, in his Confessions. A hint of the intimate detail of his account can be gotten from one of his best known quotes: He prayed to God to "give me chastity and continence, but not yet!"

His philosophy is a loose adaptation of Plato to the requirements of Christianity. In order to reconcile the idea that God is good with the evil that obviously exists in the world, he turned to the concept of free will and our personal responsibility for sin. And he emphasized intentions over actions when it comes to assigning moral responsibility

There are, of course, problems with his arguments: If God is omniscient and omnipotent, he knows what we will do and in fact made us this way, so isn’t he still responsible for evil? Besides which, despite the admittedly great evil we human beings do to each other, aren’t there also natural disasters and diseases that could be considered evil, yet have nothing to do with our free will? These arguments would trouble philosophers even into the twentieth century. (See Dostoevsky’s The Brothers Karamazov for examples!)

Augustine became bishop of Hippo Regius (west of Carthage) in 395. He died in 430, during the seige of Hippo by the Vandals, a Germanic tribe that conquered North Africa (which was the "breadbasket" of Italy in those times!). You could say he lived through the fall of the Roman Empire.

The Fall of Rome

The Roman Empire was seriously declining. The economy began to stagnate. Too much money was being used to simply maintain the borders and unity of the empire. The cities began to deteriorate. City services declined, and hunger and disease severely hurt the poor. Many moved out to the country, where they found themselves working in the great latifundi – what we might call agribusinesses – as peasants and artisans. Free peasants turned over their ownership of land to these powerful landlords, in exchange for protection. In turn, these latifundi were ready-made mini-kingdoms for the barbarian chieftains who would be coming soon!

By the third century, the empire was being attacked from every direction. It was nobly defended by 33 legions (5000 men each). Internally, it was suffering from sheer size, and in 395, it officially split into two halves, the Western Roman Empire and the Eastern Roman Empire.

In the 400s, the Huns entered Europe from the Russian steppes, and got as far as Chalons, near Paris. They spread terror everywhere they went. Their empire collapsed in 476, but not before they set dozens of German tribes in motion towards the Roman Empire.

The Romans fought some off, paid some off, and let some in to protect the borders. Most of the mighty legions were eventually composed of German soldiers! One rather large tribe, the Visigoths (western Goths), began to move towards Italy from their settlements in the Balkans. In 410, they destroyed Rome. The western half of the Roman Empire was for all intents and purposes dead and in the hands of the various invaders.

58 | 71© Copyright 2006 C. George Boeree

Page 59: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

The Eastern Roman Empire was also in decline and was plagued by wars, external and internal. Emperor Justinian (527 - 565) tried but failed to reconquer Italy and sent the Eastern Empire into financial crisis. His efforts to discourage pagan philosophies and eliminate Christian heresies would eventually lead to much dissatisfaction with his rule. On the other hand, Justinian codified Roman law and adapted it to Christian theology, and he promoted great works such as the building of the Hagia Sophia, with its incredibly large dome and beautiful mosaics.

Barbarians at the gates were only part of the Empire’s problems, however. There was famine in the remnants of the Roman Empire on and off from 400 to 800. There was a plague in the 500’s. The Empire’s population dropped by 50%. The city of Rome’s population dropped 90%. By 700, only Constantinople– capital of the eastern Roman Empire – had more than 100,000 people.

In the late 600's, Arabs conquered Egypt and Syria (up till then still a part of the Eastern Empire), and even attempted to take Constantinople itself. In the 700’s, Europe was attacked by Bulgars (a Mongol tribe), Khazars (a Turkish tribe which had adopted Judaism), Magyars (the Hungarians), and others. The Eastern Empire would see the Turks take Anatolia (appropriately renamed Turkey) in 1071, and finally take Constantinople in 1453.

In the meantime, western Europe was ruled by various size gangster-like hierarchies of illiterate warriors. The great mass of people were reduced to slave-like conditions, tilling the soil or in service jobs in the greatly reduced cities. We don’t call ‘em the dark ages for nothing!

But, when the sun sets on one civilization, it is usually rising somewhere else....

Islam

So, as the Roman Empire faded into the sunset, the opportunity for other civilizations to make a mark arose. I doubt that anyone at the time would have guessed that the major contender would come from the relatively desolate western coast of Arabia. Arabia could only marginally sustain its population agriculturally. But, positioned nicely between the wealthy empires to its north and the untapped resources of Africa to its south – and later the ocean roots to India and beyond – it managed to provide its people with the option of lucrative trade.

Mohammed was born 569 ad in Mecca, a merchant town near the Red Sea. His mother died when he was six, so he was raised, first by his grandfather, later by his uncle. He was probably illiterate, but that was the reality for most Arabs of the time.

At 26, he married a wealthy widow 14 years his senior, who would be his only wife until she died 26 years later. He would have ten more wives – but no living son. He and his first wife had a daughter, Fatima, who would become a significant character in Islamic history. She married Mohammed’s adopted son, Ali.

As he got older, he became increasingly religious, and sought to learn about Judaism and Christianity. He began to meditate alone in the desert and local caves.

In 610 ad, Mohammed fell asleep in a cave, when tradition has it that the angel Gabriel appeared to him and told him he would be the messenger of God (Allah*). He would have this experience repeatedly throughout the rest of his ife. Each time, the angel would provide him with a lesson (sura) which he was to commit to memory. These were eventually recorded, and after his death collected into the Islamic holy book, the Quran (or Koran).

*Allah is the Arabic word meaning "the God." It comes from the same root as the Hebrew Elohim, and ultimately comes from the Cananite word El, which referred to the father of all the gods.

59 | 71© Copyright 2006 C. George Boeree

Page 60: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

He preached to the people of Mecca, but was met with considerable opposition from pagan leaders. When the threat of violence became clear, he left Mecca for the town of Medina, to which he had been invited, with some 200 of his followers. Here, he was much more successful, and eventually he took over secular authority of the the town.

Relations with the pagan families of Mecca continued to deteriorate, and relations with the Jews of Medina, at first promising, deteriorated as well. An alliance between the Meccan families and the Medina Jews fought Mohammed’s followers over the course of several years.

In 630, Mohammed took Mecca. Within two more years, all of Arabia was under his control, and Islam was a force to be reckoned with. Mohammed died June 7, 632.

Mohammed’s basic message was simple enough: We must accept Allah as the one and only God, and accept that Mohammed was his prophet. Say words to this effect three times, and you are a Moslem.

Islam means surrender, meaning that we are saved only by faith. Allah, being all-knowing, knows in advance who will and who will not be saved. This idea (which we will see again among the Protestants in Europe) tends to encourage bravery in battle, but it also tends to lead a culture into pessimistic acceptance of the status quo. But that would not happen to Islam for many hundreds of years!

The Quran says that some day (only Allah knows when), the dead will rise and be reunited with their souls. They will be judged. Some will be cast into one of the seven levels of hell. Some will be admitted into paradise – described in very physical, even hedonistic, terms. Much of this scenario came from the Jews, who in turn got it from the Persians.

Islam is very rule-oriented, blending the religious with the secular. Church and State are one. In the Quran, there are rules for marriage, commerce, politics, war, hygiene – very similar to the Jewish laws, which Mohammed imitated. Among those rules, Moslems are not to eat pork or dog meat and may not have sex during a woman’s period, just like the Jews. Mohammed added a rule against alcohol. The society Mohammed envisioned is approximated by such authoritarian states as Saudi Arabia and Iran today.

Marriage was encouraged, and celibacy considered sinful. Polygamy was permitted, within limits. Women, as in Judaism and Christianity, were clearly secondary to men, but were not to be considered property. They were equal to men in most legal and financial dealings, and divorce, while easy, was strongly discouraged. Likewise, although slavery was not condemned, many rules were designed to humanize the insititution.

Mohammed and the Moslems were generally accepting of Jews and Christians ("people of the book"), but intolerant of pagans. War and capital punishment were clearly condoned and practiced by the prophet: "And one who attacks you, attack him in like manner" (ii, 194).

The Arabic culture and language, and the religion of Islam, soon would dominate much of the world, from Spain and Morocco to Egypt and Palestine to Persia and beyond. For a while, it would present a progressive, tolerant face, and Moslem philosophy would rival that of the ancient Greeks.

60 | 71© Copyright 2006 C. George Boeree

Page 61: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

A Brief History of Judaism

Palestine*1 was a fertile area, warm and watered by Mediterranean rains – a most desirable location. It lay between the sophisticated societies of Egypt and Mesopotamia, making it an ideal location for trade and, of course, war.

Tradition has it that the Hebrews came with father Abraham from Ur in Mesopotamia around 2000 bc, along with Abraham's El Shaddai ("god of the mountain"). It is more likely that they were natives to the area just to the east and conquered their close relatives the Canaanites*2 to establish their historical domain. Constant warfare with neighboring peoples apparently resulted in a large number of Hebrews being enslaved by the Egyptians, which sets the stage for the singular event of Jewish history, the Exodus.

Moses, probably an Egyptian, assisted the captive Hebrew population in its hour of need, possibly by introducing Egyptian cleanliness laws in a time of plague. Around 1300 bc, he led them, it is said, back into Palestine, where they would be of enormous influence on their settled brethren.

The Hebrews organized themselves into 12 tribes, with warrior-priest chieftains, referred to in the Bible as Judges. Intertribal wars led them to seek a monarch similar to the ones they had observed in Egypt and Mesopotamia. In about 1010 bc, they found that monarch in a ruthless warlord named Saul.

Only four years later, his seat was taken by David. After defeating the Philistines – the "Sea People" (possibly early Greeks) who had settled the coast – he established Jerusalem as his capital.

In 966 bc, David was succeeded by Solomon. Under his rule, the Hebrews became rich, investing in the trade between Phoenicia and Egypt, as well as in sea routes to Arabia and east Africa. Solomon had a temple built in Jerusalem to contain the Ark of the Covenant. The Ark was a gold-covered wooden box that presumably contained the tablets of the Law that Moses received from God Himself at Mt. Sinai. It was the most sacred symbol of Yahweh, and was believed to give the Hebrews power over their enemies.

The Hebrews were originally polytheistic, even animistic. They believed in spirits and, as pastoralists, were particularly devoted to cults of the bull, the sheep, and so on. Animal sacrifice was the tradition, mostly at local altars and wilderness sites. They performed divination using dice, something which they would continue to do for many centuries.

It should be noted that much of Genesis consists of the common myths of the region (and many other regions), such as the creation story, the fall of man, the flood, and so on.

Yahweh, possibly the Canaanite god Yehu or Yaw, became the "national" god of the Hebrews. With Solomon and the Temple, he was made into the greatest god of all. He retained, as the Bible demonstrates profusely, very human characteristics: Jealousy, regret, anger, love of the scent of burnt offerings, and openness to bribery were among his qualities.

*1 Palestine is the name that the Romans gave to the area. It comes from their name for the Philistines, the people who once occupied the coast, and who may have been Greeks from Crete or Cyprus. The earliest name for Palestine was Canaan, and today, of course, we call most of it Israel.

*2 The Hebrews, the Canaanites, and the Phoenicians were ethnically the same people. Their languages were merely dialects of each other, and they shared in the use of the first alphabet.

61 | 71© Copyright 2006 C. George Boeree

Page 62: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Early beliefs did not involve the concept of hell as we now know it. There was instead Sheol, a land of darkness beneath the ground. But, like Hades among the Greeks and Hel among the Germans, it was home to nearly all who died, not just those who sinned. Unpleasant, it was not yet a place of eternal torture. But only a very few people went to heaven to live with the gods.

The religion revolved around laws – many of them, and not unlike the laws of the Hindus. Sin could be lifted by means of prayer and sacrifice, and uncleanness (such as menstruation and childbirth) by ritual purification, all controlled by the priestly caste. Beyond the Commandments, the Laws of Moses regulated all of life for the Hebrews – diet, hygiene, medicine, even sexuality.

After Solomon, the condition of the Hebrew tribes began to deteriorate. Rich and poor classes developed, and the caste of priests (descendants of Levi) became increasingly powerful. Solomon's kingdom split into Israel in the north and Judah in the south. In 722 bc, Sargon II, the Assyrian emperor, overwhelmed the entire area.

The Assyrians were a particularly brutal group and the Hebrews, like others, suffered greatly. In the era of their overlordship of Palestine, a number of religious fanatics became influential among the Hebrews. They were disdainful of the rich and of the priests, and preached that the downfall of the Hebrews was due to their own sinfulness. These preachers were, of course, the prophets of the Bible: Amos, Hosea, Elijah, and Isaiah – all preaching in the 800's bc. They introduced an idea borrowed from the larger cultures around them: the Messiah, including virgin birth and all. In the Greek translations of the Bible, Mahsiah would be translated as Christos, "the anointed one."

King Josiah ruled the area from 639 to 609. He and his priests saw the need for a codification of Hebrew traditions to provide solidarity among the people. In 622, they "discovered" (or created) a scroll presumably written by Moses, and called it the Book of the Covenant or the Law. It was probably much of Deuteronomy, and parts of Exodus (xx to xxiii?). The scroll was read outloud over two days and proved to be a hit! With that support, Josiah went on to destroy the idols to other gods in Palestine.

In 587 bc, in the midst of a war between Egypt and Babylonia, the Babylonian King Nebuchadnezzar invaded Palestine, destroyed most of Jerusalem, including the Temple. He took much of Jerusalem's population to Babylon as slaves. This was the Babylonian Captivity.

Just prior to the captivity, Jeremiah gave his warnings, and later, Ezekial reprimanded the Jews for bringing this on themselves once again. Also around this time there was a prophet, who also wrote under the name Isaiah, who developed a new image of Yahweh. His God was the only God, and he was the embodiment of love and kindness. And his ultimate victory over the evil of this world would be brought about by a Messiah.

In 539, Cyrus, King of Persia, conquered Babylonia and made Palestine part of the Persian Empire. He freed the Babylonian Jews and restored their wealth, and they returned to Jerusalem. They supplanted the non-Jewish settlers, rebuilt the Temple, and reestablished priestly rule and the Law of Moses.

Ezra, in 458 bc, had this Law read outloud. This time, it took two weeks, because the collection included the entire five volumes of the Torah. The present form of the Torah (the first five books of the Bible) was developed by 300 bc.

Modern scholars view the Torah as having four authors (or groups of authors):

• "J" (for Jehovah) called God Yahweh and was likely from Judah. He was responsible for much of Genesis, Exodus, and Numbers.

• "E" (for Elohim) used Elohim (God) instead, and was likely from Israel. He wrote the rest of Genesis, Exodus, and Numbers. J and E were probably integrated soon after 722 bc.

• "D" represents the Levite priests who put together Deuteronomy. It probably dates from not long before 622, when King Josiah "discovered" it.

• "P" (for priestly code) covers geneologies and rituals in the preceding books, plus Leviticus. It was probably written not long before King Josiah died, in 609. Some believe "P" may have been Jeremiah.

62 | 71© Copyright 2006 C. George Boeree

Page 63: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

• "R" (for redactor) combined J, E, and P into the first four books of the Torah, and then added D. Some scholars believe he may have been Ezra.

In 332 bc, Alexander the Great took Jerusalem. It surrendered without a fight. Alexander was supposedly an admirer of the Jews and their God. This introduced a long period of Greek rule – and accompanying Hellenization – which would affect Judaism greatly. Besides a translation into Greek called the Septuagint in about 200 bc, the prophets were added to the collection of scriptures during this period, as well as Proverbs, Psalms, the Song of Solomon, Job, and Ecclesiastes.

The development of a Hellenized Jewish community in Alexandria (Egypt) led to a split between those liberal Jews and the more conservative Jews of Palestine. Also, the Samaritans, who inhabited what was originally Israel, broke ranks with the Jews of Judea (Judah), keeping only the ooriginal Torah as their scripture.

In 168 bc, Simon Maccabee took Judea out of the hands of Alexander's successors (the Seleucids), and began his own dynasty. But in 63 bc, Pompeii conquered the area and made Judea a part of the Roman province of Syria.

The next hundred or so years were crucial ones for the Jews. In 37 bc, nationalistic Jews in league with Parthian invaders, revolted. The Romans had appointed Herod ("the Great") as King of the Jews two years earlier, and he repelled the invaders and eliminated their Jewish supporters. He ruled the area until 4 bc, which may have been the year in which Jesus was born.

Palestine had a population of about two and one half million at this time, with some 100,000 people in Jerusalem. Three sects became influential:

• The Sadducees were a conservative, highly nationalistic group. They did not believe in immortality. • The Pharisees believed in strict application of the Law, and added an oral tradition. They did believe

in immortality, and were more conciliatory towards the Romans. • The Essenes were an extremist monastic tradition, possibly influenced by Buddhist monastics. They

believed that a Messiah would establish the Kingdom of Heaven, to which only the "pure" would be admitted.

Over time, the government of Palestine – mostly Roman-appointed Jews – would degenerate into incompetence and corruption. Groups of Zealots (fanatics) arose who swore to kill all disloyal Jews. They killed quite a few, and many Gentiles as well. The Gentiles of the area responded in kind. Emperor Vespasian sent his son Titus with Roman legions to Palestine and Titus offered the Jews a lenient settlement. The Zealots turned him down, so the legionnaires slaughtered them.

In 70 ad, Titus ordered the Temple destroyed and the Jews dispersed – the Diaspora. Millions of Jews spread throughout the Empire, which already contained some seven million Jews – roughly 7 % of the Empire's population. With the Diaspora, the Sadducees disappeared and the Pharisees, by means of their teachers (rabbis) kept the flame alive by preaching the Law in thousands of synagogues.

Around 132 ad, there was another uprising by Jews in the Near East. The Emperor Hadrian outlawed teaching of the Law, and destroyed most of Judea. Many Jews went to Babylon, where they were fairly well treated and did quite well. In around 500 ad, they completed the Babylonian Talmud, a collection of commentaries on and explanations of the Law.

Within the Roman Empire, the Jews were granted citizenship (like everyone else) in 212 ad. They were, however, greatly disliked by other Roman citizens: They insisted on dressing differently, celebrating different holidays, eating different foods. Even more annoying was their exclusivity, their firm conviction that they were better than everyone else, and their disdain for anyone else's gods. The increasing popularity of one Jewish messianic sect – Christianity – only made things worse.

In 417 ad, Constantine, the first Christian emperor, lowered the Jew's status to secondary citizens of the Empire. They remained in that precarious position for the next 1400 years or so.

63 | 71© Copyright 2006 C. George Boeree

Page 64: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Early Christian Heresies

A heresy is a belief that deviates from some standard, official belief. When religious authorities decide that a belief is heretical, they usually take active efforts to eradicate the belief, usually including the removal of the offending believers (by excommunication or worse). Of course, one man's orthodoxy is another man's heresy!

Most Christian heresies centered around the twin issues of the nature of the trinity and, more specifically, the nature of Jesus Christ. The official stand on these issues (according to all the Catholic, Orthodox, and Protestant churches) are as follows: God is a trinity, three persons but one essence; Jesus Christ was one person, simultaneously human and divine. That these two statements are not particularly rational was considered irrelevant. The trinity was seen as mysterious and a matter of faith, not reason.

What follows are eight heresies, ranging from sects that see Jesus Christ as purely divine, to others which see him as purely human.

Sabellianism: Sabellianism is named for its founder Sabellius (fl. 2nd century). It is sometimes referred to as modalistic monarchianism. The father, son, and holy ghost are three modes, roles, or faces of a single person, God. This, of course, implies that Jesus Christ was purely divine, without humanness, and therefore could not truly have suffered or died.

Docetism: The name comes from the Greek word dokesis, meaning "to seem." Along the same lines as Sabellianism, Docetism says that Christ was not a real human being and did not have a real human body. He only seemed to be human to us. In a nutshell...

Christ only (no Jesus)

Monophysitism: Monophysite comes from the Greek words for "one body." This heresy says that Jesus Christ was a joining of the eternal Logos with the human person Jesus, which occured at incarnation. He therefore is two separate natures joined in one body. Monophysitism is very much alive in several present-day Egyptian and Middle Eastern sects of Christianity.

Jesus > Jesus ChristChrist

Adoptionism: Adoptionism says that Jesus was a human being who was "adopted" by God at his conception, at which point he developed a divine nature. Later versions sometimes suggest that he was adopted later, such as when he was baptized by John the Baptist.

Jesus > Christ

Nestorianism: Supposedly, Nestorius, Patriarch of Antioch (fl. 410), believed that Jesus Christ had two natures -- man and God -- which remained separate throughout his period on earth. This is not really what Nestor said (although he did deny virgin birth) but the name stuck. You can still find a few Nestorian churches in Iran.

Jesus......Christ......

Apollinarianism: Named for Apollinaris of Laodicea (fl. 350), this heresy says that Jesus Christ was not a

64 | 71© Copyright 2006 C. George Boeree

Page 65: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

real man, but not totally divine either. Apollinarians suggested that he had a human body and a human soul, but his mind was taken over by the eternal Logos.

Je(Christ)sus

Arianism: Arianism is named after Arius (c. 250 - c. 336), a priest in Alexandria. This is considered the most serious heresy. Jesus Christ was thought of as a special creation by God for man's salvation. Arianism was the form of Christianity that the Goths adhered to, and it was popular in all the areas they conquered, including Italy, Spain, and Africa.

Socianism: A version of Arianism called Socianism (from the Latin socius, meaning "companion), simply says that Jesus was an extraordinary man. This heresy still lives on in two very different forms, the Unitarians and the Jehova's Witnesses.

Jesus only (no Christ)

Other Heresies

Not all heresies focussed on the issues of the trinity and Christ's nature. Here are the leading examples.

Donatism: Named for its leader, the theologian Donatus the Great (d. 355), Donatism included a group of extremist sects, mostly in North Africa, that emphasized asceticism. They valued martyrdom, found lapses of faith (even under torture or threat of death) inexcusable, and believed that the sacraments required a pure priest to be effective.

Pelagianism: Another group of sects, centered in Gaul, Britain, and Ireland, is associated with the Irish monk Pelagius (fl. 410). He believed that original sin was not transmitted from Adam and Eve to their children (and thereby to us). Baptism was not considered necessary, and people could be "saved" by their own efforts, that is, they did not necessarily require the grace of God. Many modern liberal Christians agee with Pelagius.

Gnosticism: Discussed in my article on Roman philosophy and religion, the Christian versions were, obviously, considered serious heresies. Gnosticism has never entirely disappeared, and can be seen in the traditions of Alchemy and Astrology, and even in modern times in the works of Carl Jung.

Manicheanism: Also discussed in that article, Manicheanism is actually a separate religion which blends Christianity with Gnosticism, Mithraism, neo-Platonism, and even Buddhism. Again, it was considered a very serious heresy. It survived well into the Middle Ages, where it strongly influenced the Bogomils in the Balkans and the Cathars in southern France.

The Bulgarian Heresy: This heresy is worth a few extra paragraphs!

In the 10th century, there arose in Bulgaria a gnostic heresy credited to a priest by the name of Bogomil. The beliefs of the Bogomils, as they were called, were adoptionist, meaning that they considered Jesus to have been "adopted" by God at the time of his baptism, but did not consider him to be a part of a trinity. Neither did they consider Mary in any way the mother of God.

65 | 71© Copyright 2006 C. George Boeree

Page 66: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Simplicity and strict adherence characterized their practices, with priests elected from their own groups and congregations meeting at homes rather than churches. Infant baptism was not practiced, marriage was not considered a sacrament, and saints were considered false idols.

The heresy had a strong Manichean flavor to it. They believed that God had two sons, Michael and Satan. Satan created the material world and attempted to create Adam, but was unable to create a soul. God added the soul to Adam, but mankind was bound in service to Satan. Michael came to earth in the form of the holy spirit, which entered into Jesus. As Christ, he broke the original agreement which bound mankind to Satan. But it was Satan who orchestrated the crucifixion, and he is still working to recapture mankind by means of the mainstream churches.

The basic ideas of this Bulgarian heresy spread rapidly west, through northern Italy to Southern France. There, the believers called themselves Cathars, from the Greek word meaning pure. Others called them Albigensians, after the town of Albi, or Bougres, for Bulgarians. This last name is the source of the word bugger, due to accusations of sodomy.

Even stricter than the Bogomils, the Cathars attempted to live simple, exemplary lives, with the most serious believers refraining from sex and other physical pleasures. Many adopted strict veganism. They had only one sacrament, the consolamentum, which was something of a last rites in which sin was removed.

The Cathars believed that the God of the old testament was actually Satan, and that he was responsible for the creation of the material world. Jesus was therefore purely spirit (Docetism), since he would have been tainted if he had had a real body. By purity of living, anyone could cast off the physical body and awaken in heaven. The impure were doomed to rebirth into this physical world. One interesting side effect of this belief was that women were treated as equal to men, since we have all been men or women at some time in our past lives.

The Bogomils and the Cathars were harshly persecuted by the Orthodox church in the east and the Catholic church in the west. By the 14th century, the Bulgarians were absorbed by the Islamic Ottoman Empire, and the Cathars were virtually eliminated by Crusades and the Inquisition. They had laid the foundations, however, for the Reformation.

For considerably more detail on these and other heresies (from an admittedly Catholic perspective) see the online Catholic Encyclopedia at http://www.newadvent.org

66 | 71© Copyright 2006 C. George Boeree

Page 67: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Sunniis and Shiites

The major split in Islam is that between the majority Sunnis and the minority Shiites. The split goes back to events in the 7th century:

After Mohammed’s death, leadership of the Islamic community passed to Abu Bakr, one of Mohammed’s closest companions. Some in the community felt that this succession was not legitimate, and that the title of caliph really belonged to Ali. Ali’s claim was supported by the fact that he was Mohammed’s cousin, his adopted son, his first convert (at the age of nine), and husband of his daughter Fatima. Both sides believe that Mohammed specifically designated their man: Supporters of Abu became the Sunnis, those of Ali the Shiites.

The Caliphate passed from Abu Bakr to Umar, and from Umar to Ulthman. Ulthman at last passed the torch to Ali. When Ali was murdered in 661, the Caliphate passed to Muawiya, who would found the famous Umayyid Caliphate. Ali was buried in Najaf in what is now Iraq, and the site remains a major Shiite holy site.

Sunni refers to the sunnas, or oral traditions and interpretations of the Koran – a body of work similar to the Jewish Talmud. Sunnis believe that the position of Caliph should be a position to which one is elected by the religious leaders of the Islamic community, and not dependent on direct lineage from Mohammed.

Shiite comes from the word shia, which means "the party (of Ali)". They are mostly found in Iran and Iraq, and among the Palestinians. They consider certain direct descendants of Ali – the Imams – infallible and the true inheritors of Mohammed. Ali was the first Imam, his son Hassan the second, his second son Hussein the third. Ali’s sons were killed in the conflict with Caliph Muawiya. However, their succession ended with the 12th Imam, who went into hiding in 940. Most Shiites believe that the 12th Imam will reemerge someday as the Mahdi or Messiah, and reassert his leadership of the Islamic world. In the meantime, ayatollahs are elected to serve as caretakers of the faith.

Sunnis tend to be a bit more liberal (though not by modern western standards) and keep their religion simpler than the Shiites. The Shiites tend to be more intense about their religion and have a tradition of valuing martyrdom that came out of their early experiences of conflict with the Sunnis.

There are a number of splinter groups of both sects. Perhaps the most famous today is the Wahhabi sect (a Sunni splinter), of which Osama bin Laden is possibly a member. It is characterized by radical fundamentalism: The Koran is not to be interpreted but rather taken literally. There are to be no prayers or other appeals to prophets, saints, or any entity other than God. There are to be no images of or monuments to any supposed Islamic leaders, not even elaborate tombs for famous Moslems. And the Koran is to be the soul source of secular as well as religious law.

Another famous group is the Sufi movement, which can be Sunni or Shiite. Sufis are mystics who believe that God’s love shines through everything, even ugliness and evil, and that by attaining a certain state of mind, one can directly experience this. In this sense, they resemble Zen Buddhism. Sufism is also noted for its use of stories that have layered meanings, much like the parables of Jesus. One subgroup of the Sufis is the "whirling dervishes," whose mystical practice includes religious dance.

67 | 71© Copyright 2006 C. George Boeree

Page 68: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

Early Chinese and Indian History

Early Chinese History

Around 1500 bc, we see the rise of the semi-mythological Shang dynasty. This was a feudal kingdom that dominated the Yellow River basin, and established a number of small cities, most of which were in what is now Henan province. It is during the Shang dynasty that Chinese symbolic writing was developed by the dynasty priests.

In about 100 bc, we see a new dynasty – called the western Chou – centered in Loyang, also in what is now the Henan province. It consisted of many smaller feudal kingdoms with allegiance to a "head king" or emperor. Much of their cohesiveness was due to the constant need to defend themselves against the surrounding barbarians.

The eastern Chou dynasty began in 770 bc. This period was marked not only by constant warfare with the barbarians, but considerable warfare amongst the various parts of China as well. Culturally, peasants became more valued in this period (due to their importance in warfare), and the merchant class became more important. It is this period that saw the introduction of money.

During this dynasty, some of the most significant philosophers made their appearance. Confucius (551 to 479 bc) introduced a philosophy that combined ethics with religious traditions, a philosophy that would dominate Chinese political structure until the 20th century.

At about the same time, we also see Laotze introducing a more sophisticated version of traditional nature worship called Taoism, in one of the greatest books ever written, the Tao te Ching. While Confucianism would be the formal philosophy of the high court, Taoism would eventually profoundly influence the Buddhism introduced later.

From 403 to 221 bc, China was split into a number of warring states. In 221 bc, the Ch'in dynasty established its rule. Ch'in was a border state to the west of the previous centers of Chinese civilization, and we get the name China from their dynasty. The Ch'in established a highly centralized state, along the same lines as the Roman Empire, and standardized measurements, weights, and money. It was also during this time that construction of the Great Wall began, in an effort to keep out the Huns – the same people that would threaten Rome not too much later.

From 206 bc to 9 ad, we see the western Han dynasty. Han was a kingdom just south of the Chou kingdom, again in what is now Henan. The Han dynasty defeated the Huns in approximately 100 bc (sending them on their way towards Europe) and expanded their territory to the west. They also established the famous Silk Roads – routes to the Middle East used for trade with Persia, Rome, and India.

From 25 to 220 ad, the eastern Han dynasty took over, and oversaw a great "flowering" of their civilization. Trade with Rome and others in silk and porcelain was booming. Paper was invented about 100 ad, and Buddhism began to make inroads from northwestern India and Greek kingdom of Bactria (part of what is now Afghanistan).

From 220, we have the period of three kingdoms, followed by a period where China was divided into seperate northern and southern empires. The north was invaded by a combination of Huns and Turkish tribes, while the south went through a series of dynastic changes. In 379 ad Mahayana Buddhism became the official religion (living in harmony with Confucianism and blending with Taoism).

68 | 71© Copyright 2006 C. George Boeree

Page 69: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

China was reunified in 581 under the Sui dynasty, whose policies were taken over in 617 by the T'ang dynasty. Notable during this period, the written exam system of civil service became established in 606 ad. This system would continue until the communists took over in 1951. The T'ang dynasty lasted until 907.

The 900's was a period of rapid dynastic turnover, and we see a reversal of the fortunes of the Buddhists, who were actively persecuted. In 960, the northern Sung dynasty provided stability, although only by paying tribute to the Mongols. The southern Sung took over from 1127 until 1279, still paying tribute to the Mongols, but overseeing a second renaissance of culture and economics. During this period, the Chinese language was codified by Chu Hsi (1131 - 1200), literature, painting, and porcelain flourished, and both printing and gunpowder were invented.

In 1196, Genghis Khan became the supreme ruler of the Mongols and their Turkish and Tartar allies, and proceeded to lead them into China, taking Beijing in 1215. At the same time, he sent his troops west as far as Poland and Hungary. When he died in 1227, his empire was split into several smaller units ruled by his various sons. The Mongols would continue to rule the steppes well into the 1400's, Ivan III finally liberating Moscow in 1480!

Marco Polo, a Venetian adventurer, visited China during this period, and brought back stories of wealth that would make Chinese goods nearly as sought after as they had been during the Roman Empire. Sadly, in 1325, China suffered from one of it's greatest famines, which killed 8 million out of its 45 million population.

In 1368, the Mongols were driven out of China, and the Ming dynasty begins. It had a strong contralized government founded on solid Confucian principles. The capital was moved to Beijing in 1421, where it would remain until the present day. The Great Wall was extended to 2450 km (about 1500 miles).

The Ming dynasty oversaw another renaissance, with novels, maps, great architecture, porcelain, and a new medical technique we call acupuncture. On the other hand, they didn't want too much to do with the world beyond the empire: European trade was limited to the Portuguese colony of Macao.

From 1644 all the way to 1911, China was again ruled by "barbarians," this time the Manchu from the northeast of China. The Manchus, being of limited numbers, were anxious to use the existing structures of Chinese bureaucracy and blended themselves with the native population as much as possible. In fact, they saw the greatest population growth in history and expanded the empire to its present extent. At first, they encouraged trade with the Europeans, but later would close the empire to foreign trade. As we know, the Europeans are rarely detered when such a vast market looms on the horizon, and the colonial empires – especially the British – would chip away at the glory that had been China.

Early Indian History

In somewhere arond 1500 bc, a group of people who called themselves Aryans invaded the Indian subcontinent, and came to dominate most of the original Dravidian people. The Aryans spoke a language distantly related to the western European languages, and came from the Russian steppes. They brought with them what is known as the Vedic religion, which would eventually result in a series of books called the Vedas.

As the Aryans settled in, they developed the caste system. The top two castes were composed entirely of Aryans: the Kshatryas or warriors, and the Brahmins or priests. Below them were a mixed group of peasants called the Vaishas, and the subject Dravidians, called the Shudras. Below all of these were the various people of the jungles, as well as the slaves of the original Dravidians, which were called the Pariahs or outcastes.

69 | 71© Copyright 2006 C. George Boeree

Page 70: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

The hierarchical society would last officially until the British rule, and continues informally even today.

Around 500 bc, several people, in the process of searching for enlightenment, would shake the caste system: First, there was Siddhartha Gautama, better known as the Buddha. He preached a stoic life style involving moral living and meditation that would develop into the rich philosophy of Buddhism. The other was Vardhamana, called Mahavira, and his follower Jina, who believed that suffering was due to the mixing of spirit with base matter, which must be separated from each other by means of fasting, asceticism, and chastity. Their beliefs would become the religion called Jainism.

In the late 300's bc, the troops of Alexander the Great knocked at India's door, and would remain a significant presence in Bactria, just northwest of India. These Greeks would be the only westerners to adopt Buddhism, and they would take part in introducing Buddhism into China. Chandragupta, king of Maghada in eastern India (where Buddha preached), established the Maurya Empire, controlling most of northern India.

His grandson Ashoka (272 - 231 bc) is one of the most famous figures in Indian history. After a particularly bloody battle, he swore off killing and embraced Buddhism. Among other things, he established laws based on Buddhism and recorded them on stone pillars and monuments all over northern India. He also sent missionaries as far west as Egypt and Greece, whose effects on western thought are still unknown. Unfortunately, his empire was divided among his descendants after his death, and India again became a land of many small feudal states.

The next major event comes around 50 ad, when Yüeh-chih (an Indo-European people from western China called the Tocharians or Kushans) invaded India from their base in Bactria. In 320 ad, and lasting until 535, the Gupta Empire would permit a cultural renaissance, including a blossoming of poetry, drama, and other literature.

Beginning around 430 ad, the Huns would start nibbling away at the Gupta Empire until its collapse. This was followed by another period of short-lived empires and smaller states.

From 700 ad on, we see a major change in the subcontinent. First, Buddhism, the dominant religion of India, would be gradually driven out by the Brahmin caste and its supporters, and replaced with a revitalized, if very conservative, Hinduism. Second, the Moslems would enter India from the west and slowly expand to rule over the northern half of the subcontinent, all the way to Bengal (what is now Bangladesh). In 1206, the Sultanate of Delhi was established, an empire based on Moslem theocracy and military might. Nevertheless, India prospered during this period, and greatly expanded trade with the Near East. The Sultanate would last until 1526.

Despite Moslem rule, the caste system continued, now with Moslem rulers at the top, and the native Indians were kept poor through harsh taxation. The Moslems accepted Hindus as "people of the book" (what they called Jews and Christians in the west, because they shared the same Biblical traditions as the Moslems), as long as they kept to their place in society. Buddhism, however, they found threatening, and Buddhist monasteries, temples, and books were destroyed. This has continued even to the present, as with the destruction of ancient giant Buddhist statues in Afghanistan in 1998.

It was in 1498 that the Portuguese discovered the sea route to India, circumventing the hostile Moslem empires inbetween, and established the trading settlement that would become Calcutta. In the early 1500's they went on to colonize Goa, Ceylon, Bombay, and other coastal spots.

The Moguls – led by Babar, descendant of the Khans – invaded India from their stronghold in Kabul

70 | 71© Copyright 2006 C. George Boeree

Page 71: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part One: The Ancients

(Afghanistan), and defeated the Sultan of Delhi. By 1576, they would take over all of northern India. The Moguls, although Moslem, were very tolerant of the Hindus and even the Jesuits, and declared the Edict of Toleration in 1583. A number of syncretic sects developed during this time, the most famous of which is Sikhism. The Sikhs were founded by Nanak (1469 - 1538), who blended Islam and Hinduism and other philosophies into a strong egalitarian religious culture, where each man takes as his last name "lion" and each woman "princess." To this day, the Sikhs provide the backbone of the Indian military.

Tthe Arab Moslems and the Moguls, although outsiders, brought another period of renaissance to India. They established libraries and universities, contributed greatly to literature (including updates of the great Indian religious texts), and founded a new style of Indian architecture, exemplified by the great Taj Mahal.

In 1612, a new player entered the scene: The British took over the Portuguese colonies. They would eventually rule all of India and much more.

The History of Psychology Part Two: The Rebirth

Part Three: The 1800's

Part Four: The 1900's

[ http://www.ship.edu/%7Ecgboeree/historyofpsych.html ]

71 | 71© Copyright 2006 C. George Boeree

Page 72: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

E-Text Source:[ http://www.ship.edu/%7Ecgboeree/historyofpsych.html ]

1 | 73© Copyright 2006 C. George Boeree

Page 73: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Index

Index 2

The Middle Ages 3

[ The Dark Ages | The Universities | The problem of universals | Nominalism | Abelard | The Moslems | St. Thomas | The Beginning of the End of the Middle Ages ]

A Letter from Heloise to Abelard 11

Timeline: From 1000 to 1400 13

Map: Europe 1278 13

The Beginnings of Modern Philosophy [ Humanism | The Reformation | Science | Francis Bacon | Galileo Galilei | René Descartes | Education ]

14

A Letter from Galileo Galilei 23

René Descartes Selection: Meditations 25

Quotations from John Comenius 28

Timeline: From 1400 to 1800 30

Map: Europe 1700 30

Epistemology 31

The Enlightenment 36

[ Thomas Hobbes | Benedictus Spinoza | John Locke | George Berkeley | Gottfried Wilhelm Leibniz | Pierre Bayle ]

August Comte's Calendar 44

Benedict Spinoza's Emotions 47

Metaphysics 49

David Hume and Immanuel Kant 55

The Rights of Man 63

The Rights of Woman 65

Ethics 66

[ Theological Theories | Moral Relativism | Moral Realism ]

Overlapping Moralities 72

2 | 73© Copyright 2006 C. George Boeree

Page 74: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

The Middle Ages

3 | 73© Copyright 2006 C. George Boeree

Page 75: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

The Dark Ages

Sometime after the fall of Rome, we come to the Dark Ages. Most of Europe was decentralized, rural, parochial. Life was reduced to the "laws of nature:" The powerful ruled, while the powerless looked only to survive. There was no sense of history or progress. Superstition and fatalism prevailed. Belief in the

imminent end of the world was common every century. You can get a fair approximation to European life in dark and early middle ages by looking at some of the developing nations of the world, although you would have to take away all signs of the past thousand years of technological development!

Alcuin (735-804) – Charlemagne’s head scholar – is one of the few names that come down to us from this period. Other than his Christianity, a glimmer of his view of reality can be gleaned from this quote: "What is man? The slave of death, a passing wayfarer. How is man placed? Like a lantern in the wind."

Nevertheless, Charlemagne (768-814) provided a political unity, and the Pope a religious unity, and a new era slowly began. Eventually, the Church took over Europe, and the Pope replaced the emperor as the most important figure. By 1200, the Church would own a third of the land area of Europe! The power of the church and its common creed meant enormous pressures to

conform, backed up by fear of supernatural sanctions. But on the positive side, the papacy helped establish stability and ultimately prosperity.

We now turn to what are called the Middle Ages, roughly the period from 1000 to 1400 ad.

The Universities

Universities developed out of monastery and cathedral schools – really what we would call elementary schools, but attended by adolescents and taught by monks and priests. The first was in Bologna, established in 1088 (see map below).

In these schools and universities, students began (with the always-present threat of flogging!) with the trivium – grammar (the art of reading and writing, focussing on the psalms, other parts of the Bible, and the Latin classics), rhetoric (what we would call speech), and logic. Trivium, of course, is the origin of the word trivia – the stuff beginners deal with!

Beyond that, they would study the quadrivium: arithmetic, geometry, music, and astronomy. All together, these subjects make up the seven liberal arts. Liberal referred to the free man, the man of some property, and liberal arts were in contrast to the practical arts of the working poor.

4 | 73© Copyright 2006 C. George Boeree

Page 76: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

The problem of universals The major philosophical issue of the time was the nature of universals. This concerns the meaning of a word. What in the real world does a word refer to? This is easy with proper nouns (names): George, for example, refers to this person here, me myself. But what about other, more general words? What does cat refer to? This was by no means a new issue, but the scholars of the middle ages began without the benefit of nice Greek sources!

St. Anselm of Canterbury (1033-1109) was a neoPlatonist, and he is best known for his efforts at coming up with a logical proof of God’s existence – the famous ontological proof: Since we can think of a perfect being, he must exist, since perfection implies existence.

In regards to the question of universals, he was a proponent of realism. Realism was Plato’s perspective: There is a real universal or ideal (somewhere) to which a word refers. This usually fits in well with Christianity. If humanity is real beyond being just the collection of individual human beings, we can talk about a human nature, including, for example, the idea of original sin. If there were no such thing as humanity, if each person were a law unto him or herself, then we could hardly lay the sins of Adam and Eve on anyone but Adam and Eve!

Likewise, if God is a real universal, then there is no logical incongruity about saying he is Father, Son, and Holy Spirit all at once.

Mind you, the argument isn’t without problems. For example, the ultimate universal – All – is then logically greater than God, because All must include God and creation! But Christianity says that God and creation are separate and fundamentally different.

Anselm’s motto was Augustine’s "I believe in order that I may understand" (credo ut intelligam): Faith is an absolute requirement, and is the standard for all thinking. Truth is revealed by God, so submit yourself to the church.

5 | 73© Copyright 2006 C. George Boeree

Page 77: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Nominalism

Roscellinus of Amorica in Brittany (1050-1121) was the founder of nominalism, another approach to universals. A universal, he said, is just a "flatus vocis" (a vocal sound – i.e. a word). Only individuals actually exist. Words, and the ideas they represent, refer to nothing, really. This is quite compatable with materialism and empiricism, but not, really, to Christianity.

It, too, is not without problems: If words are nothing but air, then reason (and philosophy), which is the manipulation of these words, is nothing but blowing air (as many students in fact believe). That includes, of course, the reasoning it took to come to the nominalist conclusion!

Regarding the church, nominalism means that the church is nothing but the people that compose it, and religion is just what individuals think. And, if God is Father, Son, and Holy Ghost, then we can't be monotheists.

Abelard

Peter Abelard (1079-1142) was a student of both Anselm and Roscillinus. A brilliant thinker and speaker and a canon (priest) of the Cathedral of Notre Dame, he became a popular teacher at the University of Paris.

In 1117, he met a sixteen year old girl named Heloise. An orphan, she was being raised by her uncle Fulbert. She was particularly intelligent, as well as beautiful, and so her uncle asked Abelard if he would tutor her in exchange for room and board. Abelard himself commented that this was like entrusting a lamb to a wolf!

His teaching suffered a bit. He was more likely to compose love poems than lectures! But Heloise became pregnant and had a son they named Astrolabe (!). Her uncle was furious, but Abelard promised to marry Heloise, if Fulbert would keep the marriage a secret. The only way he could become a priest while married would be for her to become a nun, which was unacceptable to either of them. She was willing to be his mistress, but he convinced her to marry him in secret.

Well, Fulbert remained upset by all this, and eventually sent some men to teach Abelard a lesson: They cut off his genitals! The people of Paris (being French, even in the Middle Ages) had complete sympathy with their hero Abelard, but Abelard himself was mortified. Heloise became a nun, and Abelard a monk in order to pay for their sins. They exchanged letters for many years, and her first to him can be seen by clicking here.

Abelard was, however, persuaded to continue teaching and writing. Arguing, among other things, that the trinity referred not to the Father, Son, and Holy Spirit, but to God’s power, wisdom, and love, he began to irritate some of the people with power in the church. The Pope issued an order condemning Abelard to perpetual silence and confinement to a monastery (the usual for heresy at this time). On his way to Rome to defend himself, he died at 63. Heloise convinced his Abbot to bury him at her convent, and twelve years later, she died and was buried next to him.

Abelard invented "sic et non" – yes and no, pros and cons – in a book by the same name. Sic et non is a Socratic method that lays the arguments of two opposing points of view side by side for comparison. Abelard is very much the rationalist, and he made his motto "I understand in order to believe" (intelligo ut credam).

6 | 73© Copyright 2006 C. George Boeree

Page 78: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

He believed that the truth of faith and reason must still agree, as did all his teachers, but reason has precedence. It is faith that has to adapt, i.e. the church must re-evaluate the meaning of its teachings when they fail to measure up to reason.

For Abelard, ethics is a matter of conduct inspired by a good heart, good will, good intentions. If you have a good conscience, you can do no wrong (sin). You can only be mistaken. He had said, for example, that when Romans killed Christians (including Christ himself), they were only acting according to their conscience, and therefore were not guilty of sin!

He is best know, however, for conceptualism, his attempt to synthesize nominalism and realism. Although the thing and its name have a reality of their own, universals exist in the mind as ideas, he said, which refer to groups of things and are represented by words. The mind creates abstractions out of real things by detecting similarities, so the meaning of the word cat is the mental abstraction we created by looking at individual cats and noting that they all have four legs, fur, pointy ears, two eyes with funny pupils, meouw, etc. etc. This is still an important perspective in modern cognitive psychology.

This answer to the question of universals is, as you might have guessed by now, still not without problems. Notice that we are assuming that we can use words like legs, ears, eyes.... But what do they refer to? They can only refer to the mental abstractions we make of individual legs, ears, eyes.... So how do you tell you are looking at a leg? Well, it's a mental abstraction we make out of flesh with a hip joint, a knee, and a foot at the end. So what is a knee? Well, it's.... At what point do we reach a unique thing?

[Personally, I believe that these abstractions or characteristics are based on errors, that is, when individual things are easily mistaken for each other!]

The Moslems

The Near-Eastern and North African remnants of the Roman Empire fell as far as any other parts. Mohammed (570-632) brought Islam – "Surrender" – into the world, and it spread like wildfire, both by sword and by persuasion. So, with Islam and reunification under a series of Arab caliphates, the dark of the dark ages lifted a bit earlier there. In Baghdad, Damascus, Cairo, even Seville in newly-conquered Spain, scholars turned to the ancient Greeks and began again to reason and observe. The security, stability, wealth, and relative tolerance of their society inspired them to produce literature, including philosophy, that by the millenium, nearly equalled that of ancient Greece.

Avicenna of Baghdad (Ibn Sina, 980-1037) was one of these great thinkers. Thoroughly familiar with Aristotle, he was none-the-less a neo-Platonist and a gnostic, as it seems all Moslem philosophers must be in order to remain Moslem. Generally, he felt that reason and faith could not conflict, as the Christian thinkers had concluded as well. But he hints at heresy by suggesting that such items of faith as the physical paradise after death that Mohammed promised his followers, were necessary in order to win over the masses, but are just stories to the mature believer.

Averroes of Cordova (Ibn Rushd, 1126-1198) is the greatest of the Islamic philosophers. He began as a lawyer, and was chief justice of Seville and later of Cordova. He was also a physician, and served as the court physician in Marrakesh. He was the first to recognize that if a person survived smallpox, they would be immune thereafter. He described for the first time the purpose of the retina. He wrote an encyclopedia of medicine used in both Moslem and Christian universities.

Averroes begins, of course, with God. God is what sustains reality. God is the

7 | 73© Copyright 2006 C. George Boeree

Page 79: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

order of the universe. But, he says, creation is just a myth. The universe has always existed, and will always exist.

The human mind has two aspects. There is a passive intellect, which is composed of the potential for thought and carries the details that make one personality different from another, both physically and psychologically. It is a part of the body and dies with it. And there is an active intellect, which energizes the passive intellect. It is actually the same in each person, is the only part of us that survives death, and is, in fact, God.

But Islam’s openness to philosophy was not to last. The Emir of Baghdad ordered Averroes’ books burned, and his example was followed by other leaders all the way back to Averroes’ homeland of Spain. The world of Islam had achieved what the Christian world failed to achieve: complete domination by religion.

By means of Moslem Spain and Sicily, Avicenna and Averroes and others would come to inspire, in turn, the Christian scholars of the new universities of Europe. These scholars would consume the writings of Greek, Jewish, and Arabic scholars.

St. Thomas

In the late Middle Ages (the 1200s), Aristotle excited a lot of thought in the monks and scholars of the universities. These neo-Aristotelians were called schoolmen, or scholastics. By studying Aristotle and his Arab and Jewish commentators, they learned to think more logically, but their goals were still essentialy theological.

The scholastic par excellence was St. Thomas Aquinas (1225 - 1274).

Of German stock, he was the son of the Count of Aquino, a town between Rome and Naples. He went to the University of Naples, where there was great interest in Arab and Jewish philosophers – and, of course, Aristotle. He became a monk of the Dominican order and went to Paris to study.

His mother was so upset by this turn of events that she sent his brothers to kidnap him and bring him home. (Contrary to what we might assume, families were seldom happy when sons or daughters went off to become monks or nuns. They often grieved for them as if they had died!) He escaped and continued his studies in Paris and elsewhere.

He was known to be a very pious and modest man, with no ambitions for church promotions – unlike the ambitious Abelard! He wrote a great deal, but is best known for the Summa Theologiae, usually just called the Summa, a work of 21 volumes in which he uses Abelard's Sic et Non method to reconcile Aristotle and Christianity.

Thomas believed that the soul is the form of the body, as Aristotle said, and gives it life and energy. But the soul and the body are totally linked together. This flies in the face of the Platonic and neo-Platonic ideas of the church fathers, and irritated the mystical Franciscan monks most of all.

Thomas added that the soul without the body would have no personality, because individuality comes from matter, not spirit, which represents the universal in us. For this reason, resurrection of the body is crucial to the idea of personal immortality. Averroes’ idea that only an impersonal soul survives death was, in other words, quite wrong.

Thomas saw five faculties of the soul:

1. The vegetative faculty, which is involved in food, drink, sex, and growth.

8 | 73© Copyright 2006 C. George Boeree

Page 80: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

2. The sensitive faculty, i.e. our senses, plus the common sense that binds sensations together. 3. The locomotor faculty, which permits movement. 4. The appetitive faculty, which consists of our desire and will. 5. The intellectual faculty, i.e. thought, reason.

For St. Thomas, reason or intellect is man’s greatest treasure, that which raises him above the animals. In keeping with conceptualism, he felt it was the intellect that abstracts the idea (form or universal) from its individual appearance, so that, even though day-to-day experience can tell us about the particulars of reality, only reason or intellect can lead us to universal laws of the physical, or the human, world.

Ultimately, we do need direct, intuitive knowledge of God. Reason depends on sensory experience, and sensory experience is of matter, not spirit. So reason, like all things human, is imperfect, and cannot comprehend the perfection that is God. Faith is our ultimate refuge. Nevertheless, he insisted, faith and reason do not conflict, since God would not have made a world that did not ultimately match up with revealed truth.

In spite of his obvious brilliance, St. Thomas (like all philosophers in all ages) was a man of his time. For example, he was as chauvinistic as any of his predecessors regarding women: He considered women inferior by nature (and God’s design), and saw them as a serious threat to the moral progress of men. He also devoted a significant portion of the Summa to angels and demons, which he thought of as every bit as real as anything else. Among other things, he believed that the angels moved the planets, that they had no bodies, that they moved instantaneously, and that each person had his or her very own guardian angel.

His ideas threatened many in the church, most especially the Franciscans. His works emphasized reason too much and faith too little. He put too much stock in pagans like Aristotle and Averroes. And he taught that the soul and the body were unified! After his death (at the age of 49), the Franciscans convinced the Pope to condemn him and his writings. But the Dominicans rallied to his defense, and in 1323 Thomas was canonized.

(In 1879, Pope Leo XIII made Thomism the official philosophy of the Catholic church. It is, with Marxism, positivism, and existentialism, one of the four most influential philosophies of the 20th century).

The Beginning of the End of the Middle Ages

The Franciscans, as I said, were the primary critics of St Thomas. Roger Bacon (1214-1294), a Franciscan monk and scientist, pointed out that reason does actually need experience in order to have something to reason about – a hint of modern empiricism in the Middle Ages!

But St. Thomas’ severest critic was John Duns Scotus (1265-1308), a Franciscan monk and professor at Oxford, Paris, and Cologne. He believed that the authority of the church was everything. The will is supreme and intellect is subordinate to it. Although a conceptualist (like Thomas), of the thing, the idea, and the name, he felt that it was the individual thing that was the most real. His student William would take that and run with it.

William of Occam in England (1280-1347) was another Franciscan monk. Like Roger Bacon, he believed that, without sensory contact with things, the universal is inconceivable. In fact, he said universals are only names we give groups of things – a return to the nominalism of Roscellinus.

William is best known for the principle that is named for him: Occam’s razor. "Don’t multiply causes unnecessarily." usually interpreted to mean that the simplest explanation is the best. Over time, this came to mean "if you don’t need a supernatural explanation, don’t use it!"

9 | 73© Copyright 2006 C. George Boeree

Page 81: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

The result of William's thinking is skepticism: Without universals, there are no generalizations, categories, classifications, theories, laws of nature, etc. All we can have is an accumulation of facts about individual entities. We will see this again in the philosophy of David Hume.

William of Occam, although he was a devout Christian, is often considered the turning point from the religious worldview of the Middle Ages to the scientific worldview of the Renaissance and the Modern era.

You could say that philosophy rested a while around this time, not for a lack of ideas, but because of over a hundred years of Troubles. There was a great famine in Europe from 1315 to 1317. The economy spiralled downward and the banks collapsed in the first few decades of the 1300s. The Hundred Years War began in 1337 and lasted about 120 years (despite the name). The Black Death, a plague carried by the fleas on rats, came from the Near East and killed over one third of the population between 1347 and 1352. Peasant revolts in England, France, and elsewhere were cruelly suppressed between 1378 and 1382. The Church was split between two popes, one in Rome and one in Avignon, between 1378 and 1417.

But these events, horrible as they were, turned out to be temporary setbacks, and an even greater explosion of intellectual activity was about to begin!

10 | 73© Copyright 2006 C. George Boeree

Page 82: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Heloise's First Letter to Abelard * (a selection)

To her master, nay father, to her husband, nay brother; from his handmaid, nay daughter, his spouse, nay sister: to ABELARD, from HELOISE. ....

And if the name of wife appears more sacred and more valid, sweeter to me is ever the word friend, or, if you be not ashamed, concubine or whore. To wit that the more I humbled myself before you the fuller grace I might obtain from you, and so also damage less the fame of your excellence. And you yourself were not wholly unmindful of that kindness in the letter of which I have spoken, written to your friend for his comfort. Wherein you have not disdained to set forth sundry reasons by which I tried to dissuade you from our marriage, from an ill-starred bed; but were silent as to many, in which I preferred love to wedlock, freedom to a bond. I call God to witness, if Augustus, ruling over the whole world, were to deem me worthy of the honor of marriage, and to confirm the whole world to me, to be ruled by me forever, dearer to me and of greater dignity would it seem to be called your strumpet than his empress.

For it is not by being richer or more powerful that a man becomes better; one is a matter of fortune, the other of virtue. Nor should she deem herself other than venal who weds a rich man rather than a poor, and desires more things in her husband than himself. Assuredly, whomsoever this concupiscence leads into marriage deserves payment rather than affection; for it is evident that she goes after his wealth and not the man, and is willing to prostitute herself, if she can, to a richer. As the argument advanced (in Aeschines) by the wise Aspasia to Xenophon and his wife plainly convinces us. When the wise woman aforesaid had propounded this argument for their reconciliation, she concluded as follows: "For when you have understood this, that there is not a better man nor a happier woman on the face of the earth; then you will ever and above all things seek that which you think the best; you to be a husband of so excellent a wife, and she to be married to so excellent a husband." A blessed sentiment, assuredly, and more than philosophic, expressing wisdom itself rather than philosophy. A holy error and a blessed fallacy among the married, that a perfect love should preserve their bond of matrimony unbroken, not so much by the continence of their bodies as by the purity of their hearts. But what error shows to the rest of women the truth has made manifest to me. Since what they thought of their husbands, that I, that the entire world not so much believed as knew of you. So that the more genuine my love was for you, the further it was removed from error.

For who among kings or philosophers could equal you in fame? What kingdom or city or village did not burn to see you? Who, I ask, did not hasten to gaze upon you when you appeared in public, nor on your departure with straining neck and fixed eye follow you? What wife, what maiden did not yearn for you in your absence, nor burn in your presence? What queen or powerful lady did not envy me my joys and my bed? There were two things, I confess, in you especially, wherewith you could at once captivate the heart of any woman; namely the arts of making songs and of singing them. Which we know that other philosophers have seldom followed. Wherewith as with a game, refreshing the labor of philosophic exercise, you have left many songs composed in amatory measure or rhythm, which for the suavity both of words and of tune being oft repeated, have kept your name without ceasing on the lips of all; since even illiterates the sweetness of your melodies did not allow to forget you. It was on this account chiefly that women sighed for love of you. And as the greater part of your songs descanted of our love, they spread my fame in a short time through many lands, and inflamed the jealousy of many against me. For what excellence of mind or body did not adorn your youth? What woman who envied me then does not my calamity now compel to pity one deprived of such delights? What man or women, albeit an enemy at first, is not now softened by the compassion due to me?

* Source: The Letters of Abelard and Heloise, translated from the Latin by C.K. Scott Moncrieff, (New York: 1925). Made available by Miss MariLi Pooler, Brooklyn NY. Language modernized slightly by myself.

This text is part of the Internet Medieval Source Book. © Paul Halsall December 1997. Available at: http://www.fordham.edu/halsall/source/heloise1.html

11 | 73© Copyright 2006 C. George Boeree

Page 83: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

And, though exceedingly guilty, I am, as you know, exceeding innocent. For it is not the deed but the intention that makes the crime. It is not what is done but the spirit in which it is done that equity considers. And in what state of mind I have ever been towards you, only you, who have knowledge of it, can judge. To your consideration I commit all, I yield in all things to your testimony. Tell me one thing only, if you can, why, after our conversion, which you alone did decree, I am fallen into such neglect and oblivion with you that I am neither refreshed by your speech and presence nor comforted by a letter in your absence. Tell me, one thing only, if you can, or let me tell you what I feel, nay what all suspect. Concupiscence joined you to me rather than affection, the ardor of desire rather than love. When therefore what you desired ceased, all that you had exhibited at the same time failed. This, most beloved, is not mine only but the conjecture of all, not peculiar but common, not private but public. Would that it seemed thus to me only, and your love found others to excuse it, by whom my grief might be a little quieted. Would that I could invent reasons by which in excusing you I might cover in some measure my own vileness.

Give your attention, I beseech you, to what I demand; and you will see this to be a small matter and most easy for you. While I am cheated of your presence, at least by written words, whereof you have an abundance, present to me the sweetness of your image. In vain may I expect you to be liberal in things if I must endure you niggardly in words. Until now I believed that I deserved more from you when I had done all things for you, persevering still in obedience to you. Who indeed as a girl was allured to the asperity of monastic conversation not by religious devotion but by your command alone. Wherein if I deserve nought from you, you may judge my labor to have been vain. No reward for this may I expect from God, for the love of Whom it is well known that I did not anything. When you hastened to God, I followed you in the habit, nay preceded you. For as though mindful of the wife of Lot, who looked back from behind him, you delivered me first to the sacred garments and monastic profession before you gave yourself to God. And for that in this one thing you should have had little trust in me I vehemently grieved and was ashamed. For I (God knows) would without hesitation precede or follow you to the Vulcanian fires according to your word. For not with me was my heart, but with you. But now, more than ever, if it be not with you, it is nowhere. For without you it cannot anywhere exist. But so act that it may be well with you, I beseech you. And well with you will it be if it find you propitious, if you give love for love, little for much, words for deeds. Would that your love, beloved, had less trust in me, that it might be more anxious! But the more confident I have made you in the past, the more neglectful now I find you. Remember, I beseech you, what I have done, and pay heed to what you owe me. While with you I enjoyed carnal pleasures, many were uncertain whether I did so from love or from desire. But now the end shows in what spirit I began. I have forbidden myself all pleasures that I might obey your will. I have reserved nothing for myself, save this, to be now entirely yours. Consider therefore how great is your injustice, if to me who deserve more you pay less, nay nothing at all, especially when it is a small thing that is demanded of you, and right easy for you to perform.

And so in His Name to whom you have offered yourself, before God I beseech you that in whatsoever way you can you restore to me your presence, to wit by writing me some word of comfort. To this end alone that, thus refreshed, I may give myself with more alacrity to the service of God. When in time past you sought me out for temporal pleasures, you visited me with endless letters, and by frequent songs did set your Heloise on the lips of all men. With me every public place, each house resounded. How more rightly should you excite me now towards God, whom you excited then to desire. Consider, I beseech you, what you owe me, pay heed to what I demand; and my long letter with a brief ending I conclude. Farewell, my all.

12 | 73© Copyright 2006 C. George Boeree

Page 84: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Timeline 1000 to 1400 ad

Map Europe 1278

13 | 73© Copyright 2006 C. George Boeree

Page 85: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

The Beginnings of Modern Philosophy

14 | 73© Copyright 2006 C. George Boeree

Page 86: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Things began to get better: The economy improved; There was a return to building, cathedrals, universities, cities...; a return to "luxuries" – paper, plays, music...; a return to invention – the compass, printing...; a return to exploration – Africa, the New World, the Pacific.... It was the renaissance, which we date (roughly!) from 1400 to 1600 or so. They were vigorous times, interesting times, dangerous times!

The aristocracy had won the day over the fledgling monarchies and even the church’s heavy hand, at least for now. So there were just tons of these upper-crust types, often with lots of money, totally in love with the idea of themselves. Religious and other thinkers were freed, to one extent or another, from the powerful central authority of the church to create their own, very reasonable or totally outlandish, religious philosophies. And merchants found that money can buy almost anything, including the traditional respect that the aristocracy received. In fact, aristocratic title and merchant wealth were a perfect combination for a good marriage!

These aristocrats and merchants believed in the "perfectibility" of mankind: We could become better human beings! Most importantly, we could become more powerful, richer – more aristocratic, if you will. Much attention was paid to behaving like a gentleman or a lady, as reflected, for example, in Baldesar Castiglione’s guide to proper conduct, The Book of the Courtier.

They were practical, interested in real events and real people in the real world. Individualistic and competitive (and very "dog eat dog"), they liked their politics, and they like to play rough.

But, they were also anti-intellectual, even cocky in their ignorance. They tended to think of scholars as dry, impractical types, who might be able to forecast eclipses, but probably couldn’t tie their own shoes, much less make money or run estates!

And they were superstitious, spiritualistic, fascinated by astrology, ancient Egypt, the Kabbalah, alchemy, magic – a Renaissance version of our "New Age" movement.

Two events in particular stand out as representative of the renaissance: The first was printing. Johann Gutenberg (c 1400-1467) of Mainz invented the printing press and movable type, and printed the Gutenberg Bible in 1455.

The second was the discovery of the New World, which meant lots of gold and silver and a stoked-up international economy, as well as an outlet for those discontented with life in Europe ("Lebensraum" – room to live, as the Germans call it). This, of course, is usually credited to Christopher Columbus (1451-1506), pictured above.

Humanism

Another aspect of the renaissance was its humanism, meaning an interest in or focus on human beings and their well being, here and now, rather than in God and the afterlife, or the activities of saints and Biblical heroes of eons ago. Petrarch (1304-1374), for example, wrote history with an emphasis on personality, and is often considered the first humanist (at least since the ancients!).

There were several philosophers in the early renaissance who particularly express this idea of humanism. Giovanni Pico della Mirandola (1463-1494), for example, believed that the philosophers (that is, Plato and Aristotle) and Christianity basically agree. He argued for free will and saw mankind as the connection between the physical world and the spiritual world.

Desiderius Erasmus of Rotterdam (1467-1536) recommended a compromise between faith and humanism, and tried hard to prevent the excesses of the reformation. A strong believer in free will, he wrote against the concept of justified wars, and asked his readers to exercise tolerance, friendliness, and gentleness.

Sir Thomas More (1478-1535), friend of Erasmus and chancellor to the infamous King Henry VIII, wrote a

15 | 73© Copyright 2006 C. George Boeree

Page 87: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

story called Utopia, in which he described a perfect society much along the lines of Erasmus’ compromise between faith and humanism. When he refused to recognize his sovereign as the head of the English church, said Henry VIII had him beheaded. The Catholic church made him a saint.

Niccolo Machiavelli (1469-1527) had quite a different outlook on things than these philosophers – still, however, typifying humanism. He wrote about hard ball politics in a book called The Prince, 1513. His reputation became so negative, that "Old Nick" became a nickname (!) for the devil himself. But few people realize that he followed up The Prince with The Discourses, in which he discusses democracy as the political system he would prefer to see! I myself think he deserves credit as the first social psychologist after the Greeks.

And this was particularly the heyday of artists and authors. Among the artists were Leonardo DaVinci (1452-1519) and Michelangelo (1475-1564) in Italy, and Albrecht Dürer (1471-1528) in Germany. Later in the renaissance, we have El Greco (1541-1614) in Spain, and later still, Rembrandt van Rijn (1606-1669) in Holland. Among the literary types we had Montaigne (1533-1592) in France, Cervantes (1547-1616) in Spain, and none other than William Shakespeare (1564-1616) in England. There were many others.

The Reformation

In the Middle Ages, the ultimate authority was pretty clearly God, and the pope was his mouthpiece. Heresy was not uncommon, of course, but excommunication and monastic imprisonment were the major punishments. Then, in 1215, came the Inquisition, and heresy was punishable by death. Spain in particular was a land of religious fanatics. Torquemada, the Grand Inquisitor of Spain from 1483 to 1498, made the Spanish Inquisition a household word.

Martin Luther (1483-1546) posted his 95 Theses (points of disagreement with the way the church was doing things) on the door of the castle church of Wittemburg. His Theses focused on the sale of indulgences and denied the primacy of the pope. He emphasized the idea that we are born in sin, our lack of free will, and our absolute need for the grace of God. Luther furthermore translated the Bible into German, and his dialect became the basis for standard literary German to the present day!. He also wrote some pretty nasty papers condemning peasants and Jews.

John Calvin (Jean Cauvin, 1509-1564) from northern France, became Protestant , and was forced to flee to Switzerland. There, he preached unconditional obedience to God and the odd doctrine of predestination, which says that since God is omniscient, he already knows who is going to heaven and who is not. Gaining political as well as spiritual power, he ruled Geneva as a religious dictatorship, not unlike Iran or Afghanistan recently: no drinking, dancing, or

gambling; no icons, candles, or incense; obligatory church attendance for everyone.... He condemned the Spanish unitarian Michael Servetus, who came to him for protection, to burn at the stake for heresy! (A unitarian is someone who does not believe in the Trinity – the worst heresy of all. Even today, the protestant churches of the US won’t accept the Unitarians as Christians! This despite the fact that the trinity is mentioned nowhere in the Bible.)

Henry VIII ruled England from 1509 to 1547. Having a hard time conceiving an heir, he divorced (and excecuted) one wife after another. When the pope refused to give him an easy divorce from Catherine of

16 | 73© Copyright 2006 C. George Boeree

Page 88: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Aragon, he declared himself the head of the English Church and took all monastic property for his treasury! But the doctrines remained fundamentally Catholic. (Although usually considered Protestant, the Anglican Church and its offspring, the Episcopalian Church, maintain good relationships with the Catholic Church to this day.)

On the other hand, Philip II of Spain (ruled 1556-1598) wished to restore Catholicism to its former glory. His domestic policy consisted of encouraging the inquisition – resulting in the mass burning of heretics and severe oppression of remaining Moors and Jews in Spain.

Philip was especially ticked off by the war for independence of the Netherlands from Spain, which was led by Protestants. Elizabeth I of England (ruled 1558-1603), whom he courted, secretly encouraged piracy against his fleets, which were coincidentally bringing loads of silver from the new world. The hostilities culminated in the destruction of his Great Armada in 1588.

The reformation lead the Catholic church to reform itself, but not before executing a very large number of Protestants for heresy. The Protestants executed Catholics and other Protestants as well. Catholic or Protestant, these were not proud days for religion!

Science

In Mathematics, a number of advances were made: Francis Pellos of Nice invented the decimal point in 1492. Thomas Harriot, the astronomer who discovered sunspots, created what are now the standard symbols used in algebra. John Napier of Scotland invented logarithms, which in turn permitted the William Oughtred to develop the slide rule – which could be considered a simple analog computer – in 1622. Descartes himself invented analytic geometry.

Biology and medicine also had a few breakthroughs: Peracelsus (Theophrastus Bombastus von Hohenheim was his real name! 1493-1541) recognized that life was based on chemical and physical sources, and should be explained thus. In 1553, Michael Servetus – the same one that Calvin had burned at the stake in Geneva – discovered pulmonary circulation. William Harvey (1578-1657), physician to King James I and King Charles I (and Francis Bacon, below), explained the circulation of the blood for the first time. He also promoted the idea that every animal comes from an egg – in an age when spontaneous generation of flies was the established belief.

Instrumentation drove much of the progress in science. The compound microscope was invented in 1595 by Zacharias Janssen of Middleburg in Holland. The telescope was invented by his neighbor, a German named Hans Lippershey in 1608. Galileo invented a thermometer in 1603, and his student Evangelista Torricelli invented the barometer in 1643.

[A note: Glass lenses had been around for some time. "Reading stones" (magnifying glasses) exist that date from 1000 ad in Venice. Roger Bacon suggested the principle of reading spectacles in 1264, and the first spectacles show up in Florence, Italy, around 1280. A nobleman named Amati is suggested as a possible inventor. They were considered a near miracle by the elderly of the time. On the other hand, spectacles for the near-sighted only show up in the 1500's (on the nose of Pope Leo, no less!), and bifocals have to wait for Ben Franklin to invent them in the 1780s.]

And then there were the great astronomers! Nicholas Copernicus of Poland (1473-1543) introduced the heliocentric solar system. The church, of course, asked why would God not put us – his special creation – in the

17 | 73© Copyright 2006 C. George Boeree

Page 89: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

center? How can this be reconciled with scripture? And doesn’t this conflict with direct experience?

Johannes Kepler 1571-1630 added the laws of planetary motion, i.e. that they have elliptic, not circular, paths. Note that this implies something less than perfection, not what God would do, even if he did put the sun in the center!

If astronomers were having a hard time with the church, heaven forbid you elaborated on Copernicus: Giordano Bruno (1548-1600), of Nola near Naples, believed in an infinite universe without center, with innumerable earths traveling around innumerable suns, each with plants and animals and people. And he was a pantheist. Pantheism is the belief that God is found throughout nature, that he is, in fact, identical with the universe. When people say "God is in everything and everyone," they are in fact making a pantheist statement that could have gotten them killed until fairly recently! He had a particularly powerful effect on Spinoza, whom we will discuss in the next chapter.

After a brief stint as a Dominican monk, he wandered around the cities of Europe until a Venetian aristocrat invited him to return. That same aristocrat turned him in to the Inquisition in 1592. He was imprisoned for eight years, but refused to recant. Finally, on February 17, 1600, he was burned at the stake in the Square of the Flowers in Rome, naked and with a nail through his tongue. In 1889, a statue of him was erected in that same square, and his death commemorated by free-thinkers world-wide every year since.

Francis Bacon

Francis Bacon (1561-1626) was born January 22, 1561. His father was "Lord Keeper of the Royal Seal," under Elizabeth I, something similar to the Secretary of the Treasury in the US President’s Cabinet today. When his father died early, he was left without an estate, and so went to study law.

At the ripe old age of 23, he was elected to Parliament, where he was a strong advocate of religious toleration, and his fortunes began to improve. In 1607, King James I made him Solicitor General; in 1613, Attorney General; and in 1617, Lord Keeper of the Royal Seal! The following year, the King made him a Baron and the Lord Chancellor – basically the King’s right hand man.

His real love, however, was science and philosophy. He wrote Novum Organum of 1620 refined the art of logical thinking, and proposed a "new method" for science. Bacon suggested that we use induction – working from facts to theory (instead of from theory, or the Bible, to "facts"). He was wary of hypotheses – which he felt were as likely to be superstition or wishful thinking than anything else – but in fact suggested what we would now call the testing of hypothesis in the form of a process of elimination of alternative explanations!

18 | 73© Copyright 2006 C. George Boeree

Page 90: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

In 1621, soon after the King raised him to Viscount, the Parliament impeached him for taking bribes. He had indeed taken many bribes – but so had everyone else, so the impeachment was really a political slap at the King.

But, being out of political office allowed him to continue full time the science and philosophy he loved part time. He began a project, with the help of the King, called The Great Renewal, which was to be a review of all the sciences.

Basically, The Great Renewal involved purging ourselves, our intellects, of our biases, which he called idols. He named four:

1. Idols of the Tribe. The tribe he is referring to is us, the human tribe. So the idols of the tribe are our natural tendencies towards bias, such as reading our own wishes into what we suppose we see, looking for patterns or a purpose to everything, and so on.

2. Idols of the Cave. The cave is the little box we each live in as individuals. So the idols of the cave are the distortions and biases we have as individuals, such as those based on our peculiar backgrounds and educations, as well as the intellectual heroes we emulate.

3. Idols of the Marketplace. The marketplace is society, and the main threat to clear thinking from society is its use of language. The common uses of words are not necessarily fit for scientific and philosophical use, and "common sense" or the logic we presume we are using when we speak is not that logical. And words can exist that have references that do not exist – a great root of confusion.

4. Idols of the Theater. The theater refers to the showplaces of scientific ideas and theories – journals and books, famous names and theories, particular scientific designs or methods that have won recognition – the appearances of truth! Bacon says we should take care not to idolize or dogmatize whatever theories are presently accepted, even if they are promoted by "authorities" in their field or appear to be accepted "universally."

In 1624, he published The New Atlantis, a utopian fiction about an island in the South Pacific ruled by scientists. They lived in a university-like setting called Salomon’s House (after their founder), and were chosen for their position by tests of their merits – just like the philosopher-kings in Plato’s Republic. This may have been the model for England's Royal Society (of scientists).

In The New Atlantis, incidentally, he predicted quite a few modern inventions, including cars, planes, radio, and anesthetics.

Bacon died in 1626, at the age of 65, after catching cold while experimenting with preserving chicken by freezing. He is considered the father of British philosophy, and the intellectuals of France dedicated their monumental Encyclopédie to him in 1751.

Galileo Galilei

Galileo Galilei (1564-1642) was born in Pisa, Italy February 18, 1564, the same year as William Shakespeare was born, and the same day Michelangelo died. When he was 18, he discovered the principle of the pendulum. At 22, he invented the hydrostatic balance. Perfecting his telescopes, he managed, in 1610, to discover four of the nine moons of Jupiter (the Galilean moons!), the rings of Saturn, and the phases of Venus!

He is most famous, of course, for the law of gravity, stating that two things of the same size and shape, but of different weights, will fall at the same speed through the same medium. That he demonstrated this by dropping things off the leaning tower of Pisa is probably a myth – but who knows?

19 | 73© Copyright 2006 C. George Boeree

Page 91: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

On the philosophical front, he was the first to make the distinction, which would become so important in English philosophy, between primary and secondary qualities. Primary qualities where physical properties of matter that could be measured, and therefore be made the subject of scientific analysis. Secondary qualities, on the other hand, were things that required the presence of a conscious, living creature: tastes, odors, colors, sounds... Only if these could be converted into primary qualities could they, he believed, become the subjects of science.

Galileo considered Copernicus’ theory as a proven fact, and taught it as such. The church, however, and especially the Jesuits, would only accept it if stated as a hypothesis, in the same way that some fundamentalists today will only tolerate the teaching of evolution if it is presented as just one theory among many.

Galileo pointed out to his critics that the Bible shouldn’t be read literally. If you do, you will end up with no end of absurdities and contradictions! It is meant to be taken metaphorically. Uh oh. Here are some quotes from a letter he wrote to the Grand Duchess Christina of Tuscany in 1615 :

...(N)othing physical which sense-experience sets before our eyes, or which necessary demonstrations prove to us, ought to be called in question (much less condemned) upon the testimony of Biblical passages which may have some different meaning beneath their works.

...I do not feel obliged to believe that that same God who has endowed us with sense, reason, and intellect has intended us to forgo their use.

(From p. 607 of Durant’s The Age of Reason Begins, and originally from Galileo’s Discoveries and Opinions, edited by Stillman Drake, pp 177 and 183)

In 1616, the Inquisition told Galileo to stop teaching Copernicus’ theory, and in fact condemned all publications and books, by any author, that did so. Galileo, probably recalling Bruno’s fate only 16 years earlier, quieted down. An essay by a student of his got things upset again, so Galileo spoke to the Pope himself. The Pope wouldn’t budge.

He finished his book on Copernicus’s theory anyway, but presented it as a hypothesis only, even putting it in the form of a dialog between supporters and detractors. Of course, naming the anti-Copernicus speaker "Simpleton" didn’t help. The Jesuits attacked it, even saying that Galileo was a greater danger to the church than Luther and Calvin. (They were probably, in the long run, right.)

At 68, he was, over four interrogations, threatened with torture (though not tortured), and asked to recant. He refused, a little less intensely each time. They pronounced him guilty of heresy. Eventually, he was put under house arrest, but otherwise free to teach and write. He was lucky.

Galileo died January 8, 1642. Science suffered quite a blow in the Catholic countries, with many scientist fearful of stating their views. This moved the center of scientific discovery to the Protestant North, not because Protestantism was more tolerant of science, but because the churches of those countries had less legal authority. In 1835, the church did finally take Galileo’s books off the banned books list. The church apologized in 1999.

20 | 73© Copyright 2006 C. George Boeree

Page 92: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

René Descartes

René Descartes (1596 - 1650) was born March 31, 1596 in La Haye, France, the son of a wealthy lawyer. Sadly, his mother contracted tuberculosis a few days after his birth and died. The baby nearly died as well, and Rene remained a weak child.

After a good Jesuit education, at 17 he began to wander Europe, including a stint in the Bavarian army. In 1628, he moved to Holland, where he stayed for most of his life. Rene never married, but he did have a mistress and a daughter, who died at the age of five.

His major contribution, for which he will forever be known as the father of modern philosophy, is the method of doubt. In his book Meditations, he decided to start philosophy from scratch by doubting everything he could – things, God, self, the church, Aristotle... – until he found something he could not doubt and from which he could build a new philosophy.

His conclusion was, of course, that there was one thing he could not doubt: The fact that he was there doing the doubting! Cogito ergo suum, I think therefore I am. From there, he went on to conclude that there were a number of things equally certain: God, time and space, the world, mathematics. These things, he said, were innate – in-born – to the mind. You derive them not from experience but from the nature of one’s mind itself.

But there’s more to Descartes: He was a mathematician as well as a philosopher, and he made a variety of mathematical discoveries, most especially analytic geometry (applying algebra to geometry – remember Cartesian coordinates?), which he supposedly discovered while in what he called a stove – perhaps a sauna. He was also a scientist, and made a number of innovations in mechanics and optics as well. And he was the first to note the idea of the reflex.

The idea that some of our actions are reflexive leads inevitably to the possibility that all actions are reflexive. Descartes theorized that animals (at least) have no need for a soul: They are automatons. Being a good Catholic, he exempted human beings. We do have a soul, although he acknowledged that he did not know how the soul and the body interacted.

In history, people use what is most interesting around them to theorize about other things, especially themselves. Today, everyone talks about psychological issues using computer analogies and information processing models. In Descartes’ day, it was the mechanics of clockworks and hydraulic systems that were the cutting edge of technology. So he and others basically suggested that life – including at least much of human life – was mechanical, i.e. functioned by the same natural laws as did physical entities.

Descartes went one step further: He made the deist hypothesis. He suggested that (outside of the human soul and free will) all of creation works mechanically, and that God designed and set it all in motion – but certainly would have no need to step in and intervene once things are going. Of course, that says God would have no need of miracles, that Christ was not some great intervention in the course of history, and that prayers don’t really do anything. Uh oh.

The Calvinist theologians of Holland attacked him: Besides this untraditional idea of deism and a mechanical universe, as a good Catholic Descartes also believed in free will, which doesn’t jive well with Calvinist predestination (the idea that God knows exactly who is going to hell or to heaven). His friends, fortunately, intervened on his behalf.

In 1649, Descartes was invited by Queen Christina of Sweden to tutor her majesty. Unfortunately, she wanted him at five in the morning three days a week, through sleet and rain and snow. Descartes caught pneumonia and died in 1650, at the age of 54.

21 | 73© Copyright 2006 C. George Boeree

Page 93: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Education

I should add a little note on education during these heady times. In the 1600’s, despite all the great scientists and philosophers, about 80% of the population was illiterate. (Not the modern "functionally illiterate," meaning they are not very good at it, but totally unable to read and write.) Nevertheless, change was coming. For example, in 1619, the Duke of Saxe-Weimar (in Germany) ordered compulsory education for all children between six and twelve years old, with a month of "vacation" at harvest time so they could work on the farm – practically the same system we have today, including the summer vacation!

John Comenius (Jan Komensky, 1592-1670), a bishop of the Moravian Brethren, wrote the first printed textbook (illustrated, no less) which was used for 250 years. In the Didactica Magna ("Great Art of Teaching"), he outlined principles of education that could be used by most any schoolboard today.

Note, however, for all the religious reform and scientific progress, over one million people, mostly older women, were executed as witches from the time of the Papal Bull concerning witches, issued in 1484, through the 1700s. There was even a witch hunting manual first published in 1486 called Malleus Malificarum ( the Witch’s Hammer) – how to recognize them, how to torture them into a confession, how to effectively kill them.... Women continue to be mistreated today, of course, although we seem to be coming to our senses as the world moves into the second millenium.

Note as well that slavery, which was a minor issue in the Middle Ages (what with the convenience of serfs), had resurrected its ugly head with Spanish and Portuguese conquest of vast areas of Africa and the Americas. Many people in Spain (where else?) believed that no decent Christian should perform manual labor! Protestant nations and their colonies found the practice of slavery equally profitable. Although slavery still exists in some third-world nations, it has died out in most of the world, mostly because the industrial revolution made it too costly, not because we were offended by the practice.

22 | 73© Copyright 2006 C. George Boeree

Page 94: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Selections from a letter to the Grand Duchess Christina of Tuscany, 1615 *

Galileo Galilei

To the Most Serene Grand Duchess Mother:

Some years ago, as Your Serene Highness well knows, I discovered in the heavens many things that had not been seen before our own age. The novelty of these things, as well as some consequences which followed from them in contradiction to the physical notions commonly held among academic philosophers, stirred up against me no small number of professors-as if I had placed these things in the sky with my own hands in order to upset nature and overturn the sciences. They seemed to forget that the increase of known truths stimulates the investigation, establishment, and growth of the arts; not their diminution or destruction.

Showing a greater fondness for their own opinions than for truth they sought to deny and disprove the new things which, if they had cared to look for themselves, their own senses would have demonstrated to them. To this end they hurled various charges and published numerous writings filled with vain arguments, and they made the grave mistake of sprinkling these with passages taken from places in the Bible which they had failed to understand properly, and which were ill-suited to their purposes.

These men would perhaps not have fallen into such error had they but paid attention to a most useful doctrine of St. Augustine's, relative to our making positive statements about things which are obscure and hard to understand by means of reason alone. Speaking of a certain physical conclusion about the heavenly bodies, he wrote: "Now keeping always our respect for moderation in grave piety, we ought not to believe anything inadvisedly on a dubious point, lest in favor to our error we conceive a prejudice against something that truth hereafter may reveal to be not contrary in any way to the sacred books of either the Old or the New Testament."

Well, the passage of time has revealed to everyone the truths that I previously set forth; and, together with the truth of the facts, there has come to light the great difference in attitude between those who simply and dispassionately refused to admit the discoveries to be true, and those who combined with their incredulity some reckless passion of their own. Men who were well grounded in astronomical and physical science were persuaded as soon as they received my first message. There were others who denied them or remained in doubt only because of their novel and unexpected character, and because they had not yet had the opportunity to see for themselves. These men have by degrees come to be satisfied. But some, besides allegiance to their original error, possess I know not what fanciful interest in remaining hostile not so much toward the things in question as toward their discoverer. No longer being able to deny them, these men now take refuge in obstinate silence, but being more than ever exasperated by that which has pacified and quieted other men, they divert their thoughts to other fancies and seek new ways to damage me.

.... To this end they make a shield of their hypocritical zeal for religion. They go about invoking the Bible, which they would have minister to their deceitful purposes. Contrary to the sense of the Bible and the intention of the holy Fathers, if I am not mistaken, they would extend such authorities until even m purely physical matters - where faith is not involved - they would have us altogether abandon reason and the evidence of our senses in favor of some biblical passage, though under the surface meaning of its words this passage may contain a different sense.

.... I think that in discussions of physical problems we ought to begin not from the authority of scriptural passages but from sense?experiences and necessary demonstrations; for the holy Bible and the phenomena of nature proceed alike from the divine Word the former as the dictate of the Holy Ghost and the latter as the observant executrix of God's commands. It is necessary for the Bible, in order to be accommodated to the

*Available at http://www.fordham.edu/halsall/mod/galileo-tuscany.html © Paul Halsall Aug 1997 [email protected]

23 | 73© Copyright 2006 C. George Boeree

Page 95: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

understanding of every man, to speak many things which appear to differ from the absolute truth so far as the bare meaning of the words is concerned. But Nature, on the other hand, is inexorable and immutable; she never transgresses the laws imposed upon her, or cares a whit whether her abstruse reasons and methods of operation are understandable to men. For that reason it appears that nothing physical which sense?experience sets before our eyes, or which necessary demonstrations prove to us, ought to be called in question (much less condemned) upon the testimony of biblical passages which may have some different meaning beneath their words. For the Bible is not chained in every expression to conditions as strict as those which govern all physical effects; nor is God any less excellently revealed in Nature's actions than in the sacred statements of the Bible....

If in order to banish the opinion in question from the world it were sufficient to stop the mouth of a single man – as perhaps those men persuade themselves who, measuring the minds of others by their own, think it impossible that this doctrine should be able to continue to find adherents-then that would be very easily done. But things stand otherwise. To carry out such a decision it would be necessary not only to prohibit the book of Copernicus and the writings of other authors who follow the same opinion, but to ban the whole science of astronomy. Furthermore, it would be necessary to forbid men to look at the heavens, in order that they might not see Mars and Venus sometimes quite near the earth and sometimes very distant, the variation being so great that Venus is forty times and Mars sixty times as large at one time as at another. And it would be necessary to prevent Venus being seen round at one time and forked at another, with very thin horns; as well as many other sensory observations which can never be reconciled with the Ptolemaic system in any way, but are very strong arguments for the Copernican. And to ban Copernicus now that his doctrine is daily reinforced by many new observations and by the learned applying themselves to the reading of his book, after this opinion has been allowed and tolerated for these many years during which it was less followed and less confirmed, would seem in my judgment to be a contravention of truth, and an attempt to hide and suppress her the more as she revealed herself the more clearly and plainly. Not to abolish and censure his whole book, but only to condemn as erroneous this particular proposition, would (if I am not mistaken) be a still greater detriment to the minds of men, since it would afford them occasion to see a proposition proved that it was heresy to believe. And to prohibit the whole science would be to censure a hundred passages of holy Scripture which teach us that the glory and greatness of Almighty God are marvelously discerned in all his works and divinely read in the open book of heaven....

24 | 73© Copyright 2006 C. George Boeree

Page 96: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

René Descartes: Selections from Meditations *

Meditation I.

1. SEVERAL years have now elapsed since I first became aware that I had accepted, even from my youth, many false opinions for true, and that consequently what I afterward based on such principles was highly doubtful; and from that time I was convinced of the necessity of undertaking once in my life to rid myself of all the opinions I had adopted, and of commencing anew the work of building from the foundation, if I desired to establish a firm and abiding superstructure in the sciences. But as this enterprise appeared to me to be one of great magnitude, I waited until I had attained an age so mature as to leave me no hope that at any stage of life more advanced I should be better able to execute my design. On this account, I have delayed so long that I should henceforth consider I was doing wrong were I still to consume in deliberation any of the time that now remains for action. To-day, then, since I have opportunely freed my mind from all cares [and am happily disturbed by no passions], and since I am in the secure possession of leisure in a peaceable retirement, I will at length apply myself earnestly and freely to the general overthrow of all my former opinions. ....

4. But it may be said, perhaps, that, although the senses occasionally mislead us respecting minute objects, and such as are so far removed from us as to be beyond the reach of close observation, there are yet many other of their informations (presentations), of the truth of which it is manifestly impossible to doubt; as for example, that I am in this place, seated by the fire, clothed in a winter dressing gown, that I hold in my hands this piece of paper, with other intimations of the same nature. But how could I deny that I possess these hands and this body, and withal escape being classed with persons in a state of insanity, whose brains are so disordered and clouded by dark bilious vapors as to cause them pertinaciously to assert that they are monarchs when they are in the greatest poverty; or clothed [in gold] and purple when destitute of any covering; or that their head is made of clay, their body of glass, or that they are gourds? I should certainly be not less insane than they, were I to regulate my procedure according to examples so extravagant.

5. Though this be true, I must nevertheless here consider that I am a man, and that, consequently, I am in the habit of sleeping, and representing to myself in dreams those same things, or even sometimes others less probable, which the insane think are presented to them in their waking moments. How often have I dreamt that I was in these familiar circumstances, that I was dressed, and occupied this place by the fire, when I was lying undressed in bed? At the present moment, however, I certainly look upon this paper with eyes wide awake; the head which I now move is not asleep; I extend this hand consciously and with express purpose, and I perceive it; the occurrences in sleep are not so distinct as all this. But I cannot forget that, at other times I have been deceived in sleep by similar illusions; and, attentively considering those cases, I perceive so clearly that there exist no certain marks by which the state of waking can ever be distinguished from sleep, that I feel greatly astonished; and in amazement I almost persuade myself that I am now dreaming. ....

12. I will suppose, then, not that Deity, who is sovereignly good and the fountain of truth, but that some malignant demon, who is at once exceedingly potent and deceitful, has employed all his artifice to deceive me; I will suppose that the sky, the air, the earth, colors, figures, sounds, and all external things, are nothing better than the illusions of dreams, by means of which this being has laid snares for my credulity; I will consider myself as without hands, eyes, flesh, blood, or any of the senses, and as falsely believing that I am possessed of these; I will continue resolutely fixed in this belief, and if indeed by this means it be not in my power to arrive at the knowledge of truth, I shall at least do what is in my power, viz, [ suspend my judgment ], and guard with settled purpose against giving my assent to what is false, and being imposed upon by this deceiver, whatever be his power and artifice. But this undertaking is arduous, and a certain indolence insensibly leads me back to my ordinary course of life; and just as the captive, who, perchance, was enjoying in his dreams an imaginary liberty, when he begins to suspect that it is but a vision, dreads awakening, and

* From http://philos.wright.edu/Descartes/Meditations.html

25 | 73© Copyright 2006 C. George Boeree

Page 97: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

conspires with the agreeable illusions that the deception may be prolonged; so I, of my own accord, fall back into the train of my former beliefs, and fear to arouse myself from my slumber, lest the time of laborious wakefulness that would succeed this quiet rest, in place of bringing any light of day, should prove inadequate to dispel the darkness that will arise from the difficulties that have now been raised.

Meditation II

1. The Meditation of yesterday has filled my mind with so many doubts, that it is no longer in my power to forget them. Nor do I see, meanwhile, any principle on which they can be resolved; and, just as if I had fallen all of a sudden into very deep water, I am so greatly disconcerted as to be unable either to plant my feet firmly on the bottom or sustain myself by swimming on the surface. I will, nevertheless, make an effort, and try anew the same path on which I had entered yesterday, that is, proceed by casting aside all that admits of the slightest doubt, not less than if I had discovered it to be absolutely false; and I will continue always in this track until I shall find something that is certain, or at least, if I can do nothing more, until I shall know with certainty that there is nothing certain. Archimedes, that he might transport the entire globe from the place it occupied to another, demanded only a point that was firm and immovable; so, also, I shall be entitled to entertain the highest expectations, if I am fortunate enough to discover only one thing that is certain and indubitable.

2. I suppose, accordingly, that all the things which I see are false (fictitious); I believe that none of those objects which my fallacious memory represents ever existed; I suppose that I possess no senses; I believe that body, figure, extension, motion, and place are merely fictions of my mind. What is there, then, that can be esteemed true ? Perhaps this only, that there is absolutely nothing certain.

3. But how do I know that there is not something different altogether from the objects I have now enumerated, of which it is impossible to entertain the slightest doubt? Is there not a God, or some being, by whatever name I may designate him, who causes these thoughts to arise in my mind ? But why suppose such a being, for it may be I myself am capable of producing them? Am I, then, at least not something? But I before denied that I possessed senses or a body; I hesitate, however, for what follows from that? Am I so dependent on the body and the senses that without these I cannot exist? But I had the persuasion that there was absolutely nothing in the world, that there was no sky and no earth, neither minds nor bodies; was I not, therefore, at the same time, persuaded that I did not exist? Far from it; I assuredly existed, since I was persuaded. But there is I know not what being, who is possessed at once of the highest power and the deepest cunning, who is constantly employing all his ingenuity in deceiving me. Doubtless, then, I exist, since I am deceived; and, let him deceive me as he may, he can never bring it about that I am nothing, so long as I shall be conscious that I am something. So that it must, in fine, be maintained, all things being maturely and carefully considered, that this proposition (pronunciatum ) I am, I exist, is necessarily true each time it is expressed by me, or conceived in my mind.

4. But I do not yet know with sufficient clearness what I am, though assured that I am; and hence, in the next place, I must take care, lest perchance I inconsiderately substitute some other object in room of what is properly myself, and thus wander from truth, even in that knowledge ( cognition ) which I hold to be of all others the most certain and evident. For this reason, I will now consider anew what I formerly believed myself to be, before I entered on the present train of thought; and of my previous opinion I will retrench all that can in the least be invalidated by the grounds of doubt I have adduced, in order that there may at length remain nothing but what is certain and indubitable. ....

6. But [as to myself, what can I now say that I am], since I suppose there exists an extremely powerful, and, if I may so speak, malignant being, whose whole endeavors are directed toward deceiving me ? Can I affirm that I possess any one of all those attributes of which I have lately spoken as belonging to the nature of body ? After attentively considering them in my own mind, I find none of them that can properly be said to belong to myself. To recount them were idle and tedious. Let us pass, then, to the attributes of the soul. The first mentioned were the powers of nutrition and walking; but, if it be true that I have no body, it is true likewise that I am capable neither of walking nor of being nourished. Perception is another attribute of the soul; but perception too is impossible without the body; besides, I have frequently, during sleep, believed

26 | 73© Copyright 2006 C. George Boeree

Page 98: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

that I perceived objects which I afterward observed I did not in reality perceive. Thinking is another attribute of the soul; and here I discover what properly belongs to myself. This alone is inseparable from me. I am–I exist: this is certain; but how often? As often as I think; for perhaps it would even happen, if I should wholly cease to think, that I should at the same time altogether cease to be. I now admit nothing that is not necessarily true. I am therefore, precisely speaking, only a thinking thing, that is, a mind (mens sive animus), understanding, or reason, terms whose signification was before unknown to me. I am, however, a real thing, and really existent; but what thing? The answer was, a thinking thing. ....

8. But what, then, am I? A thinking thing, it has been said. But what is a thinking thing? It is a thing that doubts, understands, [conceives], affirms, denies, wills, refuses; that imagines also, and perceives.

9. Assuredly it is not little, if all these properties belong to my nature. But why should they not belong to it ? Am I not that very being who now doubts of almost everything; who, for all that, understands and conceives certain things; who affirms one alone as true, and denies the others; who desires to know more of them, and does not wish to be deceived; who imagines many things, sometimes even despite his will; and is likewise percipient of many, as if through the medium of the senses. Is there nothing of all this as true as that I am, even although I should be always dreaming, and although he who gave me being employed all his ingenuity to deceive me ? Is there also any one of these attributes that can be properly distinguished from my thought, or that can be said to be separate from myself ? For it is of itself so evident that it is I who doubt, I who understand, and I who desire, that it is here unnecessary to add anything by way of rendering it more clear. And I am as certainly the same being who imagines; for although it may be (as I before supposed) that nothing I imagine is true, still the power of imagination does not cease really to exist in me and to form part of my thought. In fine, I am the same being who perceives, that is, who apprehends certain objects as by the organs of sense, since, in truth, I see light, hear a noise, and feel heat. But it will be said that these presentations are false, and that I am dreaming. Let it be so. At all events it is certain that I seem to see light, hear a noise, and feel heat; this cannot be false, and this is what in me is properly called perceiving (sentire), which is nothing else than thinking.

27 | 73© Copyright 2006 C. George Boeree

Page 99: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Quotes from Comenius *

Education for Everyone

Not the children of the rich or of the powerful only, but of all alike, boys and girls, both noble and ignoble, rich and poor, in all cities and towns, villages and hamlets, should be sent to school.

Education is indeed necessary for all, and this is evident if we consider the different degrees of ability. No one doubts that those who are stupid need instruction, that they may shake off their natural dullness. But in reality those who are clever need it far more, since an active mind, if not occupied with useful things, will busy itself with what is useless, curious, and pernicious.

Learning is Natural

Who is there that does not always desire to see, hear, or handle something new? To whom is it not a pleasure to go to some new place daily, to converse with someone, to narrate something, or have some fresh experience? In a word, the eyes, the ears, the sense of touch, the mind itself, are, in their search for food, ever carried beyond themselves; for to an active nature nothing is so intolerable as sloth.

The proper education of the young does not consist in stuffing their heads with a mass of words, sentences, and ideas dragged together out of various authors, but in opening up their understanding to the outer world, so that a living stream may flow from their own minds, just as leaves, flowers, and fruit spring from the bud on a tree.

Learning by Easy Stages

There is in the world no rock or tower of such a height that it cannot be scaled by any man (provided he lack not feet) if ladders are placed in the proper position or steps are cut in the rock, made in the right place, and furnished with railings against the danger of falling over.

If we examine ourselves, we see that our faculties grow in such a manner that what goes before paves the way for what comes after.

Play

Much can be learned in play that will afterwards be of use when the circumstances demand it.

A tree must also transpire, and needs to be copiously refreshed by wind, rain, and frost; otherwise it easily falls into bad condition, and becomes barren. In the same way the human body needs movement, excitement, and exercise, and in daily life these must be supplied, either artificially or naturally.

* Quotes from John Amos Comenius, The Great Didactic, 1649 (translated by M.W. Keatinge 1896). Presented on the Froebel Web at http://www.geocities.com/Athens/Forum/7905/web7005.html.

Last quote found in Will and Ariel Durant's The Age of Reason Begins (1961), p. 582.

Image: Statue of Comenius at Moravian College, Bethlehem, Pennsylvania, presented at their web site at http://www.moravian.edu.

28 | 73© Copyright 2006 C. George Boeree

Page 100: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Lifelong Learning

If, in each hour, a man could learn a single fragment of some branch of knowledge, a single rule of some mechanical art, a single pleasing story or proverb (the acquisition of which would require no effort), what a vast stock of learning he might lay by. Seneca is therefore right when he says: "Life is long, if we know how to use it." It is consequently of importance that we understand the art of making the very best use of our lives.

Aristotle compared the mind of man to a blank tablet on which nothing was written, but on which all things could be engraved. There is, however, this difference, that on the tablet the writing is limited by space, while in the case of the mind, you may continually go on writing and engraving without finding any boundary, because, as has already been shown, the mind is without limit.

Humanity

We are all citizens of one world, we are all of one blood. To hate a man because he was born in another country, because he speaks a different language, or because he takes a different view on this subject or that, is a great folly. Desist, I implore you, for we are all equally human.... Let us have but one end in view, the welfare of humanity; and let us put aside all selfishness in considerations of language, nationality, or religion.

29 | 73© Copyright 2006 C. George Boeree

Page 101: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Timeline 1400 to 1800 ad

Map Europe 1700

30 | 73© Copyright 2006 C. George Boeree

Page 102: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Epistemology

31 | 73© Copyright 2006 C. George Boeree

Page 103: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Epistemology is that part of philosophy that asks "what can we know?" "What can we be sure of?" "How do we get beyond mere opinion to real knowledge?"

Traditionally, there are two approaches to epistemology: rationalism, which says we gain knowledge through reasoning, and empiricism, which says we gain knowledge through sensory experience. Although there are a few extremist philosophers, generally most agree that both these approaches to knowledge are needed, and that to some extent they support and correct each other. More on that in a moment.

Rationalists focus on what they call necessary truth. By that they mean that certain things are necessarily true, always, universally. Another term that means the same thing is a priori truth. A priori is Latin for "beforehand," so a priori truth is something you know must be true before you even start looking at the world the senses reveal to us.

The most basic form of necessary truth is the self-evident truth. Self-evident means you don’t really even have to think about it. It has to be true. The truths of mathematics, for example, are often thought of as self-evident. One plus one equals two. You don’t need to go all over the world counting things to prove this. In fact, one plus one equals two is something you need to believe before you can count at all!

(One of the criticisms that empiricists would put forth is that "one plus one is two" is trivial. It is tautological, meaning it is true, sure, but not because it is self-evident: It is true because we made it that way. One plus one is the definition of two, and so with the rest of mathematics. We created math in such a way that it works consistently for us!)

Other self-evident truths that have been put forth over the years include "you can’t be in two places at once," "something either is or it isn’t," "everything exists." These are pretty good candidates, don’t you think? But often, what is self-evident to one person is not self-evident to another. "God exists" is perhaps the most obvious one – some people disagree with it quite vigorously. Or "the universe had to have a beginning" – some people believe it has always been. A familiar use of the phrase "self-evident" is Thomas Jefferson's use of it in the Declaration of Independence: "We hold these truths to be self-evident: That all men are created equal...." But it is pretty obvious to most that this is not, really, true. Instead, it is a rhetorical device, that is, it sounds good to put it that way!

In order to reason our way to more complex knowledge, we have to add deduction (also known as analytic truth) to the picture. This is what we usually think of when we think of thinking: With the rules of logic, we can discover what truths follow from other truths. The basic form of this is the syllogism, a pattern invented by Aristotle which has continued to be the foundation of logic to the present day.

The traditional example is this one, called modus ponens: "Men are mortal. Socrates is a man. Therefore Socrates is mortal." If x, then y (if you are human, then you are mortal). X (you are human). Therefore, y (you are mortal). This result will always be true, if the first two parts are true. So we can create whole systems of knowledge by using more and more of these logical deductions!

Another syllogism that always works is in the form "If x, then y. Not y. Therefore not x." If you are human, then you are mortal. You are not mortal. Therefore, you are not human. If the first two parts are true, then the last one is necessarily true. This one is called modus tollens.

On the other hand, there are two examples that don’t work, even though they sound an awful lot like the ones I just showed you: If x, then y. Not x. Therefore not y. If you are human, then you are mortal. You are not human. Therefore you are not mortal. That, of course, would come as a big surprise to animals! Or look at this example: "If God would show himself to me personally, that would prove the truth of religion. But he hasn’t done so. Therefore, religion is false." It sounds like a reasonable argument, but it is not. (This is called denial of the antecedent.)

32 | 73© Copyright 2006 C. George Boeree

Page 104: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Another goes like this: If x, then y. Y. Therefore x. If you are human, then you are mortal. You are mortal. Therefore you are human. Or try this one: "If God created the universe, we would see order in nature. We do in fact see order in the universe – the laws of nature! Therefore, God must have created the universe." It sounds good, doesn’t it? But it is not at all logical: The order in the universe could have another cause. (This is called affirmation of the consequent.)

There are many types of rationalism, and we usually refer to them by their creators. The best known, of course, is Plato’s (and Socrates’). Aristotle, although he pretty much invented modern logic, is not entirely a rationalist – he was also interested in the truths of the senses. The most magnificent example of rationalism is Benedict Spinoza’s. In a book called Ethics, he began with one self-evident truth: God exists. By God, he meant the entire universe, both physical and spiritual, so his truth does seem pretty self-evident: Everything that is, is! But from that truth, he carefully, step by step, reasons his way to a very sophisticated system of metaphysics, ethics, and psychology.

Now let’s turn to empiricism. Empiricism focuses, logically enough, on empirical truth (also known as synthetic truth), which we derive from our sensory experience of the world.

Many people think that empiricism is the same thing as science. That is an unfortunate mistake. The reason that empiricism is so closely tied in our minds to science is really more historical than philosophical: After many centuries of religious rationalism dominating European thinking, people like Galileo and Francis Bacon came out and said, hey, how about paying some attention to the world out there, instead of just trying to derive truth from the scriptures? The stage for this change in attitude was, in fact, already set by St. Thomas Aquinas, who at least felt that scriptural truth and empirical truth need not conflict!

The simplest form of empirical truth is that based on direct observation – taking a good hard look. Now this is not the same as anecdotal evidence, such as "I know a guy who has a cousin in Topeka who married a woman whose college roommate saw a UFO." It’s not really even the same as "I saw a UFO." It means that there is an observation that I made that you can make, too, and that, were it possible, everyone should be able to make. In other words, here’s a UFO: Take a look!

(Rationalists would argue, of course, that we could very well ALL be having an hallucination!)

In order to build a more complex body of knowledge from these direct observations, we must make use of induction, also known as indirect empirical knowledge. We take the observations and carefully stretch them to cover more ground than we could actually cover directly. The basic form of this is called generalization. Say you see that a certain metal melts at a certain temperature. In fact, you’ve seen it many times, and you’ve shown it to others. At some point, you make the inductive leap and say "the melting point of this metal is so many degrees." Now it’s true that you haven’t melted every bit of this metal in the universe, but you feel reasonably confident that (under the same conditions) it will melt at so many degrees. That’s generalization.

You can see that this is where statistics comes in, especially in a more wishy-washy science like psychology. How many observations do you need to make before you can comfortably generalize? How many exceptions to the desired result can you explain away as some sort of methodological error before it gets to be too much? What are the odds that my observation is actually true beyond these few instances of it?

Just like there are different styles of rationalism, there are different types of empiricism. In this case, we have given them some names. Most empirical approaches are forms of epistemological realism, which says that what the senses show us is reality, is the truth.

The basic form of realism is direct realism (also known as simple or "naive" realism – the latter obviously used by those who disagree with it!). Direct realism says that what you see is what you get: The senses portray the world accurately. The Scottish philosopher Thomas Reid is the best known direct realist.

33 | 73© Copyright 2006 C. George Boeree

Page 105: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

The other kind is called critical (or representative) realism, which suggests that we see sensations, the images of the things in the real world, not the things directly. Critical realists, like rationalists, point out how often our eyes (and other senses) deceive us. One famous example is the way a stick jutting out of the water seems to be bent at the point it comes out. Take it out of the water, and you find it is straight. Apparently, something about the optics of air and water leads to this illusion. So what we really see are sensations, which are representations of what is real. Descartes and Locke were both critical realists. So are the majority of psychologists who study sensation, perception, and cognition.

But, to give Reid his due, a direct realist would respond to the critical realist that what we call illusions are actually matters of insufficient information. We don’t perceive the world in flash photos: We move around, move our eyes and ears, use all our senses.... To go back to the stick, a complete empirical experience of it would include seeing it from all directions, perhaps even removing it. Then we will see not only the real stick, just as it is, but the laws of air-water optics as well! A modern direct realist is the psychologist J. J. Gibson.

There is a third, rather unusual form of empiricism called subjective idealism that is most associated with Bishop George Berkeley. As an idealist in terms of his metaphysics, he argued that what we see is actually already a psychological or mental thing to begin with. In fact, if we don’t see it, it isn’t really there: "To be is to be perceived" is how he put it. Now, this doesn’t mean that the table you are sitting at simply ceases to be when you leave the room: God’s mind is always present to maintain the table’s existence!

There is this famous question: "If a tree falls in the woods, and there is no one there to hear it, does it make a sound?" The subjective idealist answer is yes, it does, because God is always there.

Another way to look at these three empirical approaches is like this: Critical realism postulates two steps to experiencing the world. First there is the thing itself and the light or sounds (etc.) it gives off. Second, there is the mental processing that goes on sometime after that light hits our retinas, or the sound hits our eardrums. Direct realism says that the first step is enough. Subjective idealism says that the second step is all there is.

(An old story tells about three baseball umpires bragging about their abilities. The first one says "I call 'em as I see 'em!" The second one says "Well, I call 'em as they are!" And the third one says "Shoot, they ain't anything till I call 'em!" The first is a critical realist, the second a direct realist, and the third is a subjective idealist.)

As I said at the beginning of this section, rationalism and empiricism don’t really have to remain antagonistic, and in fact they haven’t. It could even be said that science is a very well balanced blend of the two, where each serves, like the branches of government, as a check and balance to the other.

The traditional, ideal picture of science looks like this: Let’s start with a theory about how the world works. From this theory we deduce, using our best logic, a hypothesis, a guess, regarding what we will find in the world of our senses, moving from the general to the specific. This is rationalism. Then, when we observe what happens in the world of our senses, we take that information and inductively support or alter our theory, moving from the specific to the general. This is empiricism. And then we start again around the circle. So science combines empiricism and rationalism into a cycle of progressive knowledge.

Now notice some of the problems science runs into: If my theory is true then my hypothesis will be supported by observation and/or experiment. But notice: If my hypothesis is supported that does not mean that my theory is true. It just means that my theory is not necessarily wrong! On the other hand, if my hypothesis is not supported, that does in fact mean that my theory is wrong (assuming everything else is right and proper). So, in science, we never have a theory we can say is unequivocally true. We only have theories that have stood the test of time. They haven’t been shown to be false... yet!

34 | 73© Copyright 2006 C. George Boeree

Page 106: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

This is one of the things that most people don’t seem to understand about science. For example, people who prefer creationism over evolution will say that, since evolution is "only a theory," then creationism is just as legitimate. But evolution has been tested again and again and again, and the observations scientists have made since Darwin have held up tremendously well. It's like saying that a thoroughbred race horse is "just a horse," and therefore any old nag is just as good!

On the other hand, creationism fails quickly and easily. Carbon dating shows that the world is far older than creationists suggest. There are fossils of species that no longer exist. There is a notable lack of fossils of human beings during the dinosaur age. There are intermediate fossils that show connections between species. There are examples of species changing right before our eyes. There is a vast body of related knowledge concerning genetics. But with every piece of evidence shown to the creationists, they respond with what the logicians call an ad hoc argument.

An ad hoc argument is one that is created after the fact, in an attempt to deal with an unforeseen problem, instead of being a part of the theory from the beginning. So, if there is a rock that is too old, or a fossil that shouldn’t be, the creationist might respond with "well, God put that there in order to test our faith," or "the days in Genesis were actually millions of years long" or "mysterious are the ways of the Lord." Obviously, creationism is based on faith, not science.

Science is always a work in progress. No one believes in evolution, or the theory of relativity, or the laws of thermodynamics, the same way that someone believes in God, angels, or the Bible. Rather, we accept evolution (etc.) as the best explanation available for now, the one that has the best reasoning working for it, the one that fits best with the evidence we have. Science is not a matter of faith.

Science is, of course, embedded in society and influenced by culture and, like any human endeavor, it can be warped by greed and pride and simple incompetence. Scientists may be corrupt, scientific organizations may be dominated by some special interest group or another, experimental results may be falsified, studies may be poorly constructed, scientific results may be used to support bad policy decisions, and on and on. But science is really just this method of gaining knowledge – not knowledge we can necessarily be certain about, but knowledge that we can rely upon and use with some confidence. For all the negatives, it has been the most successful method we have tried.

35 | 73© Copyright 2006 C. George Boeree

Page 107: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Modern Philosophy: The Enlightenment

36 | 73© Copyright 2006 C. George Boeree

Page 108: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

The 1600's were among the most exciting times for philosophy since ancient Athens. Although the power of religion was still immense, we begin to see pockets of tolerance in different places and at different times, where a great mind could really fly. England was fairly tolerant, if only because of its diversity. Holland was the best place to be. A small country fighting off attacks, military and economic, from every side, needed all the support it could get, whatever your religion, denomination, or even heresy.

The central issues were the same as those of the ancient Greeks: What is the world made of? How do we know anything for certain? What is the difference between good and evil? But they are now informed with centuries of science, literature, history, multicultural experiences, and, of course, written philosophy. Perhaps we have to admit that the modern philosophers are only elaborating on the ancient Greeks, but what elaboration! Was Rembrandt only doodling?

I will approach this era philosopher-by-philosopher, showing, I hope, the "battles" between materialism (e.g. Hobbes) and idealism (Berkeley), between empiricism (Locke) and rationalism (Spinoza), and between faith (Leibniz) and atheism (Bayle).

Thomas Hobbes (1588 - 1678)

Thomas Hobbes was born on April 5, 1588. His father, an Anglican clergyman, left the family when Thomas was still young. Fortunately, his older brother did well for himself, and sent Thomas to Oxford. He served for a while as secretary to Francis Bacon. Travelling around Europe, he paid a visit to Galileo. He spent eleven years in Paris, and was tutor there to the exiled Prince of Wales (who would become Charles II).

In 1651, he wrote The Leviathan, a book presumably concerning politics, but covering much else besides. The book is named for a sea monster in the book of Job in the Bible. It was meant to be a symbol of God’s power, but Hobbes used it to symbolize the state.

Hobbes thought of himself as a scientist, but he was really more of a rationalist: Truth can be had if we only make sure to define our terms well and reason logically! But his conclusions were empiricistic: Nothing is in the mind that isn’t first in the senses. This in turn lead him to a pure materialism: All qualities are really matter in motion. Things "of the mind," such as memories and imagination, are just sense images decaying, and all in the form of matter in motion in the brain.

Will to Hobbes is just the last desire you have before you take action on it – hence free will is an absurdity. All motivation is selfish, and ultimately tied to survival. The basic negative emotion is fear, the basic positive emotion is desire for power. Good and bad are purely subjective matters. And so he goes beyond Descartes: Not only are animals just machines, so are we. B. F. Skinner was an admirer of Hobbes.

Because good and bad are subjective and we are selfishly motivated, we will do whatever we need to do to satisfy our needs. Society must therefore control the individual if we are to have any peace at all! So society develops systems of rewards and punishments, social approval and social censure. Leviathan – the commonwealth – is that necessary evil.

Presaging Rousseau, he suggested we submit to society in order to avoid a purely primitive life, which he characterized as "nasty, brutish, and short." But, in contrast to Rousseau, he felt that society is an arrangement made between ruler and ruled, not among equals. Ultimately, the king must have absolute

37 | 73© Copyright 2006 C. George Boeree

Page 109: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

power for civilization to survive. Democracy, he says, is just rule by orator-demagogues who easily manipulate the mob.

Religion, too, is a device for keeping the peace. It is nothing more than a fear of invisible powers that the mob has accepted as legitimate. Superstition is the same thing, just not accepted as legitimate! I should note that Hobbes was not an atheist: He was a deist, meaning that he believed in a creator, an intelligent prime mover who started all this, but one who does not need to intervene once his mechanical laws of nature take effect.

When he returned to England, he found himself confronted with many critics. Fortunately for Hobbes, his old pupil, now King Charles II, took him in and set him up with a nice pension. He died December 4, 1679, at the age of 91.

Benedictus Spinoza (1632-1677)

Baruch Spinoza was born in Amsterdam on November 24, 1632. His parents were Portuguese Jews who had escaped from the persecution they suffered in their homeland. Sadly, his mother died when Baruch was only six.

He received a religious education, but his father instructed him in various secular subjects in the hopes that Baruch would take on a business career. Baruch became fluent in many languages, and had a particular love for math, especially geometry. His father died in 1654, when Baruch was 22.

Discussing his beliefs with his friends, he admitted to doubting many of their religious traditional beliefs, such as life after death. They reported him to the synagogue soon after. After trying to persuade him to keep his opinions to himself, the rabbis excommunicated him in 1656. At that time, excommunication (Jewish or Christian) including the practice of shunning – i.e. no one in the community was to speak or correspond with him in any fashion.

But Baruch – now called Benedictus ("blessed," the Latin for the Hebrew baruch) – had many friends outside the Jewish community, and they would protect him all his life. Nevertheless, he was forced to move to Rijnsburg, a small town, in 1660 after a death threat, and again in 1663 to Voorburg near the Hague, and finally to the Hague itself.

He supported himself throughout as a lens-maker. At this time, that occupation included not only the making of glasses, but of lenses for telescopes and microscopes – the latest thing in technology! He conducted a variety of experiments as well. Unfortunately, the constant exposure to glass dust was to take a toll on Benedictus’ lungs.

He published, anonymously, the Treatise on Theology and Politics in 1670. This was a devastating critique of Biblical literalism, and was immediately condemned by the religious community of Holland.

His most important work, Ethics, was begun all the way back in 1662. He tried to publish it in 1675, but was frightened off by rumors that his life would be in jeopardy if he did so. He died of tuberculosis two years later, February 20, 1677, at the age of 45. His friends published Ethics and other unpublished works in his honor in that same year.

The full title of the book was Ethics, Demonstrated in the Fashion of Geometry, because he laid out his arguments in the same way that a mathematician might lay out a geometric proof. This is certainly a rigorous way of writing philosophy, but it does make it hard to read. (Dagobert Runes edited The Ethics of Spinoza in

38 | 73© Copyright 2006 C. George Boeree

Page 110: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

1957 so that it would be more readable for modern students.)

According to Spinoza, Substance (that which underlies all reality, also known as Existence or Being) has two attributes (sides or aspects). If we look at reality from one angle, thru the senses, we see it as matter. But if we look at it within ourselves, we see it as thought. He suggested that there were an infinite number of aspects, but those two are the only ones evident to human beings.

So, the body (or brain) and the mind (or soul) are one and the same thing seen from two different perspectives. Where there is material activity, there is thought. Where there is thought, there is material activity. Not all thought is available to what I perceive as myself: Much of it remains unconscious. But it goes on nonetheless.

This "double-aspectism" sounds great, but it does bring us to panpsychism. Panpsychism is the idea that every material thing has a mental side to it (and vice versa). People have minds, animals have minds, plants have minds, even rocks and houses have minds. The earth itself has a mind. Of course, as we move away from people, those minds are increasingly unconscious and lacking in a sense of self, but still....

It also leads to Spinoza’s most famous concept, the one he actually based the rest of his theory upon: God and Nature are one and the same, and identical with all of Existence, mental and physical. God is the mind of the universe; the universe is the body of God. This is often called pantheism – God is everywhere and in everything – but in his day, it was called atheism.

Like Hobbes, Spinoza is a mechanist. He believes only in determinism, not free will. For us as humans, this determinism comes in the form of desires, which derive from our need to survive. All things, he says, have the motive of self-preservation, all things are "selfish."

He says that we strive to increase our power, that is, our capacity to preserve ourselves. Then he identifies this power with virtue! So the good is defined as what is useful to us, and the bad as what is damaging to us. Good advances our well-being, bad decreases our well-being. Good we perceive as pleasure, bad is perceived as pain.

But, we have many desires. Usually, one outweighs the others and we do what we desire most. But often they conflict. This conflict itself decreases our well-being and so is painful. What do we do to make our lives less painful then?

Society helps to some extent. By providing rewards and punishments, praise and blame, it adds new items to our list of desires that may outweigh certain desires and support others. Ultimately, society instills a conscience in most of us. Spinoza saw conscience as learned, not innate.

Ultimately, we must rely on ourselves: First, Spinoza says, we have to gain some control over our desires. When they are out of our control, when they instead control us, he calls them passions. They are out of our control because they operate unconsciously and so are not available to reason. By getting a "clear idea" of them, we turn them into simple emotions, which are amenable to reason. Freud would say, three centuries later, that we must "make the unconscious conscious!"

One way to turn a passion into an emotion, incidentally, is to trace its roots. If you can see where it came from, its operations become clear – conscious – and you can better deal with it.

Another way to deal with passions is to see the necessity of things. Nature is what it is, God wills what he wills, and no one can change that. Surrender to the inevitable, and you will be much more peaceful, at least. A wise person, for example, sees that getting angry at unpleasant people isn’t going to change them. In fact, it only hurts you. Being kind to others, on the other hand, is usually rewarded, and it takes much less out of you. Along with Buddha and Jesus, Spinoza said that love can conquer hate.

He also said that wise people "desire nothing for themselves which they do not also desire for the rest of mankind" (Ethics, iv, 48). This presages Kant’s categorical imperative by a century.

But only an emotion can overcome another emotion. Therefore, reason must itself become an emotion – a powerful one – in order that it may outweigh others. He calls this powerful emotion "the intellectual love of

39 | 73© Copyright 2006 C. George Boeree

Page 111: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

God," which of course means love of nature as well. It also includes the acceptance of God’s will – or natural law. Knowledge of God/Nature is the ultimate virtue, and the ultimate pleasure!

Dismissed by the English as an atheist and by the French as too religious, Spinoza would have great influence on upcoming German philosophers, including Goethe, Hegel, Schopenhauer, and Nietzsche. And it is in Germany where psychology was to flourish.

John Locke (1632-1704)

John Locke is sometimes called "the father of the enlightenment." He was born August 29, 1632, the same year as Spinoza. His father was an attorney and a Puritan, who taught young John the value of representation and religious freedom. John’s father died of tuberculosis when John was 29, leaving him with a small inheritance.

John went to Oxford, received his Masters degree, and taught there. He later studied medicine and became the personal physician to the Earl of Shaftesbury (grandfather to the philosopher of the same title).

Beginning in 1675, Locke studied in France. When he returned, he found the political climate under James II less than congenial, and so moved to Holland. It was there that he wrote his great psychological work, Essay concerning Human Understanding.

In 1689, he returned to England after William and Mary took the throne from James II. There he published his works – the Essay, his two Treatises on Government, and two letters concerning the need

for religious tolerance. In 1691, he retired to a friend’s mansion, and died in 1704 at the age of 72.

His Treatises alone would assure him a place in history near the top. In them, he outlined the basics of representative government, including natural rights, consent of the governed, the protection of property, religious tolerance, separation of church and state, and the checks and balances between executive and legislative branches. His ideas would become the foundation of the Declaration of Independence, the American Constitution, especially the Bill of Rights, and the French Declaration of the Rights of Man. Not bad.

Unlike Hobbes, Locke sees people as having a positive nature, one that contains instincts for social good and the ability to reason. Since our nature is positive, we are should allow ourselves and others the freedom to develop that nature. For this reason, we must each surrender some degree of freedom in order that others may likewise be free to develop their potentials.

Laws are created, not to keep us from destroying each other, but to allow us to express our positive, rational natures. And so a government is legitimate only if its laws promote that which is our nature – to be free and rational. And it can do this only if is based on the consent of the governed! If Hobbes reminds you of Skinner, Locke should remind you of Carl Rogers.

His Essay concerning Human Understanding attacked another popular idea of his time: Many scholars believed that the idea of God and the ideas of good and evil are planted in our minds at birth, perhaps by God himself. It was said that these ideas were innate. But when Locke looked at the variety of beliefs, non-beliefs, and moralities, he concluded that these things could not possibly be innate.

He admits, of course, that there are reflexes and instincts and the like, but these are just physiological

40 | 73© Copyright 2006 C. George Boeree

Page 112: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

sequences of movement, not ideas! There are some ideas, learned from experience, that are learned so early and reinforced so consistently, that they have the appearance of being innate. But that’s only an appearance!

In the course of arguing that there are no innate ideas, he also sets the stage for two future arguments, taken up by Berkeley and Hume. First, he notices that if we try to find matter, we see nothing but qualities that we attribute to matter – but never matter itself. The idea of matter is not empirical! This would be elaborated by Berkeley.

Second, he notices that if we try to find mind, all we see are the qualities we attribute to mind. Never do we see, empirically, a mind at all! This would be elaborated by Hume.

Locke doesn’t make the leaps that Berkeley and Hume will, however. He is too practical for that. He says we are no doubt correct in believing in matter and mind. Life makes little sense without them. And yet, they are not empirically verifiable. He is sometimes called a metaphysical agnostic: He believes that there is mind and matter (and that they do interact, somehow), but no one can prove their existence.

Locke’s ideas were adopted enthusiastically by French philosophers as well as English (and American) thinkers. They would translate him into a revolutionary, and his philosophy of human nature into Sensationism and Mechanism.

George Berkeley (1685-1753)

George Berkeley was born March 12, 1685 at Dysert Castle in Ireland. He went to Trinity College in Dublin, where, among other things, he studied John Locke.

In 1709, he wrote An Essay Towards a New Theory of Vision. He asked, if a man, born blind, recovered his sight, what would he see? Berkeley reasoned he would see a meaningless array of qualities, which he would interpret as in his mind, and certainly not extended further than his eyes. Only repeated connection between the sights he sees and those same objects touched would lead him to learn shapes, distances, and so on. Later operations actually restoring people’s sight supported his theory.

Space (extension), therefore, is a mental construct, a matter of coordinating the relationships between what we see and what we experience through touch. We will see this idea of space as a mental thing again in Kant’s theory.

In 1710, he wrote The Principles of Human Knowledge. If, as Locke said, all knowledge comes through the senses, then we can know nothing that does not come through the senses. Extension in space, the shapes of things, their resistance to touch, their colors, tastes, smells,... all these do in fact come through the senses. But when does matter come through the senses? When do you see matter, or feel it, or taste it? All you ever experience through the senses are qualities, never a substance!

Matter is therefore a theory without evidence. Since the atheism of Berkeley’s day relied a great deal on materialism, he felt he had laid a knock-out punch!

Of course, it’s not just atheists who believe in matter – nearly everyone does. It’s "common sense." Dr. Johnson thought he gave the perfect rebuttal to Berkeley’s idea when he kicked a rock as hard as he could: The pain that rock caused him could hardly be denied! But Berkeley would (and did) note that all anyone could know about the rock was its shape, location, color, i.e. information of the senses, including the sense of pain if you are stupid enough to kick it.

Esse est percipi, Berkeley said: To be is to be perceived.

41 | 73© Copyright 2006 C. George Boeree

Page 113: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

So what happens to things when we are not looking at them, touching them, or kicking them? Do they vanish every time we turn around? Berkeley said of course not! Things – as collections of qualities – always remain, but in God’s mind, which encompasses everything.

When a tree falls in the forest, and there’s no one to hear it, does it make a sound? Berkeley would say it does, because God hears it. This is perhaps the purest, and most eloquent, version of idealism ever. Only the Mahayana (northern) Buddhists have a similar idea in their "mind-only" philosophy. In their case, they refer not to God but to Buddha-mind.

Berkeley went on to spend some time in Rhode Island, waiting for a grant to start up a college in Bermuda, which never arrived. Berkeley in California was named for him. He became (Anglican) Bishop of Cloyne in 1734, and died at Oxford in 1753 at the age of 68.

Gottfried Wilhelm Leibniz (1646-1716)

Leibniz was born June 21, 1646. His father was a professor of philosophy at the University of Leipzig. Little Gottfried was a boy genius (of course) and received his doctorate at the age of 20. He spent some time gallivanting around Europe, tasting just about every philosophy the continent had to offer.

In 1672, he went to France as a diplomat. There he would begin to invent calculus, as well as a calculator that could multiply and divide. In 1676, he visited Spinoza in Holland (where he read the manuscript for Ethics), and then went on to Hanover to serve the prince there. In 1700, he founded the Berlin Academy.

His major life’s project was to reconcile Catholicism and Protestantism. He failed, obviously. It will take a lot more than genius to reconcile those two!

His major work, as far as psychology is concerned, is New Essays on Human Understanding, a refutation of Locke written in 1703, but not published until 1765.

His basic point was that the mind is not a passive "tabula rasa" (clean slate or piece of paper) upon which experience writes, as Locke and Aristotle suggested. The mind is a complex thing that works on and transforms experience. "Nothing is in the mind that has not been in the senses," he said, paraphrasing Locke, "except the mind itself." This would inspire Kant, and many psychologists in more recent times.

Leibniz also suggested that while we are alive, the mind is never entirely at rest, even in deep sleep. In fact, it is often functioning even when we are not conscious of it doing so. It was this conception of the unconscious that would most influence Schopenhauer and, later, Freud.

Leibniz had a very unusual metaphysics. He started with the same sort of skeptical approach as Descartes. But he ended with an idealistic metaphysics called monadology that outdoes even Berkeley's metaphysics. Monads are souls. Each soul contains within it the "perception" of the entire universe. It's not that there is an entire universe outside our souls which we all perceive as an object – souls are all there is!

We often experience ourselves as interacting with others – "monad a monad," you might say. But Leibniz makes it clear that we are only apparently interacting, each within our own internal universe. Monads, he tells us, are "window-less."

We consciously perceive only a small piece of this internal universe – our "point-of-view," we could say. I am not aware, however, of what the insides of my stomach look like, or what thoughts you are having at this

42 | 73© Copyright 2006 C. George Boeree

Page 114: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

moment, or what's happening on some planet circling Alpha Centauri. All that and more is "in" me, but is only perceived unconsciously.

Although each soul has its own "point-of-view," all souls contain the same total perception of the universe. This is what he called harmony. But some souls have a clearer, more complete, more conscious, view of the universe within than others do. Only one soul is totally conscious, or, if you like, contains all "points-of-view." That soul is God.

Leibniz became increasingly isolated and impoverished over time, being without a political sponsor. He died alone in 1716, and his funeral was attended only by his secretary.

Pierre Bayle (1647-1706)

Pierre Bayle was born November 18, 1647, the son of a Huguenot (Protestant) minister in southern France. He was sent to a Jesuit college in order to get the best education, and was converted to Catholicism there. When he returned, he converted back to Protestantism! This made him a relapsed heretic, a very dangerous thing to be at the time.

So his father sent him to Geneva to study, where he discovered Descartes. He taught for a while in France, but then found it necessary to escape to Rotterdam, in Holland, where he eventually became a professor. He suffered from headaches and depression and never married.

In 1682, he anonymously published Diverse Thoughts on the Comet. Referring to a recent comet that had everyone abuzz, he wrote against the various superstitions of his day and the belief in miracles. In the book, he noted that as far as actions and morality are concerned, he could see no difference between Catholics and Protestants, Christians and Jews and Moslems and pagans and even atheists!

In Amsterdam in 1684 he began a magazine called News of the Republic of Letters. He wrote all the articles himself! In the meantime, both his parents and his brother were killed during the persecution of the Huguenots. So he wrote a book on tolerance. But tolerance was not on the Protestant agenda, either, and he lost his professorship. "God preserve us from the Protestant Inquisition!" he wrote.

His major work was The Dictionary, which was really more of an encyclopedia of philosophy, religion, literature, etc. Writing 14 hours a day, he wrote 2600 pages. In this massive work, he "deconstructed" (as we would say nowadays) a great number of Biblical stories, religious beliefs, and philosophical theories, including such tidbits as the doctrine of original sin and the trinity. He even suggested that, if God and Satan actually exist, Satan is winning! He would always add, after making these extreme statements, that of course no good Christian would ever believe such a thing!

After years of condemnation by the religious establishment, he died of tuberculosis on December 28, 1706. But The Dictionary would become immensely popular among intellectuals throughout Europe, and have a great influence on thinkers for more than a century.

As we enter the 1700's, we find religion fighting a losing battle against the forces of reason and science. While average people still went to church, baptized their babies, and prayed for forgiveness, the educated elite turned to deism, pantheism, and even atheism. This included the intellectuals of Catholic France as well as future "founding fathers" in colonial America: Ben Franklin, Thomas Jefferson, James Madison, and even George Washington were deists, and John Adams was a Unitarian. Scientific discovery and invention would steamroller traditional society for the next 300 years. Psychology would attempt to follow, but would lag behind for some time to come!

43 | 73© Copyright 2006 C. George Boeree

Page 115: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

August Comte’s Positivist Calendar

Since the Enlightenment, Philosophers and Scientists have been trying to make human life a little more rational. Decimal-based monetary systems, for example, began in the US in 1786, on the insistence of Thomas Jefferson. The system was adopted in France in 1793, with decimes and centimes. These terms, meaning tenths and hundredths, became our dimes and cents. The Italians didn't adopt it until 1862, and it didn't get to the British until 1971!

Similarly, the French introduced the metric system in 1795. By the end of the 1900s, almost all countries have adopted it -- the US being the notable exception this time!

Far more resistant to "rationalization" has been time. It does not seem that we will ever change the 60-second minute, 60-minute hour, and 24-hour day, but these are at least consistent and international. The calendar has likewise resisted change, but not for a lack of ideas! You see, the year of 365 days is 4 x 7 x 13 plus 1 (and another 1 on leap years), which means that there are several simple schemes we could be using. For example, we could have four seasons of thirteen weeks of seven days each (plus one, and another on leap years). Or....

August Comte, in 1849, published a 13 month calendar, which he called The Positivist Calendar. It consisted of 13 months of 28 days each (exactly four weeks). There was one extra day at the end of the year which had no weekday assigned to it, and one more extra day on leap years. Every year begins on Monday, Moses 1. It begins with1789 as year one, so the year 2000 would be 212. Each month would look exactly like this:

Mon Tue Wed Thu Fri Sat Sun1 2 3 4 5 6 78 9 10 11 12 13 1415 16 17 18 19 20 2122 23 24 25 26 27 28

Here are the names of the months Comte proposed:

• 1. Moses • 2. Homer • 3. Aristotle • 4. Archimedes • 5. Caesar • 6. St. Paul • 7. Charlemagne • 8. Dante • 9. Gutenburg • 10. Shakespeare • 11. Descartes • 12. Frederick II • 13. Bichat

Individual days were dedicated to significant persons in fields related to the month, e.g...

44 | 73© Copyright 2006 C. George Boeree

Page 116: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

• Moses 14 – Buddha • Aristotle 21 – Socrates • Gutenburg 7 – Columbus• Shakespeare 28 – Mozart • Descartes 28 – Hume• Bichat 7 – Galileo

It never caught on.

There were calendar suggestions both before and after Comte's. Perhaps the most famous is the French Revolutionary Calendar. Invented by a committee led by Fabre d'Églantine, it was adopted by the Convention in October, 1793. The calendar was divided into twelve months of thirty days each, leaving five days (six in leap years) over at the end of the last month. These five or six days were to be known as the Sans-culottides, and were to be a series of national holidays. Each month had three ten-day weeks called décades named arithmetically--primidi, duodi, tridi, quartidi, quintidi, sextidi, septidi, octidi, nonidi, décadi. The last day, décadi, was designated a day of rest. Dating was begun with Vendémiaire 1, year 1, corresponding to the fall equinox, September 22, 1792. Each month looked like this:

primidi duodi tridi quartidi quintidi sextidi septidi octidi nonidi décadi1 2 3 4 5 6 7 8 9 1011 12 13 14 15 16 17 18 19 2021 22 23 24 25 26 27 28 29 30

The names of the months:

Vendémiaire (vintage) sep 22 - oct 21 Brumaire (misty) oct 22 - nov 20 Frimaire (frosty) nov 21 - dec 20 Nivôse (snowy) dec 21 - jan 19 Pluviôse (rainy) jan 20 - feb 18 Ventôse (windy) feb 19 - mar 20 Germinal (seeds) mar 21 - apr 19 Floréal (blossoms) apr 20 - may 19 Prairial (meadows) may 20 - jun 18 Messidor (harvest) jun 19 - jul 18 Thermidor (hot) jul 19 - aug 17 Fructidor (fruitful) aug 18 - sep 16

The Sans-culottides ran from sep 17 to sep 20, with sep 21 as leap day. Sans-culottides literally means the time of no breeches. The "No-Breeches" were the working class revolutionaries, who wore long pants instead of breeches (culottes) that tied or buttoned just below the knee, which is what upper class men wore. These days were named...

Jour de la vertú (Virtue Day) Jour du génie (Genius Day) Jour du travail (Labor Day) Jour de l'opinion (Reason Day) Jour des récompenses (Rewards Day) Jour de la révolution (Revolution Day) (the leap day)

It was used in France for twelve years, until Napoleon changed it back.

45 | 73© Copyright 2006 C. George Boeree

Page 117: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Much more recently, the World Calendar was introduced by Elisabeth Achelis in 1930. She founded the World Calendar Association October 21 of the same year. It was gaining great international support until World War II interrupted civilized discussion. Reintroduced to the United Nations after the war, world-wide adoption was thwarted by the United States in 1955. American politicians could not afford to alienate the religious right, who were upset by the idea that, once a year, there would be an extra day, making it eight days between one Sunday and the next -- not in keeping with Biblical tradition! The organization moved to Ottawa and became the International World Calendar Association. Sadly, the UN setback led to demoralization of World Calendar supporters. Now, with the advent of the internet, there is again a small movement for adoption of the World Calendar voicing its support.

The World Calendar consists of 12 months, divided into four quarters. Each quarter begins on a Sunday, with a 31-day month. This is followed by two 30-day months. At the end of the year, an extra day is appended to bring the total number of days to 365. This is called World Day and does not have a weekday designation. It is conceived of as an international holiday much like New Year's Eve. Every fourth year, an extra day is added to the sixth month. It too has no weekday designation, and is thought of as an international holiday. By this method, we would have the same calendar for every year.

January/April/July/OctoberSu Mo Tu We Th Fr Sa1 2 3 4 5 6 78 9 10 11 12 13 14

15 16 17 18 19 20 2122 23 24 25 26 27 2829 30 31

February/May/August/NovemberSu Mo Tu We Th Fr Sa

1 2 3 45 6 7 8 9 10 11

12 13 14 15 16 17 1819 20 21 22 23 24 2526 27 28 29 30

March/June/September/DecemberSu Mo Tu We Th Fr Sa

1 23 4 5 6 7 8 9

10 11 12 13 14 15 1617 18 19 20 21 22 23

24 25 26 27 28 29 30 *

*Plus World Day at the end of December every year, and Leap Day at the end of June every fourth year.

The World Calendar would present us with the fewest changes to how we perceive the weeks, months, and year, plus leaves us with four identical seasons or quarters -- a help in business finances! Unfortunately, there is still enormous resistance to simple, sensible ideas like these!

For more information, see Rick McCarty's wonderful Home Page for Calendar Reform at

personal.ecu.edu/mccartyr/calendar-reform.html

46 | 73© Copyright 2006 C. George Boeree

Page 118: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Spinoza’s Emotions *

i. Desire is the essence of man insofar as it is conceived as determined to any action by any one of its modifications. [I.e., when there is change, we become motivated, and that is called desire.]

ii. Joy is man’s passage from a less to a greater perfection. [We feel joy when we improve our abilities to deal with what life hands us.]

iii. Sorrow is man’s passage from a greater to a less perfection. [We feel sorrow when we find we are not able to deal with life.]

iv. Love is joy with the accompanying idea of an external cause. [When something, or someone, gives us joy, we feel love towards that thing or person.]

v. Hatred is sorrow with the accompanying idea of an external cause. [When something, or someone, gives us sorrow, we feel hatred towards that thing or person.]

vi. Hope is a joy not constant, arising from the idea of something future or past about the issue of which we sometimes doubt. [When we detect the possibility of joy in an otherwise uncertain situation, we feel hope.]

vii. Fear is a sorrow not constant, arising from the idea of something future or past about the issue of which we sometimes doubt. [When we detect the possibility of sorrow in an uncertain situation, we feel fear.]

viii. Confidence is a joy arising from the idea of a past or future object from which cause for doubting is removed. [Confidence happens when hope conquers fear.]

ix. Despair is sorrow arising from the idea of a past or future object from which cause for doubting is removed. [Despair happens when fear overwhelms hope.]

x. Gladness is joy with the accompanying idea of something past which, unhoped for, has happened. [Gladness is the recognition that things have gone well.]

xi. Remorse is sorrow with the accompanying idea of something past which, unhoped for, has happened. [Remorse is the recognition that things have gone wrong. It might include regret and even guilt, if we had some responsibility in the matter.]

xii. Favor is love toward those who have benefited others. [It is the appreciation we feel towards good people.]

xiii. Indignation is hatred toward those who have injured others. [It is the hatred we feel towards bad people.]

xiv. Overestimation consists of thinking too highly of another person in consequence of our love for him. [This might include infatuation.]

xv. Contempt consists in thinking too little of another person in consequence of our hatred for him. [To have contempt for someone is the same as despising them.]

* From Spinoza's Ethics (Elwes, Trans.) [All comments in brackets are my own.]

47 | 73© Copyright 2006 C. George Boeree

Page 119: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

xvi. Envy is hatred in so far as it affects a man so that he is sad at the good fortune of another person and is glad when any evil happens to him. [Envy may include jealousy and lead to spitefulness.]

xvii. Compassion is love in so far as it affects a man so that he is glad at the prosperity of another person and is sad when any evil happens to him. [This, which many would call love, is no doubt the most worthy emotion.]

xviii. Self-satisfaction is the joy which is produced by contemplating ourselves and our own power of action. [Today, we might refer to this as self-esteem or self-worth.]

xix. Humility is the sorrow which is produced by contemplating our impotence or helplessness. [Although humility sounds negative, it involves a realistic understanding of our limitations.]

xx. Pride is thinking too much of ourselves, through self-love. [We often use the word to mean something positive, but traditionally pride is undeserved or excessive self-esteem.]

xxi. Despondency is thinking too little of ourselves through sorrow. [This corresponds to that unrealistic sense of guilt that plagues so many people.]

xxii. Self-exaltation is joy with the accompanying idea of some action which we imagine people praise. [Self-esteem based on others’ opinions of particular behaviors.]

xxiii. Shame is sorrow with the accompanying idea of some action which we imagine people blame. [Like humility, but based on others’ opinions of particular behaviors. We call it guilt if it is entirely internalized.]

xxiv. Benevolence is the desire to do good to those whom we pity. [Benevolence is the emotion behind our good deeds. Pity here does not carry the negative tone it often does today.]

xxv. Anger is the desire by which we are impelled, through hatred, to injure those whom we hate. [Anger is the emotion behind aggression. It includes the desire for revenge.]

48 | 73© Copyright 2006 C. George Boeree

Page 120: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Metaphysics

49 | 73© Copyright 2006 C. George Boeree

Page 121: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Metaphysics is that part of philosophy that examines the composition of the universe, and asks "what is the world – including us – made of?" "What is the ultimate substance?"

You might assume that this is more interesting to a physicist than a psychologist. Physics, in fact, gets its name from the Greek word physis, which means "ultimate substance." But for psychology, one of the enduring problems is the relationship between the mind and the body. Is the mind, for example, just the activity of the brain, as many suppose? Or is it more than that? This is one of the issues that psychology has inherited from religion as well as philosophy: We can just as well ask about the separate existence of the soul, and its relationship to mind and body. Psyche, after all, is the Greek word for soul!

For a variety of reasons, philosophers generally would like it if there were exactly one ultimate substance in the universe – an idea called monism. Call it a love of simplicity. But the problem is, of course, which one? There are two major competitors for the title:

Materialism says that the universe is made entirely of matter. Matter, for philosophers, includes energy and anything in the physical sense. Some early Greek philosophers, for example, thought that the whole world (including us) was made of water. Others thought it was fire. Others still thought the universe was composed of invisible particles which were neither created nor destroyed called atoms. Today, physicists (and chemists, and biologists, and most psychologists) have agreed on more complex explanations which, nonetheless, boil down to physical reality.

(Please note that "materialism" does not refer here to the love of material things!)

Idealism is the other competitor for the title. Idealism says that the universe is made of the spiritual or the mental, which they refer to as idea or the ideal. Early Greeks also had a variety of ideas regarding what particular brand of ideal constituted the universe. Some would say the entire world was nothing more than God’s dream (like some Hindu philosophers would argue). Others saw it as a sort of life force. Others still saw it as the perfection behind the flawed world we perceive. Modern idealist philosophers talk in terms of a world of persons, or a world of qualities.

50 | 73© Copyright 2006 C. George Boeree

Page 122: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

(Again a note: "Idealism" does not here refer to living by high ideals!)

Although it may seem to you that materialism is the obviously better answer, that is more a matter of culture than philosophy. The majority of philosophers have been idealists, because idealism is a bit more reasonable than materialism! Consider: Have you ever seen "matter?" If you look at a chair, for example, you see its shape, its colors. If you touch it, you feel resistance, warmth or coolness. You can tap it and hear sounds, smell it or lick it (if you really want to), and so on. You experience many mental events, but never, all by its lonesome, matter! But ideas – all you have to do is have a thought, and you experience it as self-evident!

(We’ll come back to some more kinds of monism in a bit.)

The usual alternative to monism is called – logically enough – dualism. It is simply a matter of saying that there are, in fact, two different substances in this universe: material and ideal. For psychology, this would be the idea that the mind (or spirit) and the body are both equally real, and that neither can be reduced to the other.

Now, this sounds like the obvious solution to the dilemma. But there is a serious philosophical problem: If there are two different substances in the universe, how could they possibly interact? How does the soul, which is presumed to be without mass or extension, cause the body to act? And how do the things that happen to the body somehow change from physical activities into a mental thing?

51 | 73© Copyright 2006 C. George Boeree

Page 123: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Think about it for a minute. It is easy to say that, when we see a red apple, the light waves cause chemical reactions in the retina, which cause sensory neurons to fire, which causes neurotransmitters to sail across synapses, which send the neural signals deeper and deeper into the brain, the activity of which is the thing we call "seeing the red apple." But no matter how much detail you provide, at no point do you convert all this physical activity into the experience of an apple!

Likewise, if I have a thought that says "I’m going to throw the apple at you," there’s no question that there will be neural activity, translated into muscular activity, translated into the flight of the apple. But when, where, and how did that thought become a neural activity? Some refer to this problem as the mind-body problem. Others call it the ghost in the machine.

Descartes, in addition to be the father of modern philosophy, also took time out to promote the idea of the reflex. In his day, hydraulic devices were all the rage – fountains with moving characters. Descartes simply suggested that living creatures are similar mechanisms (no different than we do today when we suggest that the brain is just a wet computer.)

But Descartes was also a devout Catholic who believed we had an immortal soul. How that soul influenced the body or the body the soul remained a mystery. Descartes thought that perhaps the pineal gland (a few inches behind your eyes) was a conduit that let in the "animal spirits" from our souls, which traveled through the nerves and made our muscles move. A bad guess.

Descartes’ type of dualism is called interactionism: There are two substances, they interact, I don’t know how. That, of course, is less than satisfactory. So other philosophers put in their two cents. A French priest named Nicholas Malebranche suggested that God intervenes, and makes us experience things when stuff happens to our bodies, and makes our bodies move when we will it. Since these interactions occur in all of us every day, a million times a day, God must be very busy. But God is, well, God... so it is clearly a possibility. This type of dualism is called occasionalism.

52 | 73© Copyright 2006 C. George Boeree

Page 124: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Another explanation was given by the German genius Leibniz. He suggested that, rather than have God intercede a gazillion times a day, He could have simply set the entire universe going in two coordinated paths, one material, one spiritual. Like I can set two clocks – one an antique pendulum clock and one an electric digital – to keep the same time, even though they are completely different mechanisms and have no contact with each other, so God could have done the same with the body and the mind. This is called parallelism. Again, not a bad explanation.

But philosophers (and psychologists) desire more certain knowledge than faith. So the search for an answer went on. Perhaps the most impressive of the enlightenment answers came from the lens-grinder Benedict Spinoza. His theory is called double-aspectism. It is a monism that looks like a dualism: The mind and the body, he said, are two sides of one "coin," which is the true ultimate substance of the universe.

So, if a brick hits you on the forehead, the physical things happening inside your head have another side to them, which is the pain you experience. And the thought you have to raise your hand to touch the bruise has another side to it, which is the physical act of doing so. Problem solved!

But perhaps not quite. If you say that the entire universe has two sides to it, you have to include not only mind and brain. Spinoza believed that God is what we call the mental side of the universe, and nature is what we call the physical side. God is the mind of nature, and nature is God’s body! This is called pantheism. In Spinoza’s day, it was called atheism, and was grounds in most countries for a bonfire, with you as the guest of honor.

Even if you kind of like the idea of pantheism, keep in mind that it also implies panpsychism – everything has to have its mental side. So animals and plants have souls and rocks have thoughts (albeit slow and simple ones!). On the other hand, there can be no soul in heaven that is not attached to some body. These ideas are a little harder to take.

Much later, William James, "the father of American psychology," and our best philosopher in his spare time,

53 | 73© Copyright 2006 C. George Boeree

Page 125: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

came up with neutral monism. He suggested that Spinoza was nearly right, but not quite. The physical is the one ultimate substance as seen from one perspective, and the mental or spiritual is that substance seen from another perspective. The ultimate substance is something else, something neutral. This means that it is quite alright to say that some things can only be seen as physical, others only as mental, and some as both.

The problem that remains is the question we started out with: What then is the ultimate substance? One recent suggestion is information. This doesn't stray too far from materialism and is popular with the artificial intelligence movement and cognitivists in psychology. Another suggestion is more idealist, and offers quality as the ultimate substance, quality sometimes having physical characteristics, sometimes mental ones.

William James also came up with another idea called pluralism. Strictly speaking, of course, dualism is a pluralism. But he suggested that the were many more than two "ultimate substances." There is matter, of course, and mind. But there is also math and logic – are they physical or are they mental, or are they something else? And there’s space and time – what are they? Even the material can be divided into matter, energy, gravity, and so on. And the mental includes thoughts, perceptions, imagery, feelings, will, choice, etc. Some of these things may interact (matter and energy, for example, via e = mc2). Others may not interact with anything else. The problem? Now instead of having two ultimate substances we need to reconcile, we have hundreds.

Perhaps the most popular metaphysics among researchers in psychology is called epiphenomenalism. This approach suggests that, while materialism is clearly the way to go in the sciences, it is also undeniable that there is something real about our inner, psychological life. So, say the epiphenomenalists, let's allow that there is something called mind which we have yet to pin down, but let's also say that the mind is nothing more than a biproduct of brain! Sort of like heat is a byproduct of an engine's operation: If we could design a perfect engine, all energy would translate into motion instead of heat! So, if we completely understood the brain, we would no longer need the concept mind. This is just a form of materialism, although a more humble form.

So the problem remains. "How does all this relate to ordinary psychology?" you might ask. Well, think about anything having to do with psychology – love, anger, perception, mental illness, psychopharmacology.... What is depression? Is it a perceptual or emotional problem? Or is it a matter of serotonin availability? Should we use drugs to alter people with problems such as these, or is it a matter of helping them to change their perspectives on life? If it’s a combination of the two, how do we know how much of the problem is one or the other? Is it the same for everyone? The mind-body problem does indeed remain, right at the very heart of psychology.

54 | 73© Copyright 2006 C. George Boeree

Page 126: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Hume and Kant

55 | 73© Copyright 2006 C. George Boeree

Page 127: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

The 1700s saw many great thinkers who have left a lasting impact on modern philosophy and science – and psychology. But there were two who would, between them, define the nature of science, especially psychology. They are, of course, David Hume and Immanuel Kant.

David Hume

David Hume was born April 26, 1711 in Edinburgh, Scotland. His father died the following year and left the estate to his eldest son, John. John ensured that David would receive a good Presbyterian upbringing and sent him – at the age of 12 – to the University of Edinburgh. David left three years later, to become a philosopher!

His family suggested he try law, and he tried, but found that it – as he put it – made him sick. So he went off to travel a few years in England and France. It was at a Jesuit College in France that he wrote A Treatise of Human Nature (in two parts), which he published anonymously in London in 1739.

Hume was the ultimate skeptic, convincingly reducing matter, mind, religion, and science to a matter of sense impressions and memories. First, he agreed with Bishop Berkeley that matter, or the existence of a world beyond our perceptions, is an unsupportable concept. Further, cause and effect were likewise unsupportable. We see sequences of events, but can never see the necessity that determinism requires. Further still, his investigations led him to dismiss the existence of a unifying mind within us. What we call mind is just a collection of

mental perceptions. And finally, without mind, there can be no free will.

I will let him speak for himself. Pay close attention to some really good arguments!

All ideas are copies of impressions...it is impossible for us to think of anything which we have not antecedently felt by our senses....

When we entertain any suspicion in a philosophical term, we need but inquire from what impression is that supposed idea derived. If it be not possible to assign any, this will serve to confirm our suspicion that it is employed without meaning....

56 | 73© Copyright 2006 C. George Boeree

Page 128: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Some philosophers found much of their reasonings on the distinction of substance and quality. I would fain ask them whether the idea of substance be derived from impressions of sensations or impressions of reflection. Does it arise from an impression? Point it out to us, that we may know its nature and qualities. But if you cannot point out any such impression, you may be certain you are mistaken when you imagine you have any such idea.

The idea of substance is nothing but a collection of ideas of qualities, united by the imagination and given a particular name by which we are able to recall that collection. The particular qualities which form a substance are commonly referred to an unknown something in which they are supposed to "inhere." This is a fiction.

And so...no matter!

There are some philosophers (e.g. Berkeley) who imagine we are every moment intimately conscious of what we call our self; that we feel its existence and its continuance in existence, and are certain of its identity and simplicity.

For my part, when I enter most intimately into what I call my self, I always stumble on some particular perception or other, of heat or cold, light or shade, love or hatred, pain or pleasure, color or sound, etc. I never catch my self, distinct from some such perception.

I may venture to affirm of the rest of mankind that they are nothing but a bundle or collections of different perceptions which succeed each other with an inconceivable rapidity and are in a perpetual flux and movement. Our eyes cannot turn in their sockets without varying their perceptions. Our thoughts are still more variable. And all our other senses and powers contribute to this change.

The mind (or self) is a kind of theatre where perceptions make their appearances, pass, repass, glide away, and mingle in an infinite variety. But there is no simplicity, no one simple thing present or pervading this multiplicity; no identity pervading this process of change; whatever natural inclination we may have to imagine that there is. The comparison of the theatre must not mislead us: it persists, while the actors come and go. Whereas, only the successive perceptions consititute the mind.

As memory alone acquaints us with the continuance and extent of a succession of perceptions, it is to be considered, on that account chiefly, as the source of personal identity. Had we no memory, we should never have any notion of that succession of perceptions which constitutes our self or person. But having once acquired this notion from the operation of memory, we can extend the same beyond our memory and come to include times which we have entirely forgot. And so arises the fiction of person and personal identity.

And no mind!

There is no idea in metaphysics more obscure or uncertain than necessary connection between cause and effect. We shall try to fix the precise meaning of this terms by producing the impression from which it is copied. When we look at external objects, and consider the operation of causes, we are never able, in a single instance, to discover a necessary connection; any quality which binds the effect to the cause, and renders one a necessary consequence of the other. We find only that the effect does, in fact, follow the cause. The impact of one billiard ball upon another is followed by the motion of the second. There is here contiguity in space and time, but nothing o suggest necessary connection.

Why do we imagine a necessary connection? From observing many constant conjunctions? But what is there in a number of instances which is absent from a single instance? Only this: After a repetition of similar instances the mind is carried by habit, upon the appearance of the cause, to expect the effect. This connection, which we feel in the mind, this customary and habitual transition of the imagination from a cause to its effect, is the impression from which we form the idea of necessary connection. There is nothing further in the case.

Out with cause and effect!

57 | 73© Copyright 2006 C. George Boeree

Page 129: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

The most irregular and unexpected resolutions of men may be accounted for by those who know every particular circumstance of their character and situation. A genial person, contrary to expectation, may give a peevish answer, but he has a toothache or has not dined. Even when, as sometimes happens, an action cannot be accounted for, do we not put it down to our ignorance of relevant details?

Thus it appears that the conjunction between motive and action is as regular and uniform as between cause and effect in any part of nature. In both cases, constant conjunction and inference from one to the other.

Free will is only our ignorance of cause and effect, and cause and effect is an illusion, so free will is an illusion. Simple.

In all reasonings from experience, then, there is a step taken by the mind (that the future resembles the past) which is not supported by any argument. Nevertheless, we take this step. There must therefore be some other principle (than rational or demonstrative argument). This principle is custom....

What, then, is the conclusion of the whole matter? A simple one, though, it must be confessed, pretty remote from the common theories of philosophy. All belief concerning matters of fact or real existence, is derived merely from some object present to the memory or the senses, and a customary conjunction between that and some other object. Having found, in many instances, that two kinds of objects have been conjoined (say, flame and heat), the mind is carried by custom to expect the same in the future. This is the whole operation of the mind in all our conclusions concerning matters of fact and existence.

So long, science!

If we take in hand any volume, of divinity or metaphysics, for instance, let us ask: Does it contain any reasoning concerning quantity or number? No. Does it contain any experimental (probable) reasoning concerning matter of fact? No. Commit it then to the flames: for it can contain nothing but sophistry and illusion.

I am at first affrighted and confounded with that forlorn solitude in which I am placed by my philosophy, and fancy myself some strange uncouth monster, utterly abandoned and disconsolate. Fain would I run into the crowd for shelter and warmth. I call upon others to join me. But no one will hearken to me. Everyone keeps at a distance, and dreads that storm which beats upon me from every side. I have exposed myself to the emnity of all metaphysicians, logicians, mathematicians, and theologians. Can I wonder at the insults I must suffer? I have declared my disapprobation of their systems. Can I be surprised if they should express a hatred of my ideas and my person? when I look about me, I foresee on every hand, dispute, contradiction, anger, calumny, detraction. When I turn my eye inward, I find only doubt and ignorance. Every step I take is with hesitation; every new reflection makes me dread an error and absurdity in my reasoning.

In 1739, he returned to Edinburgh, where he added a third part to A Treatise, on morality. He suggested that morality comes from sympathy, which is an instinct for association with others. He goes on to say that it is emotions that move us, not reason, and he presages Jeremy Bentham’s utilitarianism by defining virtue as "every quality of the mind which is useful or agreeable to the person himself or others." Even beauty is based on pleasure and pain, and love is based on our desire to reproduce – shades of Freud!. What little attention this part received was negative.

At this point in his life, he went through several minor political positions. And he gained a great deal of weight – something unusual among philosophers! Then, in 1748, he published An Enquiry Concerning the Human Understanding, followed in 1751 by An Enquiry Concerning the Principles of Morals. These were essentially a rewrite of A Treatise. In it, he included a new essay, "Of Miracles," wherein he portrays some of Christianity’s most basic beliefs as nothing but superstition!

58 | 73© Copyright 2006 C. George Boeree

Page 130: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

He continued on that subject with Dialogues Concerning Natural Religion, in which he compared Christianity, Deism, and Atheism. Among other things, he suggests that the world we know – including ourselves – is the result of eons of nature’s experiments. His friends asked him not to publish it. They published it for him posthumously (no pun intended).

In 1752, he wrote Political Discourses. Although he liked egalitarianism (roughly, communism) and democracy, he felt that both were too idealistic. This book influenced Adam Smith, the father of modern capitalism.

In 1754, he published the first volume of the History of England, a book admired by such notables as Voltaire and Gibbon (the author of The Decline and Fall of the Roman Empire).

In 1763 he went to Paris, where he soon became the talk of the town and was especially popular at the salons of the great aristocratic women of France, who apparently took a liking to his grand body as well as his great mind. Several years later, he brought the nearly insane Rousseau to England, which turned out to be a disagreeable adventure for both of them.

He died August 25, 1776, of ulcerative colitis. His friends found the great atheist polite, pleasant, even cheerful, to the end.

Immanuel Kant

Immanuel Kant was born on April 22, 1724, in Königsberg, Prussia (Now Kaliningrad, Russia). He was of Scottish descent and had a Pietist upbringing and education. (Pietism is a form of Protestantism similar to Methodism, i.e. very conservative.) He went to the University of Königsberg, where he received his PhD. He taught as a privatdozent, which is a private teacher or tutor, paid by his students. This meant a poor life, boardinghouses, and bachelorhood.

He began with an interest in science – physics, astronomy, geology, biology. In fact, he introduced the nebular hypothesis, suggesting that originally, swirling gases condensed into the sun and the planets – basically, what we understand to be the reality today. He also reintroduces Lucretius’s idea of evolution of plant and animal life.

In 1781, he published the Critique of Pure Reason. Critique means a critical or careful analysis, and pure reason means reason which leads to knowledge that doesn’t require experiential proof, what is also called a priori (before-hand) knowledge.

He said he had been "awakened from his dogmatic slumbers" by reading Hume. This is frequently misunderstood to mean that he was outraged. Actually, he said that he had been dogmatically accepting of the traditional ideas about reason. Hume enlightened him! However, it is also true that Hume challenged him, in a sense, to rescue such concepts as cause and effect, which Kant felt were essential to the existence of science. He took as his life's task to saving of the universe from Hume's pervasive skepticism.

First, he makes a distinction between a posteriori and a priori knowledge:

It is a question worth investigating, whether there exists any knowledge independent of experience and all sense impressions. Such knowledge is called a priori and is distinguished from a posteriori knowledge which has its sources in experience. That there is genuine a priori knowledge, that we can advance independent of all experience, is shown by the brilliant example of mathematics....

59 | 73© Copyright 2006 C. George Boeree

Page 131: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Although all our knowledge begins with experience, it does not follow that it arises entirely from experience. For it is quite possible that our empirical knowledge is a compound of that which we receive through impressions and that which our own faculty of knowing (incited by impressions) supplies from itself–a supplement to impressions which we do not distinguish from that raw material (i.e. impressions) until long practice has roused our attention and rendered us capable of separating one from the other.

What then are these a priori faculties of our minds? The first stage of mind's operation on experience is the transcendental aesthetic, which states that all sense experience is synthesized "through" the concepts of time and space.

Space does not represent any property of things in themselves, nor does it represent them in their relation to one another.... Space is nothing but the form of all appearances of outer sense. It is the subjective condition of sensibility under which alone outer perception is possible for us.

Since the capacity to be affected by objects must precede all perception of these objects, it can readily be understood how the form of all appearances (i.e., space) can be given prior to all perceptions, and so exist in the mind a priori; and how, as a pure intuition, in which all objects must be determined, it can contain, prior to all experience, principles which determine the relations of these objects. It is, therefore, solely from the human standpoint that we can speak of space, of extended things. If we depart from the subjective, the representation of space stands for nothing whatsoever.

Time is a purely subjective condition of our human perception, and, in itself, apart from the subject, is nothing.... What we are maintaining is the empirical reality of time, its objective validity of all objects which allow of ever being given to our senses. Since our perception is always sensible (i.e., by the senses), no object can ever be given to us in experience which does not conform to the condition of time. On the other hand, we deny to time any claim to absolute reality; that is to say, we deny that it belongs to things absolutely, as their condition or property independently of any reference to the form of our perception. Properties that belong to things in themselves can never be given to us through the senses. This, then, is what constitutes the ideality of time.

So time and space are necessary to perception, even though they themselves cannot be perceived apart from the events "in" them. The next step is the transcendental analytic, which says that the mind applies certain categories of thought to ideas. Without these categories, Kant says, we would not be able to think at all, and Hume couldn't have come up with his arguments. Hume, for example, felt that cause and effect were not objectively real; Kant says right! – they are a priori, in the mind:

1. Quantity: unity, plurality, totality.

2. Quality: reality, negation, limitation.

3. Relation: substance and accidents, cause and effect, reciprocity between active and passive.

4. Modality: possible-impossible, existence-nonexistence, necessity-contingency.

Finally comes the transcendental dialectic. Kant believed that the mind seeks complete knowledge. But it is limited to dealing with phenomena, appearances, only. It can't reach to noumena, the thing-in-itself. Phenomena are all you have, but they are not real; noumena are real, but you can't have them. So, to discover that real world, we try to construct it. Unfortunately, we err by trying to use the categories (logic), "designed" for phenomena, on the ultimate reality! So we end up with contradictions that are irreconcilable. Regarding cause and effect and free will:

If, however, we may legitimately take an object in two senses, namely, as phenomena and as thing-in-itself; and if the principle of causality applies to things only as phenomena and not as noumena,

60 | 73© Copyright 2006 C. George Boeree

Page 132: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

then we can, without any contradiction, think one and the same thing when phenomenal as necessarily conforming to the principle of causality and so far not free, and yet, in itself not subject to that principle and therefore free.

Suppose morality necessarily presupposed freedom of the will while speculative reason had proved that such freedom cannot even be thought. In such case freedom, and with it morality, would have to make room for the mechanical interpretation of nature. But our critique has revealed our inevitable ignorance of things-in-themselves, has limited our knowledge to mere phenomena. So, as morality requires only that freedom should not entail a contradiction, there is no reason why freedom whould be denied to will, considered as a thing-in-itself, merely because it must be denied to it as a phenomenon.

Ultimately, Kant found that the existence of God, the soul, and ultimate reality is not something you can prove, because proof is based on phenomena and the categories. Instead, they are heuristic, that is, we believe in these things because they are useful to us! In saving science and religion from Hume, he proved that they had to be taken on faith! Scholars and churchmen on all sides of the issues criticized the Critique, which ironically guaranteed its success. Kant had no censorship problems to worry about at the time, because Frederick the Great – a brilliant man himself – ruled Prussia at that time. Unfortunately for Kant and many others, he died in 1786.

(A note on Frederick the Great: The King of Prussia, including much of Germany as well, he was, besides a consummate leader and politician, an accomplished philosopher and a passionate amateur musician. He corresponded with Voltaire and Rousseau, and Bach wrote "A Musical Offering" for him, based on a melody the King had challenged him with. He wrote a number of books, including A History of My Times and The Anti-Machiavelli)

In 1788, Kant wrote the Critique of Practical Reason. Practical reason refers to the making of moral decisions. In this book, he argues that everyone has a conscience, a moral law within their souls, not unlike the categories of the Critique of Pure Reason. This moral law he calls the Categorical Imperative, which is phrased two ways. The first is a variation on the Golden Rule: Whatever you do, consider what kind of world this would be if everybody did the same. The other is a little deeper: Treat people (including oneself) only as ends, never as means to an end. Never use them, we would say today.

In order to have morality, Kant believed we needed free will. If you can’t make choices, how could you be responsible? If you aren’t responsible for anything you do, like an animal or a robot, then what you do is

61 | 73© Copyright 2006 C. George Boeree

Page 133: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

neither bad nor good! Also, he felt we needed the idea of immortality: Since justice rarely happens within a lifetime, we need life after death to take care of that. And, in order for eternal life to exist, or free will, or good and bad at all, we need to believe in God.

Notice that he doesn’t say that, first God exists, therefore.... He is actually saying that, although we can never prove the existence of God (or immortality, or free will, or good and bad), we must act as if he (and they) existed. Religious thinkers of the time did not care for this way of thinking at all!

Kant wrote a good deal more. In 1790, he wrote the Critique of Judgement, regarding judgements of beauty. He notes that our sense of beauty is based on feeling, not reason. We seem to "see" the harmony, the power, the miraculous in some things. It is as if God so arranged things!

In 1793, when he was 69 years old, Religion Within the Limits of Reason Alone came out. Here he argues (unlike Hobbes and unlike Rousseau) that we are born with the potential for both good and bad. He does acknowledge, though, that a great deal of evil comes out of civilization, rather than our primitive nature. In fact, a lot of what we now consider bad was probably originally necessary for survival in primitive conditions!

He also said that, although there is an inborn moral sense, it must be developed by moral instruction. For this reason, he believes that religion is necessary – although he also points out that religion shouldn’t be dogmatic, and that beliefs such as original sin, the divinity of Christ, and the efficacy of prayer are mere superstitions.

In 1795, he wrote On Perpetual Peace, outlining the basis for international law. In 1798, he came out with The Conflict of Faculties, arguing for the importance of academic freedom. He died February 12, 1804, after long illness, and was buried ceremoniously in Königsberg Castle. Over his grave is written

The starry heavens above me; The moral law within me.

The great modern historian of psychology, Dan Robinson, once said that today nearly every psychologist is either a Humean or a Kantian. Humeans see their science as the statistical analysis of a collection of experiences. All we can ever know is probabilities based on what happened in the past. Kantians see their science as more firmly based, in a sense, in the structure of the mind. And yet they, too, can permit themselves little certainty. Humeans can be found among most experimentalist, including the behaviorists. Kantians are more likely to be found among cognitivists and psychoanalysts. There are, as we shall see, alternatives. But they remain very much in the minority.

62 | 73© Copyright 2006 C. George Boeree

Page 134: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Declaration of the Rights of Man *

Approved by the National Assembly of France, August 26, 1789

The representatives of the French people, organized as a National Assembly, believing that the ignorance, neglect, or contempt of the rights of man are the sole cause of public calamities and of the corruption of governments, have determined to set forth in a solemn declaration the natural, unalienable, and sacred rights of man, in order that this declaration, being constantly before all the members of the Social body, shall remind them continually of their rights and duties; in order that the acts of the legislative power, as well as those of the executive power, may be compared at any moment with the objects and purposes of all political institutions and may thus be more respected, and, lastly, in order that the grievances of the citizens, based hereafter upon simple and incontestable principles, shall tend to the maintenance of the constitution and redound to the happiness of all. Therefore the National Assembly recognizes and proclaims, in the presence and under the auspices of the Supreme Being, the following rights of man and of the citizen:

Articles:

1. Men are born and remain free and equal in rights. Social distinctions may be founded only upon the general good.

2. The aim of all political association is the preservation of the natural and imprescriptible rights of man. These rights are liberty, property, security, and resistance to oppression.

3. The principle of all sovereignty resides essentially in the nation. No body nor individual may exercise any authority which does not proceed directly from the nation.

4. Liberty consists in the freedom to do everything which injures no one else; hence the exercise of the natural rights of each man has no limits except those which assure to the other members of the society the enjoyment of the same rights. These limits can only be determined by law.

5. Law can only prohibit such actions as are hurtful to society. Nothing may be prevented which is not forbidden by law, and no one may be forced to do anything not provided for by law.

6. Law is the expression of the general will. Every citizen has a right to participate personally, or through his representative, in its foundation. It must be the same for all, whether it protects or punishes. All citizens, being equal in the eyes of the law, are equally eligible to all dignities and to all public positions and occupations, according to their abilities, and without distinction except that of their virtues and talents.

7. No person shall be accused, arrested, or imprisoned except in the cases and according to the forms prescribed by law. Any one soliciting, transmitting, executing, or causing to be executed, any arbitrary order, shall be punished. But any citizen summoned or arrested in virtue of the law shall submit without delay, as resistance constitutes an offense.

8. The law shall provide for such punishments only as are strictly and obviously necessary, and no one shall suffer punishment except it be legally inflicted in virtue of a law passed and promulgated before the commission of the offense.

9. As all persons are held innocent until they shall have been declared guilty, if arrest shall be deemed

* ASCII Text Prepared by Gerald Murphy (The Cleveland Free-Net - aa300) Distributed by the Cybercasting Services Division of the National Public Telecomputing Network (NPTN). HTML Markup: by The Avalon Project at the Yale Law School © 1996 The Avalon Project. William C. Fray and Lisa A. Spar, Co-Directors. Available at http://www.yale.edu/lawweb/avalon/rightsof.htm

63 | 73© Copyright 2006 C. George Boeree

Page 135: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

indispensable, all harshness not essential to the securing of the prisoner's person shall be severely repressed by law.

10. No one shall be disquieted on account of his opinions, including his religious views, provided their manifestation does not disturb the public order established by law.

11. The free communication of ideas and opinions is one of the most precious of the rights of man. Every citizen may, accordingly, speak, write, and print with freedom, but shall be responsible for such abuses of this freedom as shall be defined by law.

12. The security of the rights of man and of the citizen requires public military forces. These forces are, therefore, established for the good of all and not for the personal advantage of those to whom they shall be intrusted.

13. A common contribution is essential for the maintenance of the public forces and for the cost of administration. This should be equitably distributed among all the citizens in proportion to their means.

14. All the citizens have a right to decide, either personally or by their representatives, as to the necessity of the public contribution; to grant this freely; to know to what uses it is put; and to fix the proportion, the mode of assessment and of collection and the duration of the taxes.

15. Society has the right to require of every public agent an account of his administration.

16. A society in which the observance of the law is not assured, nor the separation of powers defined, has no constitution at all.

17. Since property is an inviolable and sacred right, no one shall be deprived thereof except where public necessity, legally determined, shall clearly demand it, and then only on condition that the owner shall have been previously and equitably indemnified.

64 | 73© Copyright 2006 C. George Boeree

Page 136: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Selection from A Vindication of the Rights of Woman *

by Mary Wollstonecraft (1759-1797)

I love man as my fellow; but his scepter, real, or usurped, extends not to me, unless the reason of an individual demands my homage; and even then the submission is to reason, and not to man. In fact, the conduct of an accountable being must be regulated by the operations of its own reason; or on what foundation rests the throne of God?

It appears to me necessary to dwell on these obvious truths, because females have been insulated, as it were; and, while they have been stripped of the virtues that should clothe humanity, they have been decked with artificial graces that enable them to exercise a short-lived tyranny. Love, in their bosoms, taking place of every nobler passion, their sole ambition is to be fair, to raise emotion instead of inspiring respect; and this ignoble desire, like the servility in absolute monarchies, destroys all strength of character. Liberty is the mother of virtue, and if women be, by their very constitution, slaves, and not allowed to breathe the sharp invigorating air of freedom, they must ever languish like exotics, and be reckoned beautiful flaws in nature.

As to the argument respecting the subjection in which the sex has ever been held, it retorts on man. The many have always been enthralled by the few; and monsters, who scarcely have shewn any discernment of human excellence, have tyrannized over thousands of their fellow-creatures. Why have men of superiour endowments submitted to such degradation? For, it is not universally acknowledged that kings, viewed collectively, have ever been inferior, in abilities and virtue, to the same number of men taken from the common mass of mankind – yet, have they not, and are they not still treated with a degree of reverence that is an insult to reason? China is not the only country where a living man has been made a God. Men have submitted to superior strength to enjoy with impunity the pleasure of the moment – women have only done the same, and therefore till it is proved that the courtier, who servilely resigns the birthright of a man, is not a moral agent, it cannot be demonstrated that woman is essentially inferior to man because she has always been subjugated.

Brutal force has hereto governed the world, and that the science of politics is in its infancy, is evident from philosophers scrupling to give the knowledge most useful to man that determinate distinction.

I shall not pursue this argument any further than to establish an obvious inference, that as sound politics diffuse liberty, mankind, including woman, will become more wise and virtuous.

Mary Wollstonecraft was a leading feminist, revolutionary, and Unitarian in 18th century England. In addition to A Vindication of the Rights of Women (1792), Mary Wollstonecraft wrote Thoughts on the Education of Girls (1786) and A Vindication of the Rights of Man (1790). The latter inspired Thomas Payne to write The Rights of Man. She had two daughters, Fanny and Mary. She died after giving birth to Mary, who would go on to become Mary Wollstonecraft Shelley, the author of Frankenstein.

* Selection taken from full text available at http://www.baylor.edu/BIC/WCIII/Essays/rights_of_woman.html

65 | 73© Copyright 2006 C. George Boeree

Page 137: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Ethics

66 | 73© Copyright 2006 C. George Boeree

Page 138: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Ethics is the philosophical study of good and bad, right and wrong. It is commonly used interchangeably with the word morality. It differs from other aspects of philosophy in being more concerned with what should be than with what actually is. This makes it a good deal more slippery as well!

Theological Theories

There are three broad categories of ethical philosophies. The first is the theological theories. As the name tells you, these are moral philosophies that begin with the idea that what is right and wrong derive from God or some other higher power.

The simplest theological theory is the divine command theory. This theory says that God has revealed his will in the form of commands that are made available to us through oral tradition, holy scripture, or church law. All we need to do to be good is to follow those commands. Most of the church fathers held such a belief, as do most religious people today. Its major advantage is its simplicity and solidity.

A more complex theological theory is called natural law. This goes back to St. Thomas Aquinas, and is a part of traditional Catholic philosophy. St. Thomas felt that God would not give us one set of rules through scripture and the church only to have them contradicted by our experiences and reason. Nature, as God’s creation, is in complete agreement with his moral commands. People who believe in natural law would point out that there are people from other cultures, not exposed to our traditions of morality, who nonetheless reason their way to the very same conclusions as to what is right and wrong!

The difficulty with natural law has become obvious: Science does occasionally produce theories that are blatantly contradictory with scripture, and the church does occasionally produce events (religious wars and burning heretics spring to mind) that are blatantly contradictory with our "common sense" sort of morality.

The difficulty with both divine command theory and natural law is that, as society becomes more pluralistic, we come into more and more contact with a greater and greater variety of religious traditions, each with their own scriptures and traditions, and not all of them agreeing all the time. The majority of religious people are good-hearted souls, who are reluctant to believe that God would condemn entire nations for not having been lucky enough to hear the right message! This feeling is especially poignant when people gain experiences with very decent people with different religions or even no religion at all. As long as we remain generous and humble, there is no real problem.

But some people find themselves retreating to what some consider a defensive position called absolutism. Absolutism is divine command theory, but without the generous and humble spirit. In other words, it’s my way or else. We have had many examples of absolutism in history, and we have many examples still today.

Moral Relativism

Diametrically opposed to the theological theories are various forms of moral relativism. Moral relativism says that there are no universal moral principles. Morality is a matter of customs or opinions or habits or emotions. There is a range of opinions here: Relativism is sometimes considered a kind of moral skepticism, which would say that we never truly know what is good or bad. Others see it as a moral nihilism, which says that there simply is no such thing as good and bad, that those words are just misleading labels for other, simpler, things.

One brand of relativism is called conventionalism. This says that what we call morality is really a matter of

67 | 73© Copyright 2006 C. George Boeree

Page 139: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

our cultural or social norms. What our traditions say is right and wrong (for whatever reason) is right and wrong. Often, along with that, comes the idea that cultures and societies should not interfere with each other ("when in Rome..."), but that is not necessary.

Another brand is called prescriptivism (or imperativism), which looks at morality more in terms of power within a society. What we call right and wrong are essentially prescriptions as to what we want others to do, which we then enforce with the powers at our command. So call theft "bad" so that we have a justification to put people who take our stuff in jail!

Of course it is inevitable that we come across other societies who believe that what they want is their "right" regardless of what we want. Or we come across situations where there are two subcultures or societal groups whose moral beliefs come into conflict. One of the difficulties of conventionalism is defining what constitutes a society or culture and what, if any, are the rules of interaction between or among them.

One "solution" is to reduce the culture or society to a culture or society of one – that is, the individual. This is called subjectivism. Here, each person has his or her own morality. It may be a matter of individual beliefs, or a matter of habit, but each person makes their own choices. That does take care of the problem of what is a culture, but it only makes the problem of rules of interaction worse!

Another brand of relativism goes even further: emotivism says that what we call good and bad are just labels for certain emotional responses we have to certain acts. If the idea of eating puppies makes you sick, you call it bad. If it makes you salivate, you call it good. If having sex with teenagers makes your day, you call it good. If it gets you all upset, you call it bad.

Among my students, I find that freshmen often bring with them their home town religious beliefs. They tend to like the divine command theory, with a few absolutists thrown in for spice. But by the time they are juniors, most of them have become relativists. The home town crowd often blames this change on professors, but it is more a matter of exposure to the pluralistic mini-society of college.

The freshmen see that there are many people who disagree about one detail or another of their childhood moral codes, yet appear to be decent people, or at least have not been struck down by thunderbolts. So, being decent folk, they begin to emphasize tolerance for the variety of moralities they see, and relativism seems to be the best format for this tolerance. For example, if you were raised to believe that homosexuality is wrong, yet you find many people who believe that it is okay (and some who think it’s the only thing to be), you may develop a live-and-let-live attitude that says "to each his own."

But not everything is as innocuous as sexual preferences. There are people whose moral codes say we must sacrifice chickens to the Gods, or we must convert the non-believers, or we must burn witches at the stake, or we must destroy the infidel.... What do we do then with our kind tolerance? Let them be because "to each his own?" What if we had done that when Adolph Hitler had his time at bat? Or what if Jeffrey Daumer’s neighbors decided that, well, if he wants to kill and eat his lovers, what business is it of ours?

A sophisticated relativist would respond, however, by pointing out that this tolerance business really has no place in a relativistic moral theory – that tolerance is itself a moral value that one may or may not adhere to! So, if it is Hitler’s moral code to exterminate innocent people and invade neighboring countries, it’s our moral code to make him stop it! No logical problems here.

As you can see, though, relativism does take a risk. Relativism can become moral nihilism in the same way that divine command can become absolutism. Nevertheless, relativism is the moral theory followed by the majority of people in the hard sciences, including the more experimental, physiological side of psychology.

68 | 73© Copyright 2006 C. George Boeree

Page 140: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Moral realism

The third main category of moral theory is moral realism. Moral realism says that good and bad, right and wrong, exist in some fashion in this world, and independently of things like social customs, beliefs, or opinions. On the other hand, moral realism does not propose something as simple as a list of commandments delivered directly from God! Moral realism is the middle ground between the theological theories and moral relativism, and is the most common approach of philosophers.

But, as is usually the case with the middle ground, that is not an easy position to take. The big question that moral realists have to answer is "how do we know good and bad? how do we recognize right and wrong?" Because of the difficulty of this question, there are quite a few forms of moral realism.

Rationalist morality theories

The first group of theories I’d like to look at at the rationalist moral theories. As the name indicates, these theories view morality as coming out of our capacity to think. Just like rationalist epistemology, the most basic form of rational moral truth is the one that is self-evident. This is the theory of intuitionism, which is best exemplified by the modern British philosopher G. E. Moore.

Just like rationalistic epistemology, we can deduce from intuitions with formal logic. In other words, we can think our way to various moral principles. Kant promotes such an approach in what is known as formalism.

A particularly popular form of rationalist morality is called contractarianism. It is associated with several influential philosophers such as John Locke and Jean-Jacques Rousseau. Rousseau is responsible for the title and the basic idea: He suggested that, once upon a time, humanity was in a state of savage anarchy. Each person felt free to do whatever they needed to do to get what they wanted. However, the fact that everyone else was doing the same meant that no-one was really free at all. Whatever time they weren’t spending on getting what they needed would be spent protecting themselves from each other!

So, says Rousseau, our ancestors got together, sat down, and thought this through – at least metaphorically. More literally, certain ways of dealing with anarchy evolved over thousands of years. But the principle is the same: We each agree to give up some of our freedom to take whatever we want, in order that we all can get what we need. The Social Contract, it’s called.

This idea was very influential in its time, especially on the American and French Revolutions. Our founding fathers quite literally outlined the processes of our government and the rights and obligations of the citizenry in a social contract known as the Constitution. We call our system democracy, of course, but the Constitution limits our democratic freedom – the freedom of the majority – in order to protect the minority. And since you never know when it’ll be your turn to be the minority, it has worked out quite well!

Naturalistic moral theories

The next group of theories, as you might suspect, are founded on ideas of a more empirical nature. Here, morality is something you experience in some fashion. These theories are called naturalistic. The simplest suggests that we perceive good and bad quite directly, with a "sixth sense," a moral sense. This is the brain child of the Earl of Shaftesbury. We often say to each other "that doesn’t look right," and "can’t you see that that's wrong?"

Egoism says that right and wrong can be perceived in terms of certain special feelings we call happiness. The term egoism is unfortunate here, because we tend to think in terms of selfishness and hedonism, which would be more appropriately placed under the subjectivist or emotivist form of relativism. The epicureans are examples of egoism: Things like friendship, honor, and even altruism give us certain positive emotions by which we recognize that they are good. Other things make us feel guilty or ashamed.

69 | 73© Copyright 2006 C. George Boeree

Page 141: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Analogous to contractarianism in the rational view, there is utilitarianism in the natural view. Invented by Jeremy Bentham and developed by the Mills, utilitarianism is best known for the phrase "the greatest happiness of the greatest number." Like egoism, happiness is seen as the way in which we perceive good and bad. This time, however, it is not our own happiness alone, but the happiness of those around us as well.

Intuitively, it is hard to disagree with the notion. But is is in fact a difficult one. How do you know if others are happy? We’re often not even certain if we ourselves are happy! What makes others happy may not be the same as what makes us happy. How are we to add up the various kinds of happiness? Is every person equal in the equation, or are some people’s happiness more important than others? What about the poor minority in this case: Is it okay for them to be unhappy, as long as the majority is happy? Bentham thought that we could develop a "hedonistic calculus" to figure these things out – others are far from certain about that.

Again, our founding fathers were influenced by utilitarianism as well as the social contract, and the Declaration of Independence is loaded with utilitarian concepts (and contractarian ones!) Thomas Jefferson in particular was very interested in these issues.

There are many additional details to utilitarianism, and to many of these moral theories. But you will have to go to your local philosophy professor for those.

One of the things you may have spotted as you read the preceding paragraphs is that these rationalist and naturalist theories are not terribly exclusive: In fact, we could combine them all without stretching them too far out of shape. Just like the US has the Declaration of Independence and the Constitution, and just like science is a blend of rationalism and empiricism, we can use all six of the theories under moral realism at once!

Virtue ethics

There is one more branch of moral realism to talk about. This one is called virtue ethics. Instead of looking at good and bad as something impersonal that we need to recognize via reason or a moral sense, virtue theory sees good and bad as a quality of the person him or herself. It is a virtuous person that creates good acts, not good acts that add up to a virtuous person! This is also often called perfectionism.

It is found in a variety of interesting places: Aristotle proposed a virtue ethics in his famous Nichomachean Ethics; Buddha outlined a virtue ethics in his sutras; Plato has a virtue ethics, as do the stoics; and Frederic Nietzsche promotes a virtue ethics in Thus Spake Zarathustra, the book introduced "Superman" to the world! The idea is simple: Follow certain practices and you will become a virtuous man or woman. Then do what you will, and the results will be good.

I like virtue ethics a lot, but I have to admit there’s a danger in it. Who decides what constitutes a virtuous person? The Nazis read Nietzsche and decided that they were the master race and could do no wrong. Nevermind that Nietzsche would never recognize his Superman in boot-stomping blackshirts – Nietzsche was dead by then! Even the gentle Buddhists have had to face the problem: If a certified enlightened master decides it might be a good idea to sleep with his students or take all their money, does that make these things moral? To respond by saying we were mistaken about his enlightenment is too easy a way out of the dilemma!

Another version of virtue ethics, called situational ethics, was developed recently by a Christian theologian named Joseph Fletcher. Uncomfortable with the "follow these rules or burn in hell" theology of some Christians, he said that Jesus had a quite different idea of morality (one quite like Buddha, actually). If you cultivate a loving attitude, you will naturally begin to do more good and less bad. In fact, whatever is done with love is by definition a good act. You could point out that some people do pretty awful things in the name of love, but we could consider these mistaken examples of love. But you could also argue that this is an example of the "No True Scotsman" fallacy: If something good comes out of love, fine; If something bad comes out of love, then, well, that wasn't real love!

Another aspect of his theory is that morality is always situational. He means that morality is always a matter

70 | 73© Copyright 2006 C. George Boeree

Page 142: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

of a real person in a real situation, and we can’t really judge them from outside that situation. Hypothetical moral situations, he says, are never real. There are always more details to be taken into account! This sounded way too much like moral relativism to conservative Christians, and so today many people misunderstand poor Fletcher and assume he was some kind of nasty nihilist!

71 | 73© Copyright 2006 C. George Boeree

Page 143: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

Overlapping Moralities

There are three overlapping "moralities:"

A. Individual "morality" – individual opinionB. Society's "morality" – social conventionC. Morality (the real deal)

For example: A plantation owner in 1850's Alabama.

A represents the plantation owner.B represents the laws and social standards of 1850's Alabama.C represents true morality.

1. Individual opinion only: The plantation owner believes that he can do anything he wants with his own property, including his slaves – something most of his neighbors find a bit extreme.

2. Social convention only: Slaves must be registered with the state and, although most find this reasonable, it is something the plantation owner nevers bothers with.

3. Individual opinion and social convention: Slavery was accepted and permitted in Alabama, which the plantation owner certainly agrees with.

4. Individual opinion and true morality: The plantation owner believes he has a moral duty to care for his elderly father, even though most of his neighbors would be just as happy to see the old coot croak.

5. Social convention and true morality: The state – and most people of that time and place – says that all human beings, slaves included, should be well treated. The plantation owner doesn't always agree.

6. True morality, separate from both individual opinion and social convention: Slavery is wrong and should be eliminated – even though that is something neither the plantation owner nor his neighbors nor his state legislators find acceptable!

7. All three: There are laws against murder, following general moral principles, and the plantation owner agrees.

72 | 73© Copyright 2006 C. George Boeree

Page 144: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Two: The Rebirth

How certain overlaps can be interpreted:

1. A good fit between person (A) and society (B): the conformist, the solid citizen.

2. A poor fit between person and society: the misfit, the outlaw, the insane.

3. A good person.

4. A bad person.

5. A good society.

6. A bad society.

7. Moving towards the ideal.

8. Moral perfection: The individual and society are in complete agreement, and the principles involved reflect true, "ultimate" morality. Not expected anytime soon!

The History of Psychology Part One: The Ancients

Part Three: The 1800's

Part Four: The 1900's

[ http://www.ship.edu/%7Ecgboeree/historyofpsych.html ]

73 | 73© Copyright 2006 C. George Boeree

Page 145: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

E-Text Source:[ http://www.ship.edu/%7Ecgboeree/historyofpsych.html ]

1 | 88© Copyright 2006 C. George Boeree

Page 146: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Index

Index 2

Timeline: The 1800s 3

Map: Europe 1815

3

Early Medicine and Physiology 6

[ The ancients | The rebirth of medicine | The 1800's | Hermann von Helmhotz ]The Hippocratic Oath 12

Phrenology Diagram 13

A Brief History of Psychopharmacology 14

Charles Darwin and Evolution 17

[ Charles Robert Darwin | Alfred Russel Wallace | Thomas Henry Huxley | Herbert Spencer ]

Charles Darwin Selection: Descent of Man 22

Primal Patterns of Behavior 25

Sociobiology 27

The Romantic Philosophers 34

[ Jean-Jacques Rousseau | Johann Wolfgang von Goethe | Arthur Schopenhauer | Søren Aabye Kierkegaard | Nietzsche | Romanticism in General ]The Quotable Friedrich Nietzsche 46

Thus SpokeZarathustra A Selection 49

The Beginnings of Psychology 56

[ Associationism | John Stuart Mill | Hermann Ebbinghaus | Psychophysics | Gustav Fechner | Sir Francis Galton | Alfred Binet ]Sir Francis Galton Selection: Hereditary Talent 65

The History of Statistics 68

Wilhelm Wundt and William James 69

[ Wilhelm Wundt | William James | Structuralism or Voluntarism | Functionalism | Commonalities ]

William James Selection: The Stream of Consciousness 80

Free will 85

2 | 88© Copyright 2006 C. George Boeree

Page 147: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Map: Europe 1815

Timeline: The 1800s

1801 Pinel writes text on Moral Therapy

1804 Immanuel Kant dies

1807 Hegel completes The Phenomenology of Spirit

1808 Reil coins term "psychiatry"

1810 Gall publishes the first volume of Anatomie et Physiologie du Systèm Nerveux

1811 Sir Charles Bell reports to associates at a dinner party the anatomical separation of sensory and motor function of spinal cord

1815 Napoleon surrenders at Waterloo

1816 Johann Friedrich Herbart publishes Lehrbuch zur Psychologie

Herbart's text introduces the concept of repression

3 | 88© Copyright 2006 C. George Boeree

Page 148: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

1819 Schopenhauer writes The World as Will and Idea

1822 Francis Magendie publishes an article which postulates the separation of sensory and motor function of the spinal cord

1831 Goethe completes Faust – he dies the following year

1834 Johannes Müller publishes Handbüch des Physiologie des Menschen

1835 Colt invents the revolver

1842 Auguste Comte completes his six-volume Course in Positive Philosophy

1843 Kierkegaard publishes Either/Or and Fear and Trembling

1845 Morton uses ether as an anesthetic

1845 The Irish famine – over one million dieand another million leave Ireland

1847 Marx and Engels publish The Communist Manifesto

1848 Haucock performs first appendix operation

1855 Herbert Spencer publishes the two volumes of the Principles of Psychology

Alexander Bain publishes The Senses and the Intellect

1856 Hermann Ludwig Ferdinand von Helmholtz publishes the first volume of the Handbuch der physiologischen Optik

1859 Charles Darwin publishes The Origin of the Species

Alexander Bain publishes The Emotions and the Will

1860 Gustav Fechner publishes The Elements of Psychophysics

1861 Paul Broca shows that the loss of speech in one individual is due to a lesion in third convolution of the left frontal lobe

1861 Italy is united under Victor Emmanuel II for the first time since the Roman Empire

1861 The abolition of serfdom in Russia frees 40 million serfs

1862-1865 The American Civil War frees 4 million slaves – over 600,000 soldiers die

1863 Wilhelm Wundt publishes Lectures on Human and Animal Psychology

I. M. Sechenov publishes a monograph Reflexes of the Brain, in which he attempted to analyze the higher order functions in terms of the reflex schema

1864 Louis Pasteur invents "pasteurization"

1865 Mendel discovers the laws of genetics

1867 Lister invents antiseptic surgery

1869 Francis Galton publishes Hereditary Genius and uses the normal distribution for purposes of classification

Von Hartmann writes Philosophy of the Unconscious

1870 G. Fritsch and E. Hitzig realize the first direct electric stimulation of the brain

1871 Charles Darwin publishes The Descent of Man

4 | 88© Copyright 2006 C. George Boeree

Page 149: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

1871 Germany finally united under Prussian leadership: "The Second Reich"

1873 Wundt publishes Principles of Physiological Psychology

1874 Franz Brentano publishes Psychology from an Empirical Standpoint

1876 Alexander Bain establishes Mind, the first journal devoted to psychological research

1879 Wundt establishes the first psychological laboratory at the University of Leipzig in Germany

Lightner Witmer uses the term clinical psychology for the first time

1882 Charcot opens clinic at Salpetriere

Christine Ladd Franklin completes the doctoral program in mathematics at Johns Hopkins – no degree granted due to prohibition against granting doctorates to women!

1883 Francis Galton publishes Inquiries into Human Faculty and Its Development

Wundt establishes the journal Philosophische Studien to publish the results of his laboratory research

Kraepelin publishes list of disorders

Nietzsche publishes Thus Spake Zarathustra

1884 William James publishes What is an Emotion?

1885 Hermann Ebbinghaus writes On Memory

1885-6 Freud studies hypnotism under Charcot

1886 Louis Pasteur cures rabies

1889 William James publishes The Principles of Psychology

1890 Ehrenfels writes About the Qualities of the Gestalt

1892 The American Psychological Association is founded with 42 members

Edward Titchener introduces his version of Wundt's structuralism to America.

1893 Oswald Külpe publishes Outline of Psychology

1894 John Dewey publishes The Ego as Cause

Margaret Floy Washburn becomes the first woman to receive a PhD in psychology; Her dissertation was supervised by Titchener

1895 Josef Breuer and Sigmund Freud publish Studies in Hysteria

Gustave Le Bon publishes Psychologie des Foules

1896 Dewey publishes in the Psychological Review his famous article The Reflex Arc Concept in Psychology

Lightner Witmer establishes at the University of Pennsylvania a clinic of psychology, the first psychological clinic in America and perhaps in the world

1897 Wundt publishes Outlines of Psychology

5 | 88© Copyright 2006 C. George Boeree

Page 150: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

1898 Titchener publishes The Postulates of a Structural Psychology

E. L. Thorndike publishes Animal Intelligence

Early Medicine and Physiology

6 | 88© Copyright 2006 C. George Boeree

Page 151: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

The ancients

Although always a part of philosophy, psychology has close ties as well to biology, especially human physiology and medicine. As long as the mind is in some way attached to a body, this is inevitable. But, as you know, it took quite a bit of prying the mind apart from its religious connection with an immortal soul before that intimate connection would be acknowledged!

"The First Physician," at least as far as the Greeks were concerned, was Asclepius. He started a partially mystical society or guild of physicians that was to have an influence for many centuries to come. During that time, he gained god-like status. Even Socrates, as he lay dying of the overdose of hemlock, told his student Crito to sacrifice a cock to Asclepius, presumably in thanks for an easy death.

More clearly historical is Acmaeon of Croton (b. 435 bc) in southern Italy. A pythagorean by philosophy, he was known for his anatomical studies. He is the first person we have record of who dissected the eye and discovered the optic nerve. His theory of the mind included the idea that the brain is the seat of perception and thought, and that there are connections from all the sense organs to the brain. He believed that it was pneuma, meaning breath or animal spirits, ran through the body like neural signals.

Disease, he theorized, is at least in part due to a loss of balance in the body. He postulated a set of opposites, especially hot and cold, wet and dry, and bitter and sweet, that we need to balance in order to maintain health, by controlling our temperature, nutrition, and so on.

Hippocrates (b. 460 bc) of Cos in Asia Minor, is better known. He was an Asclepiad – i.e. a member of the medical guild, and is the originator of the Hippocratic Oath. But note: Contrary to popular belief, few if any doctors are required to take this or any other oath!). Despite his background, he preferred to avoid mystical interpretations and stick close to the empirical evidence. For example, in a treatise called "On the sacred disease" (meaning epilepsy), he dismissed the usual demonic-possession theory and suggested that it was an hereditary disease of the brain.

He is also known for his theory of humors. According to Greek tradition, there are four basic substances: earth, water, air, and fire. Each of these has a corresponding "humor" or biological liquid in the body: black bile, phlegm, blood, and yellow bile, in that order.

These humors, just like the four basic substances, vary along two dimensions: hot or cold, and wet or dry, like this...

wet dryhot air/blood fire/yellow bilecold water/phlegm earth/black bile

Like Alcmaeon said, the task of the physician is to restore balance when the relative proportions of these humors were out of balance. Hippocrates also noted some emotional connections to these humors.

It should be noted, despite the odd humor theory, that Hippocrates and with him Plato correctly recognized the significance of the brain. A bit later, around 280 bc, Erasistratus of Chios dissected the brain and differentiated the various parts.

For the most part, of course, medicine in these centuries, and for many centuries to come, consisted of a

7 | 88© Copyright 2006 C. George Boeree

Page 152: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

blend of first aid – the setting of bones, for example – and herbal remedies, plus a considerable amount of praying to the gods for miraculous intervention!

In the Roman Empire, another physician gained fame that would last well into the Middle Ages: Galen was born 130 ad in Pergamon in Asia Minor – a major center of learning at the time. He went to Alexandria – THE center of learning – to study anatomy. In the Roman Empire, dissection of humans was not allowed – based, of course, on superstitious fear of retribution, not on any feelings of human dignity! So Galen studied the great apes instead.

At the age of 28, he returned home for a while to serve as surgeon to the gladiators. His fame spread, and he went to Rome.

In addition to a great deal of fairly decent, concrete advice, he theorized that all life is based on pneuma or spirit. Plants had natural spirit, which causes growth. Animals have vital spirit, which is responsible for movement. And human beings have animal spirit – from the word anima, meaning soul – which is responsible for thought.

He believed that cerebrospinal fluid was the animal spirit, and noted that it was to be found in the cerebral vesicles of the brain as well as the spinal cord. He believed it traveled out through the nerves to the muscles, as well as in from the sensory organs. Not bad.

It was Galen who added the idea of temperaments to Hippocrates’ four humors:

Blood sanguine, cheerfulPhlegm phlegmatic, sluggishYellow bile choleric, angryBlack bile melancholy, sad

Note how these words have come down to us. Note also how we use terms like "he is in a good humor," "he has a bad temper" (as in temperature), "he has a dry wit" (referring to the wet-dry dimension), and "he is a hot-head" (the cool-warm dimension). Imbalances among these psychological states, he believed, were one more cause for diseases. Of course, this is the first known personality typology! It had some influence on people as varied as Alfred Adler, Ivan Pavlov, and Hans Eysenck.

The rebirth of medicine

It is some time before we again see real progress in medicine and physiology. In 1316, Mondino de Luzzi came out with the first European textbook on anatomy, appropriately called Anatomia. Early in the 1500's, Da Vinci, naturally, plays a part with numerous drawings of skulls and brains, and even a wax casting of the ventricles. In 1561, Gabriele Fallopio published Observationes Anatomicae, wherein he describes, among many other things, the cranial nerves and, of course, the fallopian tubes

Real progress had to wait for the invention of the microscope by Zacharias Jansen of Middleburg, Holland, in 1595 (or by his father, Hans). It would be refined by Antonie van Leeuwenhoek in Holland, Galileo in Italy and Robert Hooke in England.

(Soon afterwards, in 1608, a colleague of Zacharias Jansen in Middleburg, a German by the name of Hans Lippersberg, invented the telescope.)

Another major event was William Harvey’s (1578-1657) explanation of the circulation of the blood in 1628.

8 | 88© Copyright 2006 C. George Boeree

Page 153: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Most physicians, still using Galen’s text, believed that the blood ebbed and flowed like a tide through the whole body!

Centers of medical education developed in the universities at Padua, Italy and Leyden, Holland. Here, students studied anatomy, did post-mortems, and even dabbled in what we would now call pathology. They performed careful case-studies, with detailed measurements.

Neurophysiology developed in parallel to all the other medical and physiological developments. We could point to Thomas Willis’s anatomical description of the brain in 1664 as the first major step. His book was illustrated by Christopher Wren, the famous English artist and architect. Willis coined the term neurology in 1681.

A very significant contributor to the development of our understanding of the brain was none other than our old friend Rene Descartes. He postulated a dualistic system, with a mind/soul interacting with the brain/body by means of animal spirits (pneuma). The will (an aspect of our souls) enters the brain as animal spirits via the pineal gland, interacts with the organization of nerves that represent established habits, courses through the nerves (viewed as tiny tubes) to the muscles, causing them to contract and so produce a behavior!

Likewise, actions upon the sensory neurons cause increases in pressure on the animal spirits, which course through the nerves to the brain, influencing the structure of the brain by repetition, as well as passing on to the soul as perceptions.

Sometimes, the actions of the senses led to rather immediate responses by the muscles. These would be called reflexes by Descartes' countryman, Jean Astruc, and were defined as cycles of action that do not require the intervention of the mind or soul. Descartes did include far more complex behavior as reflexes than we would today.

Passions (roughly, emotions) also come from outside the body, essentially as sensations. They lead to a variety of physiological changes as well as reflex actions: We see a bear, we run! In animals, these passions are just sensations and reflexes. We, however, experience them with our mind/soul as wonder, love, hate, desire, joy, and sadness, as well as hundreds of combinations.

Descartes ideas, minus the soul, would be promoted by Julien Offay de la Mettrie (1709-1751) in a landmark book called Man a Machine (1748). Robert Whytt (1714-1766) would later lay down the neurological basics of the reflex, and introduce the terms stimulus and response. In 1791, Luigi Galvani cllinched these concepts with his famous experiments involving the electrical stimulation of frogs' nerves.

About 1721, Lady Mary Montegu introduced a strange medical practice she had seen while visiting in Turkey: Inoculation. Instead of letting a full-blown case of smallpox damage their lovely skin, young women had pus from someone with a mild case of smallpox injected just under the skin. (Don't laugh: Today, people have themselves injected with the poison botox to erase wrinkles!) Edward Jenner later began inoculating people against the smallpox by vaccinating them with cowpox material. The antibodies produced made one immune to smallpox as well as further cases of cowpox.

9 | 88© Copyright 2006 C. George Boeree

Page 154: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

The 1800's

Medicine got its greatest boost in the 1800’s, especially after Louis Pasteur (1822-1895) came up with the theory that diseases were caused by micro-organisms. The new field of bacteriology continued with Pasteur’s friend Joseph Lister (1827-1912), who introduced the novel idea of antiseptic conditions in surgery – especially washing one’s hands!

Charles Bell (1774-1855) and François Magendie (1783-1855) independently clarified the distinction between sensory and motor nerves. They noted that sensory fibers enter the posterior roots of the spinal cord, and motor fibers leave the anterior roots. Bell is also the first person to describe the facial paralysis we now call Bell's palsy. And Magendie was the first to test the cerebellum's functions.

Franz Joseph Gall (1758-1828) of Vienna and, later, Paris, studied the shapes of skulls and concluded that the various bumps and depressions in each persons head related to certain psychological and personality characteristics. This would become very popular as phrenology, even though serious scientists such as Bell and Flourens thought it absurd. Please don't misunderstand: There is little, if any, truth to this map!)

Marie-Jean-Pierre Flourens (1794-1867) concluded that the cerebrum was in fact responsible for thought and will, and that it operates holistically – not as Gall would have it! He noted that the other parts – cerebellum, medulla, etc. – had different functions, but that each also works holistically within itself. It is also Flourens who introduces ablation as a way of studying the connection between the brain and behavior.

However, things just never seem to be that simple. Paul Broca (1824-1880), a French surgeon, had a patient that lost the power of speech due to a lesion in what is now called Broca’s Area. Another surgeon, Carl Wernicke, published a book on aphasia in 1874. He, of course, discovered the significance of Wernicke's area.

In 1870, two researchers, Eduard Fritsch and Gustav Hitzig, used direct electrical stimulation of the brain in a dog to discover, among other things, the motor and sensory cortices. Four years later, Robert Bartholow does the same with a human brain. Their work established that there is indeed some localization of function – it just doesn’t have anything to do with bumps on the head.

Johannes Müller (1801-1858), working in Berlin, developed the doctrine of specific energy of nerves. Each nerve, when stimulated, leads to only one sensory experience, even if it is stimulated in another manner than usual. A simple example is the light flashes you see when you press against your eyeballs! This (I think unfortunately) led to increased belief in indirect realism – i.e. that we don’t actually experience the world directly.

Hermann von Helmhotz

Hermann von Helmholtz is arguably the most famous German scientist of the 19th century. He was born in 1821 in Potsdam, Germany, to Caroline and August Helmholtz. His father, a teacher as well as an officer in the Prussian army, began schooling young Hermann at home because of health problems.

He did attend Gymnasium from the ages of nine to 17. He wanted to study physics, but entered medical school in Berlin in 1838. His parents could not afford to send him without the scholarship given to medical

10 | 88© Copyright 2006 C. George Boeree

Page 155: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

students who promised to serve in the army after graduating.

Helmholtz befriended several other young men – including Emil Du Bois-Reymond and Ernst Brücke – who were students of Johannes Müller at the nearby University of Berlin. These students, in contrast to their professor, swore a solemn oath to avoid vitalism, the belief that there was something unique about living, as opposed to non-living, matter: "No other forces than common physical chemical ones are active within the organism." Helmholtz adopted their position as well.

In 1842, he became an army surgeon at Potsdam, and continued studying math and physics on his own. In 1847, he read a paper at the Physical Society of Berlin on the conservation of energy. This alone would have won him an honored place in history!

Soon after, he became an associate professor of physiology at Königsberg, and married. During this period of his life, he measured the speed of the neural impulse. Prior, it was thought to be either infinite or the speed of light. He found it to be a paltry 90 feet per second. This put neurological activity well within the limits of ordinary physical and chemical sciences!

Along the way, in 1851, he invented the ophthalmascope – the device doctors use to look into your eye.

In 1855, he moved to Bonn to be professor of anatomy and physiology. Here he began his research into sight and hearing. In 1856, he published the first of three volumes called the Handbook of Physiological Optics.

He moved once again in 1858, this time to Heidelberg as professor of physiology. During this period, his wife died, and he later married a young socialite. His philosophical work focused on epistemology, and he continued his research on sight and hearing. His explanation of color vision – that it is based on three cones sensitive to red, green, and violet – is still remembered as the Young-Helmholtz theory. He became quite famous.

In 1870, he was offered the chair in physics (his first love) at the University of Berlin. In

addition to a huge salary, he was offered living quarters and a new Institute of Physics.

He published a number of papers on geometry, especially the non-Euclidean kind that would be so important to people like Einstein in the twentieth century. His main focus was physics, of course, and one of his prize students was Heinrich Hertz, who was the first person to actually generate radio waves in 1888.

Helmholtz traveled to the US in 1893 as the German representative to the Chicago Worlds Fair. A bad fall on ship put his health in serious jeopardy. He died of a cerebral hemorrhage in September of 1894.

11 | 88© Copyright 2006 C. George Boeree

Page 156: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

The Hippocratic Oath *

I swear by Apollo the physician, and Aesculapius, Hygeia and Panacea and all the gods and goddesses, that, according to my ability and judgement, I will keep this Oath and this covenant.

To reckon him who taught me this Art equally dear to me as my parents, to share my substance with him, and relieve his necessities if required; to look upon his offspring on the same footing as my own brothers, and to teach them this Art, if they shall wish to learn it, without fee or stipulation; and that by precept, lecture, and every other mode of instruction, I will impart a knowledge of the Art to my own sons, and those of my teachers, and to disciples who have signed the covenant and have taken an oath according to the law of medicine, but no one else.

I will follow that system of regimen which, according to my ability and judgment, I consider for the benefit of my patients, and abstain from whatever is deleterious and mischievous. I will give no deadly medicine to anyone if asked, nor suggest any such counsel; and in like manner I will not give to a woman an abortive remedy. With purity and with holiness I will pass my life and practise my Art.

I will not cut persons labouring under the stone, but will leave this to be done by such men as are practitioners of this work. Into whatever houses I enter, I will go into them for the benefit of the sick, and will abstain from every voluntary act of mischief and corruption; and, further, from the seduction of females or males, of freemen and slaves.

Whatever, in connection with my professional practice, or not in connection with it, I see or hear, in the life of men, which ought not to be spoken of abroad, I will not divulge, as reckoning that all such should be kept secret. While I continue to keep this Oath unviolated, may it be granted to me to enjoy life and practice of the Art, respected by all men, in all times. But should I trespass and violate this Oath, may the reverse be my lot.

* Source: http://www.usmedstudents.com/links/hippocraticoath.htm

12 | 88© Copyright 2006 C. George Boeree

Page 157: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Phrenology Diagram

For much more on phrenology, go to http://www.bc.edu/bc_org/avp/cas/fnart/phrenology/

13 | 88© Copyright 2006 C. George Boeree

Page 158: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

A Brief History of Psychopharmacology

The Ancient World Drugs and medicines have always been with us. Where there were plants with psychoactive properties, there were people willing to use them, for pleasure or relief, or to kill.

Recorded history is filled with descriptions of potent psychopharmaceuticals, but some have been outstanding. Alcohol has been nearly universal in use, and was already presenting itself as a problem among ancient Greeks and Romans. There are records of cannabis use in the ancient Middle East. Opium was known to the ancients, but seems to have been restricted to medicinal use. Hemlock was certainly known – Socrates met his death with a cup of hemlock.

More exotic substances were also available. An extract of the nightshade or belladonna plant called atropine was used everywhere from Rome to India as a poison – and as a cosmetic device: women sometimes put a drop of weak solution in their eyes to dilate their pupils! It is still used for the same reason today by eye doctors.

Another favorite was the extract of the foxglove plant, called digitalis. A powerful poison, it was also used to treat various ailments.

And mushrooms provided many of our ancestors with interesting hallucinogenic experiences (and serious illnesses!). Some believe that the holy drink of the ancient Aryans mentioned in the Vedas – soma – was a concoction involving mushrooms.

The Middle Ages Alcohol continued to be used with great gusto during the Middle Ages in Europe. Around 1250, Europeans developed the process of distillation and added brandy and other liquors to the already popular wine and beer. The generic term for these distilled products was "the water of life" – aqua vitae.

Early in the Middle Ages, Arab traders and warriors introduced the use of the opium poppy to India and China. In China, it was used primarily as a medicine. But in India, it became a widespread habit of the rich, and soldiers used it to bolster their fighting spirit. At this time, opium was ingested primarily as a drink; sometimes it was eaten.

The Age of Exploration By the sixteenth century, alcohol had developed into a serious social problem. Worthies from Martin Luther to King James I of England condemned drunkenness. And yet society at large continued to see alcohol as a gift from God. Attempts to control its use invariably failed, and authorities were limited to regulating and taxing its sale.

Around 1650, a new leap was taken by the Dutch in the form of inexpensive distilled grain flavored with the berries of the juniper bush: genever or, in English, gin. It was an immediate success in England as well.

But new forms of psychoactive substances were pouring in from all over the world. Coffee, for example, was introduced into Europe from Arabia, where they had invented coffee roasting centuries before. Although Moslem religious figures condemned it, it was so popular among Moslems as a substitute for alcohol that it was dubbed "the wine of the Arabs."

14 | 88© Copyright 2006 C. George Boeree

Page 159: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Coffee was considered by Europeans and Arabs alike as healthful and therapeutic. It also wound up being the focal point of a new social institution, the coffee house or café. It was particularly praised as the long-sought substitute for the evils of alcohol.

In the latter part of this era, the East India Company and other trading companies began imported tea from China and India. It, too, was praised as a medicinal drink, but would not compete with coffee for some time to come.

One of the first things that Columbus and his emulators discovered, after they discovered America itself, was tobacco. The first seeds were brought to Europe by a French adventurer named André Thevet. It was deemed a potent medicine, good for a great number of ailments, especially those involving the lungs, by Jean Nicot of France – from whose name we get nicotine.

Tobacco seeds came to England ten years later, and spread throughout the upper classes through the salesmanship of a certain Sir Walter Raleigh. It was praised as a panacea, and became a major crop for settlers in Virginia and other New World locales. In an effort to control its use, it was heavily taxed.

Smoking also spread throughout Asia, from Turkey to China. The response was far more negative than in Europe: Selling tobacco was punishable by decapitation in China, for example, and carried the death penalty in the Ottoman Empire. In Russia, one could be tortured and exiled for using it. And the pope made excommunication the punishment for clergy who took up the habit.

None of this, of course, actually did any good.

Another major drug to enter the Western arena in this period is coca. Coca leaves had been chewed for ages in South America, especially among the Incas. After Pizarro destroyed the Inca Empire in 1553, a Spanish adventurer named Monardes brought the plant to Europe, but it failed to catch on – at this point!

The 1800s In 1859, Dr. Pablo Mantegazzo isolated cocaine from the coca leaf, and wrote about its wonderful powers to combat fatigue, depression, and impotence. A few decades later, a Viennese physician by the name of Sigmund Freud sang its praises as an anesthetic and a restorative. With these and many other supporters, cocaine became quite popular. It even became a part of the formula for a popular tonic in the US known as Coca-Cola. Until 1903, "coke" contained 60 milligrams of cocaine per 8 ounce serving! After causing a number of deaths by overdose, it was outlawed in 1914.

A more serious issue in the 1800s was opium. In 1820, the Chinese, in an effort to stop the spread of opium addiction, prohibited the importation of opium. The British – who seem to play the part of the culprit in many of these situations – actually declared war on China in order to protect their precious opium trade. They ended up with their market intact and a piece of China called Hong Kong.

The use of opium was recommended by the medical profession in Europe and America, and few challenged them. The problem was exacerbated by a number of novelties: Friedrick Serturner’s discovery of morphine, an opium derivative, in 1803; the practice of smoking opium rather than drinking or eating it; and by the invention of the hypodermic in 1853.

Opium and its derivatives began to receive some well-deserved negative attention when the British author De Quincey wrote his best selling Confessions of an English Opium Eater in 1822. By this time, opium was available in the form of hundreds of different non-prescription medicines, and was quite popular among both upper and working classes.

In 1874, heroin was synthesized from opium, and was touted as a less dangerous form than opium or morphine. The name, in fact, refers to its supposed potential as the hero of medicines. In 1896, the Bayer company began marketing heroin.

15 | 88© Copyright 2006 C. George Boeree

Page 160: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Several other drugs became available to the European and American public in the 1800s. For one, laborers from India brought cannabis to Jamaica in the form of ganja. The Indian Hemp Drugs Commission put its stamp of approval on cannabis, saying its use is accompanied by practically no negative consequences, and the drug spread among the Jamaican lower class. It was available in the US and Europe, but it would not become popular until the next century.

Amphetamines, the first major synthetic drug, was discovered in 1887. Its use as a stimulant quickly became widespread. It was used in World War II to help energize soldiers and industrial workers alike.

Earlier in the 1700s, ether was discovered. When its effectiveness as an anesthetic became known in the 1840s, inhaling it or mixing a few drops in water became popular among upper class youth in the US and Europe. It would later spread to the poor of Ireland and other countries as a cheap alternative to alcohol.

And last, but not least, Claude Bernard began in 1856 to experiment with a poison from the Amazon jungles of South America called curare.

Psychedelic drugs Psychedelic drugs or hallucinogens have been with us since ancient times, as mentioned above. But it wasn't until the 1900's – especially the 1960's – that they became as popular as they have. Here is a partial list:

Scopolamine, an anticholinergic drug, is found in Atropa belladonna (belladonna or deadly nightshade), Datura stramonium (jimsonweed), and Mandragora officinarum (mandrake).

A large number of modern drugs have catecholamine-like effects. The oldest is peyote (from the Lophophora williamsii plant), used by Mexican Indians. Mescaline is derived from peyote. There are two drugs, myristin and elemicin, which are found in nutmeg and mace. And there are the methamphetamines with their endless initials (DOM, MDA, DMA TMA, MDE, and MDMA – the last best known as ecstasy).

Arguably the most famous hallucinogens are the serotonin-like drugs. Some have ancient roots: Psilocybin and psilocin are derived from the mushroom Psilocybe mexicana; Ololiuqui was used by Central and South American Indians, and is better known as morning glory seeds; Harmine comes from the Middle Eastern plant called Peganum harmala; And bufotenine comes from the skin secretions of the South American bufo toad!

In 1938, however, all these begin to pale in comparison with the discovery by one Albert Hofman, a Swiss chemist, of a derivative of ergot (a rye fungus), which he called lysergic acid diethylamide – LSD.

And finally, we have the very dangerous psychedelic anesthetic drugs such as phencyclidine, discovered in 1956, and better known as PCP or angeldust.

16 | 88© Copyright 2006 C. George Boeree

Page 161: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Charles Darwin and Evolution

17 | 88© Copyright 2006 C. George Boeree

Page 162: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Charles Robert Darwin (1809-1882) *

Charles Darwin was born in Shrewsbury, England, on February 2, 1809. His father was Robert Waring Darwin, a physician and son of the famous Erasmus Darwin, also a physician, as well as a respected writer and naturalist. His mother was Susannah Wedgewood Darwin. She died when Charles was eight.

Charles was educated in the local school, taught by Dr. Samuel Butler. In 1825, he went to Edinburgh to start studying medicine, but he soon realized that he did not have the stomach for it! So he switched to Cambridge, ostensibly to become a clergyman. He was actually more interested in entomology – especially beetles – and in hunting. He graduated from Christ’s College in 1831.

It is said that even when he was a young man, he had a patient and open mind, spending many hours collecting specimens of one sort or another and pondering over new ideas. The idea of evolution was very much in the air in those times: It was increasingly clear to naturalists that species change and have been changing for many millennia. The question was, how did this happen?

One of his mentors, John Henslow, encouraged him to apply for the (unpaid!) position of naturalist on a surveying expedition on the now-famous vessel, the Beagle, under the command of Capt. Robert Fitz-Roy. Charles left England for the first time in his life on December 27,

1831. He wouldn’t return until October 2, 1836!

Most of the ship’s time was spent surveying the coasts of South America and nearby islands, but it would also visit various Pacific islands, New Zealand, and Australia. It was the Galapagos Islands that most impressed him. There he found that finches had evolved a variety of beaks – each suited to a particular food source. Natural variation had somehow been selected to fit the ecological niches available on the tiny islands.

Upon returning, Darwin wrote several books based on his surveys on geology and the plant and animal species he had observed and collected. He also published his journal as Journal of a Naturalist. He notes that he was most impressed by the ways similar animals adapted to different ecologies.

From early on, Darwin recognized that selection was the principle men used so successfully when breeding animals. What he needed now was an idea as to how nature could perform that task without the benefit of intelligence!

In 1838, he read a book by Malthus called Population. Malthus introduced the idea that competition over limited resources would, in nature, keep populations stable. He also warned that human populations, when straining resources, would suffer as well!

On January 29, 1839, Darwin married Emma Wedgewood, a cousin. They lived in London for a few years, then settled in the village of Down, 15 miles outside London, where they lived the rest of their lives. Darwin began suffering from an illness he had probably contracted from an insect bite in the Andes many years before. Darwin’s son, Francis, later could not say enough about his mother’s dedication to his father’s well-being. Without her, he would have been considerably less productive. They would go on to have two daughters and five sons.

Darwin wrote a sketch of his theory in 1842. In 1844, he wrote in a letter "At last gleams of light have come, and I am almost convinced (quite contrary to the opinion I started with) that species are not (it is like confessing a murder) immutable."

* A major resource for Darwin's biography was the 11th edition (1910/1911) of the Encyclopedia Britannica, available online at http://www.gennet. org/darwin.htm

18 | 88© Copyright 2006 C. George Boeree

Page 163: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

He was about half done with a full exposition of his ideas when he received an essay from A. R. Wallace, with a request for comments. The essay outlined a theory of natural selection! Wallace, too, had read Malthus, and in 1858, while sick from fever, had the whole idea come to him in one flash. Darwin, in his reluctance, had postponed revealing his ideas to the scientific public for 20 years!

Darwin forwarded the essay to his friend, Sir Charles Lyell of the Geological Society, as Wallace had requested. Lyell sent the essay on, with an essay by Darwin, for presentation at a scientific conference.

The point they jointly made was clear: Just like men can exaggerate one or another minor variation by selectively breeding dogs or cattle, so nature selects similar variations – by only permitting the most successful variations to survive and reproduce in the struggle over limited resources. Although the changes would be slight and slow, the millennia would permit the obvious diversity of nature! Darwin named this natural selection.

In 1859, Darwin finally published his master work, on the Origin of Species by Means of Natural Selection. The book was an instant success. There was also, of course, a great deal of debate – mostly concerning the contrast with traditional religious explanations of the natural world.

Natural selection was often confused with an earlier idea of the French naturalist Lamarck. He suggested that characteristics acquired during an animal’s life were passed on to its offspring. The famous example is how the constant stretching of the neck over many generations explains the giraffe’s unlikely physique. This theory – Lamarckianism – was natural selection’s major competitor for decades to come!

In 1868, he published The Variation of Animals and Plants under Domestication. Then, in 1871, he came out with The Descent of Man, and Selection in Relation to Sex. This was really two books in one. The second part is about sexual selection. This is what accounts for, for example, the bright colors of many male birds: Both the males' coloring and the attraction to the coloring on the part of the females during courtship are selected for because these variations benefit the offspring.

The Descent of Man portion of the book is a brief introduction to the idea that we, too, are the results of natural selection. This part would lead to a lot of heated arguments!

In 1872, The Expression of Emotion was published. This time, Darwin talks about the evolution of the signals that animals use to communicate – and relates those signals to human emotional expression. This is the first step towards what we now call sociobiology (and evolutionary psychology).

In addition to these influential books, Darwin also enjoyed studying and writing about plants. In 1862, he wrote Fertilization of Orchids. In 1875, he came out with both Climbing Plants and Insectivorous Plants. In I877, came Different Forms of Flowers on Plants of the Same Species. In 1880, he wrote, with his son Francis, The Power of Movement in Plants. And in 1881, he published the famous Formation of Vegetable Mould through the Action of Worms!

Charles Darwin died April 19, 1882. He was buried in Westminster Abbey. He was apparently a kind and gentle man, beloved by his family and friends alike. Outside of his voyage on the Beagle, he rarely left his home in Down. Reluctantly, he surrendered his religious beliefs and settled into an agnosticism that did not prevent him from participating with his parish in charitable works.

Alfred Russel Wallace

Alfred Wallace was born in 1823 in the village of Usk in Monmouthshire, England. His options were limited as his father died when Alfred was still a young man. So, taking advantage of a natural talent, he became a drawing teacher.

19 | 88© Copyright 2006 C. George Boeree

Page 164: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

He went on an expedition to South America with his friend Henry Bates. He spent four years in the jungles of Brazil. On his way home, the ship caught fire and sank – with four years of notes and specimen collections. The crew and passengers were fortunately rescued by a passing vessel. These adventures were the basis of Travels on the Amazon and Rio Negro, published in 1853.

Soon afterwards, he left on a second voyage, this time to Malaysia. This one would last eight years! It was during this expedition that he, sick with fever, had the idea for natural selection, and in two days, wrote an essay on the topic and sent it off to Darwin. After his return from Malaysia, he published The Malay Archipelago, a detailed journal on the plants, animals, and people of the islands.

He died November 7, 1913. Although offered a place at Westminster Abbey, his family preferred that he be buried near his home. His grave is appropriately marked by a fossilized tree trunk.

Thomas Henry Huxley

Thomas Henry Huxley was born May 4, 1825, the son of George Huxley, a schoolmaster, and Rachel Huxley. He received two years of formal education at his dad's school, and was for the rest self-educated.

Although he was raised Anglican, he became interested in Unitarianism and its naturalistic thinking. This interest led him to begin studying biology with his brother in law. His studies led to a scholarship to Charing Cross Hospital in London, where he won awards in physiology and organic chemistry.

He served as assistant surgeon on the HMS Rattlesnake. which was surveying the waters around Australia and New Guinea. To pass the time, he began investigating the various forms of sea life.

Huxley met and fell in love with Nettie Heathorn in Sydney in 1847. He then continued his biological research in that part of the world. After returning to England, he was elected to the Royal Society in 1850, but could not find an academic position. Depressed and angry, he began taking on controversial stands – including denial of the Christian version of geology.

In 1854, he began teaching at the Government School of Mines. Finally established as a gentleman, he brought the patient Nettie to England and they married in 1855.

Huxley met Darwin in 1856, and they developed a long and close friendship. He took it upon himself to begin a campaign for Darwin's theory, which earned him the nickname "Darwin's bulldog." In particular, he fought against the church and for the concept of human evolution from apes. All the while he was a great promoter of science in general and scientific education in particular.

Huxley was responsible for a great deal of research, from his original work on sea creatures to later work on the evolution of vertebrates. He also came up with the idea of agnosticism – by which he meant the belief that ultimate reality would always be beyond human grasp. And he is responsible for the popular metaphysical point of view known as epiphenomenalism.

In 1882, his daughter went mad. She would die five years later under the care of Jean-Martin Charcot, the great French psychiatrist. He became very depressed and retired from his professorship. For a while, he promoted Social Darwinism (see below), but backed away years later to say, with Darwin, that humanity is best served by promoting ethics, rather than instincts.

He died of a heart attack during a speech, June 29, 1895.

20 | 88© Copyright 2006 C. George Boeree

Page 165: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Herbert Spencer

Herbert Spencer was born April 27, 1820, in Derby, England. His father was a schoolmaster, and both his parents were "dissenters" (i.e. religious non-conformists). Spencer was clearly gifted and was mostly self-educated.

An excellent writer, he wrote articles on social issues for various magazines of the day and even became editor of The Economist for several years. In 1855, he published The Principles of Psychology. This became part of a series of books, which he called The Synthetic Philosophy, and included biology and sociology as well as psychology.

Originally believing in the inheritance of acquired characteristics (Lamarckianism), he became a follower of Darwin's theory. It was, in fact, Spencer who coined the phrase "survival of the fittest" But he also transformed Darwin's theory into a social theory that encouraged extreme individualism and laissez-faire economic policies, called Social Darwinism.

Basically, survival of the fittest was to apply to people competing against people as well, and he implied that it was something of a social duty to accept the fact that some would be rich and some poor – and that the consequences of poverty should not be interfered with. Even whole societies – such as England – were engaged in a struggle for survival that did not allow for weakness of will or sentimentality.

Social Darwinism is not something Darwin would have approved of. It has in it the fallacy of false analogy: Human society is not a neat parallel to the non-human biological world. Unfortunately, Social Darwinism seems to be here to stay, and can be found within Fascist, Conservative, and Libertarian political agendas, and in personal philosophies such as that of Ayn Rand.

Spencer is, nevertheless, considered one of the great productive thinkers of his day. He died Dec. 8, 1903, in Brighton, Sussex.

21 | 88© Copyright 2006 C. George Boeree

Page 166: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

A Selection from The Descent of Man by Charles Darw in *

The main conclusion here arrived at, and now held by many naturalists who are well competent to form a sound judgment, is that man is descended from some less highly organised form. The grounds upon which this conclusion rests will never be shaken, for the close similarity between man and the lower animals in embryonic development, as well as in innumerable points of structure and constitution, both of high and of the most trifling importance, - the rudiments which he retains, and the abnormal revisions to which he is occasionally liable, - are facts which cannot be disputed. They have long been known, but until recently they told us nothing with respect to the origin of man. Now when viewed by the light of our knowledge of the whole organic world their meaning is unmistakable. The great principle of evolution stands up clear and firm, when these groups of facts are considered in connection with others, such as the mutual affinities of the members of the same group, their geographical distribution in past and present times, and their geological succession. It is incredible that all these facts should speak falsely. He who is not content to look, like a savage, at the phenomena of nature as disconnected, cannot any longer believe that man is the work of a separate act of creation. He will be forced to admit that the close resemblance of the embryo of man to that, for instance, of a dog - the construction of his skull, limbs and whole frame on the same plan with that of other mammals, independently of the uses to which the parts may be put - the occasional re-appearance of various structures, for instance of several muscles, which man does not normally possess, but which are common to the Quadrumana - and a crowd of analogous facts - all point in the plainest manner to the conclusion that man is the co-descendant with other mammals of a common progenitor.

We have seen that man incessantly presents individual differences in all parts of his body and in his mental faculties. These differences or variations seem to be induced by the same general causes, and to obey the same laws as with the lower animals. In both cases similar laws of inheritance prevail. Man tends to increase at a greater rate than his means of subsistence; consequently he is occasionally subjected to a severe struggle for existence, and natural selection will have effected whatever lies within its scope. A succession of strongly-marked variations of a similar nature is by no means requisite; slight fluctuating differences in the individual suffice for the work of natural selection; not that we have any reason to suppose that in the same species, all parts of the organisation tend to vary to the same degree.

By considering the embryological structure of man, - the homologies which he presents with the lower animals, - the rudiments which he retains, - and the reversions to which he is liable, we can partly recall in imagination the former condition of our early progenitors; and can approximately place them in their proper place in the zoological series. We thus learn that man is descended from a hairy, tailed quadruped, probably arboreal in its habits, and an inhabitant of the Old World. This creature, if its whole structure had been examined by a naturalist, would have been classed amongst the Quadrumana, as surely as the still more ancient progenitor of the Old and New World monkeys. The Quadrumana and all the higher mammals are probably derived from an ancient marsupial animal, and this through a long line of diversified forms, from some amphibian-like creature, and this again from some fish-like animal. In the dim obscurity of the past we can see that the early progenitor of all the Vertebrata must have been an aquatic animal, provided with branchiæ, with the two sexes united in the same individual, and with the most important organs of the body (such as the brain and heart) imperfectly or not at all developed. This animal seems to have been more like the larvæ of the existing marine Ascidians than any other known form.

The high standard of our intellectual powers and moral disposition is the greatest difficulty which presents itself, after we have been driven to this conclusion on the origin of man. But every one who admits the

* From Charles Darwin, The Descent of Man and Selection in Relation to Sex (New York: Appleton and Co., 1883), pp. 7, 609, 612-614, 618-619.

This text is part of the Internet Modern History Sourcebook, available at http://www.fordham.edu/halsall/mod/1871darwin.html© Paul Halsall Aug 1997 [email protected]

22 | 88© Copyright 2006 C. George Boeree

Page 167: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

principle of evolution, must see that the mental powers of the higher animals, which are the same in kind with those of man, though so different in degree, are capable of advancement....

The moral nature of man has reached its present standard, partly through the advancement of his reasoning powers and consequently of a just public opinion, but especially from his sympathies having been rendered more tender and widely diffused through the effects of habit, example, instruction, and reflection. It is not improbable that after long practice virtuous tendencies may be inherited. With the more civilised races, the conviction of the existence of an all-seeing Deity has had a potent influence on the advance of morality. Ultimately man does not accept the praise or blame of his fellows as his sole guide though few escape this influence, but his habitual convictions, controlled by reason, afford him the safest rule. His conscience then becomes the supreme judge and monitor. Nevertheless the first foundation or origin of the moral sense lies in the social instincts, including sympathy; and these instincts no doubt were primarily gained, as in the case of the lower animals, through natural selection.

The belief in God has often been advanced as not only the greatest but the most complete of all the distinctions between man and the lower animals. It is however impossible, as we have seen, to maintain that this belief is innate or instinctive in man. On the other hand a belief in all-pervading spiritual agencies seems to be universal, and apparently follows from a considerable advance in man's reason, and from a still greater advance in his faculties of imagination, curiosity and wonder. I am aware that the assumed instinctive belief in God has been used by many persons as an argument for His existence. But this iS a rash argument, as we should thus be compelled to believe in the existence of many cruel and malignant spirits, only a little more powerful than man; for the belief in them is far more general than in a beneficent Deity. The idea of a universal and beneficent Creator does not seem to arise in the mind of man, until he has been elevated by long-continued culture....

I am aware that the conclusions arrived at in this work will be denounced by some as highly irreligious; but he who denounces them is bound to shew why it is more irreligious to explain the origin of man as a distinct species by descent from some lower form, through the laws of variation and natural selection, than to explain the birth of the individual through the laws of ordinary reproduction. The birth both of the species and of the individual are equally parts of that grand sequence of events, which our minds refuse to accept as the result of blind chance. The understanding revolts at such a conclusion, whether or not we are able to believe that every slight variation of structure, - the union of each pair in marriage, - the dissemination of each seed, - and other such events, have all been ordained for some special purpose.

Sexual selection has been treated at great length in this work, for, as I have attempted to shew, it has played an important part in the history of the organic world. I am aware that much remains doubtful, but I have endeavoured to give a fair view of the whole case. In the lower divisions of the animal kingdom, sexual selection seems to have done nothing: such animals are often affixed for life to the same spot, or have the sexes combined in the same individual, or what is still more important, their perceptive and intellectual faculties are not sufficiently advanced to allow of the feelings of love and jealousy, or of the exertion of choice. When, however, we come to the Arthropoda and Vertebrata, even to the lowest classes in these two great Sub-Kingdoms, sexual selection has effected much....

Sexual selection depends on the success of certain individuals over others of the same sex, in relation to the propagation of the species; whilst natural selection depends on the success of both sexes, at all ages, in relation to the general conditions of life. The sexual struggle is of two kinds; in the one it is between the individuals of the same sex, generally the males, in order to drive away or kill their rivals, the females remaining passive; whilst in the other, the struggle is likewise between the individuals of the same sex, in order to excite or charm those of the opposite sex, generally the females, which no longer remain passive, but select the more agreeable partners....

The main conclusion arrived at in this work, namely that man is descended from some lowly organised form, will, I regret to think, be highly distasteful to many. But there can hardly be a doubt that we are descended from barbarians. The astonishment which I felt on first seeing a party of Fuegians on a wild and broken shore will never be forgotten by me,< for the reflection at once rushed into my mind - such were our ancestors.

23 | 88© Copyright 2006 C. George Boeree

Page 168: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

These men were absolutely naked and bedaubed with paint, their long hair was tangled, their mouths frothed with excitement, and their expression was wild, startled, and distrustful. They possessed hardly any arts, and like wild animals lived on what they could catch; they had no government, and were merciless to every one not of their own small tribe. He who has seen a savage in his native land will not feel much shame, if forced to acknowledge that the blood of some more humble creature flows in his veins. For my own part I would as soon be descended from that heroic little monkey, who braved his dreaded enemy in order to save the life of his keeper, or from that old baboon, who descending from the mountains, carried away in triumph his young comrade from a crowd of astonished dogs - as from a savage who delights to torture his enemies, offers up bloody sacrifices, practises infanticide without remorse, treats his wives like slaves, knows no decency, and is haunted by the grossest superstitions.

Man may be excused for feeling some pride at having risen, though not through his own exertions, to the very summit of the organic scale; and the fact of his having thus risen, instead of having been aboriginally placed there, may give him hope for a still higher destiny in the distant future. But we are not here concerned with hopes or fears, only with the truth as far as our reason permits us to discover it; and I have given the evidence to the best of my ability. We must, however, acknowledge, as it seems to me, that man with all his noble qualities, with sympathy which feels for the most debased, with benevolence which extends not only to other men but to the humblest living creature, with his god-like intellect which has penetrated into the movements and constitution of the solar system - with all these exalted powers - Man still bears in his bodily frame the indelible stamp of his lowly origin.

24 | 88© Copyright 2006 C. George Boeree

Page 169: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Primal Patterns of Behavior

John and Mary A typical American couple

John and Mary picked out a lot in suburbia (1) and had a nice two-story house built on it (2). They have since moved in and made it a real home, with their own furniture, decoration, landscaping, and so on (3). They keep their home neat and clean, and are especially proud of their fancy master bathroom, with jacuzzi (4). They put up a fence to keep their dog in and their neighbor's kids out, and have installed burgler and fire alarm systems (5). They even had a nice mailbox sign made-up that says, in gold lettering, "The Smiths" (6). They like their town and spend some leisure time taking scenic drives (7) and frequenting their favorite "watering holes" (8).

Both John and Mary work (9) – they are "yuppies" – and although they like their work, they would have to confess that their true goal is to make a "killing" in the market (10). They bring home (11) quite a bit of money every week, and they have considerable investments and a sizable "nest egg" (12). Their jobs are demanding, the business world being highly competitive and "dog-eat-dog" (13). When they finalize a profitable deal, they like to celebrate afterwards (14). But they've also had their share of failures, and have had to skulk off with their "tails between their legs" (15).

They have quite a few friends at work (16) and like to get together (17). Most of these friends are people of their own status, because it's hard to be comfortable with someone who is your boss or who works for you (18). But they are polite people, and are always pleased to see an aquaintance (19). An attractive, well-groomed couple (20), John and Mary dated for about a year (21). They soon became a "serious item" and finally got married (22). They are thinking about having children (23), but are concerned that they will need to move if they want to provide their children with the best possible environment (24).

The Komodo Dragon*

An eight-foot lizard with a brain the size of a walnut: Basic instincts

1. Selection and preparation of homesite

2. Establishment of territory

3. Marking of territory

4. Use of defecation posts

5. Patrolling territory

6. Ritualistic display in defense of territory, commonly involving the use of coloration and adornments

7. Trail making

8. Showing place preferences

9. Forgaging

* image from http://www.lazoo.org/closeup.html

25 | 88© Copyright 2006 C. George Boeree

Page 170: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

10. Hunting

11. Homing

12. Hoarding

13. Formalized intraspecific fighting in defense of territory

14. Triumphal display in successful defense

15. Assumption of distinctive postures and coloration in signalling surrender

16. Formation of social groups

17. Flocking

18. Establishment of social hierarchy by ritualistic

display and other means

19. Greeting

20. Grooming

21. Courtship, with displays using coloration and adornments

22. Mating

23. Breeding and, in isolated instances, attending offspring

24. Migration

Is there a komodo dragon inside YOU?

26 | 88© Copyright 2006 C. George Boeree

Page 171: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Sociobiology

Ever since Darwin came out with his theory of evolution, people - including Darwin himself – have been speculating on how our social behaviors (and feelings, attitudes, and so on) might also be affected by evolution. After all, if the way our bodies look and work as biological creatures can be better understood through evolution, why not the things we do with those bodies?

The entemologist E. O Wilson was the first to formalize the idea that social behavior could be explained evolutionarily, and he called his theory sociobiology. At first, it gained attention only in biological circles – even there it had strong critics. When sociologists and psychologists caught wind of it, the controversy really got started. At that time, sociology was predominantly structural-functionalist, with a smattering of Marxists and feminists. Psychology was still dominated by behaviorist learning theory, with humanism starting to make some headway. Not one of these theories has much room for the idea that we, as human beings, could be so strongly determined by evolutionary biology!

Over time, Wilson's sociobiology found more and more supporters among biologists, psychologists, and even anthropologists. Only sociology has remained relatively unaffected.

Instinct

Let's begin with an example of instinctual behavior in animals: The three-spined stickleback is a one-inch long fish that one can find in the rivers and lakes of Europe. Springtime is, as you might expect, the mating season for the mighty stickleback and the perfect time to see instincts in action.

Certain changes occur in their appearances: The male, normally dull, becomes red above the midline. He stakes out a territory for himself, from which he will chase any similarly colored male, and builds a nest by depositing weeds in a small hollow and running through them repeatedly to make a tunnel. This is all quite built-in. Males raised all alone will do the same. We find, in fact, that the male stickleback will, in the mating season, attempt to chase anything red from his territory (including the reflection of a red truck on the aquarium's glass).

But that's not the instinct of the moment. The female undergoes a transformation as well: She, normally dull like the male, becomes bloated by her many eggs and takes on a certain silvery glow that apparently no male stickleback can resist. When he sees a female, he will swim towards her in a zigzag pattern. She will respond by swimming towards him with her head held high. He responds by dashing towards his nest and indicating it's entrance. She enters the nest, her head sticking out one end, her tail the other. He prods at the base of her tail with rhythmic thrusts. She releases her eggs and leaves the nest. He enters and fertilizes the eggs, and then, a thorough chauvinist, chases her away and waits for a new partner.

What you see working here is a series of sign stimuli and fixed actions: His zigzag dance is a response to her appearance and becomes a stimulus for her to follow, and so on. Perhaps I'm being perverse, but doesn't the stickleback's instinctive courtship remind you of some of our human courtship rituals? I'm not trying to say we are quite as mindless about it as the stickleback seems to be – just that some similar patterns may form a part of or basis for our more complex, learned behaviors.

Ethologists – people who study animal behavior in natural settings – have been studying behaviors such as the sticklebacks' for over a century. One, Konrad Lorenz, has developed an hydraulic model of how an instinct works. We have a certain amount of energy available for any specific instinctual system, as illustrated by a reservoir of water. There is a presumably neurological mechanism that allows the release of some or all of that energy in the presence of the appropriate sign stimulus: a faucet. There are further

27 | 88© Copyright 2006 C. George Boeree

Page 172: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

mechanisms – neurological, motor, hormonal – that translate the energy into specific fixed actions. Today, we might suggest that hydraulic energy is a poor metaphor and translate the whole system into an information processing one – each era has it's favorite metaphors. But the description still seems sound.

Does any of this apply to human courtship and sexual behavior? I leave it up to you. But what about other examples? Two possibilities stand out:

1. There are certain patterns of behavior found in most, if not all, animals, involving the promotion of oneself, the search for status or raw power, epitomized in aggression. Let's call this the assertive instinct.

2. There are other patterns of behavior found in, it seems, somewhat fewer species, involving care for someone other than oneself, epitomized in a mother's care for her babies. Let's call this the nurturant instinct.

Evolution

The basics of evolution are quite simple. First, all animals tend to over-reproduce, some having literally thousands of offspring in a lifetime. Yet populations of animals tend to remain quite stable over the generations. Obviously, some of these offspring aren't making it!

Second, There is quite a bit of variation within any species. Much of the variety is genetically based and passed on from one generation to another. Included in that variety are traits that help some individuals to survive and reproduce, and other traits that hinder them.

Put the two ideas together, and you have natural selection: Nature encourages the propagation of the positive traits and discourages the negative ones. As long as variety continues to be created by sexual recombination and mutation, and the resources for life remain limited, evolution will continue.

One sociobiologist, David Barash, suggests a guiding question for understanding possible evolutionary roots of any behavior: "Why is sugar sweet", that is, why do we find it attractive? One hypothesis is that our ancestors ate fruit to meet their nutritional needs. Fruit is most nutritious when it is ripe. When fruit is ripe, it is loaded with sugars. Any ancestor who had a taste for sugar would be a little more likely to eat ripe fruit. His or her resulting good health would make him or her stronger and more attractive to potential mates. He or she might leave more offspring who, inheriting this taste for ripe fruit, would be more likely to survive to reproductive age, etc. A more general form of the guiding question is to ask of any motivated behavior "How might that behavior have aided ancestral survival and/or reproduction?"

A curious point to make about the example used is that today we have refined sugar – something which was not available to our ancestors, but which we discovered and passed on to our descendants through learned culture. It is clear that today a great attraction to sugar no longer serves our survival and reproduction. But culture moves much more quickly than evolution: It took millions of years to evolve our healthy taste for sugar; it took only thousands of years to undermine it.

Attraction

Let's start by looking at mate selection. It is obvious that we are attracted some people more than others. Sociobiologists have the same explanation for this as for everything else, based on the archetypal question "why is sugar sweet?" We should be sexually attracted to others whose characteristics would maximize our

28 | 88© Copyright 2006 C. George Boeree

Page 173: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

genetic success, that is, would give us many healthy, long-lived, fertile children.

We should find healthiness attractive and, conversely, illness unattractive. We should find "perfect" features attractive, and deformities unattractive. We should find vitality, strength, vigor attractive. We should find "averageness" attractive – not too short, not too tall, not too fat, not too thin.... Quasimodo, for all his decency, had a hard time getting dates.

We are also attracted to certain people for less "logical" reasons, such as the degree to which they have strong masculine or feminine physical – and behavioral – characteristics. Women prefer men who are taller, with broad shoulders, a square jaw.... Men prefer women who are shorter than themselves, softer, rounder....

These differences between the sexes is known as sexual dimorphism, and the process that leads to these differences is called sexual selection. Small functional differences between the sexes can become large nonfunctional ones over many generations. If female birds are instinctively inclined to prefer colorful males – perhaps because colorful males have served to distract predators from ancestral females and their chicks – then a male that is more colorful will have a better chance, and the female with a more intense attraction to color a better chance, and their offspring will inherit their colors and intense attraction to colors and so on and so on... until you reach a point where the colors and the attraction are no longer a plus, but become a minus, such as in the birds of paradise. Some males cannot even fly under the weight of all their plumage.

Human beings are only modestly dimorphic. But boy are we aware of the dimorphisms!

The dimorphism is also found in our behaviors. David Barash puts it so: "Males tend to be selected for salesmanship; females for sales resistance." Females have a great deal invested in any act of copulation: the limited number of offspring she can carry, the dangers of pregnancy and childbirth, the increased nutritional requirements, the danger from predators...all serve to make the choice of a mate an important consideration. Males, on the other hand, can and do walk away from the consequences of copulation. Note, for example, the tendency of male frogs to try to mate with wading boots: As long as some sperm gets to where it should, the male is doing alright.

So females tend to more fussy about who they have relations with. They are more sensitive to indications that a particular male will contribute to their genetic survival. One of the most obvious examples is the attention many female animals pay to the size and strength of males, and the development of specialized contests, such as those of antlered and horned animals, to demonstrate that strength.

There are less obvious things as well. In some animals, males have to show, not just strength, but the ability to provide. This is especially true in any species which has the male providing for the female during her pregnancy and lactation – like humans! Sociobiologists suggest that, while men find youth and physical form most attractive, women tend to look for indications of success, solvency, savoir-faire. It might not just be a cultural fluke that men bring flowers and candies, pay for dinner, and so forth.

Further, they suggest, women may find themselves more interested in the "mature" man, as he is more likely to have proven himself, and less interested in the "immature" man, who presents a certain risk. And women should be more likely to put up with polygyny (i.e. other wives) than men with polyandry (other husbands): Sharing a clearly successful man is better in come cases than having a failure all to yourself. And, lo and behold, polygyny is even more common than monogamy, while polyandry is found in perhaps two cultures (one in Tibet and the other in Africa), and in both it involves brothers "sharing" a wife in order not to break-up tiny inherited properties..

Taking it from the other direction, males will tolerate less infidelity than females: Females "know" their children are theirs; males never know for sure. Genetically, it matters less if males "sow wild oats" or have many mates or are unfaithful. And, sure enough, most cultures are harder on women than men when it comes to adultery. In most cultures, in fact, it is the woman who moves into the husband's family (virilocality) – as if to keep track of her comings and goings.

From our culture's romantic view of love and marriage, it is interesting to note that in most cultures a failure to consummate a marriage is grounds for divorce or annulment. In our own culture, infertility and impotence

29 | 88© Copyright 2006 C. George Boeree

Page 174: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

are frequent causes of divorce. It seems reproduction is more important than we like to admit.

Of course, there is a limit to the extent to which we generalize from animals to humans (or from any species to any other), and this is especially true regarding sex. We are very sexy animals: Most animals restrict their sexual activity to narrowly defined periods of time, while we have sex all month and all year round. We can only guess how we got to be this way. Perhaps it has to do with the long-term helplessness of our infants. What better way to keep a family together than to make it so very reinforcing!

Children

That brings us to children, our attraction to them, and their attraction to us. Adults of many species, including ours, seem to find small representatives of their species, with short arms and legs, large heads, flat faces, and big, round eyes... "cute" somehow – "sweet," the sociobiologist might point out. It does make considerable evolutionary sense that, in animals with relatively helpless young, the adults should be attracted to their infants.

The infants, in turn, seem to be attracted to certain things as well. Goslings, as everyone knows, become attached to the first large moving object they come across in the first two days of life – usually mother goose (occasionally Konrad Lorenz or other ethologists). This is called imprinting. Human infants respond to pairs of eyes, female voices, and touch.

The goslings respond to their sign-stimulus with the following response, literally following that large moving object. Human infants, of course, are incapable of following, so they resort to subterfuge: the broad, full bodied, toothless smile which parents find overwhelmingly attractive.

Sociobiologists go on to predict that mothers will care for their children more than fathers (they have more invested in them, and are more certain of their maternity); that older mothers will care more than younger mothers (they have fewer chances of further procreation); that we will be more solicitous of our children when we have few (or only one!) than when we have many; that we will increase our concern for our children as they get older (they have demonstrated their survival potential); and that we will tend to push our children into marriage and children of their own.

Helping

Care – helping behavior – is likely when it involves our children, parents, spouses, or other close relations. It is less and less likely when it involves cousins or unrelated neighbors. It is so unusual when it involves strangers or distant people of other cultures and races that we recall one story – the good Samaritan – nearly 2000 years after the fact.

Sociobiologists predict that helping decreases with kinship distance. In fact, it should occur only when the sacrifice you make is outweighed by the advantage that sacrifice provides the genes you share with those relations. The geneticist J. B. S. Haldane supposedly once put it this way: "I'd gladly give my life for three of my brothers, five of my nephews, nine of my cousins...." This is called kin selection. Altruism based on genetic selfishness!

One kind of "altruistic" behavior is herd behavior. Some animals just seem to want to be close, and in dangerous times closer still. It makes sense: By collecting in a herd, you are less likely to be attacked by a predator. Mind you, sometimes you may find yourself on the outside of the herd – but the odds are good that

30 | 88© Copyright 2006 C. George Boeree

Page 175: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

the next time you'll be snugly inside.

Another kind is called reciprocal altruism. A prairy dog who sees a predator will begin to yelp loudly, for example. This warns the rest of his community, although it draws the predators attention to the one doing the yelping!

Herd behavior and reciprocal altruism work for the same reason that kin selection works: It caters to inclusive fitness: A slight reduction of my own survival probabiliities is more than balanced by the survival of relatively close relations. Some animals even help any member of their on species, with the instinctual "understanding" that they may be the beneficiaries the next time they need help themselves.

Robert Trivers has suggested that people engage in a more sophisticated form of reciprocal altruism, shared only with a few of the more advanced creatures of the world. Here you would be willing to sacrifice for someone else if it is understood that that specific other will do the same for you, or reciprocate in some other way, "tit for tat." Clearly, this requires the ability to recognize individuals and to recall debts!

Other geneticists have pointed out that, if there is a genetic basis for reciprocal altruism, their will also be some individuals that cheat by allowing others to do for them without ever meeting their own obligations. In fact, depending on the advantages that reciprocal altruism provides and the tendency of altruists to get back at cheaters, cheaters will be found in any population. Other studies have shown that "sociopathy," guiltless ignoring of social norms, is found in a sizable portion of the human population.

There is, of course, no need for a human being to be 100% altruist or 100% cheat. Most of us (or is it all of us?), although we get angry at cheats, are quite capable of cheating when the occasion arises. We feel guilt, of course, be we can cheat. A large portion of the human psyche seems to be devoted to calculating our chances of success or failure at such shady maneuvers. More on this later.

Aggression

Like many concepts in social psychology, aggression has many definitions, even many evaluations. Some think of aggression as a great virtue (e.g. "the aggressive businessperson"), while others see aggression as symptomatic of mental illness.

The fact they we do keep the same word anyway suggests that there is a commonality: Both positive and negative aggression serve to enhance the self. The positive version, which we could call assertiveness, is acting in a way that enhances the self, without the implication that we are hurting someone else. The negative version, which we might call violence, focuses more on the "disenhancement" of others as a means to the same end.

Although the life of animals often seems rather bloody, we must take care not to confuse predation – the hunting and killing of other animals for food – with aggression. Predation in carnivorous species has more in common with grazing in vegetarian species than with aggression between members of the same species. Take a good look at your neighborhood cat hunting a mouse: He is cool, composed, not hot and crazed. In human terms, there is not the usual emotional correlate of aggression: anger. He is simply taking care of business.

That taken care of, there remains remarkably little aggression in the animal world. But it does remain. We find it most often in circumstances of competition over a resource. This resource must be important for "fitness," that is, relevant to one's individual or reproductive success. Further, it must be restricted in abundance: Animals do not, for example, compete for air, but may for water, food, nesting areas, and mates.

It is the last item – mates – that accounts for most aggression in mammals. And it is males that are most noted for this aggression. As we mentioned earlier, females have so much at stake in any act of copulation –

31 | 88© Copyright 2006 C. George Boeree

Page 176: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

so many months gestation, the increased energy requirement, susceptibility to attack, the dangers of birth, the responsibility of lactation – that it serves their fitness to be "picky" when looking for a partner. If females are picky, males must be show-offs: The male must demonstrate that he has the qualities that serve the female's fitness, in order to serve his own fitness. Deer are a good example. Mind you, this need not be conscious or learned; in all likelihood, it is all quite instinctual in most mammals. It may possibly have some instinctual bases in us as well.

Some of his aggressiveness may in fact be mediated by testosterone, the "male" hormone. Inject testosterone into female mice and their threshold for aggressive behavior goes down. Remove testosterone from male mice (by castrating the poor things) and their thresholds go up. But I must add that testosterone does not cause aggression, it just lowers the threshold for it.

But females in many species can be quite aggressive (such as female guinea pigs), and females in any species can be extremely aggressive in certain circumstances (such as when facing a threat to her infants). In human societies, the sociological statistics are clear: Most violent crime is committed by men. But we have already noticed that, as women assert their rights to full participation in the social and economic world, those statistics are changing. Time will tell the degree to which testosterone is responsible for aggression in people.

Nevertheless, males engage in a great deal of head-butting. But one can't help but notice that these contests "over" females seldom end in death or even serious injury in most species. That is because these contests are just that: contests. They are a matter of displays of virtues, and they usually include actions that serve as sign stimuli to the opponent that the contest has ended in his favor: surrender signals. Continued aggression is of little advantage to either the loser or the winner. Even male rattlesnakes don't bite each other!

Territoriality and dominance hierarchies – once thought to be major focuses of aggressive behavior – seem to be relatively less significant. Animals tend to respect territorial and status claims more than dispute them. It is only when circumstances, whether natural or humanly created, are out of the ordinary that we see much aggression. And low food supplies likely have little to do with aggression. Southwick, studying Rhesus monkeys in the London Zoo, found that reducing the food supplies by 25% had no effect on the amount of aggression found, and reducing the food supplies by 50% actually decreased aggression! We find the same thing among primitive people.

Aggression in Human Beings

So why so much aggression in people? One possibility is our lack of biological restraints. Sociobiologists predict that animals that are poorly equipt for aggression are unlikely to have developed surrender signals. Man, they say, is one of these creatures. But we developed technology, including a technology of destruction, and this technology "evolved" much too quickly for our biological evolution to provide us with compensating restraints on aggression. Experience tells us that guns are more dangerous than knives, though both are efficient killing machines, because a gun is faster and provides us with less time to consider our act rationally – the only restraint left us.

Another problem is that we humans live not just in the "real" world, but in a symbolic world as well. A lion gets aggressive about something here-and-now. People get aggressive about things that happened long ago, things that they think will happen some day in the future, or things that they've been told is happening. Likewise, a lion gets angry about pretty physical things. Calling him a name won't bother him a bit.

A lion gets angry about something that happens to him personally. We get angry about things that happen to our cars, our houses, our communities, our nations, our religious establishments, and so on. We have extended our "ego's" way beyond our selves and our loved ones to all sorts of symbolic things. The response to flag burning is only the latest example.

32 | 88© Copyright 2006 C. George Boeree

Page 177: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

If aggression has an instinctual basis in human beings, we would expect there to be a sign stimulus. It would certainly not be something as simple as bright red males during mating season, as in stickleback fish. If we go back to the idea of competition as a fertile ground for aggression, we notice that frustration is a likely candidate. There are two of you who want the same thing; if one grabs it, the other doesn't get it and is unhappy; so he takes it, and now the other is unhappy; and so on. Goal-directed behavior has been blocked, and that is frustration.

Variations on that theme abound: We can be frustrated when an on-going behavior is interrupted (trying tripping someone); we can be frustrated by a delay of goal achievement (cut in front of someone on line at the supermarket); or we can be frustrated by the disruption of ordinary behavior patterns (cause me to forego my morning coffee). We are flexible creatures.

But we must beware here: Other things can lead to aggression besides frustration (or aren't highly paid boxers engaged in aggression?) and frustration can lead to other things besides aggression (or doesn't social impotence lead to depression?). Further, as Fromm points out, frustration (and aggression) is in the eyes of the beholder. He feels that the frustration must be experienced as unjust or as a sign of rejection for it to lead to aggression.

Sociobiology "versus" Culture

Many psychologists, sociologist, anthropologists, and others are wary of the explanations – convincing though they sometimes are – of the sociobiologists: For every sociobiological explanation, we can find a cultural explanation as well. After all, culture operates by the same principles as evolution.

There are many different ways to do any one task, but in the context of a certain physical environment and a certain culture, some ways of doing things work better than others. These are more likely to be "passed on" from one generation to the next, this time by learning.

Now, cultures need to accomplish certain things if they are to survive at all. They must assure effective use of natural resources, for example, which might involve the learning of all sorts of territorial and aggressive behaviors, just like in sociobiological explanations. And they must assure a degree of cooperation, which might involve learning altruistic behaviors, rules for sharing resources and for other social relationships, just like the ones in sociobiological explanations. And they must assure a continuation of the population, which might involve certain courtship and marital arrangements, nurturant behaviors, and so on, just like in sociobiological explanations.

If a society is to survive – and any existing society has at least survived until now – it must take care of the very same issues that genetics must take care of. And, because learning is considerably more flexible than evolutionary adaptation, we would expect culture to tend to replace genetics. That is, after all, only evolutionary good sense!

So do we have instincts? If instincts are defined as automatic reflex-like connections – no, probably not. But define instincts as "strong innate tendencies toward certain behaviors in certain situations" – yes, we probably do. The important point is that we (unlike animals) can always say no to our instinctual behaviors, just like we can say no to our learned ones!

If you are interested in learning more about sociobiology and its impact on psychology, go to the Center for Evolutionary Psychology. See especially their Primer for a more sophisticated overview of the topic!

33 | 88© Copyright 2006 C. George Boeree

Page 178: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

The Romantic Philosophers

34 | 88© Copyright 2006 C. George Boeree

Page 179: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Romanticism

Empiricism would continue on to the present day. It would become increasingly materialistic in French philosophy, culminating in the reductionism of Auguste Comte (1798-1857), wherein all human experience is reduced to biology, chemistry, and ultimately physics. Rationalism, too, continues to the present day, reaching its peak in Georges Hegel's (1770-1831) idealism of the Absolute. Hegel held that all human activity is nothing more than the working of the universe as it slowly and inevitably progresses towards ultimate Godhood.

In both empiricism and rationalism (and materialism and idealism), the human, especially the individual human person, gets lost – either in the eternal bumping of atoms or in the grand scheme of God-making. Our thoughts and feelings are nothing of any importance either way! We are just carbon molecules or the twitchings of eternity.

Some philosophers were taken aback by this tendency, both before and after Comte and Hegel. They felt that, for human beings, it was our own day-to-day living that was the center of our search for the truth. Reason and the evidence of our senses were important, no doubt, but they mean nothing to us unless they touch our needs, our feelings, our emotions. Only then do they acquire meaning. This "meaning" is what the Romantic movement is all about.

I will focus on several philosophers that I believe most influenced psychology. First is Jean-Jacques Rousseau, who is often considered the father of Romanticism. And the last is Friedrich Nietzsche, who is sometimes considered the greatest Romantic. Afterwards, we will look at the commonalities among these philosophers that let us talk of a Romantic Movement.

Jean-Jacques Rousseau (1712-1778 )

No history of psychology is complete without a look at Jean-Jacques Rousseau. He has influenced education to the present day, philosophy (Kant, Schopenhauer...), political theory (the French Revolution, Karl Marx...), and he inspired the Romantic Movement in Philosophy, which in turn influenced all these things, and psychology, once again.

Plus, he’s one of the most colorful characters we have and, as an added bonus, he has left a particularly revealing autobiography in The Confessions.

He was born in Geneva, Switzerland, in 1712 to the watchmaker Isaac Rousseau and his wife Suzanne Bernard Rousseau. Athough a Calvinist, Isaac was also a bit unstable, and left his wife and first son, returned to father Jean-Jacques, then left again. His mother died one week after Jean-Jacques was born, and he was raised by an aunt and uncle.

they sent him off to boarding school in the country where, he says, he learned "all the insignificant trash that has obtained the name of education." The experience did, however, serve as the start of his love-affair with rural life.

At twelve years old, he returned to his aunt and uncle. There apprenticed to a watchmaker, he developed two other personal qualities: The constant beatings from his master (as well as at school) led him to lying and idleness; and adolescence led him to develop a rather bizarre romantic streak. He would spend much of his life falling in love.

35 | 88© Copyright 2006 C. George Boeree

Page 180: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

At sixteen, he ran away from home with no money nor possessions. A priest led him to baroness Mme de Warens, a 29 year old beauty who apparently had a soft spot for losers and potential converts. Her influence led him to convert to Catholicism, though he was not yet ready to give up his exhibitionism nor his desire to be spanked by lovely ladies. He entered a seminary in 1729, but was promptly dismissed. He eventually developed an on-again, off-again physical relationship with the lovely Mme Warens.

In the mean time, he walked all over the countryside, often long distances. He loved the woods, mountains, and nature itself. He served as an occasional tutor and music teacher, but spent much of his time reading Enlightenment authors. Voltaire’s work turned him to a Nature worship quite congenial to his personality.

In 1742, when he was 30 years old, he left for Paris. He quickly befriended the political writer Diderot , who managed to help him get a job as a secretary at the French Embassy in Venice. He was dismissed because of his insolent nature.

In 1746 he met and fell in love with Therese Levasseur, a simple-minded laundress and seamstress. They together had four children, all of whom were send to orphanages. Keep in mind that that was a common response to poverty in those days (i.e. from the fall of Rome to World War II!). He did feel considerable remorse about it later, but admitted that he would have made a really lousy father! No one doubts him on that.

He worked as a secretary to various aristocrats and spent quite some time composing music. He even rewrote an operetta by Voltaire and wrote to him. A literary contest with a monetary prize caught his attention and, in 1750, he won with Discours sur les arts et les sciences – a powerful attack on civilization.

This was the first time we see his ideas about the natural goodness of man. And although we think of him as an Enlightenment thinker, this thesis was actually anti-Enlightenment, anti-philosophy, anti-reason, anti-Voltaire, and even anti-printing press! The good life, he was saying, is the simple life of the peasants. This conception of "back to nature" involved, of course, a romanticized notion of nature, and stands in stark contrast to the nature of jungles and deserts!

1752 was another active year. He wrote his comedy Narcisse. His operetta Le devin du village was successfully presented to the King. Unfortunately, his illness – he suffered from a variety of painful and humiliating bladder problems – kept him from meeting the King, and he forfeited a pension.

In 1753, another competition was announced. Rousseau’s entry, Discourse sur l’origine et les fondements de l’inegalite parmi les hommes, won and was published two years later.

In this piece, he accepted biological inequalities, but argued that there were no natural basis for any other inequalities – economic, political, social, or moral! These, he said, were basically due to the existence of private property and the need to defend it with force. Man is good, he argued, but society, which is little more than the reification of greed, corrupts us all.

He admits that it is no longer possible for us to leave civilized society now. It has, in fact, become a part of our nature! The best we can do is to lead simpler lives with fewer luxuries with the simple morality of the gospels to guide us.

In his article on economics for the Encyclopedia, he suggest that it would help if we had a graduated income tax, a tax on luxuries (and none on necessities), and national free public education.

In 1756, he moved with Therese and her elderly mother into "the Hermitage," a cottage lent to him by Mme d’Épinay. There he wrote a novel (or "romance") called Julie, ou la nouvelle Heloise, referring to the Heloise of Heloise and Abelard fame. It became perhaps the most famous novel of the 1700s.

On the other side, he alienated his friends with unpleasant letters and his rudeness towards his benefactress Mme D’Epinay. Even his oldest friend, Diderot, called him mad. In a huff, he left the Hermitage.

In 1762, Rousseau published both Émile and The Social Contract. The first line of The Social Contract is the most famous: "Man is born free, and he is everywhere in chains." The purpose of the rest of the book was to describe a society that would instead preserve that freedom.

36 | 88© Copyright 2006 C. George Boeree

Page 181: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

"The social contract" is an admittedly mythological contract among individuals to surrender some of their freedoms to ensure a community which respects the individual and, thereby, preserves as much freedom as possible. This idea, combined with Locke’s thoughts on government, were to inspire the founding fathers of the new United States.

It should be noted, though, that at the end of the book, Rousseau does prescribe death as the punishment for anyone who, by their actions, shows that they do not hold the common values of the community! The French Revolution would show more clearly than the American what a double-edged sword a philosophy such as Rousseau’s can be!

Émile was far more sedate. It is a treatise on child-rearing, from the man who sent his four children to orphanages! Turns out, though, he had some pretty good advice.

He condemned all forms of education that use force. Instead, he promoted education that nurtured the natural unfolding of a child’s potentials. This in a time when it was thought that if you didn’t beat children regularly with a good sized stick, they would grow up spoiled! And Nature, he said, is to be the child’s primary teacher, with freedom to explore the major teaching method.

Basically, he says, the child learns by gradual adaptation to necessities, and by imitation of those around him. Education should be primarily moral until the child is twelve, when intellectual education begins. Religious education should be held off until the child is 18. This way, the child can develop reasonable religious beliefs, rather than unthinking acceptance of mythology and miracles.

The book is beautifully written, but many would say almost naively idealistic. It would be a great influence in Europe and later in the United States. Maria Montessori in Italy, for example, based many of her ideas on Rousseau, as did John Dewey in the US. What we now call progressive education and learning by doing come basically from Émile!

The great philosophers of his time laughed at him – but the clergy was outraged! Rousseau’s friends warned him and encouraged him to flee. In 1762, the French parlement ordered all copies of Emile confiscated and burned. Rousseau fled to Switzerland, only to have both his books burned in Calvinist Geneva.

He begged Frederic the Great for asylum in Neuchâtel. There he lived, more eccentric than ever. And yet he was the idol of women everywhere, and his publishers begged him for more. He gave them more, primarily in the form of essays or letters to his critics.

But the local ministers in Neuchâtel were also upset about his writings, and a local sermon led to an attack on Rousseau’s house. He and Therese moved again, to a lone cottage on a tiny island in a lake in Switzerland. But he was again ordered to leave, which he did, first to Strasbourg, then to England at the invitation of David Hume in 1766.

At first in London he was the talk of the town, and everyone wanted to meet him. But he tired of this quickly and asked Hume to find him a place in the country. There, Rousseau, Therese, and their dog Sultan put quite a strain on their hosts’ hospitality.

Rousseau began to read critical articles in the British press. Already rather paranoid, he responded to them as if there were a conspiracy against him, and even accused Hume of being a part of it. He and Therese "escaped" from England back to France.

Although technically still in danger of arrest in France, he nevertheless enjoyed the reception his fans gave him. But fearing for his life, he fled into the countryside to wander anonymously. In 1768, he finally married his Therese.

She begged him to go back to Paris, so they did (under pseudonyms). There he copied music for a living, and also finally finished, in 1770, his autobiography.

He continued to write, some of his most beautiful work as well as some of his most paranoid, until 1778. He had moved into a cottage offered by the marquis de Girardin, where he happily studied the local flora, when he suffered a stroke. Therese tried to move him onto his bed, but he fell again and cut his head. By the time

37 | 88© Copyright 2006 C. George Boeree

Page 182: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

the Marquis got to him, he was dead.

He was buried on the estate, and his grave become a pilgrimage site. He was later moved to the Pantheon in Paris, and laid to rest not far from, of all people, Voltaire.

Johann Wolfgang von Goethe (1749-1832)

We do not have to visit a madhouse to find disordered minds;our planet is the mental institution of the universe.

Goethe

Goethe was born in 1749 in Frankfurt-am-Main in Germany, the oldest of six children – although only he and a sister survived into adulthood. His father, Johann Kaspar Goethe, was a well-to-do lawyer and amateur scholar, but a failure in politics and with an unpleasant disposition. His mother, Katharina Elisabeth Textor was considerably more pleasant, and was the daughter of the bürgermeister (mayor) of Frankfurt.

Young Goethe was a handsome and talented youth, learned languages easily, and was interested in music and art. He entered the University of Leipzig to study law, but a disappointment in love led him to sickness and depression, and he left school. In 1771, however, he received his law degree from the University of Strasbourg.

His early reading of Bayle's Dictionary led him to renounced his Christianity as a teenager and become an atheist. He later mellowed a bit, and adopted a pantheism modeled after Spinoza's.

In 1774, he wrote Die Leiden des jungen Werthers (the Sorrows of Young Werther), a tragic love story that, though panned by the critics, was wildly successful, especially among young romantic intellectuals. The book concludes with a suicide which was, sadly, imitated by a number of love sick readers. Like many of his works, the story emphasized the tensions between the nature of the individual and the restrictions of society.

The following year, he was invited to join the Duke of Saxony-Weimar at court. At first, he was just an "ornament" there, but later he performed various real political duties, including inspections of mines and the establishment of weather observatories.

In 1782, he was inducted into the nobility, which permitted him to add "von" to his name. Because of his fame and status in Weimar, he met and befriended a number of young poets, including Schiller and Herder.

Since his teens, Goethe was given to falling in love, yet apparently unable to commit himself to one woman or the institution of marriage. His longest and most intense relationship began around 1775 with Charlotte von Schardt, a married woman who had had seven children (though only four survived). He would write long and romantic letters to her for most of his life.

He did eventually set up a household with a young working-class girl named Christiane Vulpius. She bore a child on Christmas day in 1789.

In 1801, Goethe became quite ill, and his recovered took many years. Toward the end of his illness, Napoleon defeated the Prussians at Jena and marched into Weimar. His troops attempted to take over Goethe's house, and Christiane physically protected him. He finally married her.

38 | 88© Copyright 2006 C. George Boeree

Page 183: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Goethe was a strong admirer of Napoleon, and visited him in 1808 at the emperor's invitation. Goethe also visited with Beethoven in 1812.

Goethe's greatest work is his two-part play Faust. Although he began writing it in 1773, it would not be finished until 1831. The first part, however, could stand alone, and it was completed in 1808. Its theme was human freedom and the power of passion, which Faust discovers after he wagers his soul in a devil's bargain with Mephistopheles.

[An interesting aside: Goethe's Faust creates an artificial man in his laboratory. This influenced a certain Mary Shelley, author of Frankenstein (perhaps the first science fiction novel). She even places her story in a 13th century castle she had seen which belonged to the old (and colorful) German family Frankenstein, a castle Goethe was also quite familiar with!]

In addition to his poetry, novels, and plays, Goethe spend considerable time on science. He studied medicine, anatomy, physics, chemistry, botany, and meteorology.

In 1792, he completed the two part Beiträge zur Optik (Contributions to Optics), and in 1810 the three part Zur Farbenlehre (On the Theory of Colors). He truly believed that it was these works that would be his greatest contributions. Instead, few scientists approved of them, and they were to make little serious impact on the field. His work would make an impression on various artists, though, including Turner, Klee, and Kandinsky. His approach was really more phenomenological than experimental, and his work reflected more on the subjective experiences of color and light than on their physics.

He also wrote a book called The Metamorphosis of Plants, which suggested that all plants are just variations on a primitive plant he called the Urpflanze. He coined the term morphology along the way, and showed the relationship of human beings to animals with his discovery of the human intermaxillary bone (just above your upper teeth), just where it is in lower animals.

His wife Christiane died in 1816. His lifelong love Charlotte died in 1827. The Duke died the following year. And his last remaining child died in 1830. Suffering from sickness and depression, Goethe himself finally died, March 22, 1832, one year after finishing the second half of his masterpiece Faust.

Arthur Schopenhauer ( 1788-1860)

Arthur Schopenhauer was born February 22, 1788 in Danzig, Prussia (now Gdansk in Poland). His father was a successful businessman, and his mother a novelist. Young Arthur was moved around Europe quite a bit, which allowed him to become fluent in several languages, and to develop a deep love of nature.

In 1805, his father died, and he tried a business career. He lived with his mother for a while in Weimar, and she introduced him to Goethe. He went on to study medicine at the University of Göttingen and philosophy at the University of Berlin, and ultimately received his doctorate from the University of Jena in 1813. Later, he worked with Goethe on Goethe's studies on color.

In 1819, he published his greatest work, Die Welt als Wille und Vorstellung (The World as Will and Idea).

To Schopenhauer, the phenomenal world is basically an illusion. The true reality, Kant's "thing-in-itself," he refers to as Will. Will, perhaps an odd term to us today, is more like the Tao in Chinese philosophy: It is out

39 | 88© Copyright 2006 C. George Boeree

Page 184: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

of the Will that everything derives. But it has more the qualities of a force, and pushes or drives what we perceive as the phenomenal world.

Will is, you could say, the inner nature of all things. So, if you want to understand something's – or someone's – inner nature, you need only look within yourself. So the Will also drives us, through our instincts. This concept would influence a young Sigmund Freud a generation later.

Schopenhauer, profoundly influenced by his reading of Buddhist literature, saw life as essentially painful. We are forced by our natures, our instincts, to live, to breed, to suffer, and to die. Schopenhauer is often described as "the great pessimist!"

For the world is Hell, and men are on the one hand the tormented souls and on the other the devils in it....

If you imagine... the sum total of distress, pain, and suffering of every kind which the sun shines upon in its course, you will have to admit it would have been much better if the sun had been able to call up the phenomenon of life as little on the earth as on the moon....

To our amazement we suddenly exist, after having for countless millennia not existed; in a short while we will again not exist, also for countless millennia. That cannot be right, says the heart.

The question, of course, is how does one get past this suffering? One way he recommends is esthetic salvation – seeing the beauty in something, or someone. When we do this, we are actually looking at the universal or essence behind the scene, which moves us in turn towards the universal subject within ourselves. This quiets the will that forces us into the phenomenal world. Schopenhauer believed that music was the purest art – one step from will.

A second way to transcend suffering is through ethical salvation – compassion. Here, too, it is the recognition of self-in-others and others-in-self that leads to a quieting of the will.

But these are only partial answers. The full answer requires religious salvation – asceticism, the direct stilling of all desires by a life of self-denial and meditation. Without the will, only nothingness remains, which is Nirvana.

Schopenhauer lived many years of his life a bitter and reclusive man, unable to deal with his lack of success in life. He began publishing his works again in 1836, and intellectuals all over Europe began to develop an interest in him.

Sadly, Schopenhauer developed heart problems and on September 21, 1860, he died. After his death, he would powerfully influence such notables as the composer Richard Wagner, Friedrich Nietzsche, Thomas Mann and many other writers.

Søren Aabye Kierkegaard (1813-1855)

There are, as is known, insects that die in the moment of fertilization.So it is with all joy: life's highest, most splendid moment of enjoyment is accompanied by death.

Kierkegaard

Søren Kierkegaard was born in Copenhagen on May 5, 1813, the youngest of seven children. His father, Michael Pedersen Kierkegaard, was in the hosiery business. He was a powerful man who held to a particularly gloomy Christianity, obsessed with guilt over having once cursed God. His mother was Ane Sørensdatter Lund, a servant of the Kierkegaard's.

40 | 88© Copyright 2006 C. George Boeree

Page 185: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Two of Søren's brothers and two of his sisters died. By 1834, his mother had died as well, and Kierkegaard became nearly as depressed as his father. He lost his faith and turned to a hedonistic life-style, but had a religious experience in 1838. He received his theology degree in 1840, and proposed to Regine Olsen, daughter of a prominent Copenhagen government official.

No one knows precisely why, but in late 1841, he broke off the engagement, which lead to considerable negative social press. It seems to have been the pivotal crisis in his life, and he abruptly left to Berlin to study.

When he returned, he finished a manuscript he had been working on, and in 1843 published Either/Or. It takes the form of an argument about how to live life between an "aesthetic" man and an "ethical" man – very probably reflecting two aspects of Kierkegaard's own soul.

The aesthetic man is basically a hedonist and an atheist. Although he is portrayed as a refined gentleman, his sections of the book are rambling, suggesting that his life is likewise without focus. The ethical man is a judge, and his arguments are far more orderly and eloquent: He spends considerable time analyzing the ancient Roman emperor Nero and his mental states.

Also in 1843, he published his famous book Fear and Trembling, which retells the story of Abraham and his near-sacrifice of his son. This time, Kierkegaard compares the ethical response – it is clearly wrong to kill one's own son – with a religious response, which is reflected in Abraham's faith in his God.

In his various books, Kierkegaard develops his three "stages" or competing life philosophies: The aesthetic person, who lives in the moment and lacks commitment; the ethical person, who is in fact committed to his ideals; and the religious person, who recognizes the transcendent nature of true ideals. Notice the similarity to Schopenhauer, although for Schopenhauer "aesthetic" refers to a love of art and music, not hedonism.

Throughout his work, he was concerned with passions. He defined anxiety, for example, as "the dizziness of freedom."Despair is what the hedonist feels when he finally recognized the emptiness of his life. Guilt is what the ethical man feels when he inevitably discovers his inability to forgive himself. These definitions would profoundly influence a number of later philosophers and writers.

In 1849, he published Sickness unto Death, which was his strongest call to the conventional Christians of Copenhagen to take what Kierkegaard called "a leap of faith" into a more personal kind of religion. But his community is not quite ready for this passionate brand of Christianity, and he was severely criticized by the religious powers of Denmark.

Kierkegaard is often considered the first existentialist, mostly because of the way he used the word existence. He said that God doesn't exist because he is eternal. Only people exist, because they are always an unfinished product. And the nature of existence is, first, that it is the domain of the individual, and second, that individual must take responsibility for his or her own creation.

But Kierkegaard noted that his was not a "system" of philosophy. Human existence is an ongoing process of creation, and cannot be encompassed by any "system." This has been a central theme in existentialism ever since.

Kierkegaard died on October 2, 1855, of spinal paralysis. He would not take communion, and he asked that no clergy participate in his funeral. His epitaph reads "The Individual."

41 | 88© Copyright 2006 C. George Boeree

Page 186: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Friedrich Wilhelm Nietzsche ( 1844 – 1900)

I fear animals regard man as a creature of their own kind which has in a highly dangerous fashion lost its healthy animal reason-as the mad animal, as the laughing animal, as the weeping animal, as the unhappy animal. – Nietzsche

Second only to Rousseau in the impact he had on Psychology is Friedrich Wilhelm Nietzsche. He was born in Röcken, in Prussia Saxony, on October 15, 1844, named after Friedrich Wilhelm IV, King of Prussia, who had the same birthday. Nietzsche's father was a minister – one of many in the family – who had tutored several members of the royal family. His mother was a puritanical housewife.

When Friedrich was 18, he lost his faith – which would remain a central issue for the rest of his life. And he said his life was changed as well by his reading of Schopenhauer a few years later while a student at the University of Leipzig.

When he was 23, he was drafted into the Prussian army – but he fell off a horse, hurt his chest, and was released.

He received an appointment as professor of philology (classical languages and literature) at the University of Basel at the tender age of 24, a year before he received his Ph.D. Near Basel lived the famous Richard Wagner, and Nietzsche was invited to Christmas dinner in 1869. Wagner’s grandiose and romantic operas were to influence Nietzsche’s view of life for some time to come.

He served a brief stint as a volunteer medical orderly during the Franco-Prussian War, during which he contracted diphtheria and dysentery, which damaged his health permanently.

After returning to Basel, he published his first book in 1872 – inspired by Wagner – called The Birth of Tragedy out of the Spirit of Music. It was in this book that he introduced the contrast of the Dionysian and Apollonian. Dionysus was the god of wine and revelry, living for the moment. Apollo was the god of peace, order, and art. The one lacks discipline, but the other lacks, as we would say today, soul.

In 1879, because of his seriously deteriorating health, he was forced to retire from teaching. He published Human, All Too Human – an analysis of emotion – in parts from 1878 through 1880. During this time also, he fell in love, although briefly, with the famous Lou Salomé (later a confident of Sigmund Freud’s!).

Heartbroken, and perhaps recognizing that he was destined for bachelorhood, he retired high into the Alps to write his master work, Thus Spake Zarathustra, published in 1883 through 1885. Here, he made a heroic effort at addressing the pessimism of Schopenhauer. Nietzsche felt that religion had failed miserably to provide man with meaning. So now that God was "dead," we needed to stop looking to the skies and start providing that missing meaning ourselves. The people he saw as having accomplished this transition he called "Über-menschen," usually translated as supermen. But, he notes, supermen have not arrived as yet, and we must be satisfied to serve as a bridge to that future.

The book is a masterpiece by any standard, yet Nietzsche remained an unknown. His health continuing to deteriorate, he was cared for by his sister, Lisbeth Förster-Nietzsche. She, however, married an anti-semite

42 | 88© Copyright 2006 C. George Boeree

Page 187: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

who Nietzsche abhorred and moved to a commune in Paraguay!

Nietzsche then lived in various rooming houses all over Italy and Switzerland. His eyesight went from bad to worse, and his headaches overwhelmed him. He stopped writing books and instead wrote aphorisms (short comment), which he then collected into books.

Beyond Good and Evil (the best introduction to his ideas) came out in 1886, and The Geneology of Morals in 1887. In these books, he makes clear his great distinction between Herren-Moral and Herden-Moral, that is, the morality of lords and the morality of the herd.

The morality of the herd is what he calls traditional Judeo-Christian morality: It is, he says, an ethic of helplessness and fear. With this morality, we keep the powerful and talented under control by appealing to virtues such as altruism and egalitarianism. Secretly, it is, like all motives, a "will to power" – but a sly, manipulative one. We cry "I am weaker than you, but I am still better than you!"

The morality of lords, on the other hand, is based on the manly virtues of courage, honor, power, and the love of danger. It is pagan, western, teutonic. The only rule, he said, is do not betray a friend.

Although he was not anti-semitic, his choice of words would lead the Nazis to use some of them in ways he never intended many years after his death. Ask yourself if the masses of people shouting "Heil Hitler!" and the acts of rounding up minority civilians for work camps and slaughter in any way make you think of courage and honor!

The contrast between these two moralities is in fact a very productive one:

Herden-Moral

bourgeoisie democracy

welfare socialism

egalitarianism human rights

sympathy comfort

decadence

Herren-Moral

aristocracy laissez-faire

merit freedom honesty purpose

Nietzsche become increasingly ill and bitter, blind and paranoid. In Turin in January of 1889 he had attempted to protect a horse that was being whipped when he suffered an apoplectic stroke (just like Rousseau) which sent him to an asylum. Some believe his collapse was the result of syphilis, but it could just as well have been due to years of medication. His mother claimed him and took care of him until she died in 1897, when his sister, now back in Germany, took him in.

He was seldom lucid after that. He died August 25, 1900 at the age of 55, of stroke and pneumonia.

A number of his works were published after his collapse, including The Will to Power in 1889, which is a collection of aphorisms found in his notebooks, and his autobiography Ecce Homo in 1908. Ecce Homo illustrates both his brilliance and his insanity very dramatically. Freud called him the most brilliant psychologist who ever lived.

43 | 88© Copyright 2006 C. George Boeree

Page 188: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Romanticism in General

Beneath all the variety represented by the Romantics lies a common theme: Passion. While the empiricists were concerned with sensory data, and the rationalists were concerned with reason, the romantics looked at consciousness and saw first and foremost its dynamics, purposefulness, striving, desire... passion!

Goethe has Faust say, ''Gefülte is Alles." Feeling is everything!

In fact, they saw passion in all life, as a basic category... life as a Darwinian struggle, not just to survive, but to overcome. As such, it could be called instinct; but in humanity, it goes further, and involves an overcoming of nature itself.

"The only reality is this: The will of every center of power to become stronger – not self-preservation, but the desire to appropriate, to become master, to become more, to become stronger," said Nietzsche.

Along with their love of passion came an impatience with, even disgust at, the mediocre, the weak, the irresponsible, the unpassionate.

The romantic's view of the world is a reflection of their view of humanity: The world is rich, full of qualities – color, sound, flavor, feeling – thick, you might say, and not the thin, gray, empty thing as pictured by modern science. They tended to ignore metaphysical speculation as an intellectual game. And for Schopenhauer, passion became the basic form of all reality: a universe pressing to be realized.

A passionate metaphysics requires a passionate epistemology (as opposed to an intellectual or empirical one). First, there is a preference for intuition or insight: As Pascal put it, "the heart has reasons that reason knows nothing of." A holistic understanding is more satisfying than logical, analytical, or experimental explanations. The world is too big for those and has to be embraced rather than picked apart.

And the importance of the subjective is emphasized. All experience is subjective as well as objective. This is a sort of "uncertainty principle" that applies to all sciences, and philosophy, and certainly psychology. Objectivity is simply a meaningless goal. So subjectivity is not something to eliminate, but to understand.

Hence we must go back to life as it is lived, the Lebenswelt. We must study whole, meaningful experiences. We might want to go back to ordinary people, perhaps children or primitives, to understand the lived world before it is tainted by our perpetual intellectualization. These tendencies would eventually lead to phenomenology and related methodologies.

Last (and far from least), we must have a passionate morality. The romantics tend to admire the heroic, taking a stand against nature, against the mediocre, against nothingness or meaninglessness. To some extent, the heroic is closely tied to futility: It is often Quixotic, or picaresque. There is an affection for the foolish or unconventional.

Romantic morality is more stoic than epicurean. Meaning, as expressed by virtue, purpose, and courage, is the highest value, not pleasure or happiness as we usually conceive of them.

Some romantics are suspicious of Asian philosophy to the extent that it represents surrender. Nietzsche, among them, considers even the Judeo-Christian tradition "Asian" and weak. Their suspicion is not entirely well-founded: In traditions such as Taoism and Zen Buddhism, for example, "surrender" is valued precisely for the strength it imparts, as demonstrated physically in judo ("gentle way"). Schopenhauer understood this, and his work is clearly colored by Buddhism in particular.

A passionate morality requires freedom, which Goethe considered the greatest happiness, and which was quickly disappearing from empiricist, rationalist, and even religious philosophy. I have to be free to take that courageous stand; to be determined is to be nothing at all.

A little Buddhism sneaks in when Nietzsche speaks of amor fati, love of fate: When choices are taken from

44 | 88© Copyright 2006 C. George Boeree

Page 189: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

you, you can still conquer the moment with your attitude.

Nietzsche said "God is dead!" Now, anything goes. You don't have to do anything. Be nice? Why? Be selfish? Why? As Sartre put it, we are "condemned" to freedom. Even when we choose to allow ourselves to be determined, it is our choice. Even Kierkegaard asks us to take a leap of faith that has no justification. So, we have nothing to lean on, no crutch, no "opiate," no excuses.

Freedom means responsibility. We create ourselves, or better, we overcome ourselves, or at least we should. Others just play out their "programs." Freedom requires that we be truly aware, fully conscious. It requires that we be fully feeling, that we not deny but experience our passion. It requires that we be active, involved.

Freedom means creativity, and the romantic prefers the artist over the scientist. These ideas are the foundation for the concept of self-actualization.

The heirs of the romantics are the phenomenologists, existentialists, and humanists of today.

45 | 88© Copyright 2006 C. George Boeree

Page 190: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

The Quotable Friedrich Nietzsche

On Madness

Madness is something rare in individuals – but in groups, parties, peoples, ages it is the rule. Beyond Good and Evil.

I fear animals regard man as a creature of their own kind which has in a highly dangerous fashion lost its healthy animal reason – as the mad animal, as the laughing animal, as the weeping animal, as the unhappy animal. The Gay Science.

On Religion

After coming into contact with a religious man I always feel I must wash my hands. Ecce Homo.

Two great European narcotics, alcohol and Christianity. Twilight of the Idols.

Even today many educated people think that the victory of Christianity over Greek philosophy is a proof of the superior truth of the former – although in this case it was only the coarser and more violent that conquered the more spiritual and delicate. So far as superior truth is concerned, it is enough to observe that the awakening sciences have allied themselves point by point with the philosophy of Epicurus, but point by point rejected Christianity. Human, all too Human.

The spiritualization of sensuality is called love: it is a great triumph over Christianity. Twilight of the Idols.

On the Self

Active, successful natures act, not according to the dictum "know thyself," but as if there hovered before them the commandment: will a self and thou shalt become a self. Assorted Opinions and Maxims.

He who cannot obey himself will be commanded. That is the nature of living creatures. Thus Spoke Zarathustra.

I assess the power of a will by how much resistance, pain, torture it endures and knows how to turn to its advantage. The Will to Power.

To exercise power costs effort and demands courage. That is why so many fail to assert rights to which they are perfectly entitled-because a right is a kind of power but they are too lazy or too cowardly to exercise it. The virtues which cloak these faults are called patience and forbearance. The Wanderer and His Shadow.

On Death

To die proudly when it is no longer possible to live proudly. Death of one's own free choice, death at the proper time, with a clear head and with joyfulness, consummated in the midst of children and witnesses: so that an actual leave-taking is possible while he who is leaving is still there. Twilight of the Idols.

On Punishment

Distrust everyone in whom the impulse to punish is powerful! Thus Spoke Zarathustra.

A strange thing, our kind of punishment! It does not cleanse the offender, it is no expiation: on the contrary, it defiles more than the offense itself. Daybreak.

46 | 88© Copyright 2006 C. George Boeree

Page 191: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

The Will to Power

What is good? – All that heightens the feeling of power, the will to power, power itself in man. The Anti-Christ.

Not necessity, not desire – no, the love of power is the demon of men. Let them have everything – health, food, a place to live, entertainment – they are and remain unhappy and low-spirited: for the demon waits and waits and will be satisfied. Daybreak.

My idea is that every specific body strives to become master over all space and to extend its force (its will to power) and to thrust back all that resists its extension. But it continually encounters similar efforts on the part of other bodies and ends by coming to an arrangement ("union") with those of them that are sufficiently related to it: thus they then conspire together for power. And the process goes on. The Will to Power.

[Anything which] is a living and not a dying body... will have to be an incarnate will to power, it will strive to grow, spread, seize, become predominant – not from any morality or immorality but because it is living and because life simply is will to power... 'Exploitation'... belongs to the essence of what lives, as a basic organic function; it is a consequence of the will to power, which is after all the will to life. Beyond Good and Evil.

On Truth

There are no facts, only interpretations. Daybreak.

It is not things, but opinions about things that have absolutely no existence, which have so deranged mankind! Daybreak.

Convictions are more dangerous enemies of truth than lies. Human, all too Human.

Extreme positions are not succeeded by moderate ones, but by contrary extreme positions. The Will to Power.

Why does man not see things? He is himself standing in the way: he conceals things. Daybreak.

Mystical explanations are considered deep. The truth is that they are not even superficial. The Gay Science.

What are man's truths ultimately? Merely his irrefutable errors. The Gay Science.

Over immense periods of time the intellect produced nothing but errors. A few of these proved to be useful and helped to preserve the species: those who hit upon or inherited these had better luck in their struggle for themselves and their progeny. Such erroneous articles of faith... include the following: that there are things, substances, bodies; that a thing is what it appears to be; that our will is free; that what is good for me is also good in itself. The Gay Science.

Eternal recurrence

Never yield to remorse, but at once tell yourself: remorse would simply mean adding to the first act of stupidity a second. The Wanderer and his Shadow.

What, if some day or night a demon were to steal after you into your loneliest loneliness and say to you: "This life as you now live it and have lived it, you will have to live once more and innumerable times more; and there will be nothing new in it, but every pain and every joy and every thought and sigh and everything unutterably small or great in your life will have to return to you, all in the same succession and sequence - even this spider and this moonlight between the trees, and even this moment and I myself. The eternal hourglass of existence is turned upside down again and again, and you with it, speck of dust!" Would you not throw yourself down and gnash your teeth and curse the demon who spoke thus?... Or how well disposed would you have to become to yourself and to life to crave nothing more fervently than this ultimate eternal confirmation and seal? The Gay Science.

47 | 88© Copyright 2006 C. George Boeree

Page 192: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

The Bad Man

Whoever has overthrown an existing law of custom has always first been accounted a bad man: but when, as did happen, the law could not afterwards be reinstated and this fact was accepted, the predicate gradually changed; history treats almost exclusively of these bad men who subsequently became good men! Daybreak.

I know my fate. One day there will be associated with my name the recollection of something frightful – of a crisis like no other before on earth, of the profoundest collision of conscience, of a decision evoked against everything that until then had been believed in, demanded, sanctified. I am not a man. I am dynamite. Ecce Homo.

48 | 88© Copyright 2006 C. George Boeree

Page 193: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Friedrich Nietzsche: Selection from Thus Spake Zarathustra, part four

The Higher Man*

Nietzsche's masterpiece, Thus Spake Zarathustra, is the story of a sage who has been living on a mountain contemplating the fate of mankind for many years. When he feels he has some answers to share, he comes down and attempts to preach. First he discovers (in the market-place) that there doesn't seem to be anyone who wants to hear what he has to say. He realizes that he has come down from the mountain too soon, that the people his message was for – "the higher men" – simply don't exist yet. Nevertheless, he gives this speech.

As you read it, keep in mind that what he is preaching is not intended for the ordinary people of today, but for a better people of the future. Some of it seems harsh, even negative. But the message is one we can recognize and sympathize with: We should avoid getting sucked into the conventional, restrained, even shrivelled lives of the ordinary, mediocre people around us but, instead, strive to realize our fullest potentials.

1

When I came to men for the first time, then did I commit the hermit's folly, the great folly: I appeared in the market-place.

And when I spoke to all, I spoke to none. In the evening, however, rope-dancers were my companions, and corpses; and I myself almost a corpse.

With the new morning, however, there came to me a new truth: Then did I learn to say "Of what account to me are market-place and crowd and crowd-noise and long crowd-ears!"

You higher men, learn this from me: In the market-place no one believes in higher men. But if you will speak there, very well! The crowd, however, sputters "We are all equal."

"You higher men," – so sputters the crowd – "there are no higher men, we are all equal; man is man, before God – we are all equal!"

Before God! – Now, however, this God has died. Before the crowd, however, we will not be equal. You higher men, go away from the market-place!

2

Before God! – Now however this God has died! You higher men, this God was your greatest danger.

Only since he lay in the grave have you again arisen. Only now comes the great noontide, only now does the higher man become – master!

Have you understood this word, O my brothers? You are frightened: Do your hearts turn giddy? Does the abyss here yawn for you? Does the hell-hound here yelp at you?

Well! Take heart, you higher men! Only now does the mountain of the human future begin to work. God has died: Now we desire that the Superman live!

* Adapted from THUS SPAKE ZARATHUSTRA by Friedrich Nietzsche (1891). translated by Thomas Common. Available at http://members.aol.com/Magnetar/private/Zarathustra/Z73.htmlInterpretation by C. George Boeree.

49 | 88© Copyright 2006 C. George Boeree

Page 194: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

3

The most careful ask today "How is man to be maintained?" I, Zarathustra, ask, as the first and only one: "How is man to be surpassed?"

The Superman I have at heart;– that is the first and only thing to me – and not man: Not the neighbour, not the poorest, not the sorriest, not the best.

O my brothers, what I can love in man is that he is an over-coming and a down-going. And also in you there is much that makes me love and hope.

In that you have despised, you higher men, that makes me hope. For the great despisers are the great reverers.

In that you have despaired, there is much to honour. For you have not learned to submit yourselves, you have not learned petty policy.

For today the petty people have become master: They all preach submission and humility and policy and diligence and consideration and the long et cetera of petty virtues.

Whatever is of the effeminate type, whatever originates from the servile type, and especially the crowd-mishmash – that is what wishes now to be master of all human destiny – O disgust! Disgust! Disgust!

They ask and ask and never tire of asking: "How is man to maintain himself best, longest, most pleasantly?" Thereby are they the masters of today.

These masters of today, surpass them, O my brothers: These petty people, they are the Superman's greatest danger!

Surpass, you higher men, the petty virtues, the petty policy, the sand-grain considerateness, the ant-hill politeness, the pitiable comfortableness, the "happiness of the greatest number!"

And rather despair than submit yourselves! And verily, I love you, because you do not know how to live today, you higher men! For thus do you live best!

4

Have you courage, O my brothers? Are you stout-hearted? Not the courage before witnesses, but hermit courage and eagle courage, which not even a God any longer beholds?

Cold souls, mules, the blind and the drunken, I do not call stout-hearted. He has heart who knows fear, but conquers it; who sees the abyss, but with pride.

He who sees the abyss, but with eagle's eyes, he who with eagle's talons grasps the abyss: He has courage.

5

"Man is evil" – so all the wisest ones said to me for consolation. Ah, if only it were still true today! For evil is man's best strength.

"Man must become better and more evil"- so do I teach. The most evil is necessary for the Superman's best.

It may have been well for the preacher of the petty people to suffer and be burdened by men's sin. I, however, rejoice in great sin as my great consolation!

50 | 88© Copyright 2006 C. George Boeree

Page 195: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Such things, however, are not said for long crowd-ears. Every word, also, is not suited for every mouth. These are fine far-away things: At them sheep's hooves shall not grasp!

6

You higher men, do you think that I am here to put right what you have put wrong?

Or that I wish henceforth to make snugger couches for you sufferers? Or show you restless, lost, and confused climbers new and easier footpaths?

No! No! Three times No! Always more, always better ones of your type shall succumb, for you shall always have it worse and harder.

Thus only does man grow upwards to the height where the lightning strikes and shatters him: High enough for the lightning!

Out to the few, the long, the remote go my soul and my seeking: Of what account to me are your many little, short miseries!

You do not yet suffer enough for me! For you suffer from yourselves, but you have not yet suffered from man. You would lie if you spoke otherwise! None of you suffers from what I have suffered.

7

It is not enough for me that the lightning no longer does harm. I do not wish to conduct it away: It shall learn to work for me.

My wisdom has accumulated long like a cloud: It becomes stiller and darker. So does all wisdom which shall one day bear lightning.

To these men of today will I not be light, nor be called light. Them will I blind: Lightning of my wisdom! Put out their eyes!

8

Do not will anything beyond your power: There is a bad falseness in those who will beyond their power.

Especially when they will great things! For they awaken distrust in great things, these subtle false-coiners and stage-players –

Until at last they are false towards themselves, squint-eyed, pale cankers, glossed over with strong words, parade virtues and brilliant false deeds.

Take good care there, you higher men! For nothing is more precious to me, and rarer, than honesty.

Is this today not that of the crowd? The crowd however knows not what is great and what is small, what is straight and what is honest: It is innocently crooked, it always lies.

9

Have a good distrust today, you higher men, you enheartened ones, you open-hearted ones! And keep your reasons secret! For this today is of the crowd.

51 | 88© Copyright 2006 C. George Boeree

Page 196: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

What the crowd once learned to believe without reason, who could refute it to them by means of reason?

And on the market-place one convinces with grand gestures. But reason make the crowd distrustful.

And when truth occasionally triumphs there, then ask yourselves with good distrust: "What strong error has fought for it?"

Be on your guard also against the intellectuals! They hate you, because they are unproductive! They have cold, withered eyes before which every bird is unplumed.

Such persons brag about not lying: but inability to lie is still far from being love of truth. Be on your guard!

Freedom from fever is still far from being knowledge! Icy spirits I do not believe in. He who cannot lie, does not know what truth is.

10

If you would go up high, then use your own legs! Do not get yourselves carried aloft; do not seat yourselves on other people's backs and heads! Are you mounted, however, on horseback? You now ride briskly up to your goal? Fine, my friend! But your lame foot is also with you on horseback! When you reach your goal, when you alight from your horse, precisely at your highest, you higher man, then will you stumble!

11

You creating ones, you higher men! One is only pregnant with one's own child.

Do not let yourselves be imposed upon or put upon! Who then is your neighbor? Even if you act "for your neighbor"– you still do not create for him!

Unlearn, I pray you, this "for," you creating ones: Your very virtue wishes you to have nothing to do with "for" and "on account of" and "because." Against these false little words shall you stop your ears.

"For one's neighbour," is the virtue only of the petty people: There it is said "birds of a feather," and "one hand washes the other." They have neither the right nor the power for your self-seeking!

In your self-seeking, you creating ones, there is the foresight and foreseeing of the pregnant! What no one's eye has yet seen – the fruit! – this, shelters and saves and nourishes your entire love.

Where your entire love is, namely with your child, there is also your entire virtue! Your work, your will is your "neighbour": Let no false values impose themselves upon you!

12

You creating ones, you higher men! Whoever has to give birth is sick; and whoever has given birth is unclean.

Ask women: one gives birth, not because it gives pleasure. The pain makes hens and poets cackle.

you creating ones, in you there is much uncleanliness. That is because you have had to be mothers.

A new child: Oh, how much new filth has also come into the world! Go apart! He who has given birth shall wash his soul!

52 | 88© Copyright 2006 C. George Boeree

Page 197: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

13

Be not virtuous beyond your powers! And seek nothing from yourselves opposed to probability!

Walk in the footsteps in which your fathers' virtue has already walked! How will you rise high, if your fathers' will does not rise with you?

He, however, who would be a firstling, let him take care lest he also become a lastling! And where the vices of your fathers are, there should you not set yourself up as saints!

He whose fathers were inclined to women, and to strong wine and flesh of the wild boar – what would it be if he demanded chastity of himself?

A folly would it be! Rather, it seems to me, that he should be the husband of one or of two or of three women.

And if he founded monasteries, and inscribed over their portals: "The way to holiness" – I should still say: What good is it? It is a new folly!

He has founded for himself a penance-house and refuge-house: much good may it do! But I do not believe in it.

In solitude there grows what one brings into it – including the brute in one's own nature. Thus is solitude inadvisable to many.

Has there ever been anything filthier on earth than the saints of the wilderness? Around them was not only the devil loose – but also the swine.

14

Shy, ashamed, awkward, like the tiger whose spring has failed – thus, you higher men, have I often seen you slink aside. A cast which you made has failed –

But what does it matter, you dice-players! Have you not learned to play and joke, as one must play and joke? Do we not ever sit at a great table of joking and playing?

And if great things have been a failure with you, have you yourselves therefore been a failure? And if you yourselves have been a failure, has man therefore been a failure? If man, however, has been a failure – well then? Never mind!

15

The higher its type, always the less often does a thing succeed. You higher men here, have you not all been failures?

Be of good cheer; what does it matter? How much is still possible! Learn to laugh at yourselves, as you ought to laugh!

What wonder even that you have failed and only half succeeded, you half-shattered ones! Does not man's future strive and struggle within you?

Man's furthest, profoundest, star-highest issues, his prodigious powers, do not all these foam through one another in your cup?

What wonder that many a cup shatters! Learn to laugh at yourselves, as you ought to laugh! You higher men, oh, how much is still possible!

And verily, how much has already succeeded! How rich is this earth in small, good, perfect things, in well-constituted things!

53 | 88© Copyright 2006 C. George Boeree

Page 198: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Set around you small, good, perfect things, you higher men. Their golden maturity heals the heart. The perfect teaches one to hope.

16

What has until now been the greatest sin here on earth? Was it not the word of him who said: "Woe to them that laugh now!"

Did he himself find no cause for laughter on the earth? Then he sought badly. Even a child finds cause for it.

He did not love enough: Otherwise would he also have loved us, the laughing ones! But he hated and hooted us; wailing and teeth-gnashing did he promise us.

Must one then curse immediately, when one does not love? That seems to me in bad taste. Thus did he, however, this absolute one. He sprang from the crowd.

And he himself just did not love sufficiently; otherwise would he have raged less because people did not love him. Great love does not seek love – it seeks more!

Go out of the way of all such absolute ones! They are a poor sickly type, a crowd-type: They look at this life with ill-will, they have an evil eye for this earth.

Go out of the way of all such absolute ones! They have heavy feet and sultry hearts – they do not know how to dance. How could the earth be light to such ones!

17

Sinuously do all good things approach their goal. Like cats they curve their backs, they purr inwardly with their approaching happiness – all good things laugh.

His step betrays whether a person already walks on his own path: Just see me walk! He, however, who comes close to his goal, dances.

And verily, a statue have I not become, nor yet do I stand there stiff, stupid and stony, like a pillar; I love fast racing.

And though there be on earth swamps and thick melancholy, he who has light feet runs even across the mud, and dances, as upon well-swept ice.

Lift up your hearts, my brothers, high, higher! And do not forget your legs! Lift up also your legs, you good dancers, and better still, stand upon your heads!

18

This crown of the laughter, this rose-garland crown: I myself have put on this crown, I myself have consecrated my laughter. No one else have I found today potent enough for this.

Zarathustra the dancer, Zarathustra the light one, who beckons with his wings, ready for flight, beckoning to all birds, ready and prepared – a blissfully light-spirited one:

Zarathustra the soothsayer, Zarathustra the sooth-laugher, no impatient one, no absolute one, but one who loves leaps and somersaults; I myself have put on this crown!

19

Lift up your hearts, my brothers, high, higher! And do not forget your legs! Lift up also your legs,

54 | 88© Copyright 2006 C. George Boeree

Page 199: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

you good dancers, and better still if you stand upon your heads!

There are also heavy animals in a this state of happiness, there are thoroughly heavy-footed ones. Curiously do they exert themselves, like an elephant which endeavours to stand upon its head.

Better, however, to be foolish with happiness than foolish with misfortune, better to dance awkwardly than walk lamely. So learn, I pray you, my wisdom, you higher men: Even the worst thing has two good reverse sides...

...Even the worst thing has good dancing-legs: So learn, I pray you, you higher men, to put yourselves on your proper legs!

So unlearn, I pray you, the melancholy and all the crowd-sadness! Oh, how sad the buffoons of the crowd seem to me today! This today, however, is that of the crowd.

20

Be like the wind when it rushes forth from its mountain-caves: To its own piping will it dance; the seas tremble and leap under its footsteps.

That which gives wings to asses, that which milks the lionesses: Praised be that good, unruly spirit, which comes like a hurricane to all the present and to all the crowd –

That which is hostile to thistle-heads and puzzle-heads, and to all withered leaves and weeds: Praised be this wild, good, free spirit of the storm, which dances upon swamps and afflictions, as upon meadows!

That which hates the consumptive crowd-dogs, and all their ill-constituted, sullen brood: Praised be this spirit of all free spirits, the laughing storm, which blows dust into the eyes of all the dark-sighted and melancholic!

You higher men, the worst thing in you is that you have, none of you, learned to dance as you ought to dance – to dance beyond yourselves! What does it matter that you have failed?

How many things are still possible! So learn to laugh beyond yourselves! Lift up your hearts, you good dancers, high! higher! And do not forget good laughter!

This crown of laughter, this rose-garland crown: to you, my brothers, do I cast this crown! Laughing have I consecrated: You higher men, learn, I pray you – to laugh!

55 | 88© Copyright 2006 C. George Boeree

Page 200: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

The Beginnings of Psychology

56 | 88© Copyright 2006 C. George Boeree

Page 201: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Psychology as we know it didn't suddenly appear on the intellectual scene. It is impossible to say just when it began, or who was responsible for it. Instead, we can only point to a number of currents that take us from philosophy and the natural sciences into something recognizably psychological. This chapter looks at two of these "primordial" currents – associationism as the beginnings of a cognitive theory, and the introduction of quantification in the forms of psychophysics and intelligence testing.

Associationism

Associationism is the theory that the mind is composed of elements – usually referred to as sensations and ideas – which are organized by means of various associations. Although the original idea can be found in Plato, it is Aristotle who gets the credit for elaborating on it. Aristotle counted four laws of association when he examined the processes of remembrance and recall:

1. The law of contiguity. Things or events that occur close to each other in space or time tend to get linked together in the mind. If you think of a cup, you may think of a saucer; if you think of making coffee, you may then think of drinking that coffee.

2. The law of frequency. The more often two things or events are linked, the more powerful will be that association. If you have an eclair with your coffee every day, and have done so for the last twenty years, the association will be strong indeed – and you will be fat.

3. The law of similarity. If two things are similar, the thought of one will tend to trigger the thought of the other. If you think of one twin, it is hard not to think of the other. If you recollect one birthday, you may find yourself thinking about others as well.

4. The law of contrast. On the other hand, seeing or recalling something may also trigger the recollection of something completely opposite. If you think of the tallest person you know, you may suddenly recall the shortest one as well. If you are thinking about birthdays, the one that was totally different from all the rest is quite likely to come up.

Association, according to Aristotle, took place in the "common sense." It was in the common sense that the look, the feel, the smell, the taste of an apple, for example, came together to become the idea of an apple.

For 2000 years, these four laws were assumed to hold true. St. Thomas pretty much accepted it lock, stock, and barrel. No one, however, cared that much about association. It was seen as just a simple description of a commonplace occurrence. It was seen as the activity of passive reason, whereas the abstraction of principles or essences – far more significant to philosophers – was the domain of active reason.

During the enlightenment, philosophers began to become interested in the idea again, as a part of their studies of vision as well as their interest in epistemology. Hobbes understood complex experiences as being associations of simple experiences, which in turn were associations of sensations. The basic means of association, according to Hobbes, was coherence (continguity), and the basic strength factor was repetition (frequency).

John Locke, rejecting the possibility of innate ideas, made his entire system dependent on association of sensations into simple ideas. He did, however, distinguish between ideas of sensations and ideas of reflection, meaning active reason. Only by adding simple ideas of reflection to simple ideas of sensation could we derive complex ideas. He also suggested that complex emotions derived from pain and pleasure (simple ideas) associated with other ideas.

It was David Hume who really got into the issue. Recall that he saw all experiences as having no substantial reality behind them. So whatever coherence the world (or the self) seems to have is a matter of the simple

57 | 88© Copyright 2006 C. George Boeree

Page 202: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

application of these natural laws of association. He lists three:

1. The law of resemblance – i.e. similarity.

2. The law of contiguity.

3. The law of cause and effect – basically contiguity in time.

David Hartley (1705-1757) was an English physician who was responsible for making the idea of associationism popular, especially in a book called Observations of Man. His emphasis was on the law of contiguity (in time and space) and the law of frequency. But he added an idea he got from the famous Isaac Newton: This association was a matter of tuned "vibrations" within the nerves! His basic ideas are very similar to those of D. O. Hebb in the twentieth century.

James Mill (1773-1836) also elaborated on Hume's associationism. The elder Mill saw the mind as passively functioning by the law of contiguity, with the law of frequency and a law of vividness "stamping in" the association. His emphasis on the law of frequency as the key to learning makes his approach very similar to the behaviorists in the twentieth century. But he is most famous for being the father of...

John Stuart Mill

"That so few now dare to be eccentric marks the chief danger of the time."

John Stuart Mill was born May 20, 1806 in London. His father was James Mill, an historian, philosopher, and social theorist. His mother was Harriet Barrow, and seems to have had next to no influence on him! His father decided to use the principles of utilitarianism and associationism (in consultation with his good friend, Jeremy Bentham) to educate John "scientifically."

This seemed to work quite well: John began learning Greek at three, Latin at eight. At 14, he studied French, mathematics, and chemistry in France. At 16, he began working as a clerk for his father at India House, headquarters of the East India Company. By 18, John was publishing articles on utilitarian philosophy!

But at 20, he had a nervous breakdown, which he describes in his Autobiography (1873). He attributed it, no doubt rightly, to his rigid education.

In 1830, he met Harriet Taylor, a married woman. He remained loyal to her until her husband died 21 years later (!), at which point they married. Sadly, she died only seven years later.

During this time, he served as an examiner for the East India Company. He also served as a liberal member of Parliament from 1865 to 1868. ("Conservatives are not necessarily stupid, but most stupid people are conservatives.") He died at his home in Avignon, France, on May 8, 1873.

His best known work is On Liberty, published in 1859. His most important work as far as science and psychology are concerned is A System of Logic, first printed in 1843 and going through many more editions through the rest of the 1800's.

He began with the basics established by Hume, his father James Mill, and others:

58 | 88© Copyright 2006 C. George Boeree

Page 203: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

1. A sensory impression leaves a mental representation (idea or image); 2. If two stimuli are presented together repeated, they create an association in the mind; 3. The intensity of such a pairing can serve the same function as repetition.

But he adds that associations can be more than the simple sum of their parts. The can have attributes or qualities different from the parts in the same way that water has different qualities than the hydrogen and oxygen that compose it. So J. S. Mill's associationism is more like "mental chemistry" than mental addition.

J. S. Mill agrees with Hume that all we can know about our world and ourselves is what we experience, but notes that generalization allows us to talk with some confidence about things beyond experience. And he believed that there are real causes for consistent phenomena!

This is often called phenomenalism. He defines matter, for example, as "the permanent possibility of sensation." This persepctive would have profound effects on 20th century logical positivism (Wittgenstein, Ayer, Schlick, Carnap, and others) who provided the philosophical foundation for most behaviorists.

He promotes a scientific method that focuses on induction: Generalizations from experiences lead to theory, from which we then develop alternative hypotheses; We go on to test these hypotheses by observation and experiment, the results of which allow us to improve theory, and so on. This circular notion of scientific progress is known as the hypothetico-deductive method. In this way we slowly build up laws of nature in which we can be increasingly confident. This method proved to be very popular among the scientists of his day.

He more specifically outlines five procedures for establishing causation. The simpler ones go like this:

1. The method of agreement: If a phenomenon occurs in two different situations, and those two situations have only one thing in common, that "thing" is the cause (or effect) of the phenomenon.

2. The method of differences: If a phenomenon occurs in one situation but not in another, and those two situations have everything in common except for one thing, then that "thing" is the cause (or effect) of the phenomenon.

3. The method of concomitant variations: If one phenomenon varies consistently with the variations of another phenomenon, one is the cause or effect, or is otherwise involved in the causation, of the other. This, of course, is the foundation for correlation which, although it cannot establish the direction of causality, does indicate some causal relationship.

When it comes to psychology, he argued that it could indeed someday become a science, but was unlikely to ever be an exact science. Predicting the behavior of human beings may be forever beyond our abilities, leaving us to limit ourselves to talking about tendencies.

His utilitarianism recognizes that happiness is not restricted to physical pleasures (or the avoidance of pain), that there may be different kinds or qualities of happiness. "It is better to be a human being dissatisfied than a pig satisfied; better to be Socrates dissatisfied than a fool satisfied." So, although we certainly begin as simple pleasure-seeking creatures, over time we can acquire far more humanistic motivations. Ultimately, this means that high moral values can be taught, and are not dependent on innate qualities of character.

When looking at social issues, J. S. Mill applies his expanded utilitarianism: Does a certain institution add to human welfare? Or are there better alternatives? He argues, for example, that women should be allowed to vote because women's self-interests can add balance to men's self-interests, and lead to a better society. He argues for personal freedom because it allows creative individuals to better contribute to society. On the other hand, he notes that free-market capitalism tends to result in inequity and poverty, and we would be better served by some form of socialism.

59 | 88© Copyright 2006 C. George Boeree

Page 204: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Thomas Brown (1778-1820) of the Scottish School puts the finishing touches to associationism: His laws of suggestion (i.e. association) were resemblance, contrast, and nearness in space and time, just like Aristotle's. He added a set of secondary laws – duration, liveliness, frequency, and recency – that strengthened suggestions. Then he considered as well the degrees of coexistence with other associations, constitutional differences of mind or temperament, differing circumstances of the moment, state of health or efficiency of the body, and prior habits. Finally, he understood association as an active process of an active, holistic mind.

Alexander Bain (1818-1903), a lifelong friend of John Stuart Mill, connected associationism with physiology. Accepting the law of contiguity, similarity, and frequency, he viewed them, as had Hartley, as neurological. He added the law of compound association, which says that most associations are among whole clusters of other associations. And he added the law of constructive association, which says that we can also actively, creatively, add to our associations ourselves.

One of Bain's basic principles is immortalized as the Spencer-Bain principle: The frequency or probability of a behavior rises if it is followed by a pleasurable event, and decreases if it is followed by a painful event. This is, of course, the same principle that the behaviorists would elaborate on a century later.

Bain has an even larger role in the history of psychology. First, he is often given the credit of having written two of the earliest textbooks in psychology – The Senses and the Intellect (1855) and Emotions and the Will (1859), both of which went through many editions, and were used, for example, by William James. He also founded the first English-language psychological journal, called Mind, in January of 1876.

Hermann Ebbinghaus

The preceding people were essentially philosophers, not scientists. The first psychologist who made an effort to study association scientifically was Hermann Ebbinghaus.

Hermann Ebbinghaus was born on January 23, 1850, in Barmen, Germany. His father was a wealthy merchant, who encouraged his son to study. Hermann attended the University of Halle and the University of Berlin, and received his doctorate from the University of Bonn in 1873. While traveling through Europe, he came across a copy of Fechner’s Elements of Psychophysics, which turned him on to psychology.

Ebbinghaus worked on his research at home in Berlin and published a book called On Memory: An Investigation in Experimental Psychology in 1885. Basically, his research involved the memorization of nonsense syllables, which consisted of a consonant, a vowel, and another consonant. He would select a dozen words, then attempt to master the list. He recorded the number of trials it took, as well as the effects of variations such as relearning old material, or the meaningfulness of the syllables. The results have been confirmed and are still valid today.

He also wrote the first article on intelligence testing of school children, and devised a sentence completion test that became a part of the Binet-Simon test. He also published textbooks on psychology in 1897 and 1902 that were very popular for many years. Hermann Ebbinghaus died in 1909, a clear precursor to today’s cognitive movement.

The laws of association would continue to have a powerful influence in psychology. The Behaviorists, of course, focused on stimulus-stimulus and stimulus-response associations. The Gestalt psychologists

60 | 88© Copyright 2006 C. George Boeree

Page 205: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

elaborated on the various associations they termed the laws of Prägnanz. Among the cognitive psychologists, there are various theories of semantic association. And the physiological psychologists talk about the neurological bases for association. The idea appears to be here to stay. But then, as Greek and Medieval philosophers knew, association is just a simple description of a commonplace occurance!

Psychophysics

Again and again, philosophers stated unequivocally that psychology could never be a science. The activities and the contents of the mind could not be measured, and therefore an objectivity such as that achieved in physics and chemistry was out of reach. Psychology would forever remain subjective!

This would finally change in the early 1800s. Ernst Weber (1795 to 1878) was born June 24 in Wittemburg, Germany, the third of 13 children! He received his doctorate from the University of Leipzig in 1815, in physiology. He began teaching there right after graduation, and continued until he retired in 1871.

His research was predominantly concerned with the senses of touch and kinesthesia (the experience of muscle position and movement). He was the first to clearly demonstrate the existence of kinesthesia, and showed that touch was actually a conglomerate sense composed of senses for pressure, temperature, and pain.

His chosen interests led him to certain techniques: First, there is the two-point threshold, which is a matter of measuring the smallest distance noticeable to touch at various parts of the body. For example, the tongue had the smallest threshold (1 mm), and the back had the largest (60 mm).

A second technique involved kinesthesia: Just-noticeable difference is the smallest difference in weight a person is capable of perceiving through holding two things. He discovered that the just-noticeable difference was a constant fraction of the weights involved. If you are holding a 40 pound weight in one hand, you will be able to recognize that a 41 pound weight in the other hand is in fact different. But if it were a 20 pound weight, you could detect that a mere half pound difference! In other words, as regards weight, we could recognize a 1/40 difference, whatever the weights.

This is known as Weber’s Law, and is the first such "law" relating a physical stimulus with a mental experience.

Gustav Fechner

Gustav Fechner was born April 1, 1801. His father, a village pastor, died early in Gustav’s childhood, so he, with his mother and brother, went to live with their uncle. In 1817, at the age of 16, he went off to study medicine at the University of Leipzig (were Weber was teaching). He received his MD degree in 1822 at the age of 21.

But his interests moved to physics and math, so he made his living tutoring, translating, and occasionally lecturing. After writing a significant paper on electricity in 1831, he was invited to become a professor of physics at Leipzig. There, he became friends with a number of people, including Wilhelm Wundt,

61 | 88© Copyright 2006 C. George Boeree

Page 206: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

and his interests moved again, this time to psychology, especially vision.

In 1840, he had a nervous breakdown, and he had to resign his position due to severe depression. At his worst, he stayed in his rooms alone, avoiding light which hurt his eyes, and even painted his room black. While lying in bed one morning, October 22, 1850, he suddenly realized that it was indeed possible to connect the measurable physical world with the mental world, supposed to be inaccessible to scientific investigation! As his condition improved, he returned to writing and performing endless experiments, using mostly himself as a subject.

Like many people at the time, he found Spinoza’s double-aspectism convincing and found in panpsychism something akin to a personal religion. Using the pseudonym Dr. Mises, he wrote a number of satires about the medicine and philosophy of his day. But he also used it to communicate, often in an amusing way, his spiritual perspective. As a panpsychist, he believed that all of nature was alive and capable of awareness of one degree or another. Even the planet earth itself, he believed, had a soul. He called this the day-view, and opposed it to the night-view of materialism.

Further, he felt that our lives come in three stages – the fetal life, the ordinary life, and the life after death. When we die, our souls join with other souls as part of the supreme soul.

It was double-aspectism that led him to study (and name) psychophysics, which he defined as the study of the systematic relationships between physical events and mental events. In 1860, he topped his career by publishing the Elements of Psychophysics.

In this book, he introduced a mathematical expression of Weber’s Law. The expression looked like this...

*R /R = k

which means that the proportion of the minimum change in stimulus detectable (*R) to the strength of the stimulus (R) is a constant (k). (R is for the German Reiz, meaning stimulus.) Or...

S = k log R

where S is the experienced sensation.

Fechner died November 28, 1887.

What Weber and Fechner showed that makes them far more significant than just Weber’s Law is that psychological events are in fact tied to measurable physical events in a systematic way, which everyone had thought impossible. Psychology could be a science after all!

The second quantitative breakthrough would be the measurement of something far more complex, far more "psychological:" intelligence. We owe this to two great minds in particular: Sir Francis Galton in England and Alfred Binet in France.

Sir Francis Galton

Francis Galton was born February 16, 1822 near Birmingham, England. He was the youngest of 7 children, and first cousin of Charles Darwin. His father, a wealthy banker, insisted on educating Francis at home, especially considering that Francis could read at 2 and a half years old!

Later in childhood, he was sent off to boarding school, which he despised and criticized even in adulthood. At 16, he went to medical school at King’s College at Oxford. He finished his degree at Cambridge in 1843, at 21.

His father died, leaving Galton a wealthy young aristocrat. He traveled extensively and became a member of the Royal Geographical Society, for

62 | 88© Copyright 2006 C. George Boeree

Page 207: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

which he developed maps of new territories and accounts of his adventures. He became president of that organization in 1856.

Galton had a penchant for measuring everything – extending even to the behinds of women he encountered in his travels in Africa (something he had to do from a distance, of course, by means of triangulation). This interest in measurement led to his invention of the weathermap (including highs, lows, and front – terms he introduced), and to suggesting the use of fingerprints to Scotland Yard.

His obsession eventually led to his efforts at measuring intelligence. In 1869, he published Hereditary Genius: An Inquiry into its Laws and Consequences, in which he demonstrates that the children of geniuses tend to be geniuses themselves.

In 1874, he produced English Men of Science: Their Nature and Nurture, based on long surveys passed out to thousands of established scientists. In this volume, he noted that, although the potential for high intelligence is still clearly inherited, that it also needed to be nurtured to come to full fruition. In particular, the broad, liberal education provided by the Scottish school system proved far superior to the English school system he hated so much.

In 1883, he wrote Inquiries into Human Faculty and its Development. This would be the first time anyone compared identical and fraternal twins, a method now considered ideal when investigating nature vs nurture issues.

In 1888, he published Co-Relations and Their Measurement, Chiefly from Anthropometric Data. As the title suggests, it was Galton who invented correlation, as well as scatter plots and regression toward the mean. Later, Karl Pearson (1857-1936) would discover the mathematical formulation of correlation.

Sir Francis died in 1911, after an incredibly productive, if somewhat eccentric, life.

Alfred Binet

Born July 11, 1851 in Nice, France, Alfred was an only child. His mother, an artist, raised him by herself after a divorce from his father, a physician.

He started studying medicine, but decided to study psychology on his own – being independently wealthy left him free to do what he pleased! He worked with the psychiatrist Charcot at La Salpetriere, where he studied hypnosis.

In 1891, he moved to Paris to study at the physiological-psychology lab at the Sorbonne, where he developed a variety of research interests, especially, of course, involving individual differences. In 1899, he and his graduate student, Theodore Simon (1873-1961) were commissioned by the French government to study retardation in the French schools, and to create a test to differentiated normal from retarded children.

After marriage, he began studying his own two daughters and testing them with Piaget-like tasks and other tests. This led to the publication of The Experimental Study of Intelligence in 1903.

In 1905, Binet and Simon came out with the Binet-Simon Scale of Intelligence, the first test permitting graduated, direct testing of intelligence. They expanded the test to normal children in 1908, and to adults in 1911.

Binet believed intelligence to be complex, with many factors, and not to be a simple, single entity. He didn’t like the use of a single number as developed by William Stern in 1911 – the intelligence quotient or IQ. He also believed that, though genetics may set upper limits on intelligence, most of us have plenty of room for

63 | 88© Copyright 2006 C. George Boeree

Page 208: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

improvement with the right kind of education.

He cautioned that his tests should be used with restraint: Even a child two years behind his age level may later prove to be brighter than most! He was afraid that IQ would prejudice teachers and parents, and that people would tend to view it as fixed and prematurely give up on kids who score low early on.

He suggested something he called mental orthopedics: Exercises in attention and thought that could help disadvantaged children "learn how to learn." He died in 1911, a man way ahead of his time, and wiser than most!

Binet’s fears were well founded. For example, Charles Spearman (1863-1945) introduced the idea that "general intelligence" (g) was real, unitary, and inherited.

Worse were the antics of Henry Goddard (1866-1957). He translated the Binet Simon test into English. He studied a family in New Jersey he named the Kallikaks. Some were normal, but quite a few were "feebleminded" (Goddard’s term). He traced their genealogy to support the heredity position. Because he believed that there was a close connection between feeblemindedness and criminality, he recommended that states institute programs of sterilization of the feebleminded. 20 states passed such laws.

Goddard also tested immigrants, at the request of the Immigration Service. His testers found 40 to 50% of immigrants feebleminded, and they were immediately deported. He also cited particular countries as being more feebleminded than others! Keep in mind that these immigrants rarely spoke much English and were tested during the grueling process of passing through the bureaucracy of Ellis Island after a long ocean voyage in miserable conditions!

Eugenics – a term coined by Galton – is the policy of intentionally breeding human beings according to some standard, and the sterilization of those that do not meet those standards. It became an institutionalized reality in 1907, when the Indiana legislature passed a law that made sterilization of "defectives" possible. A federal Eugenics Record Office was established in Cold Spring Harbor, and their lawyers designed law in 1914 that was promoted as models for the entire country.

Virginia adoped such a law in 1924. Emma Buck, her daughter Carrie and infant granddaughter Vivian, were judged to be feebleminded, and their case (Buck vs Bell) was taken before the Supreme Court. The Supreme Court, under Oliver Wendell Holmes, came down in support of the sterilization laws. Holmes stated:

"It is better for all the world, if instead of waiting to execute degenerate offspring for crime, or to let them starve for their imbecility, society can prevent those who are manifestly unfit from continuing their kind. The principle that sustains compulsory vaccination is broad enough to cover cutting the Fallopian tubes. Three generations of imbeciles are enough."

Although scientists disputed the reasoning behind the sterilization laws, 33 states adopted them, and some 65,000 American citizens were sterilized. The Nazis based their eugenics laws on the American ones and sterilized 350,000. Eugenics gradually became unpopular as the horrors of Nazi Germany became public, and gradually ended in the 1940's. The Supreme Court has yet to reverse its opinion on the matter, however.

People reading about eugenics and the sterilization laws often think that this is a great example of how immoral scientists can be. In reality, the laws were based on biblical passages which say that "like comes from like," the very same passages used today by creationists.

64 | 88© Copyright 2006 C. George Boeree

Page 209: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Selection from: Hereditary Talend And Character by Francis Galton (1865) *

The power of man over animal life, in producing whatever varieties of form he pleases, is enormously great. It would seem as though the physical structure of future generations was almost as plastic as clay, under the control of the breeder's will. It is my desire to show more pointedly than – so far as I am aware – has been attempted before, that mental qualities are equally under control.

....

So far as I am aware, no animals have ever been bred for general intelligence. Special aptitudes are thoroughly controlled by the breeder. He breeds dogs that point, that retrieve, that fondle, or that bite; but, no one has ever yet attempted to breed for high general intellect, irrespective of all other qualities. It would be a most interesting subject for an attempt. We hear constantly of prodigies of dogs, whose very intelligence makes them of little value as slaves. When they are wanted, they are apt to be absent on their own errands. They are too critical of their master's conduct. For instance, an intelligent dog shows marked contempt for an unsuccessful sportsman. He will follow nobody along a road that leads on a well-known tedious errand. He does not readily forgive a man who wounds his self-esteem. He is often a dexterous thief and a sad hypocrite. For these reasons an over-intelligent dog is not an object of particular desire, and therefore, I suppose, no one has ever thought of encouraging a breed of wise dogs. But it mould be a most interesting occupation for a country philosopher to pick up, the cleverest dogs he could hear of, and mate them together, generation after generation – breeding purely for intellectual power, and disregarding shape, size, and every other quality.

....

[How to breed a better man: Find him a better woman!]

As we cannot doubt that the transmission of talent is as much through the side of the mother as through that of the father, how vastly would the offspring be improved, supposing distinguished women to be commonly married to distinguished men, generation after generation, their qualities being in harmony and not in contrast, according to rules of which we are now ignorant, but which a study of the subject would be sure to evolve!

It has been said by Bacon that "great men have no continuance." I, however, find that very great men are certainly not averse to the other sex, for some such have been noted for their illicit intercourses, and, I believe, for a corresponding amount of illegitimate issue. Great lawyers are especially to be blamed in this, even more than poets, artists, or great commanders. It seems natural to believe that a person who is not married, or who, if married, does not happen to have children, should feel himself more vacant to the attractions of a public or a literary career than if he had the domestic cares and interests of a family to attend to. Thus, if we take a list of the leaders in science of the present day, the small number of them who have families is very remarkable. Perhaps the best selection of names we can make, is from those who have filled the annual scientific office of President of the British Association. We will take the list of the commoners simply, lest it should be objected, though unjustly, that some of the noblemen who have occupied the chair were not wholly indebted to their scientific attainments for that high position. Out of twenty-two individuals, about one-third have children; one-third are or have been married and have no children; and one-third have never been married. Among the children of those who have had families, the names of Frank Buckland and Alexander Herschel are already well-known to the public.

There has been a popular belief that men of great intellectual eminence, are usually of feeble constitution, and of a dry and cold disposition. There may be such instances, but I believe the general rule to be exactly

* Originally published in Macmillan's Magazine, 12, 157-166, 318-327.Classics in the History of Psychology. An internet resource developed by Christopher D. Green York University, Toronto, Ontario

65 | 88© Copyright 2006 C. George Boeree

Page 210: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

the opposite. Such men, so far as my observation and reading extend, are usually more manly and genial than the average, and by the aid of these very qualities, they obtain a recognised ascendancy. It is a great and common mistake to suppose that high intellectual powers are commonly associated with puny frames and small physical strength. Men of remarkable eminence are almost always men of vast powers of work. Those among them that have fallen into sedentary ways will frequently astonish their friends by their physical feats, when they happen to be in the mood of a vacation ramble. The Alpine Club contains a remarkable number of men of fair literary and scientific distinction; and these are among the strongest and most daring of the climbers. I believe, from my own recollections of the thews and energies of my contemporaries and friends of many years at Cambridge, that the first half-dozen class-men in classics or mathematics would have beaten, out of all proportion, the last half-dozen class-men in any trial of physical strength or endurance. Most notabilities have been great eaters and excellent digesters, on literally the same principle that the furnace which can raise more steam than is usual for one of its size burn more freely and well than is common. Most great men are vigorous animals, with exuberant powers, and an extreme devotion to a cause. There is no reason to suppose that, in breeding for the highest order of intellect, we should produce a sterile or a feeble race.

Many forms of civilization have been peculiarly unfavourable to the hereditary transmission of rare talent. None of them mere more prejudicial to it than that of the Middle Ages, where almost every youth of genius was attracted into the Church, and enrolled in the ranks of a celibate clergy.

Another great hindrance to it is a costly tone of society, like that of our own, where it becomes a folly for a rising man to encumber himself with domestic expenses, which custom exacts, and which are larger than his resources are able to meet. Here also genius is celibate, at least during the best period of manhood.

A spirit of caste is also bad, which compels a man of genius to select his wife from a narrow neighborhood or from the members of a few families.

But a spirit of clique is not bad. I understand that in Germany it is very much the custom for professors to marry the daughters of other professors, and I have some reason to believe, but am anxious for further information before I can feel sure of it, that the enormous intellectual digestion of German literary men, which far exceeds that of the corresponding class of our own country-men, may, in some considerable degree, be traceable to this practice.

So far as beauty is concerned, the custom of many countries, of the nobility purchasing the handsomest girls they could find for their wives, has laid the foundation of a higher type of features among the ruling classes. It is not so very long ago in England that it was thought quite natural that the strongest lance at the tournament should win the fairest or the noblest lady. The lady was the prize to be tilted for. She rarely objected to the arrangement, because her vanity was gratified by the éclat of the proceeding. Now history is justly charged with a tendency to repeat itself. We may, therefore, reasonably look forward to the possibility, I do not venture to say the probability, of a recurrence of some such practice of competition. What an extraordinary effect might be produced on our race, if its object was to unite in marriage those who possessed the finest and most suitable natures, mental moral, and physical!

Let us, then, give reins to our fancy, and imagine a Utopia – or a Laputa, if you will – in which a system of competitive examination for girls, as well as for youths, had been so developed as to embrace every important quality of mind and body, and where a considerable sum was yearly allotted to the endowment of such marriages as promised to yield children who would grow into eminent servants of the State. We may picture to ourselves an annual ceremony in that Utopia or Laputa, in which the Senior Trustee of the Endowment Fund would address ten deeply-blushing young men, all of twenty-five years old, in the following terms:

"Gentlemen, I have to announce the results of a public examination, conducted on established principles; which show that you occupy the foremost places in your year, in respect to those qualities of talent, character, and bodily vigour which are proved, on the whole, to do most honour and best service to our race. An examination has also been conducted on established principles among all the young ladies of this country who are now of the age of twenty-one, and I need hardly remind you, that this examination takes note of

66 | 88© Copyright 2006 C. George Boeree

Page 211: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

grace, beauty, health, good temper, accomplished housewifery, and disengaged affections, in addition to noble qualities of heart and brain. By a careful investigation of the marks you have severally obtained, and a comparison of them, always on established principles, with those obtained by the most distinguished among the young ladies, we have been enabled to select ten of their names with especial reference to your individual qualities. It appears that marriages between you and these ten ladies, according to the list I hold in my hand, would offer the probability of unusual happiness to yourselves, and, what is of paramount interest to the State, would probably result in an extraordinarily talented issue. Under these circumstances, if any or all of these marriages should be agreed upon, the sovereign herself will give away the brides, at a high and solemn festival, six months hence, in Westminster abbey. We, on our part, are prepared, in each case, to assign 5,000£ as a wedding-present, and to defray the cost of maintaining and educating your children, out of the ample funds entrusted to our disposal by the State."

If a twentieth part of the cost and pains were spent in measures for the improvement of the human race that is spent on the improvement of the breed of horses and cattle, what a galaxy of genius might we not create! We might introduce prophets and high priests of our civilization into a world as surely as we can propagate idiots by mating crétins. Men and women of the present day are, to those we might hope to bring into existence, what the pariah dogs of the streets of an Eastern town are to our own highly bred varieties.

The feeble nations of the world are necessarily giving way before the nobler varieties of mankind; and even the best of these, so far as we know them, seem unequal to their work. The average culture of mankind is become so much high, than it was, and the branches of knowledge and history so various and extended, that few are capable even of comprehending the exigencies of our modern civilization; much less fulfilling them. We are living in a sort of intellectual anarchy, for want of master minds. The general intellectual capacity of our leaders requires to be raised, and also to be differentiated. We want abler commanders, statesmen, thinkers, inventors, and artists. The natural qualifications of our race are no greater than they used to be in semi-barbarous times, though the conditions amid which we are born are vastly more complex than of old. The foremost minds of the present day seem to stagger and halt under an intellectual load too heavy for their powers.

[On Americans]

Let us consider an instance in which different social influences have modified the inborn dispositions of a nation. The North American people has been bred from the most restless and combative class of Europe. Whenever, during the last ten or twelve generations, a political or religious party has suffered defeat, its prominent members, whether they were the best, or only the noisiest, have been apt to emigrate to America, as a refuge from persecution. Men fled to America for conscience' sake, and for that of unappreciated patriotism. Every scheming knave, and every brutal ruffian, who feared the arm of the law, also turned his eyes in the same direction. Peasants and artisans, whose spirit rebelled against the tyranny of society and the monotony of their daily life, and men of a higher position, who chafed under conventional restraints, all yearned towards America. Thus the dispositions of the parents of the American people have been exceedingly varied, and usually extreme, either for good or for evil. But in one respect they almost universally agreed. Every head of an emigrant family brought with him a restless character, and a spirit apt to rebel. If we estimate the moral nature of Americans from their present social state, we shall find it to be just what me might have expected from such a parentage. They are enterprising, defiant, and touchy; impatient of authority; furious politicians; very tolerant of fraud and violence; possessing much high and generous spirit, and some true religious feeling, but strongly addicted to cant.

[As is mildly suggested by the preceding passage, Galton's discussions of the character of other "races" would no doubt offend the modern sensibilities!]

67 | 88© Copyright 2006 C. George Boeree

Page 212: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

The History of Statistics

1654 – Pascal – mathematics of probability, in correspondence with Fermat

1662 – William Petty and John Graunt – first demographic studies

1713 – Jakob Bernoulli – Ars Conjectandi

1733 – DeMoivre – Approximatio; law of error (similar to standard deviation)

1763 – Rev. Bayes – An essay towards solving a problem in the Doctrine of Chances, foundation for "Bayesian statistics"

1805 – A-M Legendre – least square method

1809 – C. F. Gauss – Theoria Motus Corporum Coelestium

1812 – P. S. Laplace – Théorie analytique des probabilités

1834 – Statistical Society of London established

1853 – Adolphe Quetelet – organized first international statistics conference; applied statistics to biology; described the bell-shaped curve

1877 – F. Galton – regression to the mean

1888 – F. Galton – correlation

1889 – F. Galton – Natural Inheritance

1900 – Karl Pearson – chi square; applied correlation to natural selection

1904 – Spearman – rank (non-parametric) correlation coefficient

1908 – "Student" (W. S. Gossett) – The probable error of the mean; the t-test

1919 – R. A. Fisher – ANOVA; evolutionary biology

1930's – Jerzy Neyman and Egon Pearson (son of Karl Pearson) – type II errors, power of a test, confidence intervals

68 | 88© Copyright 2006 C. George Boeree

Page 213: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Wilhelm Wundt and William James

69 | 88© Copyright 2006 C. George Boeree

Page 214: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Wilhelm Wundt and William James are usually thought of as the fathers of psychology, as well as the founders of psychology’s first two great "schools." Although they were very different men, there are some parallels: Their lives overlap, for example, with Wilhelm Wundt born in 1832 and dying in 1920, while William James was born ten years later and died ten years earlier. Both have claims to having established the first psychology lab in 1875. And neither named his school. As you will see, there are other commonalities as well, personal and philosophical.

I believe we haven't seen thinkers of their stature in psychology since.*

Wilhelm Wundt

Wilhelm Wundt was born in the village of Neckerau in Baden, Germany on August 16, 1832. The son of a Lutheran pastor, he was a solitary and studious boy. He roomed with and was tutored by his fathers assistant, the vicar of the church. He was sent off to boarding school at 13, and the university at 19.

He studied medicine at Tübingen, Heidelberg, and Berlin, although interested more in the scientific aspect than in a medical career. In 1857, he was appointed dozent (instructor) at Heidelberg, where he lectured on physiology. From 1858 to 1864, he also served as an assistant to the famous physiologist Helmholtz, and studied the neurological and chemical stimulation of muscles.

In 1864, he became an assistant professor at Heidelberg. Three years later, he started a course he called physiological psychology, which focused on the border between physiology and psychology, i.e. the senses and reactions – an interest inspired by the work of Weber and Fechner. His lecture notes would eventually become his major

work, the Principles of Physiological Psychology (Grundzüge der physiologischen Psychologie), which would be published in 1873 and 1874.

Like Fechner and many others at the time, Wundt accepted the Spinozan idea of psychophysical parallelism: Every physical event has a mental counterpart, and every mental event has a physical counterpart. And he believed, like Fechner, that the availability of measurable stimuli (and reactions) could make psychological events open to something like experimental methodology in a way earlier philosophers such as Kant thought impossible.

The method that Wundt developed is a sort of experimental introspection: The researcher was to carefully observe some simple event – one that could be measured as to quality, intensity, or duration – and record his responses to variations of those events. (Note that in German philosophy at that time, sensations were considered psychological events, and therefore "internal" to the mind, even though the sensation is of something that is "outside" the mind. Hence what we might call observation was called by Wundt

*Sources: Blumenthal, Arthur L. (2001) A Wundt Primer: The Operating Characteristics of Consciousness. Chapter Four in Reiber, Robert W. and Robinson, David K. Wilhelm Wundt in History: The Making of a Scientific Psychology. Kluwer Academic Publishing.

William James (1890). The Principles of Psychology. As presented in Classics in the History of Psychology, an internet resource developed by Christopher D. Green of York University, Toronto, Ontario. Available at http://psychclassics.yorku.ca/James/Principles/prin4.htm

Calkins, Mary W. Autobiography of Mary Whiton Calkins, in Murchison, Carl. (Ed.) (1930). History of Psychology in Autobiography (Vol. 1, pp. 31-61). Worcester, MA: Clark University Press. [quoting James, Principles of Psychology, Vol. I, pp. 225 ff.]

70 | 88© Copyright 2006 C. George Boeree

Page 215: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

introspection!)

To continue his story, Wundt went on to become chair of "inductive philosophy" at Zürich in 1874, and then professor of philosophy at Leipzig in 1875. It was there that he would stay and work for the next 45 years!

In 1875, a room was set aside for Wundt for demonstrations in what we now call sensation and perception. This is the same year that William James would set up a similar lab at Harvard. We can celebrate that year as the founding of experimental psychology!

In 1879, Wundt assisted his first graduate student at true psychological research – another milestone. In 1881, he started the journal Philosophische Studien. In 1883, he began the first course to be titled experimental psychology. And in 1894, his efforts were rewarded with the official establishment of an "Institute for Experimental Psychology" at Leipzig – the first such in the world.

Wundt was known to everyone as a quiet, hard-working, and very methodical researcher, as well as a very good lecturer. The latter comment is from the standards of the day, which were considerably different from today’s: He would go on in a low voice for a couple of hours at a time, without notes or audio-visual aides and without pausing for questions. His students loved him, but we would no doubt criticize him for not being sufficiently entertaining!

It is curious to note that during this same busy time period, Wundt also published four books in philosophy! Keep in mind that, at this time, psychology was not considered something separate from philosophy. In fact, Wundt rejected the idea when someone suggested it to him!

The studies conducted by Wundt and his now numerous students were mostly on sensation and perception, and of those, most concerned vision. In addition, there were studies on reaction time, attention, feelings, and associations. In all, he supervised 186 doctoral dissertations, most in psychology.

Among his better known students were Oswald Külpe and Hugo Munsterberg (whom James invited to teach at Harvard), the Russian behaviorists Bekhterev and Pavolv, as well as American students such as Hall ("father" of developmental psychology in America), James McKeen Cattell, Lightner Witmer (founder of the first psychological clinic in the US, at U of Penn), and Wundt’s main interpreter to the English speaking world, E. B. Titchener. Titchener is particularly responsible for interpreting Wundt badly!

Later in his career, Wundt became interested in social or cultural psychology. Contrary to what many believe, Wundt did not think that the experimental study of sensations was the be all and end all of psychology! In fact, he felt that that was only the surface, and additionally that most of psychology was not as amenable to experimental methods.

Instead, he felt that we had to approach cultural psychology through the products it produced – mythology, for example, cultural practices and rituals, literature and art.... He wrote a ten volume Völkerpsychologie, published between 1900 and 1920, which included the idea of stages of cultural development, from the primitive, to the totemic, through the age of heroes and gods, to the age of modern man.

In 1920, he wrote Erlebtes and Erkanntes, his autobiography. A short time later, on August 31, 1920, he died.

71 | 88© Copyright 2006 C. George Boeree

Page 216: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

William James

William James was born in New York City on January 11, 1842. His father was a rich man who spent his time entertaining the intellectuals of the time and discussing the religious mysticism of Swedenborg. This wonderful atmosphere for a bright young boy was thanks to his grandfather, an Irish immigrant with a knack for real estate investment! William was soon joined by a younger brother, Henry, who would grow up to be one of America’s premier novelists. All the James children were sent to European boarding schools and traveled through all the great capitals.

At 19, after a stint as an art student, James enrolled at Harvard in chemistry, which he soon changed to medicine. He was not really interested in a career in medicine, but wanted to study the science that went with it.

In 1865, he took advantage of a marvelous opportunity to travel the Amazon River basin with the great biologist Louis Agassiz, to collect samples of new species. While there, he began to suffer from a variety of health problems. In 1867, he went to study physiology in Germany, under Helmholtz and others. He befriended several notable early German psychologists, including Carl Stumpf. On the other hand, he had little respect for Herbert Spencer, Wilhelm Wundt, G. E. Müller, and others.

In Germany, he began to suffer from serious depression, accompanied by thoughts of suicide. In addition, he had serious back pain, insomnia, and dyspepsia. In 1869, he came back to the US to finish up his MD degree, but continued to be plagued by depression. He had been reading a book by a French philosopher named Renouvier, which convinced him of the power of free will. He decided to apply this idea to his own problems, and seemed to improve.

(A personal aside: I also suffer from depression. Unlike James, however, I began to get a grip on my depression when I finally realized that it was biological, and therefore precisely not in my control!)

From 1871 through 1872, James was a part of "the Metaphysical Club," a group of Harvard grads who met in Boston to discuss the issues of the day. Included in the club were the philosopher Charles Peirce, Oliver Wendell Holmes, and Chauncey Wright. It was Wright who introduced the idea of combining Alexander Bain's concept of beliefs as the disposition to behave, with Darwin's concept of survival of the fittest: Ideas had to compete with each other, and the best would last. This is similar to a more recent idea called memes.

It was Peirce, on the other hand, who took Kant's idea that we can never really know the truth – that all our beliefs are maybes – and turned it into the basis for pragmatism. This is very similar to Hans Vaihinger's (1852-1933) philosophy of "as if" that so influenced Alfred Adler and George Kelly.

In 1872, James was appointed an instructor of physiology at Harvard. In 1875, he taught his first course in psychology, or "physiological psychology," ala Wundt, and established a demonstration laboratory – the same year Wundt established his at Leipzig. and in 1876, he became an assistant professor of physiology.

In 1878, he married Alice Gibbons, a Boston school teacher. She took particularly good care of him, and his depression lessened significantly. Despite his tender nature, he and Alice managed to raise five children.

In that same year, he signed on with the publisher Holt to write a psychology textbook. It was supposed to take two years – it took him 12.

In 1880, his title was changed to assistant professor of philosopher, which is where, in those days, psychology actually belonged. In 1885, he became a full professor.

Despite his battles with depression, he was well liked by his students and known for his great sense of humor. Even his textbook would have a certain lightness that we rarely find in textbooks. He seemed to enjoy teaching. On the other hand, he disliked research, did almost none of it, and said that labs were basically a waste of resources!

72 | 88© Copyright 2006 C. George Boeree

Page 217: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

In 1889, his title changed again – to professor of psychology! The next year, his book was finally published – two volumes, to be exact, titled The Principles of Psychology. In 1892, he put out a shorter version subtitled The Briefer Course, which students would refer to for the next 50 years as "the Jimmy." Both are masterpieces of prose and were extremely popular among students of psychology and laypersons alike.

Despite his dislike of research, he did raise the money for a new and expanded lab at Harvard, but promptly arranged to hire one of Wundt’s students, Hugo Münsterberg, to be its director. He did not supervise many graduate students, but several were quite successful in their own right, including James Angell, Edward Thorndike, and Mary Calkins.

[Mary Calkins (1863-1930) was the first woman to complete the requirements for a PhD in psychology at Harvard. Unfortunately, she was denied the degree because (get ready...) she was a woman. She later became the first woman president of the APA. ]

James had always shared his father’s interest in mysticism, even in psychic phenomena. This has dampened his reputation among hard-core scientists in the psychological community, but it only endeared him more to the public. In 1897, he published The Will to Believe, and in 1902, Varieties of Religious Experience.

But James was never completely comfortable with being a psychologist, and preferred to think of himself as a philosopher. He is, in fact, considered America’s greatest philosopher, in addition to being the "father" of American psychology!

He was profoundly influenced by an earlier American philosopher, Charles Sanders Peirce, who founded the philosophy of Pragmatism. Pragmatism says that ideas can never be completely proven true or false. Rather, we should be looking to how useful an idea is – how practical, how productive. James called it the "cash value" of an idea! James popularized Pragmatism in books like Pragmatism in 1907 and The Meaning of Truth in 1909. In 1909, he also wrote A Pluralistic Universe, which was part Pragmatism and part an expression of his own beliefs in something not unlike Spinoza’s pantheism.

He had retired from teaching in 1907 because his heart was not was it used to be, not since a mild attack in 1898 when climbing in upstate New York. He did meet Freud when he came to visit Boston in 1909, and was very much impressed. The next year, he went to Europe for his health and to visit his brother Henry, but soon returned to his home in New Hampshire. Two days later, on August 26, 1910, he died in his wife Alice’s arms.

Several of his works were published posthumously, including Some Problems in Philosophy in 1911 and the magnificent Essays in Radical Empiricism in 1912. James' most famous students included John Dewey, the philosopher often considered the father of modern American education, and Edward Thorndike, whose work with cats opened the door to the Behaviorists.

Structuralism or Voluntarism

Wundt is undergoing a resurgence in popularity. Over 100 years after his work, we have finally caught up with him. Actually, he was massively misrepresented by poorly educated American students in Germany, and especially a rather ego-driven Englishman named Titchener. Wundt recognized that Titchener was misrepresenting him, and tried to make people aware of the problem. But Boring – the premier American historian of psychology for many decades – only knew Wundt through Titchener.

One misunderstanding revolves the title of one major work: Physiological psychology. But physiological psychology originally meant experimental psychology – using methods of physiology – although not the experimental psychology of the behaviorists in the twentieth century.

Wundt and his students used an experimental version of introspection – the careful observation of one’s perceptions – and outlined some pretty specific details to the method:

73 | 88© Copyright 2006 C. George Boeree

Page 218: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

1. The observer must know when the experience begins and ends. 2. The observer must maintain "strained attention." 3. The phenomenon must bear repetition. 4. And the phenomenon must be capable of variation – i.e. experimentation.

Regarding sensations, for example, it was determined that there are seven "qualities" of sensations: The visual, auditory, olfactory, gustatory, cutaneous, kinesthetic, and organic. Several of these have additional aspects. Vision, for example, has hue, saturation, and value. And qualities could vary in intensity, duration, vividness, and (for the visual and cutaneous senses) extension.

Wundt's labs were enormously productive places, describing things like selective attention, short-term memory, etc. – even including the famous limitations on short-term memory to 7 or so "pieces" of information that would not be noticed again until the 1970's.

ConsciousnessOne of the things that would make Wundt's work so foreign to American psychologists was what he referred to as the principle of actuality: He said that consciiousness is, in fact, a reality, and that it is the subject matter of psychology. This is, of course, true – although we managed to overlook it for a good 80 years or so when behaviorism ruled the academic world in the the US, Britain, and Russia.

Mental processes are an activity of the brain, and not material. Wundt accepted Spinoza's metaphysics of parallelism and spent a great deal of effort refuting reductionism. He believed that consciousness and its activites simply did not fit the paradigms of physical science – even though psychology emerges from biology, chemistry, and physics. With that emergence, consciousness has gained a certain capacity for creative synthesis – another of Wundt's key concepts.

Although consciousness operates "in" and "through" the physical brain, its activities cannot be described in terms of chemistry or physics. The color blue, the sound of an E minor chord, the taste of smoked salmon, the meaning of a sentence.... are all eminently psychological or subjective events, with no simple physical explanations. When does that wavelength, retinal activity, neural firing, and so forth become "blue?"

Wundt also prefigures the Gestalt psychologists in rejecting the associationism of Locke and Hume: Psychological structures are more than just the sum of their parts!

He and his students concluded that consciousness is composed of two "stages:" First, there is a large capacity working memory called the Blickfeld. Then there is a narrower consciousness called Apperception, which we might translate as selective attention, which is under voluntary control and moves about within the Blickfeld.

This selective attention idea became very influential. It led, among other things, to Kraepelin's theory of schizophrenia as a breakdown of attention processes.

PsycholinguisticsAnother aspect of Wundtian psychology was its psycholinguistics, which actually takes up the first 2 books of Völkerpsychologie. Wundt suggesteed that the fundamental unit of language is the sentence – not the word or the sound. He identified the sentence not just with a sequence of words and sounds, but as a special mental state. Sounds, words, the rules of grammer, etc., all have their meaning only in relation to that underlying mental sentence.

Wundt actually invented the tree diagram of syntax we are all familiar with in linguistics texts! Language starts with S (the sentence) at the top, and selective attention separates the subject (the focus or figure) from the predicate (the ground), and so on, in contrast to the popular bottom-up, associationistic conception the behaviorists proposed. Wundt's ideas are now the standard – yet no one remembers they were his in the first

74 | 88© Copyright 2006 C. George Boeree

Page 219: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

place!

Looking at the language of children, Wundt and his students proposed that language has its origins in emotional sounds and gestures – another theory that is returning into favor.

Emotions According to Wundt we are first of all emotional creatures. All of our mental activities involve emotion. And emotion precedes cognition! He was very much the romantic (in the philosophical sense!).

He used a variety of terms: Feelings were what he called the basic, short-lived experiences; Moods were the more long-lived versions. Emotions proper were the more complex experiences. And motivations were the more "pressurized" versions of emotion that lead to behavior.

Wundt disagreed with William James and the James-lange theory of emotions. James believed that we first respond to a situation, and then we experience the emotion. Wundt pointed out that introspection clearly shows that the emotion comes first – then we have physiological and behavioral consequences.

He felt that we could not come up with some organized list of emotions: They blend into each other too much. But we could determine several quality dimensions with which to describe them, three in particular:

1. pleasure vs displeasure 2. high vs low arousal 3. strained (or controlled) attention vs relaxed attention

Volition

Wundt felt that volition – acts of will, "decision and choice" – were so significant to understanding psychology, that he wound up calling his theory voluntaristic psychology.

Volition is really motivation, and volitional action is motivated behavior. It comes out of a creative synthesis of other emotional qualities. Students of psychology often learn about Wundt's reaction time experiments – he really saw these as studies of volition.

The work done in his labs on volition would influence the Belgian phenomenologist Albert Michotte, who in turn would influence people such as Heider, Lewin, and Festinger who would be very influential in the new specialty called social psychology.

Volition and volitional acts can range from impulses and automatic, nearly reflexive acts to complex decisions and acts that require great effort. Many controlled actions become automatic over time, which then frees us up for more complicated volitional work. In fact, it was the development of logical thought that Wundt considered the very highest form of will that humans are capable of. He was quite optimistic about our potential in that regard!

Functionalism

Functionalism as a psychology developed out of Pragmatism as a philosophy: To find the meaning of an idea, you have to look at its consequences (see where it leads). So truth is what is useful, practical, pragmatic. This led James and his students towards an emphasis on cause and effect, prediction and control, and observation of environment and behavior, over the careful introspection of the Structuralists.

Pragmatism blended easily with Darwinism: To understand an idea, ask "what is it good for?" i.e. what is its

75 | 88© Copyright 2006 C. George Boeree

Page 220: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

function in the organism, what is its purpose in an ecosystem, how does it add to a creature's chances of survival and reproduction?

Some aspects of Functionalism were clearly just "anti-structuralism," a reflection, perhaps, of James impatience with details and poor grasp of the German language. In particular, he felt that the structuralists were ignoring the whole and paying too much attention to the tidbits. The anti-structuralism of later functionalists was based more on Titchener's inaccurate interpretation of Wundt's work rather than on Wundt's work itself.

Emotion An example of functionalist thinking can be found in James’ view of emotions (the James-Lange theory):

Our natural way of thinking about these standard emotions is that the mental perception of some fact excites the mental affection called the emotion, and that this latter state of mind gives rise to the bodily expression. My thesis on the contrary is that the bodily changes follow directly the PERCEPTION of the exciting fact, and that our feeling of the same changes as they occur IS the emotion. Common sense says, we lose our fortune, are sorry and weep; we meet a bear, are frightened and run; we are insulted by a rival, are angry and strike. The hypothesis here to be defended says that this order of sequence is incorrect, that the one mental state is not immediately induced by the other, that the bodily manifestations must first be interposed between, and that the more rational statement is that we feel sorry because we cry, angry because we strike, afraid because we tremble, and not that we cry, strike, or tremble, because we are sorry, angry, or fearful, as the case may be. Without the bodily states following on the perception, the latter would be purely cognitive in form, pale, colourless, destitute of emotional warmth. We might then see the bear, and judge it best to run, receive the insult and deem it right to strike, but we could not actually feel afraid or angry.

... To begin with, readers of the Journal do not need to be reminded that the nervous system of every living thing is but a bundle of predispositions to react in particular ways upon the contact of particular features of the environment. As surely as the hermit-crab's abdomen presupposes the existence of empty whelk-shells somewhere to be found,so surely do the hound's olfactories imply the existence, on the one hand, of deer's or foxes' feet, and on the other, the tendency to follow up their tracks. The neural machinery is but a hyphen between determinate arrangements of matter ourtside the body and determinate impulses to inhibition or discharge within its organs. When the hen sees a white oval object on the ground, she cannot leave it; she must keep upon it and return to it, until at last its transformation into a little mass of moving chirping down elicits from her machinery an entirely new set of performances. The love of man for woman, or of the human mother for her babe, our wrath at snakes and our fear of precipices, may all be described similarly, as instances of the way in which peculiarly conformed pieces of the world's furniture will fatally call forth most particular mental and bodily reactions, in advance of, and often in direct opposition to, the verdict of our deliberate reason concerning them. The labours of Darwin and his successors are only just beginning to reveal the universal parasitism of each creature upon other special things, and the way in which each creature brings the signature of its special relations stampted on its nervous system with it upon the scene.

... Whistling to keep up courage is no mere figure of speech. On the other hand, sit all day in a moping posture, sigh, and reply to everything with a dismal voice, and your melancholy lingers. There is no more valuable precept in moral education than this, as all who have experience know: if we wish to conquer undesirable emotional tendencies in ourselves, we must assiduously, and in the first instance

76 | 88© Copyright 2006 C. George Boeree

Page 221: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

cold-bloodedly, go through the outward motions of those contrary dispositions we prefer to cultivate. The reward of persistency will infallibly come, in the fading out of the sullenness or depression, and the advent of real cheerfulness and kindliness in their stead. Smooth the brow, brighten the eye, contract the dorsal rather than the ventral aspect of the frame, and speak in a major key, pass the genial compliment, and your heart must be frigid indeed if it do not gradually thaw!

In the first paragraph, note the holistic idea that emotion is nothing without the body. In the second, he points out that emotion has evolutionary purpose. And in the third, James emphasizes a practical application of his theory!

Habit From a historical perspective, it was James' emphasis on habit that ignited the interest of his followers, and paved the road for the development of American behaviorism. Again, here is James in his own words:

When we look at living creatures from an outward point of view, one of the first things that strike us is that they are bundles of habits. In wild animals, the usual round of daily behavior seems a necessity implanted at birth; in animals domesticated, and especially in man, it seems, to a great extent, to be the result of education. The habits to which there is an innate tendency are called instincts; some of those due to education would by most persons be called acts of reason. It thus appears that habit covers a very large part of life, and that one engaged in studying the objective manifestations of mind is bound at the very outset to define clearly just what its limits are.

... So nothing is easier than to imagine how, when a current once has traversed a path, it should traverse it more readily still a second time. But what made it ever traverse it the first time?[5] In answering this question we can only fall back on our general conception of a nervous system as a mass of matter whose parts, constantly kept in states of different tension, are as constantly tending to equalize their states. The equalization between any two points occurs through whatever path may at the moment be most pervious. But, as a given point of the system may belong, actually or potentially, to many different paths, and, as the play of nutrition is subject to accidental changes, blocks may from time to time occur, and make currents shoot through unwonted lines. Such an unwonted line would be a new-created path, which if traversed repeatedly, would become the beginning of a new reflex arc. All this is vague to the last degree, and amounts to little more than saying that a new path may be formed by the sort of chances that in nervous material are likely to occur. But, vague as it is, it is really the last word of our wisdom in the matter.[6]

... Habit is thus the enormous fly-wheel of society, its most precious conservative agent. It alone is what keeps us all within the bounds of ordinance, and saves the children of fortune from the envious uprisings of the poor. It alone prevents the hardest and most repulsive walks of life from being deserted by those brought up to tread therein. It keeps the fisherman and the deck-hand at sea through the winter; it holds the miner in his darkness, and nails the countryman to his log-cabin and his lonely farm through all the months of snow; it protects us from invasion by the natives of the desert and the frozen zone. It dooms us all to fight out the battle of life upon the lines of our nurture or our early choice, and to make the best of a pursuit that disagrees, because there is no other for which we are fitted, and it is too late to begin again. It keeps different social strata from mixing. Already at the age of twenty-five you see the professional mannerism settling down on the young commercial traveller, on the young doctor, on the young minister, on the young counsellor-at-law. You see the little lines of cleavage running through the character, the tricks of thought, the prejudices, the ways of the 'shop,' in a word, from which the man can by-and-by no more escape than

77 | 88© Copyright 2006 C. George Boeree

Page 222: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

his coat-sleeve can suddenly fall into a new set of folds. On the whole, it is best he should not escape. It is well for the world that in most of us, by the age of thirty, the character has set like plaster, and will never soften again.

Commonalities

In reality, structuralism and functionalism were more like each otherand different from modern mainstream psychology in that both were free-willist and anti-materialistic, and both considered the proper study of psychology to be the mind:

Wundt:

"Mind," "intellect," "reason," "understanding," etc., are concepts... that existed before the advent of any scientific psychology. The fact that the naive consciousness always and everywhere points to internal experience as a special source of knowledge, may, therefore, be accepted for the moment as sufficient testimony to the right of psychology as a science... "Mind," will accordinly be the subject to which we attribute all the separate facts of internal observation as predicates. The subject itself is determined wholely and exclusively by its predicates.

James:

There is only one primal stuff or material in the world, a stuff of which everything is composed, and... we call that stuff "pure experience."

Both Wundt and James were empiricists, and considered their psychologies experimental. Neither liked the rationalistic systems prevalent in the philosophy of their day – such as Hegel's grand system. However, neither were anything like what most people understand as experimentalists today, because neither of them were materialists or reductionists.

Wundt on materialism:

If we could see every wheel in the physical mechanism whose working the mental processes are accompanying, we should still find no more than a chain of movements showing no trace whatsoever of their significance for mind... (All) that is valuable in our mental life still falls to the psychical side.

James’ friend and teacher Peirce on materialism:

The materialistic doctrine seems to me quite as repugnant to scientific logic as to common sense; since it requires us to suppose that a certain kind of mechanism will feel, which would be a hypothesis absolutely irreducible to reason – an ultimate, inexplicable regularity; while the only possible justification of any theory is that it should make things clear and reasonable.

78 | 88© Copyright 2006 C. George Boeree

Page 223: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

And Mary Calkins*, one of James' students, on James' view of introspection:

From introspection he derives the materials for psychology. "Introspective observation," he expressly asserts, "is what we have to rely on first and foremost and always...."

As for the historical influential differences between Wundt and James: While Wundt focused on the introspection of consciousness, James focused on behavior in environment. This focus would lay the groundwork for a behaviorism that James would scarcely recognize.

It would be nearly a century before research psychology would come back from a long sojourn in materialistic, reductionistic, quantitative, physiological, behavioristic methods to something Wundt and James would recognize as psychology!

* Mary Whiton Calkins was one of the first female students of psychology, as well as the founder of the psychology program at Wellesley. She studied under James and Munsterberg at Harvard, but was not given the PhD she richly deserved – because she was a woman! After she died, students appealed to Harvard to grant her the PhD posthumously. They turned her down again. Shame on Harvard!

79 | 88© Copyright 2006 C. George Boeree

Page 224: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Selection from: The Stream of Consciousness (1892) by William James *

The first and foremost concrete fact which every one will affirm to belong to his inner experience is the fact that consciousness of some sort goes on. 'States of mind' succeed each other in him. If we could say in English 'it thinks,' as we say 'it rains' or 'it blows,' we should be stating the fact most simply and with the minimum of assumption. As we cannot, we must simply say that thought goes on.

....How does it go on? We notice immediately four important characters in the process, of which it shall be the duty of the present chapter to treat in a general way :

1) Every 'state' tends to be part of a personal consciousness. 2) Within each personal consciousness states are always changing. 3) Each personal consciousness is sensibly continuous. 4) It is interested in some parts of its object to the exclusion of others, and welcomes or rejects – chooses from among them, in a word – all the while.

In considering these four points successively, we shall have to plunge in medias res as regards our nomenclature and use psychological terms which can only be adequately defined in later chapters of the book. But every one knows what the terms mean in a rough way; and it is only in a rough way that we are now to take them. This chapter is like a painter's first charcoal sketch upon his canvas, in which no niceties appear.

[ Personal Nature of Consciousness ]

When I say every 'state' or 'thought' is part of a personal consciousness, 'personal consciousness' is one of the terms in question. Its meaning we know so long as no one asks us to define it, but to give an accurate account of it is the most difficult of philosophic tasks. This task we must, confront in the next chapter; here a preliminary word will suffice.

In this room – this lecture-room, say – there are a multitude of thoughts, yours and mine, some of which cohere mutually, and some not. They are as little each-for-itself and reciprocally independent as they are all-belonging-together. They are neither: no one of them is separate, but each belongs with certain others and with none beside. My thought belongs with my other thoughts, and your thought with your other thoughts. Whether anywhere in the room there be a mere thought, which is nobody's thought, we have no means of ascertaining, for we have no experience of its like. The only states of consciousness that we naturally deal with are found in personal consciousness, minds, selves, concrete particular I's and you's.

Each of these minds keeps its own thoughts to itself. There is no giving or bartering between them. No thought even comes into direct sight of a thought in another personal consciousness than its own. Absolute insulation, irreducible pluralism, is the law. It seems as if the elementary psychic fact were not thought or this thought or that thought, but my thought, every thought being owned. Neither contemporaneity, nor proximity in space, nor similarity of quality and content are able to fuse thoughts together which are sundered by this barrier of belonging to different personal minds. The breaches between such thoughts are the most absolute breaches in nature. Every one will recognize this to be true, so long as the existence of something corresponding to the term 'personal mind' is all that is insisted on, without any particular view of its nature being implied. On these terms the personal self rather than the thought might be treated as the immediate datum in psychology. The universal conscious fact is not 'feelings and thoughts exist,' but 'I think' and 'I feel.' No psychology, at any rate, can question the existence of personal selves. Thoughts connected as we feel them to be connected are what we mean by personal selves. The worst a psychology can do is so to

* First published in Psychology, Chapter XI. (Cleveland & New York, World).

Available on the internet at http://www.yorku.ca/dept/psych/classics/James/jimmy11.htm Classics in the History of Psychology: An internet resource developed by Christopher D. Green York University, Toronto, Ontario

80 | 88© Copyright 2006 C. George Boeree

Page 225: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

interpret the nature of these selves as to rob them of their worth.

[ Consciousness in Constant Change ]

Consciousness is in constant change. I do not mean by this to say that no one state of mind has any duration – even if true, that would be hard to establish. What I wish to lay stress on is this, that no state once gone can recur and be identical with what it was before. Now we are seeing, now hearing; now reasoning, now willing; now recollecting, now expecting; now loving, now hating; and in a hundred other ways we know our minds to be alternately engaged....

....The grass out of the window now looks to me of the same green in the sun as in the shade, and yet a painter would have to paint one part of it dark brown, another part bright yellow, to give its real sensational effect. We take no heed, as a rule, of the different way in which the same things look and sound and smell at different distances and under different circumstances. The sameness of the things is what we are concerned to ascertain; and any sensations that assure us of that will probably be considered in a rough way to be the same with each other....

Such a difference as this could never have been sensibly learned; it had to be inferred from a series of indirect considerations. These make us believe that our sensibility is altering all the time, so that the same object cannot easily give us the same sensation over again. We feel things differently accordingly as we are sleepy or awake, hungry or full, fresh or tired; differently at night and in the morning, differently in summer and in winter; and above all, differently in childhood, manhood, and old age. And yet we never doubt that our feelings reveal the same world, with the same sensible qualities and the same sensible things occupying it. The difference of the sensibility is shown best by the difference of our emotion about the things from one age to another, or when we are in different organic moods, What was bright and exciting becomes weary, flat, and unprofitable. The bird's song is tedious, the breeze is mournful, the sky is sad.

....From one year to another we see things in new lights. What was unreal has grown real, and what was exciting is insipid. The friends we used to care the world for are shrunken to shadows; the women once so divine, the stars, the woods, and the waters, how now so dull and common! – the young girls that brought an aura of infinity, at present hardly distinguishable existences; the pictures so empty; and as for the books, what was there to find so mysteriously significant in Goethe, or in John Mill so full of weight? Instead of all this, more zestful than ever is the work, the work; and fuller and deeper the import of common duties and of common goods.

[ The Continuity of Thought ]

....No doubt it is often convenient to formulate the mental facts in an atomistic sort of way, and to treat the higher states of consciousness as if they were all built out of unchanging simple ideas which 'pass and turn again.' It is convenient often to treat curves as if they were composed of small straight lines, and electricity and nerve-force as if they were fluids. But in the one case as in the other we must never forget that we are talking symbolically, and that there is nothing in nature to answer to our words. A permanently existing 'Idea' which makes its appearance before the footlights of consciousness at periodical intervals is as mythological an entity as the Jack of Spades.

Within each personal consciousness, thought is sensibly continuous. I can only define 'continuous' as that which is without breach, crack, or division. The only breaches that can well be conceived to occur within the limits of a single mind would either be interruptions, time-gaps during which the consciousness went out; or they would be breaks in the content of the thought, so abrupt that what followed had no connection whatever with what went before. The proposition that consciousness feels continuous, means two things:

a. That even where there is a time-gap the consciousness after it feels as if it belonged together with the consciousness before it, as another part of the same self;

81 | 88© Copyright 2006 C. George Boeree

Page 226: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

b. That the changes from one moment to another in the quality of the consciousness are never absolutely abrupt.

The case of the time-gaps, as the simplest, shall be taken first.

....When Paul and Peter wake up in the same bed, and recognize that they have been asleep, each one of them mentally reaches back and makes connection with but one of the two streams of thought which were broken by the sleeping hours. As the current of an electrode buried in the ground unerringly finds its way to its own similarly buried mate, across no matter how much intervening earth; so Peter's present instantly finds out Peter's past, and never by mistake knits itself on to that of Paul. Paul's thought in turn is as little liable to go astray. The past thought of Peter is appropriated by the present Peter alone. He may have a knowledge, and a correct one too, of what Paul's last drowsy states of mind were as he sank into sleep, but it is an entirely different sort of knowledge from that which he has of his own last states. He remembers his own states, whilst he only conceives Paul's. Remembrance is like direct feeling; its object is suffused with a warmth and intimacy to which no object of mere conception ever attains. This quality of warmth and intimacy and immediacy is what Peter's present thought also possesses for itself. So sure as this present is me, is mine, it says, so sure is anything else that comes with the same warmth and intimacy and immediacy, me and mine. What the qualities called warmth and intimacy may in themselves be will have to be matter for future consideration. But whatever past states appear with those qualities must be admitted to receive the greeting of the present mental state, to be owned by it, and accepted as belonging together with it in a common self. This community of self is what the time-gap cannot break in twain, and is why a present thought, although not ignorant of the time-gap, can still regard itself as continuous with certain chosen portions of the past.

Consciousness, then, does not appear to itself chopped up in bits. Such words as 'chain' or 'train' do not describe it fitly as it presents itself in the first instance. It is nothing jointed; it flows. A 'river' or a 'stream' are the metaphors by which it is most naturally described. In talking of it hereafter, let us call it the stream of thought, of consciousness, or of subjective life....

[ Substantive and Transitive States of Mind ]

....When we take a general view of the wonderful stream of our consciousness, what strikes us first is the different pace of its parts. Like a bird's life, it seems to be an alternation of flights and perchings. The rhythm of language expresses this, where every thought is expressed in a sentence, and every sentence closed by a period. The resting-places are usually occupied by sensorial imaginations of some sort, whose peculiarity is that they can be held before the mind for an indefinite time, and contemplated without changing; the places of flight are filled with thoughts of relations, static or dynamic, that for the most part obtain between the matters contemplated in the periods of comparative rest.

Let us call the resting-places the 'substantive parts,' and the places of flight the 'transitive parts,' of the stream of thought. It then appears that our thinking tends at all times towards some other substantive part than the one from which it has just been dislodged. And we may say that the main use of the transitive parts is to lead us from one substantive conclusion to another.

Now it is very difficult, introspectively, to see the transitive parts for what they really are. If they are but flights to a conclusion, stopping them to look at them before the conclusion is reached is really annihilating them. Whilst if we wait till the conclusion be reached, it so exceeds them in vigor and stability that it quite eclipses and swallows them up in its glare. Let anyone try to cut a thought across in the middle and get a look at its section, and he will see how difficult the introspective observation of the transitive tracts is. The rush of the thought is so headlong that it almost always brings us up at the conclusion before we can rest it. Or if our purpose is nimble enough and we do arrest it, it ceases forthwith to itself. As a snowflake crystal caught in the warm hand is no longer a crystal but a drop, so, instead of catching the feeling of relation moving to its term, we find we have caught some substantive thing, usually the last word we were pronouncing, statically taken, and with its function, tendency, and particular meaning in the sentence quite evaporated. The attempt at introspective analysis in these cases is in fact like seizing a spinning top to catch its motion, or trying to

82 | 88© Copyright 2006 C. George Boeree

Page 227: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

turn up the gas quickly enough to see how the darkness looks....

We ought to say a feeling of and, a feeling of if, a feeling of but, and a feeling of by, quite as readily as we say a feeling of blue or a feeling of cold. Yet we do not: so inveterate has our habit become of recognizing the existence of the substantive parts alone, that language almost refuses to lend itself to any other use....

[ Fringes of Experience ]

The object before the mind always has a 'Fringe.' There are other unnamed modifications of consciousness just as important as the transitive states, and just as cognitive as they. Examples will show what I mean....

Suppose we try to recall a forgotten name. The state of our consciousness is peculiar. There is a gap therein; but no mere gap. It is a gap that is intensely active. A sort of wraith of the name is in it, beckoning us in a given direction, making us at moments tingle with the sense of our closeness, and then letting us sink back without the longed-for term. If wrong names are proposed to us, this singularly definite gap acts immediately so as to negate them. They do not fit into its mould. And the gap of one word does not feel like the gap of another, all empty of content as both might seem necessarily to be when described as gaps. When I vainly try to recall the name of Spalding, my consciousness is far removed from what it is when I vainly try to recall the name of Bowles. There are innumerable consciousnesses of want, no one of which taken in itself has a name, but all different from each other. Such feeling of want is tota cÏlo other than a want of feeling: it is an intense feeling. The rhythm of a lost word may be there without a sound to clothe it; or the evanescent sense of something which is the initial vowel or consonant may mock us fitfully, without growing -more distinct. Every one must know the tantalizing effect of the blank rhythm of some forgotten verse, restlessly dancing in one's mind, striving to be filled out with words.

....The traditional psychology talks like one who should say a river consists of nothing but pailsful, spoonsful, quartpotsful, barrelsful, and other moulded forms of water. Even were the pails and the pots all actually standing in the stream, still between them the free water would continue to flow. It is just this free water of consciousness that psychologists resolutely overlook. Every definite image in the mind is steeped and dyed in the free water that flows round it. With it goes the sense of its relations, near and remote, the dying echo of whence it came to us, the dawning sense of whither it is to lead. The significance, the value, of the image is all in this halo or penumbra that surrounds and escorts it, – or rather that is fused into one with it and has become bone of its bone and flesh of its flesh; leaving it, it is true, an image of the same thing it was before, but making it an image of that thing newly taken and freshly understood.

Let us call the consciousness of this halo of relations around the image by the name of 'psychic overtone' or 'fringe.''

[ Attention ]

....The last peculiarity to which attention is to be drawn in this first rough description of thought's stream is that – Consciousness is always interested more in one part of its object than in another, and welcomes and rejects, or chooses, all the while it thinks.

The phenomena of selective attention and of deliberative will are of course patent examples of this choosing activity. But few of us are aware how incessantly it is at work in operations not ordinarily called by these names. Accentuation and Emphasis are present in every perception we have. We find it quite impossible to disperse our attention impartially over a number of impressions. A monotonous succession of sonorous strokes is broken up into rhythms, now of one sort, now of another, by the different accent which we place on different strokes. The simplest of these rhythms is the double one, tick-t—ck, tick-t—ck, tick-t—ck. Dots dispersed on a surface are perceived in rows and groups. Lines separate into diverse figures. The ubiquity of the distinctions, this and that, here and there, now and then, in our minds is the result of our laying the same selective emphasis on parts of place and time

83 | 88© Copyright 2006 C. George Boeree

Page 228: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

But we do far more than emphasize things, and unite some, and keep others apart. We actually ignore most of the things before us. Let me briefly show how this goes on.

....what is called our 'experience' is almost entirely determined by our habits of attention. A thing may be present to a man a hundred times, but if he persistently fails to notice it, it cannot be said to enter into his experience. We are all seeing flies, moths, and beetles by the thousand, but to whom, save an entomologist, do they say anything distinct? On the other hand, a thing met only once in a lifetime may leave an indelible experience in the memory. Let four men make a tour in Europe. One will bring home only picturesque impressions – costumes and colors, parks and views and works of architecture, pictures and statues. To another all this will be non-existent; and distances and prices, populations and drainage-arrangements, door- and window-fastenings, and other useful statistics will take their place. A third will give a rich account of the theatres, restaurants, and public halls, and naught besides; whilst the fourth will perhaps have been so wrapped in his own subjective broodings as to be able to tell little more than a few names of places through which he passed. Each has selected, out of the same mass of presented objects, those which suited his private interest and has made his experience thereby....

If now we pass to the æsthetic department, our law is still more obvious. The artist notoriously selects his items, rejecting all tones, colors, shapes, which do not harmonize with each other and with the main purpose of his work. That unity, harmony, 'convergence of characters,' as M. Taine calls it, which gives to works of art their superiority over works of nature, is wholly due to elimination. Any natural subject will do, if the artist has wit enough to pounce upon some one feature of it as characteristic, and suppress all merely accidental items which do not harmonize with this.

Ascending still higher, we reach the plane of Ethics, where choice reigns notoriously supreme. An act has no ethical quality whatever unless it be chosen out of several all equally possible.... When he debates, Shall I commit this crime? choose that profession? accept that office, or marry this fortune? – his choice really lies between one of several equally possible future Characters.....The problem with the man is less what act he shall now resolve to do than what being he shall now choose to become.

[ Me and not-me ]

....One great splitting of the whole universe into two halves is made by each of us; and for each of us almost all of the interest attaches to one of the halves; but we all draw the line of division between them in a different place. When I say that we all call the two halves by the same names, and that those names are 'me' and 'not-me' respectively, it will at once be seen what I mean. The altogether unique kind of interest which each human mind feels in those parts of creation which it can call me or mine may be a moral riddle, but it is a fundamental psychological fact. No mind can take the same interest in his neighbor's me as in his own. The neighbor's me falls together with all the rest of things in one foreign mass against which his own me stands cut in startling relief. Even the trodden worm, as Lotze somewhere says, contrasts his own suffering self with the whole remaining universe, though he have no clear conception either of himself or of what the universe may be. He is for me a mere part of the world; for him it is I who am the mere part. Each of us dichotomizes the Kosmos in a different place.

84 | 88© Copyright 2006 C. George Boeree

Page 229: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

Free Will

The concept of free will has undergone some hard times lately. The obvious success of science, and the materialistic, deterministic, reductionistic assumptions that usually accompany it, have made free will seem old-fashioned, associated more with scholastic theologians than modern men and women. But I find the concept impossible to ignore, much less dispose of.

Let’s begin by saying what free will is, and what it isn’t. Free will is not the same as freedom of action. Freedom of action refers to things that prevent a willed action from being realized. For example, being in prison means you are not free to paint the town red. Being in a straight jacket means you are not free to wave hello. Being paralyzed means not being able to move your limbs. These are not issues of free will. Free will means being free to try to escape (or not), to try to wave (or not), to try to move your limbs (or not).

Neither is free will the same as political or social freedom (better known as liberty). Just because you will be executed for taking the local dictator’s name in vain, doesn’t mean you aren’t free to try, or even free to actually do so. You’ll just wind up paying for the satisfaction.

On the other side of the argument, I need to point out that determinism is not the same thing as fatalism, destiny, or predestination. Determinism means that the way things are at one moment is the necessary result of the ways things were the moment before. It means that every effect has its cause, and that nothing, not even the will, is exempt. It does not mean that the future is already established.

It might also be useful to define will. As I understand it, it is a matter of intent: The perceptual, cognitive, and emotional processes we engage in when confronted by a choice result in an intent to engage in certain actions or non-actions. I have before me a cheese danish and a poppy seed muffin. I look, I sniff, I consider past experiences, I feel good about both prospects, and then I decide. I intend to eat the cheese danish (or the muffin, or neither, or both...). Whether I am free to actually eat it, or whether I can expect severe punishment for doing so, is irrelevant. I have made up my mind!

Let’s run through some arguments for free will, followed by the determinist’s responses. Since the free willist is making a claim, and an exceptional one at that, the burden of proof is on him or her.

First, there is the experience argument. I experience something within myself that I understand as making choices, and that those choices are not determined by anything other than myself.

The determinist will respond that you are simply not aware of the causes of your decisions, and have labeled that ignorance "free will." There were no doubt neurons firing and chemicals sailing across synapses and so forth, all very deterministically resulting in my choice of the danish.

The free willist might suggest that belief is a crucial part of free will. If you were to set me up with the danish and the muffin, knowing that I tend to choose danishes, you might very well say the end result was determined. But if I knew you were trying to prove your point, I would simply choose the muffin instead, or neither.

The determinist would simply say that this extra tidbit of knowledge – that I am trying to fool you – has replaced your usual causal factors. Instead, you are reacting, quite mechanically, to a threat to your beliefs.

Maybe so, says the free willist. But you must admit that I can be awfully random at times. I can suddenly jump out of my chair and scream "Tippecanoe and Tyler, too" at the top of my lungs. Let’s see you predict something like that!

The determinist would respond that indeterminism is far from free will. If that’s all there is to free will, then a roulette wheel is better at it than you are.

But I am unpredictable, says the free willist.

The determinist would point out that that is merely a practical problem, not a philosophical one. The fact that

85 | 88© Copyright 2006 C. George Boeree

Page 230: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

I cannot pin point the precise location and velocity, say, of all the particles in the universe, doesn’t mean that you aren’t determined by them. In fact, even if that were theoretically impossible (as suggested by the Heisenberg uncertainty principle), it only means I can’t predict, not that you have free will!

The free willist may point out that, without free will, morality has no meaning. All the best things about people – generosity, bravery, compassion – have no meaning. If we are as determined as falling bricks, then Adolph Hitler could no more be blamed for his evil actions than Mother Teresa could be praised for her good ones. What then of our world?

Simple, says the determinist. We will have to live without morality. Many people are already moral relativists, or even moral nihilists. Our societies can get along just fine with laws and judicial processes and prisons using nothing more than tradition, majority self-interest, reciprocity, and the rule of cover-your-ass. Maybe that’s all morality has ever been!

Another argument a free willist can make is that we have this unique ability to stop and think about a decision-making situation. We can exit the stream of cause-and-effect for a moment. We pause before the high-calorie meal to consider the advisability of diving in. Animals rarely do this: If a hungry lion has an antelope before her, she eats. And we can postpone the decision as long as we like. Even if the actual choice we make at some particular moment in time is determined, the length of time we wait for that moment to arrive is not.

Or is it? says the determinist. What caused you to wait exactly one minute before choosing? Or what caused you to stop your pausing and jump into things at just that moment? Besides, isn’t this pause just a matter of two forces of equal strength short-circuiting the normal processes?

Jean-Paul Sartre came up with an interesting free will argument. He said that we can ignore something real and we can pretend something unreal. For example, I could imagine that there is no danish before me – something I often need to do in the service of dieting. Or I can see the poppy seeds in the muffins as maggots. This imagination is a powerful thing! But the determinist would just say that imagination is just one more neurological mechanism, explainable by deterministic principles.

I must point out that, although the free willist has not exactly won any arguments so far, the determinist has put himself in a somewhat more defensive position. Some of that "burden of proof" is moving over to the determinist side. For example, he has claimed that imagination is something physical. That is a claim that we need not just accept: We can challenge him to demonstrate the validity of the claim.

Another possible foundation for free will is creativity. I can create a new option. I am not stuck with the cheese danish or the poppy seed muffin. I can throw them both and choose a bag of cheesy puffs. Or I can literally create a new concoction: Get out my mixing bowls and bake something no one has ever seen before, such as a poppy seed danish or a cheese muffin. Or I can get out my blender and make a muffin and danish slurpy.

Of course, the determinist, becoming rather tiresome by now, would just say that creativity is just a word we use to label unconscious neural events that surprise even us – an accident. If someone steps on your danish and muffin by accident, no one would think to call the wad on the bottom of his shoe a new creation!

(Of course, the determinist is claiming now that creativity is mechanical – something he could be challenged to defend.)

So, how about differentiating between causes and reasons? When I get myself a Big Mac, is it cause-and-effect determinism that led me there? Did the growling in my stomach force me into my car, the sight of the golden arches make me jerk my steering wheel in their direction? Or did I notice my appetite and conceive a plan: Look through my repertoire of gastronomic delights, I decide on a Big Mac, drive purposefully to the golden arches, and order what I want? Was I, in other words, "pushed from behind" by causes, or did I follow my reasons?

This is called teleology. Instead of reacting to stimuli, we project a future situation which we take as a goal. The connection between cause-and-effect is one of necessity. There is nothing necessary about purposes.

86 | 88© Copyright 2006 C. George Boeree

Page 231: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

They can be accomplished – or not.

But the determinist would respond with the same argument he made with imagination and creativity: Your perceptions and cognitions and emotions, your past experiences, inevitably lead to your projecting that goal and working toward it. It only appears to you to be free of necessity. But note how quickly we give up our goals when other, more powerfully supported forces push in upon us.

One last try for free will: I suggest that, as we develop from babies into adults, we separate from the world. Our causal processes become increasingly independent of the causal processes outside of us, especially in the mental realm. A gap develops that allows us to be influenced by outside situations, but not determined by them. This gap is like a large river: The man on the opposite bank can wave and jump and yell all he wants – he cannot directly affect us. But we can listen to him or interpret his semaphore signals. We can treat his antics as information to add to all the information we have gathered over our lives, and use that information to guide our decisions – influenced, but not caused.

The baby begins life nearly as intimately connected with his or her world as in the womb. By the end of life, some of us are impervious to what others think about us, can rise above any threat or seductive promise, can ignore nearly any kind of urge or pain. In one sense, we are still determined – determined by that developing person we are, determined by our selves. But nothing else in our present circumstances, or even in our past going way back to some time in childhood when that gap was first fully realized, is more than information to utilize in making free decisions.

I know very well that the determinist can respond to this idea as well. But now he is as much on the defensive as the free willist has ever been. In fact, the undecided listener may begin to conclude that it is the deterministic stance – nothing is free! – that is the more extreme, less reasonable one.

Addendum Some students have complained that I have left the job unfinished, and that I should continue the argument to a conclusion. In other words, they want to know what students always want to know: What is the answer? Or at least, what do I think is the answer. Although I would rather see students come up with their own answers, here is how I see the issue:

87 | 88© Copyright 2006 C. George Boeree

Page 232: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Three: The 1800's

The argument of free will versus determinism is in some measure a false one. Both sides have been reduced to straw men (easily destroyed arguments) by oversimplification. For example, free will has never meant freedom to ignore the laws of nature, and determinism does not mean everything is predictable. Perhaps the best thing we can do to get past the stalemate is to develop a new concept that points to the complexity of the person and his or her interaction with the world. Instead of free will versus determinism, maybe we should adopt Albert Bandura's preferred term: Self-determination.

As a middle-aged man, I have dozens of years of experiences – my childhood, my cultural inheritance, the books I've read, conversations with friends, my own thoughts – that have made me who I am today. All this is on top of my unique genetics and other physical realities of who I am. The things that happen to me now are experienced through this mass of uniqueness, and my responses depend, not only on my present situation, but on all that I am. This may not be "free will" in the absolute sense, but it is certainly self-determination.

If we possess this (somewhat limited) freedom, we also posses a (somewhat limited) responsibility for our actions. For most adults, it can be legitimately claimed that who we are includes basic moral concepts and a rational respect for the law conveyed to us by our parents and others. These things are a part of who we are, and are available to us when we make a choice to behave one way or another. We are therefore culpable if we disregard these moral and legal concepts. This dovetails nicely into the legal tradition that asks whether or not a person actually knows right from wrong, and whether the person has the maturity or the cognitive wherewithall to choose right over wrong.

In other words, we don't have to be "above" the natural world in order to have a degree of freedom within that world.

The History of Psychology Part One: The Ancients

Part Two: The Rebirth

Part Four: The 1900's

[ http://www.ship.edu/%7Ecgboeree/historyofpsych.html ]

88 | 88© Copyright 2006 C. George Boeree

Page 233: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

E-Text Source:[ http://www.ship.edu/%7Ecgboeree/historyofpsych.html ]

1 | 115© Copyright 2006 C. George Boeree

Page 234: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Index

Index 2

Timeline: The 1900s 3

Map: Europe 1914

8

Sigmund Freud and Psychoanalysis 9

[ Precursors of Psychoanalysis | Mesmer | Pinel | Charcot | The Unconscious | Freud | Jung | Adler ]

Sigmund Freud Selection: The Structure of the Unconscious 29

Behaviorism 33

[ Pavlov | Thorndike | Watson | McDougall | Hull | Tolman | Skinner ]

B. F. Skinner Selection 46

Walt Whitman There Was a Child Went Forth 53

Gestalt Psychology 55

[ Wertheimer | Köhler | Koffka | The Theory | Lewin | Goldstein ]Kurt Köhler Selection: Gestalt Psychology Today 62

Carl Rogers Selection: The Organization of Personality 66

Phenomenological Existentialism 71

[ Brentano | Stumpf | Husserl | Phenomenology | Heidegger | Sartre ]

James Joyce Selection: Portrait of the Artist as a Young Man 79

Romance: A Partial Analysis 80

Modern Medicine and Physiology 82

[ Technology and the brain | The psychopharmacological explosion | Genetics and the human genome ]

A Brief History of the Lobotomy 87

The Cognitive Movement 89

[ Wiener | Turing | von Bertalanffy | Chomsky | Piaget | Hebb | Miller | Neisser ]

A Computer Timeline 99

Conclusions: Psychology Today and Tomorrow 110

[ From Logical Positivism to Postmodernism | The Situation for Psychology ]

2 | 115© Copyright 2006 C. George Boeree

Page 235: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Timeline: 1900s

Psychiatry and Psychoanalysis Behaviorism

Phenomenology, Gestalt, Humanism, and Existentialism

Cognitive Psychology/

Artificial Intelligence

Modern Medicine and Physiology

1863 Sekhenov Reflexes of the

Brain1866 Gregor Mendel

discovers the principles of heredity

1869 von Hartmann: Philosophy of the

Unconscious1874 Brentano:

Psychology from an Empirical Standpoint

1882 Charcot opens clinic at Salpetriere

1883 Kraepelin publishes list of

disorders

1883 Nietzsche publishes Thus Spake

Zarathustra1885-6 Freud

studies hypnosis with Charcot

1885 Hermann Ebbinghaus: On

Memory1890 Ehrenfels:

About the Qualities of the Gestalt

1895 Breuer and Freud: Studies in

Hysteria

1895 Roentgen invents the X-ray

1900 Freud: Interpretation of

Dreams

1900 Husserl: Logical Investigations

1906 Pavlov publishes first conditioning

studies

1906 Golgi and Ramon y Cajal win the Nobel

for discovering the synapse

1907 Jung meets Freud; Adler invited

to join Freud's circle

1907 Bekhterev: Objective

Psychology

1909 Freud, Jung, et al speak at Clark

University

3 | 115© Copyright 2006 C. George Boeree

Page 236: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Psychiatry and Psychoanalysis Behaviorism

Phenomenology, Gestalt, Humanism, and Existentialism

Cognitive Psychology/

Artificial Intelligence

Modern Medicine and Physiology

1910 Thomas Morgan discovers

chromosomes1911 Adler forms

his own Individual Psychology society

1911 Thorndike: Animal

Intelligence1912 McDougall: Psychology: The

Study of Behavior

1912 Wertheimer publishes paper on

perception of movement

1913 Watson: Psychology as the Behaviorist

Views It

1913 Köhler does chimpanzee studies

1914 Jung splits from Freud, begins his "dark years"

(1914 to 1917 – WW I)

1921 The Gestalt journal

Psychologische Forschung first

published

1921 Loewi discovers the first

neurotransmitter, acetylcholine

1922 Tolman presents "a new

formula for behaviorism"

1923 Wertheimer: Laws of Organization

1924 Koffka: The Growth of Mind

1926 Hermann J. Muller creates

mutations in fruit flies with X-rays

1927 Alfred Adler: Understanding Human Nature

1927 Köhler: The Mentality of Apes

1927 Heidegger: Being and Time

4 | 115© Copyright 2006 C. George Boeree

Page 237: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Psychiatry and Psychoanalysis Behaviorism

Phenomenology, Gestalt, Humanism, and Existentialism

Cognitive Psychology/

Artificial Intelligence

Modern Medicine and Physiology

1929 Berger invents the EEG

1930 Skinner publishes his first

paper on conditioning1932 Tolman:

Purposive Behavior in Men

and Animals

1932 Jean Piaget: The Moral

Judgement of the Child

1935 Lewin: A Dynamic Theory of

Personality

1935 Moniz performs the first lobotomy

1936 Anna Freud: The Ego and the Mechanisms of

Defense

1936 Alan M. Turing, of

Cambridge publishes a paper

which introduces the Turing machine.

1937 Karen Horney: The Neurotic

Personality of our Time

1937 Allport: Personality

1938 Skinner: The Behavior of

Organisms

1938 The first use of electroshock

(1939 to 1945 – WW II)

1940 Ludwig von Bertalanffy:

Problems of Life1941 Fromm: Escape

from Freedom1942 Jean Piaget:

Psychology of Intelligence.

1943 Hull: Principles of

Behavior

1943 Binswanger: Grundformen und

Erkenntnis menschlichen Daseins

5 | 115© Copyright 2006 C. George Boeree

Page 238: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Psychiatry and Psychoanalysis Behaviorism

Phenomenology, Gestalt, Humanism, and Existentialism

Cognitive Psychology/

Artificial Intelligence

Modern Medicine and Physiology

1944 Turing: Machine

Intelligence1945 John W.

Mauchly and J. Presper Eckert and

their team at the University of Pennsylvania,

complete ENIAC1947 Goldstein: The

Organism

1948 Skinner: Walden II

1948 Frankl: Experiences in a

Concentration Camp

1948 Norbert Wiener: Cybernetics

1949 Donald Hebb: The Organization of

Behavior

1949 John Cade discovers the beneficial

effects of lithium1950 Erik Erikson:

Childhood and Society

1950 Rollo May: The Meaning of Anxiety

1951 Rogers: Client-Centered Therapy

1952 Laborit discovers the first antipsychotic drug, chlorpromazine

(Thorazine)1953 Watson and Crick discover the structure of the DNA molecule

1954 Carl Jung: Von dem Wurzeln des Bewusstseins

1954 Gordon Allport: The Nature of

Prejudice

1954 Olds discovers the "pleasure center"

of rats1954 Abraham

Maslow: Motivation and Personality

1955 George Kelly: Psychology of

Personal Constructs1956 George A.

Miller publishes 7 +/- 2 paper.

1957 Albert Ellis: How to Live with a

Neurotic

1957 Noam Chomsky: Syntactic

Structures

6 | 115© Copyright 2006 C. George Boeree

Page 239: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Psychiatry and Psychoanalysis Behaviorism

Phenomenology, Gestalt, Humanism, and Existentialism

Cognitive Psychology/

Artificial Intelligence

Modern Medicine and Physiology

1960 Miller: Plans and the Structure of

Behavior1961 May, et al edit

Existential Psychology

1963 Sernbach discovers the

antianxiety drug diasepam (Valium)

1967 Hans Eysenck: The

Biological Basis of Personality

1967 Ulric Neisser: Cognitive

Psychology

1969 ARPANET (future Internet) links first two

computers at UCLA and Stanford

Research Institute.1972 Hounsfield

invents the CAT scan1973 Albert

Bandura: Aggression: A

Social Learning Analysis

1973 Snyder and Pert discover endorphin

1974 D. T. Wong discovers fluoxetine

(Prozac)1977 Damadian's first

MRI1977 a virus is the first

creature to have its complete genome

revealed1976 Neisser: Cognition and

Reality1980 First AAAI

conference at Stanford

7 | 115© Copyright 2006 C. George Boeree

Page 240: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Psychiatry and Psychoanalysis Behaviorism

Phenomenology, Gestalt, Humanism, and Existentialism

Cognitive Psychology/

Artificial Intelligence

Modern Medicine and Physiology

1981 the PET scan invented

1997 "Deep Blue" beats Kasparov, the best chess player in

the world.2000 HGP and Celera

announce that they have completed

working drafts of the human genome

(The New Millennium Begins!)

Map: Europe 1914

8 | 115© Copyright 2006 C. George Boeree

Page 241: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Freud and Psychoanalysis

9 | 115© Copyright 2006 C. George Boeree

Page 242: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Precursors of Psychoanalysis

It often surprises students that psychiatry – meaning the doctoring of the mind – was not invented by Sigmund Freud. Psychoanalysis – a particular (and very significant) brand of psychiatry – was his baby. Psychiatrists existed before Freud, and most, psychiatrists today are not Freudian.

The term psychiatry was coined by the German physician Johann Reil in 1808, and would slowly replace the older term "alienist." The new respect signalled by the new name was based on some significant improvements in the care of the mentally ill in the second half of the 1700's.

There are three people I would like to pay my respects to as important precursors to psychoanalysis: Franz Anton Mesmer, who discovered hypnotism; Philippe Pinel, who changed the way we thought of and treated the mentally ill; and Jean-Martin Charcot, who is often considered the father of neurology.

Franz Anton Mesmer

Franz Anton Mesmer was born May 23, 1734 in Iznang, Germany, near Lake Constance. He received his MD from the University of Vienna in 1766. His dissertation concerned the idea that the planets influenced the health of those of us on earth. He suggested that their gravitational forces could change the distribution of our animal spirits. Later, he changed his theory to emphasize magnetism rather than gravity – hence the term "animal magnetism." It would soon, however, come to be known as mesmerism.

He was, in fact, able to put people into trance states, even convulsions, by waving magnetized bars over them. His dramatic performances were quite popular for a while, although he believed that anyone could achieve the same results. In point of fact, some of his patients did in fact get relief from their symptoms – a point that would later be investigated by others.

When accused of fraud by other physicians in Vienna, he went to Paris. In 1784, the King of France, Louis XVI, appointed a commission including Benjamin Franklin to look into Mesmer and his practices. They concluded that his results were due to nothing more than suggestion.

Despite condemnation by many of the educated elite, mesmerism became a popular fad in the salons of Europe. In order to serve the many poor people who came to him for help, he designed a sort of bathtub in which they could sit while holding the magnetic rods themselves. He eventually created an organization to train other mesmerists.

Mesmer died March 5, 1815 in Meersburg, also near Lake Constance, Germany.

An English physician, James Braid (1795-1860), a much more careful researcher of Mesmer’s phenomenon, termed it hypnotism. Disassociated from Mesmer, hypnotism would go on to have a long, if contraversial, life into the twentieth century.

10 | 115© Copyright 2006 C. George Boeree

Page 243: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Philippe Pinel

Philippe Pinel was born on April 20, 1745, in the small town of Saint André. His father was both a barber and a surgeon, a common combination in those days, as both vocations required a steady hand with the razor. His mother was also from a long line of physicians.

Philippe began his studies more interested in literature – especially Jean-Jacques Rousseau – than in medicine. But, after a few years studying theology, he began the study of medicine, and he recieved his MD from university at Toulouse in 1773.

Pinel moved to Montpellier in 1774 where he tutored wealthy students in anatomy and mathematics. He was admited into the Montpellier Société Royale des Sciences after presenting two papers on the use of mathematics in anatomical studies. He moved to Paris in 1778, where he came into contact with a number of the renowned scientists and philosophers of the day (including Ben Franklin), as well as becoming familiar with the radical new ideas of John Locke and the French sensationalists. Although he could not practice in

Paris, he became a well respected medical writer, particularly known for his careful and exhaustive case studies.

A turning point in Pinel's life came in 1785, when a friend of his developed a mental illness ending in his death. He became devoted to the study of mental illness, and became the head of the Paris asylum for insane men at Bicêtre in 1792. In that year, he also married Jeanne Vincent, with whom he had three sons.

It was at Bicêtre that he made his place in history: Prior to his coming to Bicêtre, the men were kept in chains, treated abominably, and put on daily display to the public as curiosities. In 1793, Pinel instituted a new program of human care, which he referred to as moral therapy. The men were given clean, comfortable accommodations, and were instructed in simple but productive work.

In 1795, he was appointed the head physician at the world famous hospital at Salpêtrière. Here, too, he provided his enlightened treatment conditions to the mentally ill. In that same year, he was made professor of medical pathology at Paris. In 1801, Phillipe Pinel introduced the first textbook on moral therapy to the world.

Pinel is also remembered for dismissing the demonic possession theory of mental illness for once and for all, and for eliminating treatments such as bleeding from his hospital. He also introduced other novelties to his hospital, such as vaccinations and the use of the stethoscope. He was a physician to Napoleon and was made a knight of the Legion d'Honneur in 1804. He died in Paris on October 25, 1826.

Pinel's innovations were soon imitated in other countries, by such notable as William Tuke in England, Vincenzo Chiarugi in Florence, and Dorothea Dix in the U.S.

Jean-Martin Charcot

Jean-Martin Charcot was born in Paris on November 29, 1825. He received his MD at the University of Paris in 1853. In 1860 he became a professor at his alma mater. Two years later, he began to work at Salpêtrière Hospital as well. In 1882, he opened a neurological clinic at Salpêtrière Hospital. It, and he, became known throughout Europe, and students came from everywhere to study the new field. Among them were Alfred Binet and a young Sigmund Freud.

11 | 115© Copyright 2006 C. George Boeree

Page 244: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Charcot is well known in medical circles for his studies of the neurology of motor disorders, resulting diseases, aneurysms, and localization of brain functions. He is considered the father of modern neurology as well as the person who first diagnosed of Multiple Sclerosis

In psychology, he is best known for his use of hypnosis to successfully treating women suffering from the psychological disorder then known as hysteria. Now called conversion disorder, hysteria involved a loss of some physiological function such as vision, speech, tactile sensations, movement,

etc., that was nonetheless not based in actual neurological damage.

Charcot believed that hysteria was due to a congenitally weak nervous system, combined with the effects of some traumatic experience. Hypnotizing these patients brought on a state similar to hysteria itself. He found that, in some cases, the symptoms would actually lessen after hypnosis – although he was only interested in studying hysteria, not in curing it! Others would later use hypnosis as a part of curing the problem.

Charcot died in Morvan, France, on August 16, 1893. The stamp bearing his image is from the web site of Michael Jacobson, MD, at http://www.journalclub.org/stamps/.

The Unconscious

Before we turn to the really big names, let's take a peek at the concept of the unconscious, so strongly associated with psychoanalysis. Most historians agree that the first mention of such a concept was Leibniz's discussion of "petite perceptions" or little perceptions. By this he meant certain very low-level stimuli that could enter the mind without the person's awareness – what today we would call subliminal messages. The reality of such things is very much in doubt.

Johann Friedrich Herbart (1776-1841) was the author of a texbook on psychology, published in 1816. But, following Kant, he did not believe psychology could ever be a science. He took the concepts of the associationists and blended them with the dynamics of Leibniz's monads. Ideas had an energy of their own, he said, and could actually force themselves on the person's conscious mind by exceeding a certain threshold. When ideas were incompatable, one or the other would be repressed, he said – meaning forced below the threshold into the unconscious. This should remind you of Freud's ideas – except that Herbart had them nearly a century earlier!

Schopenhauer is often seen as the originator of the unconscious, and he spoke at great lengths about instincts and the irrational nature of man, and freely made use of words like repression, resistance, and sublimation! Nietzsche also spoke of the unconscious: One of his most famous statements is "My memory says I did it. My pride says I could not have done that. In the end, my memory yields."

One more pre-Freudian should be mentioned: Karl Eduard von Hartmann (1842-1906). He blended the ideas of Schopenhauer with Jewish mysticism (the kaballah) and wrote Philosophy of the Unconscious in 1869, just in time to influence a young neurologist name Sigmund Freud.

The reader should understand that there are many theorists with little or no use for the concept of the unconscious. Brentano, forefather of phenomenology and existentialism, did not believe in it. Neither did William James. Neither did the Gestalt psychologists. Memories, for example, can be understood as stored in some physical state, perhaps as traces in the brain. When activated, we remember – but they aren't in the mind – conscious or unconscious – until so activated.

In addition to the concept of the unconscious, another early landmark of psychiatry was the introduction of

12 | 115© Copyright 2006 C. George Boeree

Page 245: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

careful diagnosis of mental illness, beginning with Emil Kraepelin's work (1856-1926). The first differentiated classification was of what he labelled dementia praecox, which meant the insanity of adolescence. Kraepelin also invented the terms neurosis and psychosis, and named Alzheimer's disease after Alois Alzheimer, who first described it. I should also mention Eugen Bleuler, who coined the term schizophrenia to replace dementia praecox in 1911.

Now, on to Freud....

Sigmund Freud

Freud's story, like most people's stories, begins with others. In his case those others were his mentor and friend, Dr. Joseph Breuer, and Breuer's patient, called Anna O.

Anna O. was Joseph Breuer's patient from 1880 through 1882. Twenty one years old, Anna spent most of her time nursing her ailing father. She developed a bad cough that proved to have no physical basis. She developed some speech difficulties, then became mute, and then began speaking only in English, rather than her usual German.

When her father died she began to refuse food, and developed an unusual set of problems. She lost the feeling in her hands and feet, developed some paralysis, and began to have involuntary spasms. She also had visual hallucinations and tunnel vision. But when specialists were consulted, no physical causes for these problems could be found.

If all this weren't enough, she had fairy-tale fantasies, dramatic mood swings, and made several suicide attempts. Breuer's diagnosis was that she was suffering from what was then called hysteria (now called conversion disorder), which meant she had symptoms that appeared to be physical, but were not.

In the evenings, Anna would sink into states of what Breuer called "spontaneous hypnosis," or what Anna herself called "clouds." Breuer found that, during these trance-like states, she could explain her day-time fantasies and other experiences, and she felt better afterwards. Anna called these episodes "chimney sweeping" and "the talking cure."

Sometimes during "chimney sweeping," some emotional event was recalled that gave meaning to some particular symptom. The first example came soon after she had refused to drink for a while: She recalled seeing a woman drink from a glass that a dog had just drunk from. While recalling this, she experienced strong feelings of disgust...and then had a drink of water! In other words, her symptom – an avoidance of water – disappeared as soon as she remembered its root event, and experienced the strong emotion that would be appropriate to that event. Breuer called this catharsis, from the Greek word for cleansing.

It was eleven years later that Breuer and his assistant, Sigmund Freud, wrote a book on hysteria. In it they explained their theory: Every hysteria is the result of a traumatic experience, one that cannot be integrated into the person's understanding of the world. The emotions appropriate to the trauma are not expressed in any direct fashion, but do not simply evaporate: They express themselves in behaviors that in a weak, vague way offer a response to the trauma. These symptoms are, in other words, meaningful. When the client can be made aware of the meanings of his or her symptoms (through hypnosis, for example) then the unexpressed emotions are released and so no longer need to express themselves as symptoms. It is analogous to lancing a boil or draining an infection.

In this way, Anna got rid of symptom after symptom. But it must be noted that she needed Breuer to do this:

13 | 115© Copyright 2006 C. George Boeree

Page 246: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Whenever she was in one of her hypnotic states, she had to feel his hands to make sure it was him before talking! And sadly, new problems continued to arise.

According to Freud, Breuer recognized that she had fallen in love with him, and that he was falling in love with her. Plus, she was telling everyone she was pregnant with his child. You might say she wanted it so badly that her mind told her body it was true, and she developed an hysterical pregnancy. Breuer, a married man in a Victorian era, abruptly ended their sessions together, and lost all interest in hysteria. Please understand that recent research suggests that many of these events, including the hysterical pregnancy and Breuer's quick retreat, were probably Freud's "elaborations" on reality!

It was Freud who would later add what Breuer did not acknowledge publicly – that secret sexual desires lay at the bottom of all these hysterical neuroses.

To finish her story, Anna spent time in a sanatorium. Later, she became a well-respected and active figure – the first social worker in Germany – under her true name, Bertha Pappenheim. She died in 1936. She will be remembered, not only for her own accomplishments, but as the inspiration for the most influential personality theory we have ever had.

Biography

Sigmund Freud was born May 6, 1856, in a small town – Freiberg – in Moravia. His father was a wool merchant with a keen mind and a good sense of humor. His mother was a lively woman, her husband's second wife and 20 years younger. She was 21 years old when she gave birth to her first son, her darling, Sigmund. Sigmund had two older half-brothers and six younger siblings. When he was four or five – he wasn't sure – the family moved to Vienna, where he lived most of his life.

A brilliant child, always at the head of his class, he went to medical school, one of the few viable options for a bright Jewish boy in Vienna those days. There, he became involved in research under the direction of a physiology professor named Ernst Brücke. Brücke believed in what was then a popular, if radical, notion, which we now call reductionism: "No other forces than the common physical-chemical ones are active within the organism." Freud would spend many years trying to "reduce" personality to neurology, a cause he later gave up on.

Freud was very good at his research, concentrating on neurophysiology, even inventing a special cell-staining technique. But only a limited number of positions were available, and there were others ahead of him. Brücke helped him to get a grant to study, first with the great psychiatrist Charcot in Paris, then with his rival Bernheim in Nancy. Both these gentlemen were investigating the use of hypnosis with hysterics.

After spending a short time as a resident in neurology and director of a children's ward in Berlin, he came back to Vienna, married his patient fiancée Martha Bernays, and set up a practice in neuropsychiatry, with the help of Joseph Breuer.

Freud's books and lectures brought him both fame and ostracism from the mainstream of the medical community. He drew around him a number of very

bright sympathizers who became the core of the psychoanalytic movement. Unfortunately, Freud had a penchant for rejecting people who did not totally agree with him. Some separated from him on friendly terms; others did not, and went on to found competing schools of thought.

Freud emigrated to England just before World War II when Vienna became an increasing dangerous place for Jews, especially ones as famous as Freud. Not long afterward, he died of the cancer of the mouth and jaw that he had suffered from for the last 20 years of his life.

14 | 115© Copyright 2006 C. George Boeree

Page 247: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Theory

Freud didn't exactly invent the idea of the conscious versus unconscious mind, but he certainly was responsible for making it popular. The conscious mind is what you are aware of at any particular moment, your present perceptions, memories, thoughts, fantasies, feelings, what have you. Working closely with the conscious mind is what Freud called the preconscious, what we might today call "available memory:" anything that can easily be made conscious, the memories you are not at the moment thinking about but can readily bring to mind. Now no-one has a problem with these two layers of mind. But Freud suggested that these are the smallest parts!

The largest part by far is the unconscious. It includes all the things that are not easily available to awareness, including many things that have their origins there, such as our drives or instincts, and things that are put there because we can't bear to look at them, such as the memories and emotions associated with trauma.

According to Freud, the unconscious is the source of our motivations, whether they be simple desires for food or sex, neurotic compulsions, or the motives of an artist or scientist. And yet, we are often driven to deny or resist becoming conscious of these motives, and they are often available to us only in disguised form. We will come back to this.

The id, the ego, and the superego

Freudian psychological reality begins with the world, full of objects. Among them is a very special object, the organism. The organism is special in that it acts to survive and reproduce, and it is guided toward those ends by its needs – hunger, thirst, the avoidance of pain, and sex.

A part – a very important part – of the organism is the nervous system, which has as one its characteristics a sensitivity to the organism's needs. At birth, that nervous system is little more than that of any other animal, an "it" or id. The nervous system, as id, translates the organism's needs into motivational forces called, in German, Trieben, which has been translated as instincts or drives. Freud also called them wishes. This translation from need to wish is called the primary process.

The id works in keeping with the pleasure principle, which can be understood as a demand to take care of needs immediately. Just picture the hungry infant, screaming itself blue. It doesn't "know" what it wants in any adult sense; it just knows that it wants it and it wants it now. The infant, in the Freudian view, is pure, or nearly pure id. And the id is nothing if not the psychic representative of biology.

Unfortunately, although a wish for food, such as the image of a juicy steak, might be enough to satisfy the id, it isn't enough to satisfy the organism. The need only gets stronger, and the wishes just keep coming. You may have noticed that, when you haven't satisfied some need, such as the need for food, it begins to demand more and more of your attention, until there comes a point where you can't think of anything else. This is the

15 | 115© Copyright 2006 C. George Boeree

Page 248: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

wish or drive breaking into consciousness.

Luckily for the organism, there is that small portion of the mind we discussed before, the conscious, that is hooked up to the world through the senses. Around this little bit of consciousness, during the first year of a child's life, some of the "it" becomes "I," some of the id becomes ego. The ego relates the organism to reality by means of its consciousness, and it searches for objects to satisfy the wishes that id creates to represent the organisms needs. This problem-solving activity is called the secondary process.

The ego, unlike the id, functions according to the reality principle, which says "take care of a need as soon as an appropriate object is found." It represents reality and, to a considerable extent, reason.

However, as the ego struggles to keep the id (and, ultimately, the organism) happy, it meets with obstacles in the world. It occasionally meets with objects that actually assist it in attaining its goals. And it keeps a record of these obstacles and aides. In particular, it keeps track of the rewards and punishments meted out by two of the most influential objects in the world of the child – mom and dad. This record of things to avoid and strategies to take becomes the superego. It is not completed until about seven years of age. In some people, it never is completed.

There are two aspects to the superego: One is the conscience, which is an internalization of punishments and warnings. The other is called the ego ideal. It derives from rewards and positive models presented to the child. The conscience and ego ideal communicate their requirements to the ego with feelings like pride, shame, and guilt.

It is as if we acquired, in childhood, a new set of needs and accompanying wishes, this time of social rather than biological origins. Unfortunately, these new wishes can easily conflict with the ones from the id. You see, the superego represents society, and society often wants nothing better than to have you never satisfy your needs at all!

The stages

Freud noted that, at different times in our lives, different parts of our skin give us greatest pleasure. Later theorists would call these areas erogenous zones. It appeared to Freud that the infant found its greatest pleasure in sucking, especially at the breast. In fact, babies have a penchant for bringing nearly everything in their environment into contact with their mouths. A bit later in life, the child focuses on the anal pleasures of holding it in and letting go. By three or four, the child may have discovered the pleasure of touching or rubbing against his or her genitalia. Only later, in our sexual maturity, do we find our greatest pleasure in sexual intercourse. In these observations, Freud had the makings of a psychosexual stage theory.

The oral stage lasts from birth to about 18 months. The focus of pleasure is, of course, the mouth. Sucking and biting are favorite activities.

The anal stage lasts from about 18 months to three or four years old. The focus of pleasure is the anus. Holding it in and letting it go are greatly enjoyed.

The phallic stage lasts from three or four to five, six, or seven years old. The focus of pleasure is the genitalia. Masturbation is common.

The latent stage lasts from five, six, or seven to puberty, that is, somewhere around 12 years old. During this stage, Freud believed that the sexual impulse was suppressed in the service of learning. I must note that, while most children seem to be fairly calm, sexually, during their grammar school years, perhaps up to a quarter of them are quite busy masturbating and playing "doctor." In Freud's repressive era, these children were, at least, quieter than their modern counterparts.

The genital stage begins at puberty, and represents the resurgence of the sex drive in adolescence, and the more specific focusing of pleasure in sexual intercourse. Freud felt that masturbation, oral sex, homosexuality, and many other things we find acceptable in adulthood today, were immature.

16 | 115© Copyright 2006 C. George Boeree

Page 249: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

This is a true stage theory, meaning that Freudians believe that we all go through these stages, in this order, and pretty close to these ages.

The Oedipal crisis

Each stage has certain difficult tasks associated with it where problems are more likely to arise. For the oral stage, this is weaning. For the anal stage, it's potty training. For the phallic stage, it is the Oedipal crisis, named after the ancient Greek story of king Oedipus, who inadvertently killed his father and married his mother.

Here's how the Oedipal crisis works: The first love-object for all of us is our mother. We want her attention, we want her affection, we want her caresses, we want her, in a broadly sexual way. The young boy, however, has a rival for his mother's charms: his father! His father is bigger, stronger, smarter, and he gets to sleep with mother, while junior pines away in his lonely little bed. Dad is the enemy.

About the time the little boy recognizes this archetypal situation, he has become aware of some of the more subtle differences between boys and girls, the ones other than hair length and clothing styles. From his naive perspective, the difference is that he has a penis, and girls do not. At this point in life, it seems to the child that having something is infinitely better than not having something, and so he is pleased with this state of affairs.

But the question arises: where is the girl's penis? Perhaps she has lost it somehow. Perhaps it was cut off. Perhaps this could happen to him! This is the beginning of castration anxiety, a slight misnomer for the fear of losing one's penis.

To return to the story, the boy, recognizing his father's superiority and fearing for his penis, engages some of his ego defenses: He displaces his sexual impulses from his mother to girls and, later, women; And he identifies with the aggressor, dad, and attempts to become more and more like him, that is to say, a man. After a few years of latency, he enters adolescence and the world of mature heterosexuality.

The girl also begins her life in love with her mother, so we have the problem of getting her to switch her affections to her father before the Oedipal process can take place. Freud accomplishes this with the idea of penis envy: The young girl, too, has noticed the difference between boys and girls and feels that she, somehow, doesn't measure up. She would like to have one, too, and all the power associated with it. At very least, she would like a penis substitute, such as a baby. As every child knows, you need a father as well as a mother to have a baby, so the young girl sets her sights on dad.

Dad, of course, is already taken. The young girl displaces from him to boys and men, and identifies with mom, the woman who got the man she really wanted. Note that one thing is missing here: The girl does not suffer from the powerful motivation of castration anxiety, since she cannot lose what she doesn't have. Freud felt that the lack of this great fear accounts for fact (as he saw it) that women were both less firmly heterosexual than men and somewhat less morally-inclined.

Before you get too upset by this less-than-flattering account of women's sexuality, rest assured that many people have responded to it. I will discuss it in the discussion section.

Therapy

Freud's therapy has been more influential than any other, and more influential than any other part of his theory. Here are some of the major points:

Relaxed atmosphere. The client must feel free to express anything. The therapy situation is in fact a unique social situation, one where you do not have to be afraid of social judgment or ostracism. In fact, in Freudian therapy, the therapist practically disappears. Add to that the physically relaxing couch, dim lights, sound-proof walls, and the stage is set.

17 | 115© Copyright 2006 C. George Boeree

Page 250: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Free association. The client may talk about anything at all. The theory is that, with relaxation, the unconscious conflicts will inevitably drift to the fore. It isn't far off to see a similarity between Freudian therapy and dreaming! However, in therapy, there is the therapist, who is trained to recognize certain clues to problems and their solutions that the client would overlook.

Resistance. One of these clues is resistance. When a client tries to change the topic, draws a complete blank, falls asleep, comes in late, or skips an appointment altogether, the therapist says "aha!" These resistances suggest that the client is nearing something in his free associations that he – unconsciously, of course – finds threatening.

Dream analysis. In sleep, we are somewhat less resistant to our unconscious and we will allow a few things, in symbolic form, of course, to come to awareness. These wishes from the id provide the therapist and client with more clues. Many forms of therapy make use of the client's dreams, but Freudian interpretation is distinct in the tendency to find sexual meanings.

Parapraxes. A parapraxis is a slip of the tongue, often called a Freudian slip. Freud felt that they were also clues to unconscious conflicts. Freud was also interested in the jokes his clients told. In fact, Freud felt that almost everything meant something almost all the time – dialing a wrong number, making a wrong turn, misspelling a word, were serious objects of study for Freud. However, he himself noted, in response to a student who asked what his cigar might be a symbol for, that "sometimes a cigar is just a cigar." Or is it?

Other Freudians became interested in projective tests, such as the famous Rorschach or inkblot tests. The theory behind these test is that, when the stimulus is vague, the client fills it with his or her own unconscious themes. Again, these could provide the therapist with clues.

Transference, catharsis, and insight

Transference occurs when a client projects feelings toward the therapist that more legitimately belong with certain important others. Freud felt that transference was necessary in therapy in order to bring the repressed emotions that have been plaguing the client for so long, to the surface. You can't feel really angry, for example, without a real person to be angry at. The relationship between the client and the therapist, contrary to popular images, is very close in Freudian therapy, although it is understood that it can't get out of hand.

Catharsis is the sudden and dramatic outpouring of emotion that occurs when the trauma is resurrected. The box of tissues on the end table is not there for decoration.

Insight is being aware of the source of the emotion, of the original traumatic event. The major portion of the therapy is completed when catharsis and insight are experienced. What should have happened many years ago – because you were too little to deal with it, or under too many conflicting pressures – has now happened, and you are on your way to becoming a happier person.

Freud said that the goal of therapy is simply " to make the unconscious conscious."

Discussion

The only thing more common than a blind admiration for Freud seems to be an equally blind hatred for him. Certainly, the proper attitude lies somewhere in between. Let's start by exploring some of the apparent flaws in his theory.

The least popular part of Freud's theory is the Oedipal complex and the associated ideas of castration anxiety and penis envy. What is the reality behind these concepts? It is true that some children are very attached to their opposite sex parent, and very competitive with their same-sex parent. It is true that some boys worry about the differences between boys and girls, and fear that someone may cut their penis off. It is true that some girls likewise are concerned, and wish they had a penis. And it is true that some of these children retain these affections, fears, and aspirations into adulthood.

18 | 115© Copyright 2006 C. George Boeree

Page 251: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Most personality theorists, however, consider these examples aberrations rather than universals, exceptions rather than rules. They occur in families that aren't working as well as they should, where parents are unhappy with each other, use their children against each other. They occur in families where parents literally denigrate girls for their supposed lack, and talk about cutting off the penises of unruly boys. They occur especially in neighborhoods where correct information on even he simplest sexual facts is not forthcoming, and children learn mistaken ideas from other children.

If we view the Oedipal crisis, castration anxiety, and penis envy in a more metaphoric and less literal fashion, they are useful concepts: We do love our mothers and fathers as well as compete with them. Children probably do learn the standard heterosexual behavior patterns by imitating the same-sex parent and practicing on the opposite-sex parent. In a male-dominated society, having a penis – being male – is better than not, and losing one's status as a male is scary. And wanting the privileges of the male, rather than the male organ, is a reasonable thing to expect in a girl with aspirations. But Freud did not mean for us to take these concepts metaphorically. Some of his followers, however, did.

Sexuality

A more general criticism of Freud's theory is its emphasis on sexuality. Everything, both good and bad, seems to stem from the expression or repression of the sex drive. Many people question that, and wonder if there are any other forces at work. Freud himself later added the death instinct, but that proved to be another one of his less popular ideas.

First let me point out that, in fact, a great deal of our activities are in some fashion motivated by sex. If you take a good hard look at our modern society, you will find that most advertising uses sexual images, that movies and television programs often don't sell well if they don't include some titillation, that the fashion industry is based on a continual game of sexual hide-and-seek, and that we all spend a considerable portion of every day playing "the mating game." Yet we still don't feel that all life is sexual.

But Freud's emphasis on sexuality was not based on the great amount of obvious sexuality in his society – it was based on the intense avoidance of sexuality, especially among the middle and upper classes, and most especially among women. What we too easily forget is that the world has changed rather dramatically over the last hundred years. We forget that doctors and ministers recommended strong punishment for masturbation, that "leg" was a dirty word, that a woman who felt sexual desire was automatically considered a potential prostitute, that a bride was often taken completely by surprise by the events of the wedding night, and could well faint at the thought.

It is to Freud's credit that he managed to rise above his culture's sexual attitudes. Even his mentor Breuer and the brilliant Charcot couldn't fully acknowledge the sexual nature of their clients' problems. Freud's mistake was more a matter of generalizing too far, and not taking cultural change into account. It is ironic that much of the cultural change in sexual attitudes was in fact due to Freud's work!

The unconscious

One last concept that is often criticized is the unconscious. It is not argued that something like the unconscious accounts for some of our behavior, but rather how much and the exact nature of the beast.

Behaviorists, humanists, and existentialists all believe that (a) the motivations and problems that can be attributed to the unconscious are much fewer than Freud thought, and (b) the unconscious is not the great churning cauldron of activity he made it out to be. Most psychologists today see the unconscious as whatever we don't need or don't want to see. Some theorists don't use the concept at all.

On the other hand, at least one theorist, Carl Jung, proposed an unconscious that makes Freud's look puny! But we will leave all these views for the appropriate chapters.

19 | 115© Copyright 2006 C. George Boeree

Page 252: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Positive aspects

People have the unfortunate tendency to "throw the baby out with the bath water." If they don't agree with ideas a, b, and c, they figure x, y, and z must be wrong as well. But Freud had quite a few good ideas, so good that they have been incorporated into many other theories, to the point where we forget to give him credit.

First, Freud made us aware of two powerful forces and their demands on us. Back when everyone believed people were basically rational, he showed how much of our behavior was based on biology. When everyone conceived of people as individually responsible for their actions, he showed the impact of society. When everyone thought of male and female as roles determined by nature or God, he showed how much they depended on family dynamics. The id and the superego – the psychic manifestations of biology and society – will always be with us in some form or another.

Second is the basic theory, going back to Breuer, of certain neurotic symptoms as caused by psychological traumas. Although most theorists no longer believe that all neurosis can be so explained, or that it is necessary to relive the trauma to get better, it has become a common understanding that a childhood full of neglect, abuse, and tragedy tends to lead to an unhappy adult.

Third is the idea of ego defenses. Even if you are uncomfortable with Freud's idea of the unconscious, it is clear that we engage in little manipulations of reality and our memories of that reality to suit our own needs, especially when those needs are strong. I would recommend that you learn to recognize these defenses: You will find that having names for them will help you to notice them in yourself and others!

Finally, the basic form of therapy has been largely set by Freud. Except for some behaviorist therapies, most therapy is still "the talking cure," and still involves a physically and socially relaxed atmosphere. And, even if other theorists do not care for the idea of transference, the highly personal nature of the therapeutic relationship is generally accepted as important to success.

Some of Freud's ideas are clearly tied to his culture and era. Other ideas are not easily testable. Some may even be a matter of Freud's own personality and experiences. But Freud was an excellent observer of the human condition, and enough of what he said has relevance today that he will be a part of personality textbooks for years to come. Even when theorists come up with dramatically different ideas about how we work, they compare their ideas with Freud's.

Carl Jung

Freud said that the goal of therapy was to make the unconscious conscious. He certainly made that the goal of his work as a theorist. And yet he makes the unconscious sound very unpleasant, to say the least: It is a cauldron of seething desires, a bottomless pit of perverse and incestuous cravings, a burial ground for frightening experiences which nevertheless come back to haunt us. Frankly, it doesn't sound like anything I'd like to make conscious!

A younger colleague of his, Carl Jung, was to make the exploration of this "inner space" his life's work. He went equipped with a background in Freudian theory, of course, and with an apparently inexhaustible knowledge of mythology, religion, and philosophy. Jung was especially knowledgeable in the symbolism of complex mystical traditions such as Gnosticism, Alchemy, Kabala, and similar traditions in Hinduism and Buddhism. If anyone could make sense of the unconscious and its habit of revealing itself only in symbolic form, it would be Carl Jung.

He had, in addition, a capacity for very lucid dreaming and occasional visions. In the fall of 1913, he had a vision of a "monstrous flood" engulfing most of Europe and lapping at the mountains of his native Switzerland. He saw thousands of people drowning and civilization crumbling. Then, the waters turned into

20 | 115© Copyright 2006 C. George Boeree

Page 253: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

blood. This vision was followed, in the next few weeks, by dreams of eternal winters and rivers of blood. He was afraid that he was becoming psychotic.

But on August 1 of that year, World War I began. Jung felt that there had been a connection, somehow, between himself as an individual and humanity in general that could not be explained away. From then until 1928, he was to go through a rather painful process of self-exploration that formed the basis of all of his later theorizing.

He carefully recorded his dreams, fantasies, and visions, and drew, painted, and sculpted them as well. He found that his experiences tended to form themselves into persons, beginning with a wise old man and his companion, a little girl. The wise old man evolved, over a number of dreams, into a sort of spiritual guru. The little girl became "anima," the feminine soul, who served as his main medium of communication with the deeper aspects of his unconscious.

A leathery brown dwarf would show up guarding the entrance to the unconscious. He was "the shadow," a primitive companion for Jung's ego. Jung dreamt that he and the dwarf killed a beautiful blond youth, whom he called Siegfried. For Jung, this represented a warning about the dangers of the worship of glory and heroism which would soon cause so much sorrow all over Europe – and a warning about the dangers of some of his own tendencies towards hero-worship, of Sigmund Freud!

Jung dreamt a great deal about the dead, the land of the dead, and the rising of the dead. These represented the unconscious itself – not the "little" personal unconscious that Freud made such a big deal out of, but a new collective unconscious of humanity itself, an unconscious that could contain all the dead, not just our personal ghosts. Jung began to see the mentally ill as people who are haunted by these ghosts, in an age where no-one is supposed to even believe in them. If we could only recapture our mythologies, we would understand these ghosts, become comfortable with the dead, and heal our mental illnesses.

Critics have suggested that Jung was, very simply, ill himself when all this happened. But Jung felt that, if you want to understand the jungle, you can't be content just to sail back and forth near the shore. You've got to get into it, no matter how strange and frightening it might seem.

Biography

Carl Gustav Jung was born July 26, 1875, in the small Swiss village of Kessewil. His father was Paul Jung, a country parson, and his mother was Emilie Preiswerk Jung. He was surrounded by a fairly well educated extended family, including quite a few clergymen and some eccentrics as well.

The elder Jung started Carl on Latin when he was six years old, beginning a long interest in language and literature – especially ancient literature. Besides most modern western European languages, Jung could read several ancient ones, including Sanskrit, the language of the original Hindu holy books.

Carl was a rather solitary adolescent, who didn't care much for school, and especially couldn't take competition. He went to boarding school in Basel, Switzerland, where he found himself the object of a lot of jealous harassment. He began to use sickness as an excuse, developing an embarrassing tendency to faint under pressure.

Although his first career choice was archeology, he went on to study medicine at the University of Basel. While working under the famous neurologist Krafft-Ebing, he settled on psychiatry as his career.

After graduating, he took a position at the Burghoeltzli Mental Hospital in Zurich under Eugene Bleuler, an expert on (and the namer of) schizophrenia. In 1903, he married Emma Rauschenbach. He also taught classes at the University of Zurich, had a private practice, and invented word

21 | 115© Copyright 2006 C. George Boeree

Page 254: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

association at this time!

Long an admirer of Freud, he met him in Vienna in 1907. The story goes that after they met, Freud canceled all his appointments for the day, and they talked for 13 hours straight, such was the impact of the meeting of these two great minds! Freud eventually came to see Jung as the crown prince of psychoanalysis and his heir apparent.

But Jung had never been entirely sold on Freud's theory. Their relationship began to cool in 1909, during a trip to America. They were entertaining themselves by analyzing each others' dreams (more fun, apparently, than shuffleboard), when Freud seemed to show an excess of resistance to Jung's efforts at analysis. Freud finally said that they'd have to stop because he was afraid he would lose his authority! Jung felt rather insulted.

World War I was a painful period of self-examination for Jung. It was, however, also the beginning of one of the most interesting theories of personality the world has ever seen.

After the war, Jung traveled widely, visiting, for example, tribal people in Africa, America, and India. He retired in 1946, and began to retreat from public attention after his wife died in 1955. He died on June 6, 1961, in Zurich.

Ego, personal unconcious, and collective unconscious

Jung's theory divides the psyche into three parts. The first is the ego,which Jung identifies with the conscious mind. Closely related is the personal unconscious, which includes anything which is not presently conscious, but can be. The personal unconscious is like most people's understanding of the unconscious in that it includes both memories that are easily brought to mind and those that have been suppressed for some reason. But it does not include the instincts that Freud would have it include.

But then Jung adds the part of the psyche that makes his theory stand out from all others: the collective unconscious. You could call it your "psychic inheritance." It is the reservoir of our experiences as a species, a kind of knowledge we are all born with. And yet we can never be directly conscious of it. It influences all of our experiences and behaviors, most especially the emotional ones, but we only know about it indirectly, by looking at those influences.

There are some experiences that show the effects of the collective unconscious more clearly than others: The experiences of love at first sight, of deja vu (the feeling that you've been here before), and the immediate recognition of certain symbols and the meanings of certain myths, could all be understood as the sudden conjunction of our outer reality and the inner reality of the collective unconscious. Grander examples are the creative experiences shared by artists and musicians all over the world and in all times, or the spiritual experiences of mystics of all religions, or the parallels in dreams, fantasies, mythologies, fairy tales, and literature.

A nice example that has been greatly discussed recently is the near-death experience. It seems that many people, of many different cultural backgrounds, find that they have very similar recollections when they are brought back from a close encounter with death. They speak of leaving their bodies, seeing their bodies and the events surrounding them clearly, of being pulled through a long tunnel towards a bright light, of seeing deceased relatives or religious figures waiting for them, and of their disappointment at having to leave this happy scene to return to their bodies. Perhaps we are all "built" to experience death in this fashion.

Archetypes

The contents of the collective unconscious are called archetypes. Jung also called them dominants, imagos, mythological or primordial images, and a few other names, but archetypes seems to have won out over these. An archetype is an unlearned tendency to experience things in a certain way.

22 | 115© Copyright 2006 C. George Boeree

Page 255: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

The archetype has no form of its own, but it acts as an "organizing principle" on the things we see or do. It works the way that instincts work in Freud's theory: At first, the baby just wants something to eat, without knowing what it wants. It has a rather indefinite yearning which, nevertheless, can be satisfied by some things and not by others. Later, with experience, the child begins to yearn for something more specific when it is hungry – a bottle, a cookie, a broiled lobster, a slice of New York style pizza.

The archetype is like a black hole in space: You only know its there by how it draws matter and light to itself.

The mother archetype

The mother archetype is a particularly good example. All of our ancestors had mothers. We have evolved in an environment that included a mother or mother-substitute. We would never have survived without our connection with a nurturing-one during our times as helpless infants. It stands to reason that we are "built" in a way that reflects that evolutionary environment: We come into this world ready to want mother, to seek her, to recognize her, to deal with her.

So the mother archetype is our built-in ability to recognize a certain relationship, that of "mothering." Jung says that this is rather abstract, and we are likely to project the archetype out into the world and onto a particular person, usually our own mothers. Even when an archetype doesn't have a particular real person available, we tend to personify the archetype, that is, turn it into a mythological "story-book" character. This character symbolizes the archetype.

The mother archetype is symbolized by the primordial mother or "earth mother" of mythology, by Eve and Mary in western traditions, and by less personal symbols such as the church, the nation, a forest, or the ocean. According to Jung, someone whose own mother failed to satisfy the demands of the archetype may well be one that spends his or her life seeking comfort in the church, or in identification with "the motherland," or in meditating upon the figure of Mary, or in a life at sea.

Of the more important archetypes, we have the shadow, which represents our animal ancestry and is often the locus of our concerns with evil and our own "dark side;" there's the anima, representing the female side of men, and the animus, representing the male side of women; and the persona, which is the surface self, that part of us we allow others to see.

Other archetypes include father, child, family, hero, maiden, animal, wise old man, the hermaphrodite, God, and the first man.

The self

The goal of life is to realize the self. The self is an archetype that represents the transcendence of all opposites, so that every aspect of your personality is expressed equally. You are then neither and both male and female, neither and both ego and shadow, neither and both good and bad, neither and both conscious and unconscious, neither and both an individual and the whole of creation. And yet, with no oppositions, there is no energy, and you cease to act. Of course, you no longer need to act.

To keep it from getting too mystical, think of it as a new center, a more balanced position, for your psyche. When you are young, you focus on the ego and worry about the trivialities of the persona. When you are older (assuming you have been developing as you should), you focus a little deeper, on the self, and become closer to all people, all life, even the universe itself. The self-realized person is actually less selfish.

The Myers-Briggs test

Katharine Briggs and her daughter Isabel Briggs Myers found Jung's ideas about people's personalities so compelling that they decided to develop a paper-and-pencil test. It came to be called the Myers-Briggs Type

23 | 115© Copyright 2006 C. George Boeree

Page 256: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Indicator, and is one of the most popular, and most studied, tests around.

On the basis of your answers on about 125 questions, you are placed in one of sixteen types, with the understanding that some people might find themselves somewhere between two or three types. What type you are says quite a bit about you – your likes and dislikes, your likely career choices, your compatibility with others, and so on. People tend to like it quite a bit. It has the unusual quality among personality tests of not being too judgmental: None of the types is terribly negative, nor are any overly positive. Rather than assessing how "crazy" you are, the "Myers-Briggs" simply opens up your personality for exploration.

The test has four scales. Extroversion - Introversion (E-I) is the most important. Test researchers have found that about 75 % of the population is extroverted.

The next one is Sensing - Intuiting (S-N), with about 75 % of the population sensing.

The next is Thinking - Feeling (T-F). Although these are distributed evenly through the population, researchers have found that two-thirds of men are thinkers, while two-thirds of women are feelers. This might seem like stereotyping, but keep in mind that feeling and thinking are both valued equally by Jungians, and that one-third of men are feelers and one-third of women are thinkers. Note, though, that society does value thinking and feeling differently, and that feeling men and thinking women often have difficulties dealing with people's stereotyped expectations.

The last is Judging - Perceiving (J-P), not one of Jung's original dimensions. Myers and Briggs included this one in order to help determine which of a person's functions is superior. Generally, judging people are more careful, perhaps inhibited, in their lives. Perceiving people tend to be more spontaneous, sometimes careless. If you are an extrovert and a "J," you are a thinker or feeler, whichever is stronger. Extroverted and "P" means you are a senser or intuiter. On the other hand, an introvert with a high "J" score will be a senser or intuiter, while an introvert with a high "P" score will be a thinker or feeler. J and P are equally distributed in the population.

Discussion

Quite a few people find that Jung has a great deal to say to them. They include writers, artists, musicians, film makers, theologians, clergy of all denominations, students of mythology, and, of course, some psychologists. Examples that come to mind are the mythologist Joseph Campbell, the film maker George Lucas, and the science fiction author Ursula K. Le Guin. Anyone interested in creativity, spirituality, psychic phenomena, the universal, and so on will find in Jung a kindred spirit.

But scientists, including most psychologists, have a lot of trouble with Jung. Not only does he fully support the teleological view (as do most personality theorists), but he goes a step further and talks about the mystical interconnectedness of synchronicity. Not only does he postulate an unconscious, where things are not easily available to the empirical eye, but he postulates a collective unconscious that never has been and never will be conscious.

In fact, Jung takes an approach that is essentially the reverse of the mainstream's reductionism: Jung begins with the highest levels – even spiritualism – and derives the lower levels of psychology and physiology from them.

Even psychologists who applaud his teleology and antireductionist position may not be comfortable with him. Like Freud, Jung tries to bring everything into his system. He has little room for chance, accident, or circumstances. Personality – and life in general – seems "over-explained" in Jung's theory.

I have found that his theory sometimes attracts students who have difficulty dealing with reality. When the world, especially the social world, becomes too difficult, some people retreat into fantasy. Some, for example, become couch potatoes. But others turn to complex ideologies that pretend to explain everything. Some get involved in Gnostic or Tantric religions, the kind that present intricate rosters of angels and demons and heavens and hells, and endlessly discuss symbols. Some go to Jung. There is nothing

24 | 115© Copyright 2006 C. George Boeree

Page 257: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

intrinsically wrong with this; but for someone who is out of touch with reality, this is hardly going to help.

These criticisms do not cut the foundation out from under Jung's theory. But they do suggest that some careful consideration is in order.

Alfred Adler

Alfred Adler was born in the suburbs of Vienna on February 7, 1870, the third child, second son, of a Jewish grain merchant and his wife. As a child, Alfred developed rickets, which kept him from walking until he was four years old. At five, he nearly died of pneumonia. It was at this age that he decided to be a physician.

Alfred was an average student and preferred playing outdoors to being cooped up in school. He was quite outgoing, popular, and active, and was known for his efforts at outdoing his older brother, Sigmund.

He received a medical degree from the University of Vienna in 1895. During his college years, he became attached to a group of socialist students, among which he found his wife-to-be, Raissa Timofeyewna Epstein. She was an intellectual and social activist who had come from Russia to study in Vienna. They married in 1897 and eventually had four children, two of whom became psychiatrists.

He began his medical career as an opthamologist, but he soon switched to general practice, and established his office in a lower-class part of

Vienna, across from the Prader, a combination amusement park and circus. His clients included circus people, and it has been suggested (Furtmuller, 1964) that the unusual strengths and weaknesses of the performers led to his insights into organ inferiorities and compensation.

He then turned to psychiatry, and in 1907 was invited to join Freud's discussion group. After writing papers on organic inferiority, which were quite compatible with Freud's views, he wrote, first, a paper concerning an aggression instinct, which Freud did not approve of, and then a paper on children's feelings of inferiority, which suggested that Freud's sexual notions be taken more metaphorically than literally.

Although Freud named Adler the president of the Viennese Analytic Society and the co-editor of the organization's newsletter, Adler didn't stop his criticism. A debate between Adler's supporters and Freud's was arranged, but it resulted in Adler, with nine other members of the organization, resigning to form the Society for Free Psychoanalysis in 1911. This organization became The Society for Individual Psychology in the following year.

During World War I, Adler served as a physician in the Austrian Army, first on the Russian front, and later in a children's hospital. He saw first hand the damage that war does, and his thought turned increasingly to he concept of social interest. He felt that if humanity was to survive, it had to change its ways!

After the war, he was involved in various projects, including clinics attached to state schools and the training of teachers. In 1926, he went to the United States to lecture, and he eventually accepted a visiting position at the Long Island College of Medicine. In 1934, he and his family left Vienna forever. On May 28, 1937, during a series of lectures at Aberdeen University, he died of a heart attack.

25 | 115© Copyright 2006 C. George Boeree

Page 258: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Striving

Alfred Adler postulates a single "drive" or motivating force behind all our behavior and experience. By the time his theory had gelled into its most mature form, he called that motivating force the striving for perfection. It is the desire we all have to fulfill our potentials, to come closer and closer to our ideal. It is, as many of you will already see, very similar to the more popular idea of self-actualization.

"Perfection" and "ideal" are troublesome words, though. On the one hand, they are very positive goals. Shouldn't we all be striving for the ideal? And yet, in psychology, they are often given a rather negative connotation. Perfection and ideals are, practically by definition, things you can't reach. Many people, in fact, live very sad and painful lives trying to be perfect! As you will see, other theorists, like Karen Horney and Carl Rogers, emphasize this problem. Adler talks about it, too. But he sees this negative kind of idealism as a perversion of the more positive understanding. We will return to this in a little while.

Striving for perfection was not the first phrase Adler used to refer to his single motivating force. His earliest phrase was the aggression drive, referring to the reaction we have when other drives, such as our need to eat, be sexually satisfied, get things done, or be loved, are frustrated. It might be better called the assertiveness drive, since we tend to think of aggression as physical and negative. But it was Adler's idea of the aggression drive that first caused friction between him and Freud. Freud was afraid that it would detract from the crucial position of the sex drive in psychoanalytic theory. Despite Freud's dislike for the idea, he himself introduced something very similar much later in his life: the death instinct.

Another word Adler used to refer to basic motivation was compensation, or striving to overcome. Since we all have problems, short-comings, inferiorities of one sort or another, Adler felt, earlier in his writing, that our personalities could be accounted for by the ways in which we do – or don't – compensate or overcome those problems. The idea still plays an important role in his theory, as you will see, but he rejected it as a label for the basic motive because it makes it sound as if it is your problems that cause you to be what you are.

One of Adler's earliest phrases was masculine protest. He noted something pretty obvious in his culture (and by no means absent from our own): Boys were held in higher esteem than girls. Boys wanted, often desperately, to be thought of as strong, aggressive, in control – i.e. "masculine" – and not weak, passive, or dependent – i.e. "feminine." The point, of course, was that men are somehow basically better than women. They do, after all, have the power, the education, and apparently the talent and motivation needed to do "great things," and women don't.

You can still hear this in the kinds of comments older people make about little boys and girls: If a baby boy fusses or demands to have his own way (masculine protest!), they will say he's a natural boy; If a little girl is quiet and shy, she is praised for her femininity; If, on the other hand, the boy is quiet and shy, they worry that he might grow up to be a sissy; Or if a girl is assertive and gets her way, they call her a "tomboy" and will try to reassure you that she'll grow out of it!

But Adler did not see men's assertiveness and success in the world as due to some innate superiority. He saw it as a reflection of the fact that boys are encouraged to be assertive in life, and girls are discouraged. Both boys and girls, however, begin life with the capacity for "protest!" Because so many people misunderstood him to mean that men are, innately, more assertive, lead him to limit his use of the phrase.

The last phrase he used, before switching to striving for perfection, was striving for superiority. His use of this phrase reflects one of the philosophical roots of his ideas: Friederich Nietzsche developed a philosophy that considered the will to power the basic motive of human life. Although striving for superiority does refer to the desire to be better, it also contains the idea that we want to be better than others, rather than better in our own right. Adler later tended to use striving for superiority more in reference to unhealthy or neurotic striving.

26 | 115© Copyright 2006 C. George Boeree

Page 259: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Life style

A lot of this playing with words reflects Adler's groping towards a really different kind of personality theory than that represented by Freud's. Freud' theory was what we nowadays would call a reductionistic one: He tried most of his life to get the concepts down to the physiological level. although he admitted failure in the end, life is nevertheless explained in terms of basic physiological needs. In addition, Freud tended to "carve up" the person into smaller theoretical concepts – the id, ego, and superego – as well.

Adler was influenced by the writings of Jan Smuts, the South African philosopher and statesman. Smuts felt that, in order to understand people, we have to understand them more as unified wholes than as a collection of bits and pieces, and we have to understand them in the context of their environment, both physical and social. This approach is called holism, and Adler took it very much to heart.

First, to reflect the idea that we should see people as wholes rather than parts, he decided to label his approach to psychology individual psychology. The word individual means literally "un-divided."

Second, instead of talking about a person's personality, with the traditional sense of internal traits, structures, dynamics, conflicts, and so on, he preferred to talk about style of life (nowadays, "lifestyle"). Life style refers to how you live your life, how you handle problems and interpersonal relations. Here's what he himself had to say about it: "The style of life of a tree is the individuality of a tree expressing itself and molding itself in an environment. We recognize a style when we see it against a background of an environment different from what we expect, for then we realize that every tree has a life pattern and is not merely a mechanical reaction to the environment."

Teleology

The last point – that lifestyle is "not merely a mechanical reaction" – is a second way in which Adler differs dramatically from Freud. For Freud, the things that happened in the past, such as early childhood trauma, determine what you are like in the present. Adler sees motivation as a matter of moving towards the future, rather than being driven, mechanistically, by the past. We are drawn towards our goals, our purposes, our ideals. This is called teleology.

Moving things from the past into the future has some dramatic effects. Since the future is not here yet, a teleological approach to motivation takes the necessity out of things. In a traditional mechanistic approach, cause leads to effect: If a, b, and c happen, then x, y, and z must, of necessity, happen. But you don't have to reach your goals or meet your ideals, and they can change along the way. Teleology acknowledges that life is hard and uncertain, but it always has room for change!

Another major influence on Adler's thinking was the philosopher Hans Vaihinger, who wrote a book called The Philosophy of "As If." Vaihinger believed that ultimate truth would always be beyond us, but that, for practical purposes, we need to create partial truths. His main interest was science, so he gave as examples such partial truths as protons an electrons, waves of light, gravity as distortion of space, and so on. Contrary to what many of us non-scientists tend to assume, these are not things that anyone has seen or proven to exist: They are useful constructs. They work for the moment, let us do science, and hopefully will lead to better, more useful constructs. We use them "as if" they were true. He called these partial truths fictions.

Vaihinger, and Adler, pointed out that we use these fictions in day to day living as well. We behave as if we knew the world would be here tomorrow, as if we were sure what good and bad are all about, as if everything we see is as we see it, and so on. Adler called this fictional finalism. You can understand the phrase most easily if you think about an example: Many people behave as if there were a heaven or a hell in their personal future. Of course, there may be a heaven or a hell, but most of us don't think of this as a proven fact. That makes it a "fiction" in Vaihinger's and Adler's sense of the word. And finalism refers to the teleology of it: The fiction lies in the future, and yet influences our behavior today.

Adler added that, at the center of each of our lifestyles, there sits one of these fictions, an important one about who we are and where we are going.

27 | 115© Copyright 2006 C. George Boeree

Page 260: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Discussion

Criticisms of Adler tend to involve the issue of whether or not, or to what degree, his theory is scientific. The mainstream of psychology today is experimentally oriented, which means, among other things, that the concepts a theory uses must be measurable and manipulable. This in turn means that an experimental orientation prefers physical or behavioral variables. Adler, as you saw, uses basic concepts that are far from physical and behavioral: Striving for perfection? How do you measure that? Or compensation? Or feelings of inferiority? Or social interest? The experimental method also makes a basic assumption: That all things operate in terms of cause and effect. Adler would certainly agree that physical things do so, but he would adamantly deny that people do! Instead, he takes the teleological route, that people are "determined" by their ideals, goals, values, "final fictions." Teleology takes the necessity out of things: A person doesn't have to respond a certain way to a certain circumstance; A person has choices to make; A person creates his or her own personality or lifestyle. From the experimental perspective, these things are illusions that a scientist, even a personality theorist, dare not give in to.

There would be many more psychiatrists and psychoanalysts and other therapists going by other titles. It is impossible to overemphasize the impact that these gentlemen, especially Freud himself, would have on psychology, and in particular on clinical psychology. Following and offering their own slants on the issues would be Anna Freud, Heinz Hartman, Erik Erikson, Otto Rank, Sandor Ferenczi, Karen Horney, Erich Fromm, Harry Stack Sullivan, Henry Murray, Gordon Allport, Gardner Murphy, George Kelly, Carl Rogers, Ludwig Binswanger, and many, many more.

28 | 115© Copyright 2006 C. George Boeree

Page 261: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Sigmund Freud Selection: "The Structure of the Unconscious" *

Conscious, Unconscious, Preconscious

The starting point for this investigation is provided by a fact without parallel, which defies all explanation or description–the fact of consciousness. Nevertheless, if anyone speaks of consciousness, we know immediately and from our own most personal experience what is meant by it. Many people, both inside and outside the science of psychology, are satisfied with the assumption that consciousness alone is mental, and nothing then remains for psychology but to discriminate in the phenomenology of the mind between perceptions, feelings, intellective processes and volitions. It is generally agreed, however, that these conscious processes do not form unbroken series which are complete in themselves; so that there is no alternative to assuming that there are physical or somatic processes which accompany the mental ones and which must admittedly be more complete than the mental series, since some of them have conscious processes parallel to them but others have not. It thus seems natural to lay the stress in psychology upon these somatic processes, to see in them the true essence of what is mental and to try to arrive at some other assessment of the conscious processes. The majority of philosophers, however, as well as many other people, dispute this position and declare that the notion of a mental thing being unconscious is self-contradictory.

But it is precisely this that psychoanalysis is obliged to assert, and this is its second fundamental hypothesis. It explains the supposed somatic accessory processes as being what is essentially mental and disregards for the moment the quality of consciousness....

We are soon led to make an important division in this unconscious. Some processes become conscious easily; they may then cease to be conscious, but can become conscious once more without any trouble: as people say, they can be reproduced or remembered. This reminds us that consciousness is in general a very highly fugitive condition. What is conscious is conscious only for a moment. If our perceptions do not confirm this, the contradiction is merely an apparent one. It is explained by the fact that the stimuli of perception can persist for some time so that in the course of it the perception of them can be repeated. The whole position can be clearly seen from the conscious perception of our intellective processes; it is true that these may persist, but they may just as easily pass in a flash. Everything unconscious that behaves in this way, that can easily exchange the unconscious condition for the conscious one, is therefore better described as "capable of entering consciousness," or as preconscious. Experience has taught us that there are hardly any mental processes, even of the most complicated kind, which cannot on occasion remain preconscious, although as a rule they press forward, as we say, into consciousness. There are other mental processes or mental material which have no such easy access to consciousness, but which must be inferred, discovered, and translated into conscious form in the manner that has been described. It is for such material that we reserve the name of the unconscious proper.

Thus we have attributed three qualities to mental processes: they are either conscious, preconscious, or unconscious. The division between the three classes of material which have these qualities is neither absolute nor permanent. What is preconscious becomes conscious, as we have seen, without any activity on our part; what is unconscious can, as a result of our efforts, be made conscious, though in the process we may have an impression that we are overcoming what are often very strong resistances. When we make an attempt of this kind upon someone else, we ought not to forget that the conscious filling up of the breaks in his perceptions–the construction which we are offering him–does not so far mean that we have made conscious in him the unconscious material in question. All that is so far true is that the material is present in his mind in two versions, first in the conscious reconstruction that he has just received and secondly in its original unconscious condition.

* From An Outline of Psychoanalysis [1940], translated by James Strachey. N.Y.: Norton. And from New Introductory Lectures on Psychoanalysis [1933], translated by W. J. H. Sprott. N.Y.: Norton.

29 | 115© Copyright 2006 C. George Boeree

Page 262: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Id, Ego, Super-ego

[The id is] . . . a chaos, a cauldron of seething excitement. We suppose that it is somewhere in direct contact with somatic processes, and takes over from them instinctual needs and gives them mental expression, but we cannot say in what substratum this contact is made. These instincts fill it with energy, but it has no organisation and no unified will, only an impulsion to obtain satisfaction for the instinctual needs, in accordance With the pleasure-principle. The laws of logic– above all, the law of contradiction–do not hold for processes in the id. Contradictory impulses exist side by side without neutralising each other or drawing apart; at most they combine in compromise formations under the overpowering economic pressure towards discharging their energy. There is nothing in the id which can be compared to negation, and we are astonished to find in it an exception to the philosophers' assertion that space and time are necessary forms of our mental acts. In the id there is nothing corresponding to the idea of time, no recognition of the passage of time, and (a thing which is very remarkable and awaits adequate attention in philosophic thought) no alteration of mental processes by the passage of time. Conative impulses which have never got beyond the id, and even impressions which have been pushed down into the id by repression, are virtually immortal and are preserved for whole decades as though they had only recently occurred. They can only be recognised as belonging to the past, deprived of their significance, and robbed of their charge of energy, after they have been made conscious by the work of analysis, and no small part of the therapeutic effect of analytic treatment rests upon this fact.

It is constantly being borne in upon me that we have made far too little use of our theory of the indubitable fact that the repressed remains unaltered by the passage of time. This seems to offers us the possibility of an approach to some really profound truths. But I myself have made no further progress here.

Naturally, the id knows no values, no good and evil, no morality. The economic, or, if you prefer, the quantitative factor, which is so closely bound up with the pleasure- principle, dominates all its processes. Instinctual cathexes seeking discharge,–that, in our view, is all that the id contains. It seems, indeed, as if the energy of these instinctual impulses is in a different condition from that in which it is found in the other regions of the mind. It must be far more fluid and more capable of being discharged, for otherwise we should not have those displacements and condensations, which are so characteristic of the id and which are so completely independent of the qualities of what is cathected....

As regards a characterization of the ego, in so far as it is to be distinguished from the id and the super-ego, we shall get on better if we turn our attention to the relation between it and the most superficial portion of the mental apparatus; which we call the Pcpt-cs (perceptual-conscious) system. This system is directed on to the external world, it mediates perceptions of it, and in it is generated, while it is functioning, the phenomenon of consciousness. It is the sense-organ of the whole apparatus, receptive, moreover, not only of excitations from without but also of such as proceed from the interior of the mind. One can hardly go wrong in regarding the ego as that part of the id which has been modified by its proximity to the external world and the influence that the latter has had on it, and which serves the purpose of receiving stimuli and protecting the organism from them, like the cortical layer with which a particle of living substance surrounds itself. This relation to the external world is decisive for the ego. The ego has taken over the task of representing the external world for the id, and so of saving it; for the id, blindly striving to gratify its instincts in complete disregard of the superior strength of outside forces, could not otherwise escape annihilation. In the fulfilment of this function, the ego has to observe the external world and preserve a true picture of it in the memory traces left by its perceptions, and, by means of the reality-test, it has to eliminate any element in this picture of the external world which is a contribution from internal sources of excitation. On behalf of the id, the ego controls the path of access to motility, but it interpolates between desire and action the procrastinating factor of thought, during which it makes use of the residues of experience stored up in memory. In this way it dethrones the pleasure- principle, which exerts undisputed sway over the processes in the id, and substitutes for it the reality-principle, which promises greater security and greater success.

The relation to time, too, which is so hard to describe, is communicated to the ego by the perceptual system;

30 | 115© Copyright 2006 C. George Boeree

Page 263: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

indeed it can hardly be doubted that the mode in which this system works is the source of the idea of time. What, however, especially marks the ego out in contradistinction to the id, is a tendency to synthesise its contents, to bring together and unify its mental processes which is entirely absent from the id. When we come to deal presently with the instincts in mental life, I hope we shall succeed in tracing this fundamental characteristic of the ego to its source. It is this alone that produces that high degree of organisation which the ego needs for its highest achievements. The ego advances from the function of perceiving instincts to that of controlling them, but the latter is only achieved through the mental representative of the instinct becoming subordinated to a larger organisation, and finding its place in a coherent unity. In popular language, we may say that the ego stands for reason and circumspection, while the id stands for the untamed passions....

The proverb tells us that one cannot serve two masters at once. The poor ego has a still harder time of it; it has to serve three harsh masters, and has to do its best to reconcile the claims and demands of all three. These demands are always divergent and often seem quite incompatible; no wonder that the ego so frequently gives way under its task. The three tyrants are the external world, the super-ego and the id. When one watches the efforts of the ego to satisfy them all, or rather, to obey them all simultaneously, one cannot regret having personified the ego, and established it as a separate being. It feels itself hemmed in on three sides and threatened by three kinds of danger, towards which it reacts by developing anxiety when it is too hard pressed. Having originated in the experiences of the perceptual system, it is designed to represent the demands of the external world, but it also wishes to be a loyal servant of the id, to remain upon good terms with the id, to recommend itself to the id as an object, and to draw the id's libido on to itself. In its attempt to mediate between the id and reality, it is often forced to clothe the Ucs. commands of the id with its own Pcs. rationalisations, to gloss over the conflicts between the id and reality, and with diplomatic dishonesty to display a pretended regard for reality, even when the id persists in being stubborn and uncompromising. On the other hand, its every movement is watched by the severe super-ego, which holds up certain norms of behaviour, without regard to any difficulties coming from the id and the external world; and if these norms are not acted up to, it punishes the ego with the feelings of tension which manifest themselves as a sense of inferiority and guilt. In this way, goaded on by the id, hemmed in by the super-ego, and rebuffed by reality, the ego struggles to cope with its economic task of reducing the forces and influences which work in it and upon it to some kind of harmony; and we may well understand how it is that we so often cannot repress the cry: "Life is not easy." When the ego is forced to acknowledge its weakness, it breaks out into anxiety: reality anxiety in face of the external world, normal anxiety in face of the super- ego, and neurotic anxiety in face of the strength of the passions in the id.

I have represented the structural relations within the mental personality, as I have explained them to you, in a simple diagram, which I here reproduce.

You will observe how the super-ego goes down into the id; as the heir to the Oedipus complex it has, after all, intimate connections with the id. It lies further from the perceptual system than the ego. The id only deals with the external world through the medium of the ego, at least in this diagram. It is certainly still too early to say how far the drawing is correct; in one respect I know it is not. The space taken up by the unconscious id ought to be incomparably greater than that given to the ego or to the preconscious. You must, if you please, correct that in your imagination.

And now, in concluding this certainly rather exhausting and perhaps not very illuminating account, I must add a warning. When you think of this dividing up of the personality into ego, super-ego and id, you must not imagine sharp dividing lines such as are artificially drawn in the field of political geography. We cannot do justice to the characteristics of the mind by means of linear contours, such as occur in a drawing or in a primitive painting, but we need rather the areas of colour shading off into one another that are to be found in modern pictures. After we have made our separations, we must allow what we have separated to merge again. Do not judge too harshly of a first attempt at picturing a thing so elusive as the human mind. It is very probable that the extent of these differentiations varies very greatly from person to person; it is possible that their function itself may vary, and that they may at times undergo a process of involution. This seems to be particularly true of the most insecure and, from the phylogenetic point of view, the most recent of them, the differentiation between the ego and the superego. It is also incontestable that the same thing can come about

31 | 115© Copyright 2006 C. George Boeree

Page 264: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

as a result of mental disease. It can easily be imagined, too, that certain practices of mystics may succeed in upsetting the normal relations between the different regions of the mind, so that, for example, the perceptual system becomes able to grasp relations in the deeper layers of the ego and in the id which would otherwise be inaccessible to it. Whether such a procedure can put one in possession of ultimate truths, from which all good will flow, may be safely doubted. All the same, we must admit that the therapeutic efforts of psycho-analysis have chosen much the same method of approach. For their object is to strengthen the ego, to make it more independent of the super- ego, to widen its field of vision, and so to extend its organisation that it can take over new portions of the id. Where id was, there shall ego be.

It is reclamation work, like the draining of the Zuyder Zee.

32 | 115© Copyright 2006 C. George Boeree

Page 265: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Behaviorism

33 | 115© Copyright 2006 C. George Boeree

Page 266: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Behaviorism is the philosophical position that says that psychology, to be a science, must focus its attentions on what is observable – the environment and behavior – rather than what is only available to the individual – perceptions, thoughts, images, feelings.... The latter are subjective and immune to measurement, and therefore can never lead to an objective science.

The first behaviorists were Russian. The very first was Ivan M. Sechenov (1829 to 1905). He was a physiologist who had studied at the University of Berlin with famous people like Müller, DuBois-Reymond, and Helmholtz. Devoted to a rigorous blend of associationism and materialism, he concluded that all behavior is caused by stimulation.

In 1863, he wrote Reflexes of the Brain. In this landmark book, he introduced the idea that there are not only excitatory processes in the central nervous system, but inhibitory ones as well.

Vladimir M. Bekhterev (1857 to 1927) is another early Russian behaviorist. He graduated in 1878 from the Military Medical Academy in St. Petersburg, one year before Pavlov arrived there. He received his MD in 1881 at the tender age of 24, then went to study with the likes of DuBois-Reymond and Wundt in Berlin, and Charcot in France.

He established the first psychology lab in Russia at the university of Kazan in 1885, then returned to the Military Medical Academy in 1893. In 1904, he published a paper entitled "Objective Psychology," which he later expanded into three volumes.

He called his field reflexology, and defined it as the objective study of stimulus-response connections. Only the environment and behavior were to be discussed! And he discovered what he called the association reflex – what Pavlov would call the conditioned reflex.

Ivan Pavlov

Which brings us to the most famous of the Russian researchers, Ivan Petrovich Pavlov (1849-1936). After studying for the priesthood, as had his father, he switched to medicine in 1870 at the Military Medical Academy in St. Petersburg. It should be noted that he walked from his home in Ryazan near Moscow hundreds of miles to St. Petersburg!

In 1879, he received his degree in natural science, and in 1883, his MD. He then went to study at the university of Leipzig in Germany. In 1890, he was offered a position as professor of physiology at his alma mater, the Military Medical Academy, which is where he spent the rest of his life. It was in 1900 that he began studying reflexes, especially the salivary response.

In 1904, he was awarded the Nobel Prize in physiology for his work on digestion, and in 1921, he received the Hero of the Revolution Award from Lenin himself.

Pavlovian (or classical) conditioning builds on reflexes: We begin with an unconditioned stimulus and an unconditioned response – a reflex! We then associate a neutral stimulus with the reflex by presenting it with the unconditioned stimulus. Over a number of repetitions, the neutral stimulus by itself will elicit the response! At this point, the neutral stimulus is renamed the conditioned stimulus, and the response is called the conditioned response.

Or, to put it in the form that Pavlov observed in his dogs, some meat powder on the tongue makes a dog salivate. Ring a bell at the same time, and after a few repetitions, the dog will salivate upon hearing the bell alone – without being given the meat powder!

Pavlov agreed with Sekhenov that there was inhibition as well as excitation. When the bell is rung many

34 | 115© Copyright 2006 C. George Boeree

Page 267: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

times with no meat forthcoming, the dog eventually stops salivating at the sound of the bell. That’s extinction. But, just give him a little meat powder once, and it is as if he had never had the behavior extinguished: He is right back to salivating to the bell. This spontaneous recovery strongly suggests that the habit has been there all alone. The dog had simply learned to inhibit his response.

Pavlov, of course, could therefore condition not only excitation but inhibition. You can teach a dog that he is NOT getting meat just as easily as you can teach him that he IS getting meat. For example, one bell could mean dinner, and another could mean dinner is over. If the bells, however, were too similar, or were rung simultaneously, many dogs would have something akin to a nervous breakdown, which Pavlov called an experimental neurosis.

In fact, Pavlov classified his dogs into four different personalities, ala the ancient Greeks: Dogs that got angry were choleric, ones that fell asleep were phlegmatic, ones that whined were melancholy, and the few that kept their spirits up were sanguine! The relative strengths of the dogs’ abilities to activate their nervous system and calm it back down (excitation and inhibition) were the explanations. These explanations would be used later by Hans Eysenck to understand the differences between introverts and extraverts!

Another set of terms that comes from Pavlov are the first and second signal systems. The first signal system is where the conditioned stimulus (a bell) acts as a "signal" that an important event is to occur – i.e. the unconditioned stimulus (the meat). The second signal system is when arbitrary symbols come to stand for stimuli, as they do in human language.

Edward Lee Thorndike

Over in America, things were happening as well. Edward Lee Thorndike, although technically a functionalist, was setting the stage for an American version of Russian behaviorism. Thorndike (1874-1949) got his bachelors degree from Wesleyan University in Connecticut in 1895 and his masters from Harvard in 1897. While there he took a class from William James and they became fast friends. He received a fellowship at Columbia, and got his PhD in 1898. He stayed to teach at Columbia until he retired in 1940.

He will always be remembered for his cats and his poorly constructed "puzzleboxes." These boxes had escape mechanisms of various complexities that required that the cats do several behaviors in sequence. From this research, he concluded that there were two laws of learning:

1. The law of exercise, which is basically the same as Aristotle’s law of frequency. The more often an association (or neural connection) is used, the stronger the connection. Naturally, the less it is used, the weaker the connection. These two were referred to as the law of use and disuse respectively.

2. The law of effect. When an association is followed by a "satisfying state of affairs," the connection is strengthened. And, likewise, when an association is followed by an unsatisfying state of affairs, it is weakened. Except for the mentalistic language ("satisfying" is not behavioral!), it is the same thing as Skinner’s operant conditioning.

In 1929, his research led him to abandon all of the above except what we would now call reinforcement (the first half of law 2).

He is also known for his study of transfer of training. It was believed back then (and is still often believed) that studying difficult subjects – even if you would never use them – was good for you because it "strengthened" your mind, sort of like exercise strengthens your muscles. It was used back then to justify making kids learn Latin, just like it is used today to justify making kids learn calculus. He found, however,

35 | 115© Copyright 2006 C. George Boeree

Page 268: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

that it was only the similarity of the second subject to the first that leads to improved learning in the second subject. So Latin may help you learn Italian, or algebra may help you learn calculus, but Latin won’t help you learn calculus, or the other way around.

John Broadus Watson

John Watson was born January 9, 1878 in a small town outside Greenville, South Carolina. He was brought up on a farm by a fundamentalist mother and a carousing father. When John was 12, they moved into the town of Greenville, but a year later his father left the family. John became a troublemaker and barely passed in school.

At 16, he began attending Furman University, also in Greenville, and he graduated at 22 with a Masters degree. He then went on to the University of Chicago to study under John Dewey. He found Dewey "incomprehensible" and switched his interests from philosophy to psychology and neurophysiology. Dirt poor, he worked his way through graduate school by waiting tables, sweeping the psych lab, and feeding the rats.

In 1902 he suffered from a "nervous breakdown" which had been a long time coming. He had suffered from an intense fear of the dark since childhood – due to stories he had heard in childhood about the devil doing his work in the night – and this exacerbated into depression.

Nevertheless, after some rest, he finished his PhD the following year, got an assistantship with his professor, the respected functionalist James Angell, and married a student in his intro psych class, Mary Ickes. They would go on to have two children. (The actress Mariette Hartley is his granddaughter.)

The following year, he was made an instructor. He developed a well-run animal lab where he worked with rats, monkeys, and terns. Johns Hopkins offered him a full professorship and a laboratory in 1908, when the previous professor was caught in a brothel.

In 1913, he wrote an article called "Psychology as a Behaviorist Views It" for Psychological Review. Here, he outlined the behaviorist program. This was followed in the following year by the book Behaviorism: An Introduction to comparative Psychology. In this book, he pushed the study of rats as a useful model for human behavior. Until then, rat research was not thought of as significant for understanding human beings. And, by 1915, he had absorbed Pavlov and Bekhterev’s work on conditioned reflexes, and incorporated that into his behaviorist package.

In 1917, he was drafted into the army, where he served until 1919. In that year, he came out with the book Psychology from the Standpoint of a Behaviorist – basically an expansion of his original article.

At this time, he expanded his lab work to include human infants. His best known experiment was conducted in 1920 with the help of his lab assistant Rosalie Rayner. "Little Albert," a 9 month old child, was conditioned to fear a white rat by pairing it seven times with a loud noise made by hitting a steel bar with a hammer. His fear quickly generalized to a rabbit, a fur coat, a Santa Clause mask, and even cotton. Albert was never "deconditioned" because his mother and he moved away. It was clear, however, that the conditioning tended to disappear (extinguish) rather quickly, so we assume that Albert was soon over his fear. This suggests that conditioned fear is not really the same as a phobia. Later, another child, three year old Peter, was conditioned and then "de-conditioned" by pairing his fear of a rabbit with milk and cookies and other positive things gradually.

In that year, his affair with his lab assistant was revealed and his wife sued for divorce. The administration at

36 | 115© Copyright 2006 C. George Boeree

Page 269: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Johns Hopkins asked him for his resignation. He immediately married Rosalie Rayner and began looking for business opportunities.

He soon found himself working for the V. Walter Thompson advertising agency. He worked in a great variety of positions within the company, and was made vice president in 1924. By all standards of the time, he was very successful and quite rich! He increased sales of such items as Pond’s cold cream, Maxwell House coffee, and Johnson’s baby powder, and is thought to have invented the slogan "LSMFT – Lucky Strikes Means Fine Tobacco."

He published his book Behaviorism, designed for the average reader, in 1925, and revised it in 1930. This was his final statement of his position.

Psychology according to Watson is essentially the science of stimuli and responses. We begin with reflexes and, by means of conditioning, acquire learned responses. Brain processes are unimportant (he called the brain a "mystery box"). Emotions are bodily responses to stimuli. Thought is subvocal speech. Consciousness is nothing at all.

Most importantly, he denied the existence of any human instincts, inherited capacities or talents, and temperaments. This radical environmentalism is reflected in what is perhaps his best known quote:

Give me a dozen healthy infants, well-formed, and my own specified world to bring them up in and I’ll guarantee to take any one at random and train him to become any type of specialist I might select – doctor, lawyer, artist, merchant-chief and, yes, even beggar-man and thief, regardless of his talents, penchants, tendencies, abilities, vocations, and race of his ancestors. (In Behaviorism, 1930)

In addition to writing popular articles for McCall’s, Harper’s, Collier’s and other magazines, he published Psychological Care of the Infant and Child in 1928. Among other things, he saw parents as more likely than not to ruin their child’s healthy development, and argued particularly against too much hugging and other demonstrations of affection!

In 1936, he was hired as vice-president of another agency, William Esty and Company. He devoted himself to business until he retired ten years later. He died in New York City on September 25, 1958.

William McDougall

William McDougall doesn't belong in this chapter, really. But his dislike for Watson's brand of behaviorism and his efforts against it warrant his inclusion. He was born June 22, 1871 in Lancashire, England. He entered the University of Manchester at 15, and received his medical degree from St. Thomas's Hospital in London, in 1897. He usually referred to himself as an anthropologist, especially after a one-year Cambridge University expedition to visit the tribes of central Borneo.

From 1898, McDougall held lectureships in Cambridge and Oxford. His reputation developed in England with the publication of several texts, including Introduction to Social Psychology in 1908 and Body and Mind in 1911. In 1912, he was made a Fellow of the Royal Society.

During WWI, he served in the medical corps, treating soldiers suffering from "shell shock," what we now call post-traumatic stress syndrome. After the war, he himself received therapy from Carl Jung!

He was offered a position as Professor of Psychology at Harvard in 1920. He considered himself a follower of William James, so he took this as a great honor. In that same year, he published The Group Mind, followed in 1923

37 | 115© Copyright 2006 C. George Boeree

Page 270: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

by the Outline of Psychology

In 1924, he participated in The Battle of Behaviorism (published in 1929). This was a debate with John Watson at the Psychology Club meeting in Washington DC that year. The audience narrowly voted McDougall the winner, but it would be Watson who would win the favor of American psychology for years to come!

McDougall resigned from his position as chair of Psychology at Harvard in 1926, and began teaching at Duke University in 1927. It should be noted that he had a particular strong relationship with his wife, and in 1927 dedicated his book Character and the Conduct of Life to her with these words: "To my wife, to whose intuitive insight I owe whatever understanding of human nature I have acquired." He died in 1938.

McDougall was an hereditarian to the end, promoting a psychology based on instincts. He himself referred to his position as evolutionary psychology. Further, he was the leading critic of the behaviourism of his day. He particularly hated Watson's simplistic materialism.

McDougall was not well like by his students or by his colleagues. The American press (notably the New York Times) was particularly antagonistic towards him. The reasons were clear: McDougall took his hereditarian position at a time when the environmental position ruled American psychology and popular opinion. He called himself a "democratic elitist" and considered a nation's intellectual aristocracy a treasure which should be protected. Further, he believed in the hereditary nature of group differences, both national and racial, and proposed the institution of eugenic programs. In his defense, however, he had no sympathy with Nazism and its version of eugenics!

McDougall has been largely forgotten – until recently, with genetics and evolutionary psychology on the rise.

McDougall saw Instincts as having three components:

• perception – we pay attention to stimuli relevant to our instinctual purposes • behavior – we perform actions that satisfy our instinctual purposes • emotion – instincts have associated negative and positive emotions

Notice that instincts are purposive, i.e. goal-directed! This is not stimulus-response behaviorism!

Here is a list of instincts and accompanying emotions:

• escape – fear • combat – anger • repulsion – disgust • parental (protective) – love, tenderness • appeal for help – distress • mating – lust • curiosity – feeling of mystery • submission – inferiority • assertion – superiority • gregariousness – loneliness • food-seeking – appetite • hoarding – greed • construction – productivity • laughter – amusement

38 | 115© Copyright 2006 C. George Boeree

Page 271: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Clark Hull Clark Leonard Hull was born May 24, 1884 near Akron, New York, to a poor, rural family. His was educated in a one-room school house and even taught there one year, when he was only 17. While a student, he had a brush with death from typhoid fever.

He went on to Alma College in Michigan to study mining engineering. He worked for a mining company for two months when he developed polio. This forced him to look for a less strenuous career. For two years, he was principal of the same school he had gone to as a child – now consisting of two rooms! He read William James and saved up his money to go to the University of Michigan.

After graduating, he taught for a while, then went on the the University of Wisconsin. He got his PhD there in 1918, and stayed to teach until 1929. This was where his ideas on a behavioristic psychology were formed.

In 1929, he became a professor of psychology at Yale. In 1936, he was elected president of the APA. He published his masterwork, Principles of Behavior, in 1943. In 1948, he had a massive heart attack. Nevertheless, he managed to finish a

second book, A Behavior System, in that same year. He died of a second heart attack May 10, 1952.

Hull’s theory is characterized by very strict operationalization of variables and a notoriously mathematical presentation. Here are the variables Hull looked at when conditioning rats:

Independent variables:

S, the physical stimulus. Time of deprivation or the period and intensity of painful stimuli. G, the size and quality of the reinforcer. The time delay between the response and the reinforcer. The time between the conditioned and unconditioned stimulus. N, the number of trials. The amount of time the rat had been active.

The intervening variables:

s, the stimulus trace. V, the stimulus intensity dynamism. D, the drive or primary motivation or need (dependent on deprivation, etc.). K, incentive motivation (dependent on the amount or quality of reinforcer). J, the incentive based on delay of reinforcement. sHr, habit strength, based on N, G (or K), J, and time between conditioned and unconditioned stimulus. Ir, reactive inhibition (e.g. exhaustion because the rat had been active for some time). sIr, conditioned inhibition (due to other training). sLr, the reaction threshold (minimum reinforcement required for any learning). sOr, momentary behavioral oscillation – i.e. random variables not otherwise accounted for. And the main intervening variable, sEr, excitatory potential, which is the result of all the above...

sEr = V x D x K x J x sHr - sIr - Ir - sOr - sLr.

The dependent variables:

Latency (speed of the response). Amplitude (the strength of the response). Resistance to extinction. Frequency (the probability of the response.

39 | 115© Copyright 2006 C. George Boeree

Page 272: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

All of which are measures of R, the response, which is a function of sEr.

Whew!

The essence of the theory can be summarized by saying that the response is a function of the strength of the habit times the strength of the drive. It is for this reason that Hull’s theory is often referred to as drive theory.

Hull was the most influential behaviorist of the the 1940’s and 50’s. His student, Kenneth W. Spence, maintained that popularity through much of the 1960’s. But the theory, acceptable in its abbreviated form, was too unwieldy in the opinion of other behaviorists, and could not easily generalize from carefully controlled rat experiments to the complexities of human life. It is now a matter of historical interest only.

E. C. Tolman

A very different theory would also have some popularity before the behaviorism left the experimental scene to the cognitivists: The cognitive behaviorism of Edward Chase Tolman. E. C. was born April 14, 1886 in Newton, Mass. His father was a businessman, his mother a housewife and fervent Quaker. He and his older brother attended MIT. His brother went on to become a famous physicist.

E. C. was strongly influenced by reading William James, so in 1911 he went to graduate school at Harvard. While there, he spent a summer in Germany studying with the Kurt Koffka, the Gestalt psychologist. He received his PhD in 1915.

He went off to teach at Northwestern University. But he was a shy teacher, and an avowed pacifist during World War I, and the University dismissed him in 1918. He went to teach at the University of California at Berkeley. He also served in the OSS (Office of Strategic Services) for two years during World War II.

The University of California required loyalty oaths of the professors there (inspired by Joseph McCarthy and the "red scare"). Tolman led protests and was summarily suspended. The courts found in his favor and he was reinstated. In 1959 he retired, and received an honorary doctorate from the same University of California at Berkeley! Unfortunately, he died the same year, on November 19.

Although he appreciated the behaviorist agenda for making psychology into a true objective science, he felt Watson and others had gone too far.

1. Watson’s behaviorism was the study of "twitches" – stimulus-response is too molecular a level. We should study whole, meaningful behaviors: the molar level.

2. Watson saw only simple cause and effect in his animals. Tolman saw purposeful, goal-directed behavior.

3. Watson saw his animals as "dumb" mechanisms. Tolman saw them as forming and testing hypotheses based on prior experience.

4. Watson had no use for internal, "mentalistic" processes. Tolman demonstrated that his rats were capable of a variety of cognitive processes.

An animal, in the process of exploring its environment, develops a cognitive map of the environment. The process is called latent learning, which is learning in the absence of rewards or punishments. The animals develops expectancies (hypotheses) which are confirmed or not by further experience. Rewards (and punishments) come into play only a motivators for performance of a learned behavior, not as the causes of

40 | 115© Copyright 2006 C. George Boeree

Page 273: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

learning itself.

He himself acknowledged that his behaviorism was more like Gestalt psychology than like Watson’s brand of behaviorism. From our perspective today, he can be considered one of the precursors of the cognitive movement.

B. F. Skinner

Burrhus Frederic Skinner was born March 20, 1904, in the small Pennsylvania town of Susquehanna. His father was a lawyer, and his mother a strong and intelligent housewife. His upbringing was old-fashioned and hard-working.

Burrhus was an active, out-going boy who loved the outdoors and building things, and actually enjoyed school. His life was not without its tragedies, however. In particular, his brother died at the age of 16 of a cerebral aneurysm.

Burrhus received his BA in English from Hamilton College in upstate New York. He didn’t fit in very well, not enjoying the fraternity parties or the football games. He wrote for school paper, including articles critical of the school, the faculty, and even Phi Beta Kappa! To top it off, he was an atheist – in a school that required daily chapel attendance.

He wanted to be a writer and did try, sending off poetry and short stories. When he graduated, he built a study in his parents’ attic to concentrate, but it just wasn’t working for him.

Ultimately, he resigned himself to writing newspaper articles on labor problems, and lived for a while in Greenwich Village in New York City as a "bohemian." After some traveling, he decided to go back to school, this time at Harvard. He got his masters in psychology in 1930 and his doctorate in 1931, and stayed there to do research until 1936.

Also in that year, he moved to Minneapolis to teach at the University of Minnesota. There he met and soon married Yvonne Blue. They had two daughters, the second of which became famous as the first infant to be raised in one of Skinner’s inventions, the air crib. Although it was nothing more than a combination crib and playpen with glass sides and air conditioning, it looked too much like keeping a baby in an aquarium to catch on.

In 1945, he became the chairman of the psychology department at Indiana University. In 1948, he was invited to come to Harvard, where he remained for the rest of his life. He was a very active man, doing research and guiding hundreds of doctoral candidates as well as writing many books. While not successful as a writer of fiction and poetry, he became one of our best psychology writers, including the book Walden II, which is a fictional account of a community run by his behaviorist principles.

August 18, 1990, B. F. Skinner died of leukemia after becoming perhaps the most celebrated psychologist since Sigmund Freud.

Theory

B. F. Skinner’s entire system is based on operant conditioning. The organism is in the process of "operating" on the environment, which in ordinary terms means it is bouncing around it world, doing what it does. During this "operating," the organism encounters a special kind of stimulus, called a reinforcing stimulus, or simply a reinforcer. This special stimulus has the effect of increasing the operant – that is, the

41 | 115© Copyright 2006 C. George Boeree

Page 274: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

behavior occurring just before the reinforcer. This is operant conditioning: "the behavior is followed by a consequence, and the nature of the consequence modifies the organisms tendency to repeat the behavior in the future."

Imagine a rat in a cage. This is a special cage (called, in fact, a "Skinner box") that has a bar or pedal on one wall that, when pressed, causes a little mechanism to release a foot pellet into the cage. The rat is bouncing around the cage, doing whatever it is rats do, when he accidentally presses the bar and – hey, presto! – a food pellet falls into the cage! The operant is the behavior just prior to the reinforcer, which is the food pellet, of course. In no time at all, the rat is furiously peddling away at the bar, hoarding his pile of pellets in the corner of the cage.

A behavior followed by a reinforcing stimulus results in an increased probability of that behavior occurring in the future.

What if you don’t give the rat any more pellets? Apparently, he’s no fool, and after a few futile attempts, he stops his bar-pressing behavior. This is called extinction of the operant behavior.

A behavior no longer followed by the reinforcing stimulus results in a decreased probability of that behavior occurring in the future.

Now, if you were to turn the pellet machine back on, so that pressing the bar again provides the rat with pellets, the behavior of bar-pushing will "pop" right back into existence, much more quickly than it took for the rat to learn the behavior the first time. This is because the return of the reinforcer takes place in the context of a reinforcement history that goes all the way back to the very first time the rat was reinforced for pushing on the bar!

Schedules of reinforcement

Skinner likes to tell about how he "accidentally – i.e. operantly – came across his various discoveries. For example, he talks about running low on food pellets in the middle of a study. Now, these were the days before "Purina rat chow" and the like, so Skinner had to make his own rat pellets, a slow and tedious task. So he decided to reduce the number of reinforcements he gave his rats for whatever behavior he was trying to condition, and, lo and behold, the rats kept up their operant behaviors, and at a stable rate, no less. This is how Skinner discovered schedules of reinforcement!

Continuous reinforcement is the original scenario: Every time that the rat does the behavior (such as pedal-pushing), he gets a rat goodie.

The fixed ratio schedule was the first one Skinner discovered: If the rat presses the pedal three times, say, he gets a goodie. Or five times. Or twenty times. Or "x" times. There is a fixed ratio between behaviors and reinforcers: 3 to 1, 5 to 1, 20 to 1, etc. This is a little like "piece rate" in the clothing manufacturing industry: You get paid so much for so many shirts.

The fixed interval schedule uses a timing device of some sort. If the rat presses the bar at least once during a particular stretch of time (say 20 seconds), then he gets a goodie. If he fails to do so, he doesn’t get a goodie. But even if he hits that bar a hundred times during that 20 seconds, he still only gets one goodie! One strange thing that happens is that the rats tend to "pace" themselves: They slow down the rate of their behavior right after the reinforcer, and speed up when the time for it gets close.

Skinner also looked at variable schedules. Variable ratio means you change the "x" each time – first it takes 3 presses to get a goodie, then 10, then 1, then 7 and so on. Variable interval means you keep changing the time period – first 20 seconds, then 5, then 35, then 10 and so on.

In both cases, it keeps the rats on their rat toes. With the variable interval schedule, they no longer "pace" themselves, because they no can no longer establish a "rhythm" between behavior and reward. Most importantly, these schedules are very resistant to extinction. It makes sense, if you think about it. If you haven’t gotten a reinforcer for a while, well, it could just be that you are at a particularly "bad" ratio or

42 | 115© Copyright 2006 C. George Boeree

Page 275: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

interval! Just one more bar press, maybe this’ll be the one!

This, according to Skinner, is the mechanism of gambling. You may not win very often, but you never know whether and when you’ll win again. It could be the very next time, and if you don’t roll them dice, or play that hand, or bet on that number this once, you’ll miss on the score of the century!

Shaping

A question Skinner had to deal with was how we get to more complex sorts of behaviors. He responded with the idea of shaping, or "the method of successive approximations." Basically, it involves first reinforcing a behavior only vaguely similar to the one desired. Once that is established, you look out for variations that come a little closer to what you want, and so on, until you have the animal performing a behavior that would never show up in ordinary life. Skinner and his students have been quite successful in teaching simple animals to do some quite extraordinary things. My favorite is teaching pigeons to bowl!

I used shaping on one of my daughters once. She was about three or four years old, and was afraid to go down a particular slide. So I picked her up, put her at the end of the slide, asked if she was okay and if she could jump down. She did, of course, and I showered her with praise. I then picked her up and put her a foot or so up the slide, asked her if she was okay, and asked her to slide down and jump off. So far so good. I repeated this again and again, each time moving her a little up the slide, and backing off if she got nervous. Eventually, I could put her at the top of the slide and she could slide all the way down and jump off. Unfortunately, she still couldn’t climb up the ladder, so I was a very busy father for a while.

Beyond these fairly simple examples, shaping also accounts for the most complex of behaviors. You don’t, for example, become a brain surgeon by stumbling into an operating theater, cutting open someone's head, successfully removing a tumor, and being rewarded with prestige and a hefty paycheck, along the lines of the rat in the Skinner box. Instead, you are gently shaped by your environment to enjoy certain things, do well in school, take a certain bio class, see a doctor movie perhaps, have a good hospital visit, enter med school, be encouraged to drift towards brain surgery as a speciality, and so on. This could be something your parents were carefully doing to you, ala a rat in a cage. But much more likely, this is something that was more or less unintentional.

Aversive stimuli

An aversive stimulus is the opposite of a reinforcing stimulus, something we might find unpleasant or painful.

A behavior followed by an aversive stimulus results in a decreased probability of the behavior occurring in the future.

This both defines an aversive stimulus and describes the form of conditioning known as punishment. If you shock a rat for doing x, it’ll do a lot less of x. If you spank Johnny for throwing his toys he will throw his toys less and less (maybe).

On the other hand, if you remove an already active aversive stimulus after a rat or Johnny performs a certain behavior, you are doing negative reinforcement. If you turn off the electricity when the rat stands on his hind legs, he’ll do a lot more standing. If you stop your perpetually nagging when I finally take out the garbage, I’ll be more likely to take out the garbage (perhaps). You could say it "feels so good" when the aversive stimulus stops, that this serves as a reinforcer!

Behavior followed by the removal of an aversive stimulus results in an increased probability of that behavior occurring in the future.

Notice how difficult it can be to distinguish some forms of negative reinforcement from positive reinforcement: If I starve you, is the food I give you when you do what I want a positive – i.e. a reinforcer?

43 | 115© Copyright 2006 C. George Boeree

Page 276: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Or is it the removal of a negative – i.e. the aversive stimulus of hunger?

Skinner (contrary to some stereotypes that have arisen about behaviorists) doesn’t "approve" of the use of aversive stimuli – not because of ethics, but because they don’t work well! Notice that I said earlier that Johnny will maybe stop throwing his toys, and that I perhaps will take out the garbage? That’s because whatever was reinforcing the bad behaviors hasn’t been removed, as it would’ve been in the case of extinction. This hidden reinforcer has just been "covered up" with a conflicting aversive stimulus. So, sure, sometimes the child (or me) will behave – but it still feels good to throw those toys. All Johnny needs to do is wait till you’re out of the room, or find a way to blame it on his brother, or in some way escape the consequences, and he’s back to his old ways. In fact, because Johnny now only gets to enjoy his reinforcer occasionally, he’s gone into a variable schedule of reinforcement, and he’ll be even more resistant to extinction than ever!

Behavior modification

Behavior modification – often referred to as b-mod – is the therapy technique based on Skinner’s work. It is very straight-forward: Extinguish an undesirable behavior (by removing the reinforcer) and replace it with a desirable behavior by reinforcement. It has been used on all sorts of psychological problems – addictions, neuroses, shyness, autism, even schizophrenia – and works particularly well with children. There are examples of back-ward psychotics who haven’t communicated with others for years who have been conditioning to behave themselves in fairly normal ways, such as eating with a knife and fork, taking care of their own hygiene needs, dressing themselves, and so on.

There is an offshoot of b-mod called the token economy. This is used primarily in institutions such as psychiatric hospitals, juvenile halls, and prisons. Certain rules are made explicit in the institution, and behaving yourself appropriately is rewarded with tokens – poker chips, tickets, funny money, recorded notes, etc. Certain poor behavior is also often followed by a withdrawal of these tokens. The tokens can be traded in for desirable things such as candy, cigarettes, games, movies, time out of the institution, and so on. This has been found to be very effective in maintaining order in these often difficult institutions.

There is a drawback to token economy: When an "inmate" of one of these institutions leaves, they return to an environment that reinforces the kinds of behaviors that got them into the institution in the first place. The psychotic’s family may be thoroughly dysfunctional. The juvenile offender may go right back to "the ‘hood." No one is giving them tokens for eating politely. The only reinforcements may be attention for "acting out," or some gang glory for robbing a Seven-Eleven. In other words, the environment doesn’t travel well!

Walden II

Skinner started his career as an English major, writing poems and short stories. He has, of course, written a large number of papers and books on behaviorism. But he will probably be most remembered by the general run of readers for his book Walden II, wherein he describes a utopia-like commune run on his operant principles.

People, especially the religious right, came down hard on his book. They said that his ideas take away our freedom and dignity as human beings. He responded to the sea of criticism with another book (one of his best) called Beyond Freedom and Dignity. He asked: What do we mean when we say we want to be free? Usually we mean we don’t want to be in a society that punishes us for doing what we want to do. Okay – aversive stimuli don’t work well anyway, so out with them! Instead, we’ll only use reinforcers to "control" society. And if we pick the right reinforcers, we will feel free, because we will be doing what we feel we want!

Likewise for dignity. When we say "she died with dignity," what do we mean? We mean she kept up her "good" behaviors without any apparent ulterior motives. In fact, she kept her dignity because her reinforcement history has led her to see behaving in that "dignified" manner as more reinforcing than making

44 | 115© Copyright 2006 C. George Boeree

Page 277: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

a scene.

The bad do bad because the bad is rewarded. The good do good because the good is rewarded. There is no true freedom or dignity. Right now, our reinforcers for good and bad behavior are chaotic and out of our control – it’s a matter of having good or bad luck with your "choice" of parents, teachers, peers, and other influences. Let’s instead take control, as a society, and design our culture in such a way that good gets rewarded and bad gets extinguished! With the right behavioral technology, we can design culture.

Both freedom and dignity are examples of what Skinner calls mentalistic constructs – unobservable and so useless for a scientific psychology. Other examples include defense mechanisms, the unconscious, archetypes, fictional finalisms, coping strategies, self-actualization, consciousness, even things like hunger and thirst. The most important example is what he refers to as the homunculus – Latin for "the little man" – that supposedly resides inside us and is used to explain our behavior, ideas like soul, mind, ego, will, self, and, of course, personality.

Instead, Skinner recommends that psychologists concentrate on observables, that is, the environment and our behavior in it.

Skinner was to enjoy considerable popularity during the 1960's and even into the 70's. But both the humanistic movement in the clinical world, and the cognitive movement in the experimental world, were quickly moving in on his beloved behaviorism. Before his death, he publicly lamented the fact that the world had failed to learn from him.

45 | 115© Copyright 2006 C. George Boeree

Page 278: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

B. F. Skinner Selection: Walden Two

Chapter 13

The quarters for children from one to I three consisted of several small playrooms with Lilliputian furniture, a child's lavatory, and a dressing and locker room. Several small sleeping rooms were operated on the same principle as the baby cubicles. The temperature and the humidity were controlled so that clothes or bedclothing were not needed. The cots were double-decker arrangements of the plastic mattresses we had seen in the cubicles. The children slept unclothed, except for diapers. There were more beds than necessary, so that the children could be grouped according to developmental age or exposure to contagious diseases or need for supervision, or for educational purposes.

We followed Mrs. Nash to a large screened porch on the south side of the building, where several children were playing in sandboxes and on swings and climbing apparatuses. A few wore "training pants"; the rest were naked. Beyond the porch was a grassy play yard enclosed by closely trimmed hedges, where other children, similarly undressed, were at play. Some kind of marching game was in progress.

As we returned, we met two women carrying food hampers. They spoke to Mrs. Nash and followed her to the porch. In a moment five or six children came running into the playrooms and were soon using the lavatory and dressing themselves. Mrs. Nash explained that they were being taken on a picnic.

"What about the children who don't go?" said Castle. "What do you do about the green-eyed monster?"

Mrs. Nash was puzzled.

"Jealousy. Envy," Castle elaborated. "Don't the children who stay home ever feel unhappy about it?"

"I don't understand," said Mrs. Nash.

"And I hope you won't try," said Frazier with a smile. "I'm afraid we must be moving along."

We said good-bye, and I made an effort to thank Mrs. Nash, but she seemed to be puzzled by that too, and Frazier frowned as if I had committed some breach of good taste.

"I think Mrs. Nash's puzzlement?" said Frazier, as we left the building, "is proof enough that our children are seldom envious or jealous. Mrs. Nash was twelve years old when Walden Two was founded. It was a little late to undo her early training, but I think we were successful. She's a good example of the Walden Two product. She could probably recall the experience of jealousy, but it's not part of her present life."

"Surely that's going too far!" said Castle. "You can't be so godlike as all that! You must be assailed by emotions just as much as the rest of us!"

"We can discuss the question of godlikeness later, if you wish," replied Frazier. "As to emotions—we aren't free of them all, nor should we like to be. But the meaner and more annoying—the emotions which breed unhappiness–are almost unknown here, like unhappiness itself. We don't need them any longer in our struggle for existence, and it's easier on our circulatory system, and certainly pleasantry, to dispense with them."

"If you've discovered how to do that. you are indeed a genius," said Castle. He seemed almost stunned as Frazier nodded assent. "We all know that emotions are useless and bad for our peace of mind and our blood pressure ' he went on. "But how arrange things otherwise?"

"We arrange them otherwise here," said Frazier. He was showing a mildness of manner which I was coming to recognize as a sign of confidence.

"But emotions are—fun!" said Barbara. "Life wouldn't be worth living without them."

"Some of them, yes" said Frasier. "The productive and strengthening emotions—joy and love. But sorrow and hate—and the high-voltage excitements of anger, fear, and rage are out of proportion with the needs of

46 | 115© Copyright 2006 C. George Boeree

Page 279: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

modern life, and they're wasteful and dangerous. Mr. Castle has mentioned jealousy, a minor form of anger, I think we may call it. Naturally we avoid it. It has served its purpose in the evolution of man; we've no further use for it. If we allowed it to persist, it would only sap the life out of us. In a cooperative society there's no jealousy because there's no need for jealousy."

"That implies that you all get everything you want," said Castle. "But what about social possessions? Last night you mentioned the young man who chose a particular girl or profession. There's still a chance for jealousy there, isn't there?"

"It doesn't imply that we get everything we want," said Frazier. "Of course we don't. But jealousy wouldn't help. In a competitive world there's some point to it. It energizes one to attack a frustrating condition. The impulse and the added energy are an advantage. Indeed, in a competitive world emotions work all too well. Look at the singular lack of success of the complacent man. He enjoys a more serene life, but it's less likely to be a fruitful one. The world isn't ready for simple pacifism or Christian humility, to cite two cases in point. Before you can safely turn out the destructive and wasteful emotions, you must make sure they're no longer needed."

"How do you make sure that jealousy isn't needed in Walden Two?" I said.

"In Walden Two problems can't be solved by attacking others" said Frazier with marked finality.

"That's not the same as eliminating jealousy, though" I said.

"Of course it's not. But when a particular emotion is no longer a useful part of a behavioral repertoire, we proceed to eliminate it."

"Yes, but how?"

"It's simply a matter of behavioral engineering," said Frazier.

"Behavioral engineering?"

"You're baiting me, Burris. You know perfectly well what I mean. The techniques have been available for centuries. We use them in education and in the psychological management of the community. But you're forcing my hand" he added. "I was saving that for this evening. But let's strike while the iron is hot."

We had stopped at the door of the large children's building. Frazier shrugged his shoulders, walked to the shade of a large tree, and threw himself on the ground. We arranged ourselves about him and waited.

Chapter 14

Each of us," Frazier began, "is engaged in a pitched battle with the rest of mankind."

"A curious premise for a Utopia," said Castle. "Even a pessimist like myself takes a more hopeful view than that."

"You do, you do," said Frazier. "But lets be realistic. Each of us has interests which conflict with the interests of everybody else. That's our original sin, and it can't be helped. Now, 'everybody else' we call 'society.' It's a powerful opponent, and it always wins. Oh, here and there an individual prevails for a while and gets what he wants. Sometimes he storms the culture of a society and changes it slightly to his own advantage. But society wins in the long run, for it has the advantage of numbers and of age. Many prevail against one, and men against a baby. Society attacks early, when the individual is helpless. It enslaves him almost before he has tasted freedom. The 'ologies' will tell you how its done. Theology calls it building a conscience or developing a spirit of selfless. Psychology calls it the growth of the super ego.

"Considering how long society has been at it, you'd expect a better job. But the campaigns have been badly planned and the victory has never been secure. The behavior of the individual has been shaped according to revelations of 'good conduct,' never as the result of experimental study. But why not experiment? The questions are simple enough. What's the best behavior for the individual so far as the group is concerned?

47 | 115© Copyright 2006 C. George Boeree

Page 280: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

And how can the individual be induced to behave in that way? Why not explore these questions in a scientific spirit?

"We could do just that in Walden Two. We had already worked out a code of conduct—subject, of course, to experimental modification. The code would keep things running smoothly if everybody lived up to it. Our job was to see that everybody did. Now, you can't get people to follow a useful code by making them into so many jack-in-the-boxes. You can't foresee all future circumstances, and you can't specify adequate future conduct. You don't know what will be required. Instead you have to set up certain behavioral processes which lead the individual to design his own 'good' conduct when the time comes. We call that sort of thing 'self-control.' But don't be misled, the control always rests in the last analysis in the hands of society.

"One of our Planners, a young man named Simmons, worked with me. It was the first time in history that the matter was approached in an experimental way. Do you question that statement, Mr. Castle?"

"I'm not sure I know what you are talking about," said Castle.

"Then let me go on. Simmons and I began by studying the great works on morals and ethics—Plato, Aristotle, Confucius, the New Testament, the Puritan divines, Machiavelli, Chesterfield, Freud—there were scores of them. We were looking for any and every method of shaping human behavior by imparting techniques of self-control. Some techniques were obvious enough, for they had marked turning points in human history. 'Love your enemies' is an example–a psychological invention for easing the lot of an oppressed people. The severest trial of oppression is the constant rage which one suffers at the thought of the oppressor. What Jesus discovered was how to avoid these inner devastations. His technique was to practice the opposite emotion. If a man can succeed in loving his enemies and 'taking no thought for the morrow,' he will no longer be assailed by hatred of the oppressor or rage at the loss of his freedom or possessions. He may not get his freedom or possessions back, but he's less miserable. It's a difficult lesson. It comes late in our program."

"I thought you were opposed to modifying emotions and instinct until the world was ready for it," said Castle. "According to you, the principle of love your enemies' should have been suicidal."

"It would have been suicidal, except for an entirely unforeseen consequence. Jesus must have been quite astonished at the effect of his discovery. We are only just beginning to understand the power of love because we are just beginning to understand the weakness of force and aggression. But the science of behavior is clear about all that now. Recent discoveries in the analysis of punishment—but I am falling into one digression after another. Let me save my explanation of why the Christian virtues—and I mean merely the Christian techniques of self-control—have not disappeared from the face of the earth, with due recognition of the fact that they suffered a narrow squeak within recent memory.

"When Simmons and I had collected our techniques of control, we had to discover how to teach them. That was more difficult. Current educational practices were of little value, and religious practices scarcely any better. Promising paradise or threatening hell-fire is, we assumed, generally admitted to be unproductive. It is based upon a fundamental fraud which, when discovered, turns the individual against society and nourishes the very thing it tries to stamp out. What Jesus offered in return for loving one's enemies was heaven on earth, better known as peace of mind.

"We found a few suggestions worth following in the practices of the clinical psychologist. We undertook to build a tolerance for annoying experiences. The sun shine of midday is extremely painful if you come from a dark room, but take it in easy stages and you can avoid pain altogether. The analogy can be misleading, but in much the same way it's possible to build a tolerance to painful or distasteful stimuli, or to frustration, or to situations which arouse fear, anger or rage. Society and nature throw these annoyances at the individual with no regard for the development of tolerances. Some achieve tolerances, most fail. Where would the science of immunization be if it followed a schedule of accidental dosages?

"Take the principle of 'Get thee behind me, Satan,' for example," Frazier continued. "It's a special case of self-control by altering the environment. Subclass A 3, I believe. We give each child a lollipop which has been dipped in powdered sugar so that a single touch of the tongue can be detected. We tell him he may eat

48 | 115© Copyright 2006 C. George Boeree

Page 281: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

the lollipop later in the day, provided it hasn't already been licked. Since the child is only three or four, it is a fairly diff–– "

"Three or four!" Castle exclaimed.

"All our ethical training is completed by the age of six," said Frazier quietly. "A simple principle like putting temptation out of sight would be acquired before four. But at such an early age the problem of not licking the lollipop isn't easy. Now, what would you do, Mr. Castle, in a similar situation?"

"Put the lollipop out of sight as quickly as possible."

"Exactly. I can see you've been well trained. Or perhaps you discovered the principle for yourself. We're in favor of original inquiry wherever possible, but in this case we have a more important goal and we don't hesitate to give verbal help. First of all, the children are urged to examine their own behavior while looking at the lollipops. This helps them to recognize the need for self-control. Then the lollipops are concealed, and the children are asked to notice any gain in happiness or any reduction in tension. Then a strong distraction is arranged—say, an interesting game. Later the children are reminded of the candy and encouraged to examine their reaction. The value of the distraction is generally obvious. Well, need I go on? When the experiment is repeated a day or so later, the children all run with the lollipops to their lockers and do exactly what Mr. Castle would do—a sufficient indication of the success of our training."

"I wish to report an objective observation of my reaction to your story," said Castle, controlling his voice with great precision. "I find myself revolted by this display of sadistic tyranny."

"I don't wish to deny you the exercise of an emotion which you seem to find enjoyable," said Frazier. "So let me go on. Concealing a tempting but forbidden object is a crude solution. For one thing, it's not always feasible. We want a sort of psychological concealment—covering up the candy by paying no attention. In a later experiment the children wear their lollipops like crucifixes for a few hours."

" 'Instead of the cross, the lollipop, About my neck was hung,' " said Castle.

"I wish somebody had taught me that, though," said Rodge, with a glance at Barbara.

"Don't we all?" said Frazier. "Some of us learn control, more or less by accident. The rest of us go all our lives not even understanding how it is possible, and blaming our failure on being born the wrong way."

"How do you build up a tolerance to an annoying situation?" I said.

"Oh, for example, by having the children 'take' a more and more painful shock, or drink cocoa with less and less sugar in it until a bitter concoction can be savored without a bitter face."

"But jealousy or envy—you can't administer them in graded doses," I said.

"And why not? Remember, we control the social environment, too, at this age. That's why we get our ethical training in early. Take this case. A group of children arrive home after a long walk tired and hungry. They're expecting supper; they find, instead, that it's time for a lesson in self-control: they must stand for five minutes in front of steaming bowls of soup.

"The assignment is accepted like a problem in arithmetic. Any groaning or complaining is a wrong answer. Instead, the children begin at once to work upon themselves to avoid any unhappiness during the delay. One of them may make a joke of it. We encourage a sense of humor as a good way of not taking an annoyance seriously. The joke won't be much, according to adult standards—perhaps the child will simply pretend to empty the bowl of soup into his upturned mouth. Another may start a song with many verses. The rest join in at once, for they've learned that it's a good way to make time pass."

Frazier glanced uneasily at Castle, who was not to be appeased.

"That also strikes you as a form of torture, Mr. Castle?" he asked.

49 | 115© Copyright 2006 C. George Boeree

Page 282: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

"I'd rather be put on the rack," said Castle.

"Then you have by no means had the thorough training I supposed. You can’t imagine how lightly the children take such an experience. It's a rather severe biological frustration, for the children are tired and hungry and they must stand and look at food; but it's passed off as lightly as a five-minute delay at curtain time. We regard it as a fairly elementary test. Much more difficult problems follow."

"I suspected as much," muttered Castle.

"In a later stage we forbid all social devices. No songs, no jokes—merely silence. Each child is forced back upon his own resources—a very important step."

"I should think so," I said. "And how do you know it's successful? You might produce a lot of silently resentful chidren. It's certainly a dangerous stage."

"It is, and we follow each child carefully. If he hasn't picked up the necessary techniques, we start back a little. A still more advanced stage"—Frazier glanced again at Castle, who stirred uneasily—"brings me to my point. When it's time to sit down to the soup, the children count off—heads and tails. Then a coin is tossed and if it comes up heads, the 'heads' sit down and eat. The 'tails' remain standing for another five minutes."

Castle groaned.

"And you call that envy?" I asked.

"Perhaps not exactly," said Frazier. "At least there's seldom any aggression against the lucky ones. The emotion, if any, is directed against Lady Luck herself, against the toss of the coin. That, in itself, is a lesson worth learning, for it's the only direction in which emotion has a surviving chance to be useful. And resentment toward things in general, while perhaps just as silly as personal aggression, is more easily controlled. Its expression is not socially objectionable."

Frazier looked nervously from one of us to the other. He seemed to be trying to discover whether we shared Castle's prejudice. I began to realize, also, that he had not really wanted to tell this story. He was vulnerable. He was treading on sanctified ground, and I was pretty sure he had not established the value of most of these practices in an experimental fashion. He could scarcely have done so in the short space of ten years. He was working on faith, and it bothered him.

I tried to bolster his confidence by reminding him that he had a professional colleague among his listeners. "May you not inadvertently teach your children some of the very emotions you're trying to eliminate?" I said. "What's the effect, for example, of finding the anticipation of a warm supper suddenly thwarted? Doesn't that eventually lead to feelings of uncertainty, or even anxiety?"

"It might. We had to discover how often our lessons could be safely administered. But all our schedules are worked out experimentally. We watch for undesired consequences just as any scientist watches for disrupting factors in his experiments.

"After all, it's a simple and sensible program," he went on in a tone of appeasement. "We set up a system of gradually increasing annoyances and frustrations against a background of complete serenity. An easy environment is made more and more difficult as the children acquire the capacity to adjust."

"But why?" said Castle. "Why these deliberate unpleasantnesses—to put it mildly? I must say I think you and your friend Simmons are really very subtle sadists."

"You've reversed your position, Mr. Castle," said Frazier in a sudden flash of anger with which I rather sympathized. Castle was calling names, and he was also being unaccountably and perhaps intentionally obtuse. "A while ago you accused me of breeding a race of softies," Frazier continued. "Now you object to toughening them up. But what you don't understand is that these potentially unhappy situations are never very annoying. Our schedules make sure of that. You wouldn't understand, however, because you're not so far advanced as our children."

Castle grew black.

50 | 115© Copyright 2006 C. George Boeree

Page 283: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

"But what do your children get out of it?" he insisted, apparently trying to press some vague advantage in Frazier's anger.

"What do they get out of it!" exclaimed Frazier, his eyes flashing with a sort of helpless contempt. His lips curled and he dropped his head to look at his fingers, which were crushing a few blades of grass.

"They must get happiness and freedom and strength," I said, putting myself in a ridiculous position in attempting to make peace.

"They don't sound happy or free to me, standing in front of bowls of Forbidden Soup," said Castle, answering me parenthetically while continuing to stare at Frazier.

"If I must spell it out," Frazier began with a deep sigh, "what they get is escape from the petty emotions which eat the heart out of the unprepared. They get the satisfaction of pleasant and profitable social relations on a scale almost undreamed of in the world at large. They get immeasurably increased efficiency, because they can stick to a job without suffering the aches and pains which soon beset most of us. They get new horizons, for they are spared the emotions characteristic of frustration and failure. They get—"His eyes searched the branches of the trees. "Is that enough?," he said at last.

"And the community must gain their loyalty," I said, "when they discover the fears and jealousies and diffidences in the world at large."

"I'm glad you put it that way," said Frazier. "You might have said that they must feel superior to the miserable products of our public schools. But we're at pains to keep any feeling of superiority or contempt under control, too. Having suffered most acutely from it myself, I put the subject first on our agenda. We carefully avoid any joy in a personal triumph which means the personal failure of somebody else. We take no pleasure in the sophistical, the disputative, the dialectical." He threw a vicious glance at Castle. "We don't use the motive of domination, because we are always thinking of the whole group. We could motivate a few geniuses that way—it was certainly my own motivation—but we'd sacrifice some of the happiness of everyone else. Triumph over nature and over oneself, yes. But over others, never."

"You've taken the mainspring out of the watch," said Castle flatly.

"That's an experimental question, Mr. Castle, and you have the wrong answer."

Frazier was making no effort to conceal his feeling. If he had been riding Castle, he was now using his spurs. Perhaps he sensed that the rest of us had come round and that he could change his tactics with a single holdout. But it was more than strategy, it was genuine feeling. Castle's undeviating skepticism was a growing frustration.

"Are your techniques really so very new?" I said hurriedly. "What about the primitive practice of submitting a boy to various tortures before granting him a place among adults? What about the disciplinary techniques of Puritanism? Or of the modern school, for that matter?"

"In one sense you're right," said Frazier. "And I think you've nicely answered Mr. Castle's tender concern for our little ones. The unhappinesses we deliberately impose are far milder than the normal unhappinesses from which we offer protection. Even at the height of our ethical training, the unhappiness is ridiculously trivial—to the well-trained child.

"But there's a world of difference in the way we use these annoyances," he continued. "For one thing, we don't punish. We never administer an unpleasantness in the hope of repressing or eliminating undesirable behavior. But there's another difference. In most cultures the child meets up with annoyances and reverses of uncontrolled magnitude. Some are imposed in the name of discipline by persons in authority. Some, like hazings, are condoned though not authorized. Others are merely accidental. No one cares to, or is able to, prevent them.

"We all know what happens. A few hardy children emerge, particularly those who have got their unhappiness in doses that could be swallowed. They become brave men. Others become sadists or masochists of varying degrees of pathology. Not having conquered a painful environment, they become

51 | 115© Copyright 2006 C. George Boeree

Page 284: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

preoccupied with pain and make a devious art of it. Others submit—and hope to inherit the earth. The rest—the cravens, the cowards—live in fear for the rest of their lives. And that's only a single field—the reaction to pain. I could cite a dozen parallel cases. The optimist and the pessimist, the contented and the disgruntled, the loved and the unloved, the ambitious and the discouraged— these are only the extreme products of a miserable system.

"Traditional practices are admittedly better than nothing," Frazier went on. "Spartan or Puritan—no one can question the occasional happy result. But the whole system rests upon the wasteful principle of selection. The English public school of the nineteenth century produced brave men—by setting up almost insurmountable barriers and making the most of the few who came over. But selection isn't education. Its crops of brave men will always be small, and the waste enormous. Like all primitive principles, selection serves in place of education only through a profligate use of material. Multiply extravagantly and select with rigor. Its the philosophy of the 'big litter' as an alternative to good child hygiene.

"In Walden two we have a different objective. We make every man a brave man. They all come over the barriers. Some require more preparation than others, but they all come over. The traditional use of adversity is to select the strong. We control adversity to build strength. And we do it deliberately, no matter how sadistic Mr. Castle may think us, in order to prepare for adversities which are beyond control. Our children eventually experience the 'heartache and the thousand natural shocks that flesh is heir to.' It would be the cruelest possible practice to protect them as long as possible, especially when we could protect them so well."

Frazier held out his hands in an exaggerated gesture of appeal.

"What alternative had we?" he said, as if he were in pain. "What else could we do? For four or five years we could provide a life in which no important need would go unsatisfied, a life practically free of anxiety or frustration or annoyance. What would you do? Would you let the child enjoy this paradise with no thought for the future—like an idolatrous and pampering mother? Or would you relax control of the environment and let the child meet accidental frustrations? But what is the virtue of accident? No, there was only one course open to us. We had to design a series of adversities, so that the child would develop the greatest possible self-control. Call it deliberate, if you like, and accuse us of sadism; there was no other course." Frazier turned to Castle, but he was scarcely challenging him. He seemed to be waiting, anxiously, for his capitulation. But Castle merely shifted his ground.

"I find it difficult to classify these practices," he said. Frazier emitted a disgruntled "Ha!" and sat back. "Your system seems to have usurped the place as well as the techniques of religion."

"Of religion and family culture," said Frazier wearily. "But I don't call it usurpation. Ethical training belongs to the community. As for techniques, we took every suggestion we could find without prejudice as to the source. But not on faith. We disregarded all claims of revealed truth and put every principle to an experimental test. And by the way, I've very much misrepresented the whole system if you suppose that any of the practices I've described are fixed. We try out many different techniques. Gradually we work toward the best possible set. And we don't pay much attention to the apparent success of a principle in the course of history. History is honored in Walden Two only as entertainment. It isn't taken seriously as food for thought. Which reminds me, very rudely, of our original plan for the morning. Have you had enough of emotion? Shall we turn to intellect?"

Frazier addressed these questions to Castle in a very friendly way and I was glad to see that Castle responded in kind. It was perfectly clear, however, that neither of them had ever worn a lollipop about the neck or faced a bowl of Forbidden Soup.

52 | 115© Copyright 2006 C. George Boeree

Page 285: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Walt Whitman: There Was a Child Went Forth *

[Here's an example of a poet's understanding of how a person learns and grows. Truth be told, I think "Uncle Walt" is far more insightful than any behaviorist – perhaps more than any psychologist of any orientation!]

THERE was a child went forth every day, And the first object he look'd upon, that object he became, And that object became part of him for the day or a certain part of the day, Or for many years or stretching cycles of years.

The early lilacs became part of this child, And grass and white and red morning-glories, and white and red clover, and the song of the phoebe-bird, And the Third-month lambs and the sow's pink-faint litter, and the mare's foal and the cow's calf, And the noisy brood of the barnyard or by the mire of the pond-side, And the fish suspending themselves so curiously below there, and the beautiful curious liquid, And the water-plants with their graceful flat heads, all became part of him.

The field-sprouts of Fourth-month and Fifth-month became part of him, Winter-grain sprouts and those of the light-yellow corn, and the esculent roots of the garden, And the apple-trees cover'd with blossoms and the fruit afterward, and wood-berries, and the commonest weeds by the road, And the old drunkard staggering home from the outhouse of the tavern whence he had lately risen, And the schoolmistress that pass'd on her way to the school, And the friendly boys that pass'd, and the quarrelsome boys, And the tidy and fresh-cheek'd girls, and the barefoot negro boy and girl, And all the changes of city and country wherever he went.

His own parents, he that had father'd him and she that had conceiv'd him in her womb and birth'd him, They gave this child more of themselves than that, They gave him afterward every day, they became part of him.

The mother at home quietly placing the dishes on the supper-table, The mother with mild words, clean her cap and gown, a wholesome odor falling off her person and clothes as she walks by, The father, strong, self-sufficient, manly, mean, anger'd, unjust, The blow, the quick loud word, the tight bargain, the crafty lure, The family usages, the language, the company,

* From http://www.princeton.edu/~batke/logr/log_190.html

53 | 115© Copyright 2006 C. George Boeree

Page 286: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

the furniture, the yearning and swelling heart, Affection that will not be gainsay'd, the sense of what is real, the thought if after all it should prove unreal, The doubts of day-time and the doubts of night-time, the curious whether and how, Whether that which appears so is so, or is it all flashes and specks? Men and women crowding fast in the streets, if they are not flashes and specks what are they? The streets themselves and the facades of houses, and goods in the windows, Vehicles, teams, the heavy-plank'd wharves, the huge crossing at the ferries, The village on the highland seen from afar at sunset, the river between, Shadows, aureola and mist, the light falling on roofs and gables of white or brown two miles off, The schooner near by sleepily dropping down the tide, the little boat slack-tow'd astern, The hurrying tumbling waves, quick-broken crests, slapping, The strata of color'd clouds, the long bar of maroon-tint away solitary by itself, the spread of purity it lies motionless in, The horizon's edge, the flying sea-crow, the fragrance of salt marsh and shore mud, These became part of that child who went forth every day, and who now goes, and will always go forth every day.

54 | 115© Copyright 2006 C. George Boeree

Page 287: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Gestalt Psychology

55 | 115© Copyright 2006 C. George Boeree

Page 288: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Gestalt Psychology, founded by Max Wertheimer, was to some extent a rebellion against the molecularism of Wundt’s program for psychology, in sympathy with many others at the time, including William James. In fact, the word Gestalt means a unified or meaningful whole, which was to be the focus of psychological study instead.

It had its roots in a number of older philosophers and psychologists:

Ernst Mach (1838-1916) introduced the concepts of space forms and time forms. We see a square as a square, whether it is large or small, red or blue, in outline or technicolor... This is space form. Likewise, we hear a melody as recognizable, even if we alter the key in such a way that none of the notes are the same.

Christian von Ehrenfels (1859-1932), who studied with Brentano in Vienna, is the actual originator of the term Gestalt as the Gestalt psychologists were to use it. In 1890, in fact, he wrote a book called On Gestalt Qualities. One of his students was none other than Max Wertheimer.

Oswald Külpe (1862-1915) was a student of G. E. Müller at Göttingen and received his doctorate at Leipzig. He studied as well with Wundt, and served as Wundt’s assistant for many years. He did most of his work while at the University of Würzburg, between 1894 and 1909.

He is best known for the idea of imageless thoughts. Contrary to Wundtians, he showed that some mental activities, such as judgments and doubts, could occur without images. The "pieces" of the psyche that Wundt postulated – sensations, images, and feelings – were apparently not enough to explain all of what went on.

He oversaw the doctoral dissertation of one Max Wertheimer.

Max Wertheimer

So who was this Max Wertheimer? He was born in Prague on April 15, 1880. His father was a teacher and the director at a commercial school. Max studied law for more than two years, but decided he preferred philosophy. He left to study in Berlin, where he took classes from Stumpf, then got his doctoral degree (summa cum laude) from Külpe and the University of Würzburg in 1904.

In 1910, he went to the University of Frankfurt’s Psychological Institute. While on vacation that same year, he became interested in the perceptions he experienced on a train. While stopped at the station, he bought a toy stroboscope – a spinning drum with slots to look through and pictures on the inside, sort of a primitive movie machine or sophisticated flip book.

At Frankfurt, his former teacher Friedrich Schumann, now there as well, gave him the use of a tachistoscope to study the effect. His first subjects were two younger assistants, Wolfgang Köhler and Kurt Koffka. They would become his lifelong partners.

He published his seminal paper in 1912: "Experimental Studies of the Perception of Movement." That year, he was offered a lectureship at the University of Frankfurt. In 1916, he moved to Berlin, and in 1922 was made an assistant professor there. In 1925, he came back to Frankfurt, this time as a professor.

In 1933, he moved to the United States to escape the troubles in Germany. The next year, he began teaching at the New School for Social Research in New York City. While there, he wrote his best known book, Productive Thinking, which was published posthumously by his son, Michael Wertheimer, a successful psychologist in his own right. He died October 12, 1943 of a coronary embolism at his home in New York.

56 | 115© Copyright 2006 C. George Boeree

Page 289: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Wolfgang Köhler

Wolfgang Köhler was born January 21, 1887, in Reval, Estonia. He received his PhD in 1908 from the University of Berlin. He then became an assistant at the Psychological Institute in Frankfurt, where he met and worked with Max Wertheimer.

In 1913, he took advantage of an assignment to study at the Anthropoid Station at Tenerife in the Canary Islands, and stayed there till 1920. In 1917, he wrote his most famous book, Mentality of Apes.

In 1922, he became the chair and director of the psychology lab at the University of Berlin, where he stayed until 1935. During that time, in 1929, he wrote Gestalt Psychology. In 1935, he moved to the U.S., where he taught at Swarthmore until he retired. He died June 11, 1967 in New Hampshire.

Kurt Koffka

Kurt Koffka was born March 18, 1886, in Berlin. He received his PhD from the University of Berlin in 1909, and, just like Köhler, became an assistant at Frankfurt.

In 1911, he moved to the University of Giessen, where he taught till 1927. While there, he wrote Growth of the Mind: An Introduction to Child Psychology (1921). In 1922, he wrote an article for Psychological Bulletin which introduced the Gestalt program to readers in the U.S.

In 1927, he left for the U.S. to teach at Smith College. He published Principles of Gestalt Psychology in 1935. He died in 1941.

The Theory

Gestalt psychology is based on the observation that we often experience things that are not a part of our simple sensations. The original observation was Wertheimer’s, when he noted that we perceive motion where there is nothing more than a rapid sequence of individual sensory events. This is what he saw in the toy stroboscope he bought at the Frankfurt train station, and what he saw in his laboratory when he experimented with lights flashing in rapid succession (like the Christmas lights that appear to course around the tree, or the fancy neon signs in Los Vegas that seem to move). The effect is called the phi phenomenon, and it is actually the basic principle of motion pictures!

If we see what is not there, what is it that we are seeing? You could call it an illusion, but its not an hallucination. Wetheimer explained that you are seeing an effect of the whole event, not contained in the sum of the parts. We see a coursing string of lights, even though only one light lights at a time, because the whole event contains relationships among the individual lights that we experience as well.

Furthermore, say the Gestalt psychologists, we are built to experience the structured whole as well as the individual sensations. And not only do we have the ability to do so, we have a strong tendency to do so. We

57 | 115© Copyright 2006 C. George Boeree

Page 290: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

even add structure to events which do not have gestalt structural qualities.

In perception, there are many organizing principles called gestalt laws. The most general version is called the law of pragnanz. Prägnanz is German for pregnant, but in the sense of pregnant with meaning, rather than pregnant with child. This law says that we are innately driven to experience things in as good a gestalt as possible. "Good" can mean many things here, such a regular, orderly, simplicity, symmetry, and so on, which then refer to specific gestalt laws.

For example, a set of dots outlining the shape of a star is likely to be perceived as a star, not as a set of dots. We tend to complete the figure, make it the way it "should" be, finish it. Like we somehow manage to see this as a "B"...

The law of closure says that, if something is missing in an otherwise complete figure, we will tend to add it. A triangle, for example, with a small part of its edge missing, will still be seen as a triangle. We will "close" the gap.

The law of similarity says that we will tend to group similar items together, to see them as forming a gestalt, within a larger form. Here is a simple typographic example:

OXXXXXXXXXX XOXXXXXXXXX XXOXXXXXXXX XXXOXXXXXXX XXXXOXXXXXX XXXXXOXXXXX XXXXXXOXXXX XXXXXXXOXXX XXXXXXXXOXX XXXXXXXXXOX XXXXXXXXXXO

It is just natural for us to see the o’s as a line within a field of x’s.

Another law is the law of proximity. Things that are close together as seen as belonging together. For example...

**************

**************

**************

You are much more likely to see three lines of close-together *’s than 14 vertical collections of 3 *’s each.

Next, there’s the law of symmetry. Take a look at this example:

[ ][ ][ ]

Despite the pressure of proximity to group the brackets nearest each other together, symmetry overwhelms our perception and makes us see them as pairs of symmetrical brackets.

Another law is the law of continuity. When we can see a line, for example, as continuing through another line, rather than stopping and starting, we will do so, as in this example, which we see as composed of two lines, not as a combination of two angles...:

58 | 115© Copyright 2006 C. George Boeree

Page 291: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Figure-ground is another Gestalt psychology principle. It was first introduced by the Danish phenomenologist Edgar Rubin (1886-1951). The classic example is this one...

Basically, we seem to have an innate tendency to pereive one aspect of an event as the figure or fore-ground and the other as the ground or back-ground. There is only one image here, and yet, by changing nothing but our attitude, we can see two different things. It doesn’t even seem to be possible to see them both at the same time!

But the gestalt principles are by no means restricted to perception – that’s just where they were first noticed. Take, for example, memory. That too seems to work by these laws. If you see an irregular saw-tooth figure, it is likely that your memory will straighten it out for you a bit. Or, if you experience something that doesn’t quite make sense to you, you will tend to remember it as having meaning that may not have been there. A good example is dreams: Watch yourself the next time you tell someone a dream and see if you don’t notice yourself modifying the dream a little to force it to make sense!

Learning was something the Gestalt psychologists were particularly interested in. One thing they noticed right away is that we often learn, not the literal things in front of us, but the relations between them. For example, chickens can be made to peck at the lighter of two gray swatches. When they are then presented with another two swatches, one of which is the lighter of the two preceding swatches, and the other a swatch that is even lighter, they will peck not at the one they pecked at before, but at the lighter one! Even something as stupid as a chicken "understands" the idea of relative lightness and darkness.

Gestalt theory is well known for its concept of insight learning. People tend to misunderstand what is being suggested here: They are not so much talking about flashes of intuition, but rather solving a problem by means of the recognition of a gestalt or organizing principle.

The most famous example of insight learning involved a chimp named Sultan. He was presented with many different practical problems (most involving getting a hard-to-reach banana). When, for example, he had been allowed to play with sticks that could be put together like a fishing pole, he appeared to consider in a very human fashion the situation of the out-of-reach banana thoughtfully – and then rather suddenly jump up, assemble the poles, and reach the banana.

A similar example involved a five year old girl, presented with a geometry problem way over her head: How do you figure the area of a parallelogram? She considered, then excitedly asked for a pair of scissors. She cut off a triangle from one end, and moved it around to the other side, turning the parallelogram into a simple rectangle. Wertheimer called this productive thinking.

The idea behind both of these examples, and much of the gestalt explanation of things, is that the world of our experiencing is meaningfully organized, to one degree or another. When we learn or solve problems, we are essentially recognizing meaning that is there, in the experience, for the "dis-covering."

Most of what we’ve just looked at has been absorbed into "mainstream" psychology – to such a degree that

59 | 115© Copyright 2006 C. George Boeree

Page 292: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

many people forget to give credit to the people who discovered these principles! There is one more part of their theory that has had less acceptance: Isomorphism.

Isomorphism suggests that there is some clear similarity in the gestalt patterning of stimuli and of the activity in the brain while we are perceiving the stimuli. There is a "map" of the experience with the same structural order as the experience itself, albeit "constructed" of very different materials! We are still waiting to see what an experience "looks" like in an experiencing brain. It may take a while.

Kurt Lewin

Gestalt Psychology, even though it no longer survives as a separate entity, has had an enormous impact. Two people in particular lead the way in introducing it into other aspects of psychology: Kurt Goldstein and Kurt Lewin.

Kurt Lewin was born September 9, 1890, in Mogilno, Germany. He received his PhD from the University of Berlin under Stumpf. After military service, he returned to Berlin where he worked with Wertheimer, Koffka, and Köhler.

He went to the U.S. as a guest lecturer at Stanford and Cornell, and took a position at the University of Iowa in 1935. In 1944, he created and directed the Research Center for Group Dynamics at MIT. He died in 1947, just beginning his work there.

Lewin created a topological theory that expressed human dynamics in the form of a map representing a person’s life space. The map is patterned with one’s needs, desires, and goal, and vectors or arrows indicated the directions and strengths of these forces – all operating as a Gestalt.

This theory inspired any number of psychologists in the U.S., most particularly those in social psychology. Among the people he influenced were Muzafer Sherif, Solomon Asch, and Leon Festinger.

Kurt Goldstein

The other person was Kurt Goldstein. Born in 1878, he received his MD from the University of Breslau in 1903. He went to teach at the Neurological Institute of the University of Frankfurt, where he met the founders of Gestalt psychology.

He went to Berlin to be a professor there, and then went on to New York City in 1935. There, he wrote The Organism in 1939, and later Human Nature in the Light of Pathology in 1963. He died in 1965.

Golstein developed a holistic view of brain function, based on research that showed that people with brain damage learned to use other parts of their brains in compensation. He extended his holism to the entire organism, and postulated that

there was only one drive in human functioning, and coined the term self-actualization. Self-preservation, the usual postulated central motive, he said, is actually pathological!

60 | 115© Copyright 2006 C. George Boeree

Page 293: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Goldstein and his idea of self-actualization influence quite a few young personality theorists and therapists. Among them would be Gordon Allport, Carl Rogers, and Abraham Maslow, founders of the American humanistic psychology movement.

61 | 115© Copyright 2006 C. George Boeree

Page 294: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Kurt Köhler Selection: Gestalt Psychology Today (1959) *

...

I should like to begin with a few remarks about the history of Gestalt psychology – because not all chapters of this history are generally known. In the eighties of the past century, psychologists in Europe were greatly disturbed by von Ehrenfels' claim that thousands of percepts have characteristics which cannot be derived from the characteristics of their ultimate components, the so-called sensations. Chords and melodies in hearing, the shape characteristics of visual objects, the roughness or the smoothness of tactual impressions, and so forth were used as examples. All these "Gestalt qualities" have one thing in common. When the physical stimuli in question are considerably changed, while their relations are kept constant, the Gestalt qualities remain about the same. But, At the time, it was generally assumed that the sensations involved are individually determined by their individual stimuli and must therefore change when these are greatly changed. How, then, could any characteristics of the perceptual situation remain constant under these conditions? Where did the Gestalt qualities come from? Ehrenfels' qualities are not fancy ingredients of this or that particular situation which we might safely ignore. Both positive and negative esthetic characteristics of the world around us, not only of ornaments, paintings, sculptures, tunes, and so forth, but also of trees, landscapes, houses, cars – and other persons – belong to this class. That relations between the sexes largely depend on specimens of the same class need hardly be emphasized. It is, therefore, not safe to deal with problems of psychology as though there were no such qualities. And yet, beginning with Ehrenfels himself, psychologists have not been able to explain their nature.

This holds also for the men who were later called Gestalt psychologists, including the present speaker. Wertheimer's ideas and investigations developed in a different direction. His thinking was also more radical than that of Ehrenfels. He did not ask: How are Gestalt qualities possible when, basically, the perceptual scene consists of separate elements? Rather, he objected to this premise, the thesis that the psychologist's thinking must begin with a consideration of such elements. From a subjective. point of view, he felt, it may be tempting to assume that all perceptual situations consist of independent, very small components. For, on this assumption, we obtain a maximally clear picture of what lies behind the observed facts. But, how do we know that a subjective clarity of this kind agrees with the nature of what we have before us? Perhaps we pay for the subjective clearness of the customary picture by ignoring all processes, all functional interrelations, which may have operated before there is a perceptual scene and which thus influence the characteristics of this scene. Are we allowed to impose on perception an extreme simplicity which, objectively, it may not possess?

Wertheimer, we remember, began to reason in this fashion when experimenting not with perceptual situations which were stationary, and therefore comparatively silent, but with visual objects in motion when corresponding stimuli did not move. Such "apparent movements," we would now say, occur when several visual objects appear or disappear in certain temporal relations. Again in our present language, under these circumstances an interaction takes place which, for instance, makes a second object appear too near, or coincident with, a first object which is just disappearing, so that only when the first object, and therefore the interaction, really fades, the second object can move toward its normal position. If this is interaction, it does not, as such, occur on the perceptual scene. On this scene, we merely observe a movement. That movements of this kind do not correspond to real movements of the stimulus objects and must therefore be brought about by the sequence of the two objects, we can discover only by examining the physical situation. It follows that, if the seen movement is the perceptual result of an interaction, this interaction itself takes place outside the

* Address of the President at the sixty-seventh Annual Convention of the American Psychological Association, Cincinnati, Ohio, September 6, 1959

First published in American Psychologist, 14, 727-734. Abridged from Classics in the History of Psychology. An internet resource developed by Christopher D. Green, York University, Toronto, Ontario http://psychclassics.yorku.ca

62 | 115© Copyright 2006 C. George Boeree

Page 295: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

perceptual field. Thus, the apparent movement confirmed Wertheimer's more general suspicion: we cannot assume that the perceptual scene is an aggregate of unrelated elements because underlying processes are already functionally interrelated when that scene emerges, and now exhibits corresponding effects.

Wertheimer did not offer a more specific physiological explanation. At the time, this would have been impossible. He next turned to the problem of whether the characteristics of stationary perceptual fields are also influenced by interactions. I need not repeat how he investigated the formation of molar perceptual units, and more particularly of groups of such objects. Patterns which he used for this purpose are now reproduced in many textbooks. They clearly demonstrate that it is relations among visual objects which decide what objects become group members, and what others do not, and where, therefore, one group separates itself from another. This fact strongly suggests that perceptual groups are established by interactions; and, since a naive observer is merely aware of the result, the perceived groups, but not of their dependence upon particular relations, such interactions would again occur among the underlying processes rather than within the perceptual field.

Let me add a further remark about this early stage of the development. Surely, in those years, Gestalt psychologists were not satisfied with a quiet consideration of available facts. It seems that no major new trend in a science ever is. We were excited by what we found, and even more by the prospect of finding further revealing facts. Moreover, it was not only the stimulating newness of our enterprise which inspired us. There was also a great wave of relief – as though we were escaping, from a prison. The prison was psychology as taught at the universities when we still were students. At the time, we had been shocked by the thesis that all psychological facts (not only those in perception) consist of unrelated inert atoms and that almost the only factors which combine these atoms and thus introduce action are associations formed under the influence of mere contiguity. What had disturbed us was the utter senselessness of this picture, and the implication that human life, apparently so colorful and so intensely dynamic, is actually a frightful bore. This was not true of our new picture, and we felt that further discoveries were bound to destroy, what was left of the old picture.

Soon further investigations, not all of them done by Gestalt psychologists, reinforced the new trend. Rubin called attention to the difference between figure and ground. David Katz found ample evidence for the role of Gestalt factors in the field of touch as well as in color vision, and so forth. Why so much interest just in perception? Simply because in no other part of psychology are facts so readily accessible to observation. It was the hope of everybody that, once some major functional principles had been revealed in this part of psychology, similar principles would prove to be relevant to other parts, such as memory, learning, thinking, and motivation. In fact, Wertheimer and I undertook our early studies of intellectual processes precisely from this point of view; somewhat later, Kurt Lewin began his investigations of motivation which, in part , followed the same line; and we also applied the concept of Gestaltung or ,organization to memory, to learning, and to recall. With developments in America, Wertheimer's further analysis of thinking, Asch's and Heider's investigations in social psychology, our work on figural aftereffects, and eventually on currents Of the brain, we are probably all familiar.

...

But I intended to discuss some trends in American psychology. May I confess that I do not fully approve of all these trends?

First, I doubt whether it is advisable to regard caution and a critical spirit as the virtues of a scientist, as though little else counted. They are necessary in research, just as the brakes in our cars must be kept in order and their windshields clean. But it is not because of the brakes or of the windshields that we drive. Similarly, caution and a critical spirit are like tools. They ought to be kept ready during a scientific enterprise; however, the main business of a science is gaining more and more new knowledge. I wonder why great men in physics do not call caution and a critical spirit the most important characteristics of their behavior. They seem to regard the testing of brakes and the cleaning of windshields as mere precautions, but to look forward to the next trip as the business for which they have cars. Why is it only in psychology that we hear the slightly discouraging, story of mere caution over and over again? Why are just psychologists so inclined to greet the

63 | 115© Copyright 2006 C. George Boeree

Page 296: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

announcement of a new fact (or a new working hypothesis) almost with scorn? This is caution that has gone sour and has almost become negativism – which, of course, is no less an emotional attitude than is enthusiasm. The enthusiasm of the early Gestalt psychologists was a virtue, because it led to new observations. But virtues, it has been said, tend to breed little accompanying vices. In their enthusiasm, the Gestalt psychologists were not always sufficiently careful.

In American psychology, it is rightly regarded as a virtue if a man feels great respect for method and for caution. But, if this virtue becomes too strong, it may bring forth a spirit of skepticism and thus prevent new work. Too many young psychologists, it seems to me, either work only against something done by others or merely vary slightly what others have done before; in other words, preoccupation with method may tend to limit the range of our research. We are, of course, after clear evidence. But not in all parts of psychology can evidence immediately be clear. In some, we cannot yet use our most exact methods. Where this happens, we hesitate to proceed. Experimentalists in particular tend to avoid work on new materials resistant to approved methods and to the immediate application of perfectly clear concepts. But concepts in a new field can only be clarified by work in this field. Should we limit our studies to areas already familiar from previous research? Obviously, would mean a kind of conservatism in psychology. When I was his student, Max Planck repeated this warning over and over again in his lectures.

Our wish to use only perfect methods and clear concepts has led to Methodological Behaviorism. Human experience in the phenomenological sense cannot yet be treated with our most reliable methods; and, when dealing with it, we may be forced to form new concepts which, at first, will often be a bit vague. Most experimentalists, therefore, refrain from observing, or even from referring to, the phenomenal scene. And yet, this is the scene on which, so far as the actors are concerned, the drama of ordinary human living is being played all the time. If we never study this scene, but insist on methods and concepts developed in research "from the outside," our results are likely to look strange to those who intensely live "'inside."

To be sure, in many respects, the graphs and tables obtained "from the outside" constitute a most satisfactory material; and, in animal psychology, we have no other material. But this material as such contains no direct evidence as to the processes by which it is brought about. In this respect it is a slightly defective, I am tempted to say, a meager, material. For it owes its particular clearness to the fact that the data from which the graphs and tables are derived are severely selected data. When subjects are told to say no more than "louder," "'softer," and perhaps "equal" in certain experiments, or when we merely count how many items they recall in others, then we can surely apply it precise statistical techniques to what they do. But, as a less attractive consequence, we never hear under these circumstances how they do the comparing in the first case and what happens when they try to recall in the second case.

Are such questions now to be ignored? After all, not all phenomenal experiences are entirely vague; this Scheerer has rightly emphasized. And, if many are not yet accessible to quantitative procedures, what of it? One of the most fascinating disciplines, developmental physiology, the science investigating the growth of an organism from one cell, seldom uses quantitative techniques. And yet, nobody can deny that its merely qualitative description of morphogenesis has extraordinary scientific value. In new fields, not only quantitative data are relevant. As to the initial vagueness of Concepts in a new field, I should like to add an historical remark. When the concept of energy was first introduced in physics, it was far from king a clear concept. For decades, its meaning could not be sharply distinguished from that of the term "force." And what did the physicists do? They worked and worked on it, until at last it did become perfectly clear. There is no other way of dealing with new, and therefore not yet perfect, concepts. Hence, if we refuse to study the phenomenal scene, because, here, few concepts are so far entirely clear, we thereby decide that this scene will never be investigated – at least not by us, the psychologists.

...

You will ask me whether my suggestions lead to any consequences in actual research. Most surely, they do. But, since I have lived so long in America, and have therefore gradually become a most cautious scientist, I am now preparing myself for the study of motivation by investigating, first of all, the action of dynamic vectors in simpler fields, such as cognition and perception. It is a most interesting occupation to compare

64 | 115© Copyright 2006 C. George Boeree

Page 297: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

motivational action with dynamic events in those other parts of psychology. When you do so, everything looks different, not only in perception but also in certain forms of learning. Specific work? There is, and will be more of it than I alone can possibly manage. Consequently, I need help. And where do I expect to find this help? I will tell you where.

The Behaviorist's premises, we remember, lead to certain expectations and experiments. What I have just said invites us to proceed in another direction. I suggest that, in this situation, we forget about schools. The Behaviorist is convinced that his functional concepts are those which we all ought to use. The Gestalt psychologist, who deals with a greater variety of both phenomenal and physical concepts, expects more from work based on such premises. Both parties feel that their procedures are scientifically sound. Why should we fight? Many experiments done by Behaviorists seem to me to be very good experiments. May I now ask the Behaviorists to regard the use of some phenomenal facts, and also of field physics, as perfectly permissible? If we were to agree on these points, we could, I am sure, do excellent work together. It would be an extraordinary experience – and good for psychology.

65 | 115© Copyright 2006 C. George Boeree

Page 298: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Carl Rogers Selection: The Organization of Personality *

The Relation of the Organized Perceptual Field to Behavior

One simple observation, which is repeated over and over again in each successful therapeutic case, seems to have rather deep theoretical implications. It is that as changes occur in the perception of self and in the perception of reality, changes occur in behavior. In therapy, these perceptual changes are more often concerned with the self than with the external world. Hence we find in therapy that as the perception of self alters, behavior alters. Perhaps an illustration will indicate the type of observation upon which this statement is based.

A young woman, a graduate student whom we shall call Miss Vib, came in for nine interviews. If we compare the first interview with the last, striking changes are evident. Perhaps some features of this change may be conveyed by taking from the first and last interviews all the major statements regarding self, and all the major statements regarding current behavior. In the first interview, for example, her perception of herself may be crudely indicated by taking all her own statements about herself, grouping those which seem similar, but otherwise doing a minimum of editing, and retaining so far as possible, her own words. We then come out with this as the conscious perception of self which was hers at the outset of counseling.

I feel disorganized, muddled; I've lost all direction; my personal life has disintegrated.

I sorta experience things from the forefront of my consciousness, but nothing sinks in very deep; things don't seem real to me; I feel nothing matters; I don't have any emotional response to situations; I'm worried about myself.

I haven't been acting like myself; it doesn't seem like me; I'm a different person altogether from what I used to be in the past.

I don't understand myself; I haven't known what was happening to me.

I have withdrawn from everything, and feel all right only when I'm all alone and no one can expect me to do things.

I don't care about my personal appearance.

I don't know anything anymore.

I feel guilty about the things I have left undone.

I don't think I could ever assume responsibility for anything.

If we attempt to evaluate this picture of self from an external frame of reference various diagnostic labels may come to mind. Trying to perceive it solely from the client's frame of reference we observe that to the young woman herself she appears disorganized, and not herself. She is perplexed and almost unacquainted with what is going on in herself. She feels unable and unwilling to function in any responsible or social way. This is at least a sampling of the way she experiences or perceives herself.

Her behavior is entirely consistent with this picture of self. If we abstract all her statements describing her behavior, in the same fashion as we abstracted her statements about self, the following pattern emerges – a

* Excerpted from Carl R. Rogers (1947) Some Observations on the Organization of Personality. Address of the retiring President of the American Psychological Association at the September 1947 Annual Meeting. First published in American Psychologist, 2, 358-368. Represented in Classics in the History of Psychology, an internet resource developed by Christopher D. Green of York University, Toronto, Ontario. Posted March 2000 at

http://www.yorku.ca/dept/psych/classics/Rogers/personality.htm.

66 | 115© Copyright 2006 C. George Boeree

Page 299: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

pattern which in this case was corroborated by outside observation.

I couldn't get up nerve to come in before; I haven't availed myself of help.

Everything I should do or want to do, I don't do.

I haven't kept in touch with friends; I avoid making the effort to go with them; I stopped writing letters home; I don't answer letters or telephone calls; I avoid contacts that would be professionally helpful; I didn't go home though I said I would.

I failed to hand in my work in a course though I had it all done: I didn't even buy clothing that I needed; I haven't even kept my nails manicured.

I didn't listen to material we were studying; I waste hours reading the funny papers; I can spend the whole afternoon doing absolutely nothing.

The picture of behavior is very much in keeping with the picture of self, and is summed up in the statement that "Everything I should do or want to do, I don't do." The behavior goes on, in ways that seem to the individual beyond understanding and beyond control.

If we contrast this picture of self and behavior with the picture as it exists in the ninth interview, thirty-eight days later, we find both the perception of self and the ways of behaving deeply altered. Her statements about self are as follows:

I'm feeling much better; I'm taking more interest in myself.

I do have some individuality, some interests.

I seem to be getting a newer understanding of myself. I can look at myself a little better.

I realize I'm just one person, with so much ability, but I'm not worried about it; I can accept the fact that I'm not always right.

I feel more motivation, have more of a desire to go ahead.

I still occasionally regret the past, though I feel less unhappy about it; I still have a long ways to go; I don't know whether I can keep the picture of myself I'm beginning to evolve.

I can go on learning – in school or out.

I do feel more like a normal person now; I feel more I can handle my life myself; I think I'm at the point where I can go along on my own.

Outstanding in this perception of herself are three things – that she knows herself, that she can view with comfort her assets and liabilities, and finally that she has drive and control of that drive.

In this ninth interview the behavioral picture is again consistent with the perception of self. It may be abstracted in these terms.

I've been making plans about school and about a job; I've been working hard on a term paper; I've been going to the library to trace down a topic of special interest and finding it exciting.

I've cleaned out my closets; washed my clothes.

I finally wrote my parents; I'm going home for the holidays.

I'm getting out and mixing with people: I am reacting sensibly to a fellow who is interested in me – seeing both his good and bad points.

I will work toward my degree; I'11 start looking for a job this week.

Her behavior, in contrast to the first interview, is now organized, forward-moving, effective, realistic and planful. It is in accord with the realistic and organized view she has achieved of her self.

It is this type of observation, in case after case, that leads us to say with some assurance that as perceptions

67 | 115© Copyright 2006 C. George Boeree

Page 300: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

of self and reality change, behavior changes. Likewise, in cases we might term failures, there appears to be no appreciable change in perceptual organization or in behavior.

What type of explanation might account for these concomitant changes in the perceptual field and the behavioral pattern? Let us examine some of the logical possibilities.

In the first place, it is possible that factors unrelated to therapy may have brought about the altered perception and behavior. There may have been physiological processes occurring which produced the change. There may have been alterations in the family relationships, or in the social forces, or in the educational picture or in some other area of cultural influence, which might account for the rather drastic shift in the concept of self and in the behavior.

There are difficulties in this type of explanation. Not only were there no known gross changes in the physical or cultural situation as far as Miss Vib was concerned, but the explanation gradually becomes inadequate when one tries to apply it to the many cases in which such change occurs. To postulate that some external factor brings the change and that only by chance does this period of change coincide with the period of therapy, becomes an untenable hypothesis.

Let us then look at another explanation, namely that the therapist exerted, during the nine hours of contact, a peculiarly potent cultural influence which brought about the change. Here again we are faced with several problems. It seems that nine hours scattered over five and one-half weeks is a very minute portion of time in which to bring about alteration of patterns which have been building for thirty years. We would have to postulate an influence so potent as to be classed as traumatic. This theory is particularly difficult to maintain when we find, on examining the recorded interviews, that not once in the nine hours did the therapist express any evaluation, positive or negative, of the client's initial or final perception of self, or her initial or final mode of behavior. There was not only no evaluation, but no standards expressed by which evaluation might be inferred.

There was, on the part of the therapist, evidence of warm interest in the individual, and thoroughgoing acceptance of the self and of the behavior as they existed initially, in the intermediate stages, and at the conclusion of therapy. It appears reasonable to say that the therapist established certain definite conditions of interpersonal relations, but since the very essence of this relationship is respect for the person as he is at that moment, the therapist can hardly be regarded as a cultural force making for change.

We find ourselves forced to a third type of explanation, a type of explanation which is not new to psychology, but which has had only partial acceptance. Briefly it may be put that the observed phenomena of changes seem most adequately explained by the hypothesis that given certain psychological conditions, the individual has the capacity to reorganize his field of perception, including the way he perceives himself, and that a concomitant or a resultant of this perceptual reorganization is an appropriate alteration of behavior. This puts into formal and objective terminology a clinical hypothesis which experience forces upon the therapist using a client-centered approach. One is compelled through clinical observation to develop a high degree of respect for the ego-integrative forces residing within each individual. One comes to recognize that under proper conditions the self is a basic factor in the formation of personality and in the determination of behavior. Clinical experience would strongly suggest that the self is, to some extent, an architect of self, and the above hypothesis simply puts this observation into psychological terms.

In support of this hypothesis it is noted in some cases that one of the concomitants of success in therapy is the realization on the part of the client that the self has the capacity for reorganization. Thus a student says:

You know I spoke of the fact that a person's background retards one. Like the fact that my family life wasn't good for me, and my mother certainly didn't give me any of the kind of bringing up that I should have had. Well, I've been thinking that over. It's true up to a point. But when you get so that you can see the situation, then it's really up to you.

Following this statement of the relation of the self to experience many changes occurred in this young man's behavior. In this, as in other cases, it appears that when the person comes to see himself as the perceiving, organizing agent, then reorganization of perception and consequent change in patterns of reaction take place.

68 | 115© Copyright 2006 C. George Boeree

Page 301: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

On the other side of the picture we have frequently observed that when the individual has been authoritatively told that he is governed by certain factors or conditions beyond his control, it makes therapy more difficult, and it is only when the individual discovers for himself that he can organize his perceptions that change is possible. In veterans who have been given their own psychiatric diagnosis, the effect is often that of making the individual feel that he is under an unalterable doom, that he is unable to control the organization of his life. When however the self sees itself as capable of reorganizing its own perceptual field, a marked change in basic confidence occurs. Miss Nam, a student, illustrates this phenomenon when she says, after having made progress in therapy:

I think I do feel better about the future, too, because it's as if I won't be acting in darkness. It's sort of, well, knowing somewhat why I act the way I do ... and at least it isn't the feeling that you're simply out of your own control and the fates are driving you to act that way. If you realize it, I think you can do something more about it.

A veteran at the conclusion of counseling puts it more briefly and more positively: "My attitude toward myself is changed now to where I feel I can do something with my self and life." He has come to view himself as the instrument by which some reorganization can take place.

There is another clinical observation which may be cited in support Of the general hypothesis that there is a close relationship between behavior and the way in which reality is viewed by the individual. It has many cases that behavior changes come about for the most part Imperceptibly and almost automatically, once the perceptual reorganization has taken place. A young wife who has been reacting violently to her maid, and has been quite disorganized in her behavior as a result of this antipathy says:

After I ... discovered it was nothing more than that she resembled my mother, she didn't bother me any more. Isn't that interesting? She's still the same.

Here is a clear statement indicating that though the basic perceptions have not changed, they have been differently organized, have acquired a new meaning, and that behavior changes then occur. Similar evidence is given by a client, a trained psychologist, who after completing a brief series of client-centered interviews, writes:

Another interesting aspect of the situation was in connection with the changes in some of my attitudes. When the change occurred, it was as if earlier attitudes were wiped out as completely as if erased from a blackboard.... When a situation which would formerly have provoked a given type of response occurred, it was not as if I was tempted to act in the way I formerly had but in some way found it easier to control my behavior. Rather the new type of behavior came quite spontaneously, and it was only through a deliberate analysis that I became aware that I was acting in a new and different way.

Here again it is of interest that the imagery is put in terms of visual perception and that as attitudes are "erased from the blackboard" behavioral changes take place automatically and without conscious effort.

Thus we have observed that appropriate changes in behavior occur when the individual acquires a different view of his world of experience, including himself; that this changed perception does not need to be dependent upon a change in the "reality," but may be a product of internal reorganization; that in some instances the awareness of the capacity for reperceiving experience accompanies this process of reorganization; that the altered behavioral responses occur automatically and without conscious effort as soon as the perceptual reorganization has taken place, apparently as a result of this.

In view of these observations a second hypothesis may be stated, which is closely related to the first. It is that behavior is not directly influenced or determined by organic or cultural factors, but primarily (and perhaps only), by the perception of these elements. In other words the crucial element in the determination of behavior is the perceptual field of the individual. While this perceptual field is, to be sure, deeply influenced and largely shaped by cultural and physiological forces, it is nevertheless important that it appears to be only the field as it is perceived, which exercises a specific determining influence upon behavior. This is not a new idea in psychology, but its implications have not always been fully recognized.

69 | 115© Copyright 2006 C. George Boeree

Page 302: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

It might mean, first of all, that if it is the perceptual field which determines behavior, then the primary object of study for psychologists would be the person and his world as viewed by the person himself. It could mean that the internal frame of reference of the person might well constitute the field of psychology, an idea set forth persuasively by Snygg and Combs in a significant manuscript as yet unpublished. It might mean that the laws which govern behavior would be discovered more deeply by turning our attention to the laws which govern perception.

Now if our speculations contain a measure of truth, if the specific determinant of behavior is the perceptual field, and if the self can reorganize that perceptual field, then what are the limits of this process? Is the reorganization of perception capricious, or does it follow certain laws? Are there limits to the degree of reorganization? If so, what are they? In this connection we have observed with some care the perception of one portion of the field of experience, the portion we call the self.

70 | 115© Copyright 2006 C. George Boeree

Page 303: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Phenomenological Existentialism

71 | 115© Copyright 2006 C. George Boeree

Page 304: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Phenomenology is a research technique that involves the careful description of aspects of human life as they are lived; Existentialism, deriving its insights from phenomenology, is the philosophical attitude that views human life from the inside rather than pretending to understand it from an outside, "objective" point-of-view. Phenomenological existentialism, as a philosophy or a psychology, is not a tightly defined system by any means. And yet its adherents are relatively easily identified by their emphasis on the importance of individuals and their freedom to participate in their own creation. It is a psychology that emphasizes our creative processes far more than our adherence to laws, be they human, natural, or divine.

Franz Brentano

Franz Brentano was born January 16, 1838 in Marienberg, Germany. He became a priest in 1864 and began teaching two years later at the University of Wurzburg. Religious doubts led him to leave the priesthood and resign from his teaching position in 1873.

The following year, he wrote Psychology from an Empirical Standpoint. It was in this book that he introduced the concept that is most associated with him: intentionality or immanent objectivity. This is the idea that what makes mind different from things is that mental acts are always directed at something beyond themselves: Seeing implies something seen, willing means something willed, imagining implies something imagined, judging points at something judged. Intentionality links the subject and the object in a very powerful way. He was given a position as professor at the University of Vienna soon after.

In 1880, he tried to marry, but his marriage was forbidden by the Austrian government, who still considered him a priest. He left his professorship and moved

to Leipzig to get married. The next year, he was permitted to come back to the University of Vienna, as a lecturer.

He was quite popular with students. Among them were Carl Stumpf and Edmund Husserl, the founders of phenomenology, and Sigmund Freud himself. Brentano retired in 1895, but continued to write until his death on March 17, 1917, in Zurich.

Carl Stumpf

Carl Stumpf was born April 21, 1884 in Wiesentheid in Bavaria. He was strongly influenced by Brentano. As lecturer at the University of Gottingen, he published The Psychological Origins of Space Perception in 1870. In 1873, he became a professor at the University of Wurzburg. His masterwork, Tone Psychology, was completed during a series of professorships at Prague, Halle, and Munich.

He became a professor and the director of the institute of experimental psychology at the Friedrich-Wilhelm University in Berlin in 1894, where he continued his work on the psychology of music, started a journal on the subject, and began an archive of primitive music.

Stumpf retired in 1921, continuing his work until his death on December 15, 1936, in Berlin. With Husserl, he is considered a cofounder of phenomenology and in particular an inspiration to the Gestalt psychologists.

72 | 115© Copyright 2006 C. George Boeree

Page 305: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Edmund Husserl

Edmund Husserl was born on April 8, 1859 in Prossnitz, Moravia. He studied philosophy, math, and physics at Leipzig, Berlin, and Vienna and received his doctorate from the University of Vienna in 1882 in mathematics. The next year, he moved to Vienna to study under Franz Brentano.

Husserl, born into a Jewish family, converted to Lutheranism in 1886, and married Malvine Steinschneider in 1887, also a convert. They had three children. In these same years, he went to study with Carl Stumpf at the University of Halle and became a lecturer there. They became good friends and exchanged ideas.

While at Halle, he agonized over the connection between mathematics and the nature of the mind. He recognized that his original ideas, which involved mathematics as coming out of psychology, were misguided. So he began the development of his brand of phenomenology as a way of investigating the nature of experience itself. This led to the publication of Logical Investigations in 1900.

He was invited to a professorship at the University of Gottingen in 1901, where students began to form a circle around him and his work. He also developed a friendship with Wilhelm Dilthey, and was influenced by Dilthey’s ideas concerning the historical context of science.

In 1916, he went to the University of Freiburg. Here he wrote First Philosophy (1923-4), which outline his belief that phenomenology offered a means towards moral development and a better world. He received many honors and gave guest lectures at the University of London, the University of Amsterdam, and the Sorbonne, making his ideas available to a new, wider audience.

He retired in 1928. Martin Heidegger, with Husserl’s strong approval, was appointed his successor. As Heidegger’s work developed into the basis of existentialism, Husserl distanced himself from the new movement.

When the Nazis took over in 1933, Husserl, born a Jew, was banned from the university. He nevertheless continued providing support to friends in the resistance. He spoke on the European crisis in Vienna in 1935 despite being under a rule of silence. He also spoke at the University of Prague that year, where his unpublished manuscripts were being collected and cataloged.

His last work, The Crisis of European Sciences and Transcendental Phenomenology (1936), introduced the concept of Lebenswelt. The next year, he became ill and, on April 27, 1938, he died.

Phenomenology

Phenomenology is an effort at improving our understanding of ourselves and our world by means of careful description of experience. On the surface, this seems like little more than naturalistic observation and introspection. Examined a little more closely, you can see that the basic assumptions are quite different from those of the mainstream experimentally-oriented human sciences: In doing phenomenology, we try to describe phenomena without reducing those phenomena to supposedly objective non-phenomena. Instead of appealing to objectivity for validation, we appeal instead to inter-subjective aggreement.

Phenomenology begins with phenomena – appearances, that which we experience, that which is given – and stays with them. It doesn't prejudge an experience as to its qualifications to be an experience. Instead, by taking up a phenomenological attitude, we ask the experience to tell us what it is.

73 | 115© Copyright 2006 C. George Boeree

Page 306: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

The most basic kind of phenomenology is the description of a particular phenomenon such as a momentary happening, a thing, or even a person, i.e. something full of its uniqueness. Herbert Spiegelberg (1965) outlines three "steps:"

1. Intuiting – Experience or recall the phenomenon. "Hold" it in your awareness, or live in it, be involved in it; dwell in it or on it.

2. Analyzing – Examine the phenomenon. Look for...

the pieces, parts, in the spatial sense; the episodes and sequences, in the temporal sense; the qualities and dimensions of the phenomenon. settings, environments, surroundings; the prerequisites and consequences in time; the perspectives or approaches one can take. cores or foci and fringes or horizons; the appearing and disappearing of the phenomena; the clarity of the phenomenon.

And investigate these many aspects both in their outward forms – objects, actions, others – and in their inward forms – thoughts, images, feelings.

3. Describing – Write down your description. Write it as if the reader had never had the experience. Guide them through your intuiting and analyzing.

What makes these three simple steps so difficult is the attitude you must maintain as you perform them. First, you must have a certain respect for the phenomenon. You must take care that you are intuiting it fully, from all possible "angles," physically and mentally, and leaving nothing out of the analysis that belongs there. Herbert Spiegelberg said "The genuine will to know calls for the spirit of generosity rather than for that of economy...."*

Included in this "generosity" is a respect for both public and private events, the "objective" and the "subjective." A basic point in phenomenology is called intentionality, which refers to the mutuality of the subject and the object in experience: All phenomena involve both an intending act and an intended object. Traditionally, we have emphasized the value of the object-pole and denigrated that of the subject-pole. In fact, we have gone so far as to dismiss even the object-pole if it doesn't correspond to some physical entity! But, to quote Spiegelberg again, "Even merely private phenomena are facts which we have no business to ignore. A science which refuses to take account of them as such is guilty of suppressing evidence and will end with a truncated universe."*

On the other hand, we must also be on guard against including things in our descriptions that don't belong there. This is the function of bracketing: We must put aside all biases we may have about the phenomenon. When you have a prejudice against a person, you will see what you expect, rather than what is there. The same applies to phenomena in general: You must approach them without theories, hypotheses, metaphysical assumptions, religious beliefs, or even common sense conceptions. Ultimately, bracketing means suspending judgement about the "true nature" or "ultimate reality" of the experience – even whether or not it exists!

Although the description of individual phenomena is interesting in its own right – and when it involves people or cultures, a massive undertaking as well – we usually come to a point where we want to say something about the class the phenomenon is a part of. In phenomenology, we talk about seeking the essence or structure of something. So we might investigate the essence of traingularity, or of pizza, or of anger, or of being male or female. We might even, as the phenomenological existentialists have attempted, seek the essence of being human!

Husserl suggested a method called free imaginative variation: When you feel you have a description of the essential characteristics of a category of phenomena, ask yourself, "What can I change or leave out without

* Spiegelberg, Herbert (1965). The Phenomenological Movement. The Hague: Martinus Nijhoff.

74 | 115© Copyright 2006 C. George Boeree

Page 307: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

losing the phenomenon? If I color the triangle blue, or construct it out of Brazilian rosewood, do I still have a triangle? If I leave out an angle, or curve the sides, do I still have a triangle?" This may seem trivial and easy, but now try it regarding "being human:" Is a corpse human? A disembodied spirit? A person in a permanent coma? A porpoise with intelligence and personality? A just-fertilized egg? A six-month old fetus?

With phenomenology, the world regains some of its solidity, the mind is again permitted a reality of its own, and a rather paranoid skepticism is replaced with a more generous, and ultimately more satisfying, curiosity. By returning, as Husserl (1965, 1970) put it, to "the things themselves," or, to use another of his terms, to the lived world (Lebenswelt), we stand a better chance at developing a true understanding of our human existence.

Martin Heidegger

Martin Heidegger was born on September 26, 1889, in Messkirch, Germany. His father was the sexton of the local church, and Heidegger followed suit by joining the Jesuits. He studied the theology and philosophy of the Middle Ages, as well as the more recent work of Franz Brentano.

He studied with Heinrich Rickert, a well known Kantian, and with Husserl. He received his doctorate in 1914, and began teaching at the University of Freiburg the following year. Although he was strongly influenced by Husserl’s phenomenology, his interests lay more in the meaning of existence itself.

In 1923, he became a professor at the University of Marburg, and in 1927, he published his masterwork, Being and Time (Sein und Zeit). Influenced by the ancient Greeks as well as Kierkegaard, Nietzsche, and Dilthey, as well as Husserl, it was an exploration of the verb "to be," particularly from the standpoint of a human being in time. Densely and obscurely written, it was nevertheless well received all over Europe, though not in the English-speaking world.

In 1928, he returned to Freiburg as Husserl’s successor. In the 1930’s, the Nazi’s began pressuring German universities to fire their Jewish professors. The rector at Freiburg resigned in protest and Heidegger was elected to take his place. Although he strongly encouraged students and professors to be true to their search for truth, he nevertheless also encouraged loyalty to Hitler. He even joined the Nazi party. Many people, otherwise admirers of his thinking, have never forgiven him for that.

To be fair, he did resign from his position as rector in 1934, and after the war talked about Naziism as a symptom of the sickness of modern society. He stopped teaching in 1944, and after the war, the allied forces prevented him from further teaching. But they later restored his right to teach in light of the fact that his support of Hitler was of a passive rather than active nature. He died in Messkirch on May 26, 1976.

Heidegger spent his entire life asking one question: What is it "to be?" Behind all our day-to-day living, for that matter, behind all our philosophical and scientific investigations of that life, how is it that we "are" at all?

Phenomenology reveals the ways in which we are. The first hurdle is our traditional contrast between subject and object, which splits man as knower from his environment as the known. But in the phenomenological attitude, experience doesn’t show this split. Knower and known are both inextricably bound together. Instead, it appears that the subject-object split is something we developed late in history, especially with the advent of modern science.

The problems of the modern world come from the "falling" of western thought: Instead of a concern with the development of ourselves as human beings, we have allowed technology and technique to rule our lives and

75 | 115© Copyright 2006 C. George Boeree

Page 308: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

lead us to a false way of being. This alienation from our true nature is called inauthenticity.

Much of what is difficult about reading Heidegger is that he tries to recover the kind of being that was before the subject-object split by looking at the origins of words, especially Greek words. In as much as the ancient Greeks were less alienated from themselves and their world, their language should offer us a clue to their relation to being.

Heidegger says that we have a special relationship to the world, which he refers to by calling human existence Dasein. Dasein means "being there," and emphasizes that we are totally immersed in the world, and yet we stand-out (ex-sist) as well. We are a little off-center, you might say, never quite stable, always becoming.

A big part of our peculiar nature is that we have freedom. We create ourselves by choosing. We are our own projects. This freedom, however, is painful, and we experience life as filled with anxiety (Angst, dread). Our potential for freedom calls to us to authentic being by means of anxiety.

One of the central sources of anxiety is the recognition that we all have to die. Our limited time here on earth makes our choices far more meaningful, and the need to choose to be authentic urgent. We are, he says, being-towards-death (Sein-zum-Tode).

All too often, we surrender in the face of anxiety and death, a condition Heidegger calls fallenness. We become "das Man" – "the everybody" – nobody in particular, the anonymous man, one of the crowd or the mob.

Two characteristics of "das Man" are idle talk and curiosity. Idle talk is small talk, chatter, gossip, shallow interaction, as opposed to true openness to each other. Curiosity refers to our need for distraction, novelty-seeking, busy-body-ness, as opposed to a true capacity for wonder.

We become authentic by thinking about being, by facing anxiety and death head-on. Here, he says, lies joy.

Jean-Paul Sartre "Man is a useless passion."

Sartre is probably the best known of the existentialists, and clearly straddles the line between philosopher and psychologist (and social activist as well!). Jean-Paul Sartre was born on June 21, 1905, in Paris, France, the only child of Jean-Baptiste Sartre and his wife Anne-Marie. His father died one year later of colitis, so his mother took him to live with her grandfather, Carl Schweitzer, a German professor at the Sorbonne and the uncle of the famous missionary-philosopher Albert Schweitzer.

Rather lost in the disciplined household of her grandfather, Anne-Marie and her small but highly intelligent son grew very close. A childhood illness left Sartre blind in one eye, which drifted outward and up, so he forever seemed to be looking elsewhere. Lonely, he began to write stories and plays as an escape.

Anne-Marie escaped her grandfather's house by getting remarried when Jean-Paul was twelve. Jean-Paul became rebellious and unmanageable, so he was sent to a boarding school. There, he continued his trouble-making ways, and frequently spent time in detention.

After lycée (roughly, high school), Sartre attended the Ecole Normale Superieure at the Sorbonne. Brilliant but disorganized and inattentive, he placed last out of 50 students on his exit exams. The following year, he

76 | 115© Copyright 2006 C. George Boeree

Page 309: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

studied with a young woman named Simone de Beauvoir, and graduated in 1929. He placed first this time, and she second. They would have a strong but open love relationship until their deaths.

After graduation, Sartre taught at a series of lycées for many years. He spent one year in Berlin attending lectures by Edmund Husserl, the founder of phenomenology. This approach would figure prominently in several of his philosophical works, including Imagination (1936), Sketch for a Theory of Emotions (1939), and The Psychology of Imagination (1940)

In 1938, Sartre published his first novel, Nausea. In this novel, he writes about the feelling of nausea that his character feels when he contemplates the "thickness" of the material world, including other people and his own body. The novel is strange, but the descriptions are compelling, and Sartre began to make a name for himself.

In 1939, Sartre was drafted into the army. He was taken prisoner in 1940 and released a year later. His experiences as a participant in the resistance would color many of his later works. In June of 1943, his play The Flies opened in Paris. Even though it was blatantly anti-Nazi, the play was sometimes attended by Nazi officers!

Also in 1943, he published his masterpiece, L'être et le néant (Being and Nothingness). In this large and difficult work, he outlined his theory that human consciousness was a sort of no-thing-ness surrounded by the thickness of being. As a "nothingness," human consciousness is free from determinism, resulting in the difficult situation of our being ultimately responsible for our own lives. "Man is condemned to be free." On the other hand, without an "essence" to provide direction, human consciousness is also ultimately meaningless.

"All existing things are born for no reason, continue through weakness and die by accident.... It is meaningless that we are born; it is meaningless that we die."

Perhaps his best known philosophical point is "existence precedes essence." In the case of non-human entities, an essence is something that is prior to somethings actual existence. A table's essence is the intention that its creator, builder, or user has for it, such as its general shape, components, and function. A woodchuck's essence is in its genetic inheritance, its instincts, and the conditions of its environment – and its entire life is sort of the playing out of a program. But a human being, according to Sartre, doesn't have a true essence. Oh, sure, we have our general shape, our genetics, our upbringings and the like. But they do not determine our lives, they only set the stage. It is we ourselves who shape our lives. We are the ones who choose what to do with the raw materials nature has provided us. We create ourselves. And our "essence" is only clear when our whole life is done. Another way to put it is that our "essence" is our lack of essence; our "essence" is our freedom.

In 1944, he produced one of his most famous play, Huis-clos or No Exit. This play, and several others, present the problem of living with one's fellow man, and are quite pessimistic. "Hell is other people" is the famous quote from No Exit.

After World War II, Sartre became increasingly concerned with the issue of social responsibility. He postulated that being free meant not only being responsible for your own life, but being responsible for the lives of all human beings. He outlined this idea in Existentialism and Humanism (1946) and a novel called Les Chemins de la Liberte (Paths of Liberty, 1945, never completed).

"But if existence really does precede essence, man is responsible for what he is. Thus, existentialism's first move is to make every man aware of what he is and to make the full responsibility of his existence rest on him. And when we say that a man is responsible for himself, we do not only mean that he is responsible for his own individuality, but that he is responsible for all men."

Sartre also wrote deep psychological examinations of famous French writers: Baudelaire (1947), Jean Genet (1952), and Flaubert (two of three volumes completed in 1971 and 1972). In these, he looks at these writers from existential, psychoanalytic, and Marxist perspectives, in an effort to create the most complete phenomenological portraits possible. The books are, unfortunately, practically unreadable!

77 | 115© Copyright 2006 C. George Boeree

Page 310: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Sartre was an admirer of Karl Marx's writings, and of the Soviet Union. His support of Russian communism ended in 1956 when the Russian army marched into Budapest to stop the Hungarian efforts at independence. (My own family emigrated to the US from the Netherlands in that year, fearful of a third world war.) Still hopeful, he wrote a critical analysis of Marxism in 1960 supporting the fundamental ideas of Marx, but criticizing the Russian form Marxism had taken.

In 1963, he published his autobiography, Les Mots (Words). He was awarded the Nobel Prize the following year, but he refused it on political grounds. Here is an example of his evocative style:

"I'm a dog. I yawn, the tears roll down my cheeks, I feel them. I'm a tree, the wind gets caught in my branches and shakes them vaguely. I'm a fly, I climb up a window-pane, I fall, I start climbing again. Now and then, I feel the caress of time as it goes by. At other times - most often - I feel it standing still. Trembling minutes drop down, engulf me, and are a long time dying. Wallowing, but still alive, they're swept away. They are replaced by others which are fresher but equally futile. This disgust is called happiness."

Toward the end of the 1970's, Sartre's health began to degenerate. His bad habits included smoking two packs of unfiltered French cigarettes a day, heavy drinking, and the use of amphetamines to help him stay awake while writing. He died on April 15, 1980, of lung cancer. Simone de Beauvoir tried to stay with his body and had to be taken away by attendants. His funeral procession was attended by over 50,000 mourners.

This philosophy of existentialism, as difficult as it is to express and live, had a great impact on any number of thinkers in this century. Among them are philosophers such as Simone de Beauvoir, Albert Camus, Martin Buber, Ortega y Gassett, Gabriel Marcel, Paul Tillich, Merleau-Ponty, psychologists such as Ludwig Binswanger, Medard Boss, Erich Fromm, Rollo May, and Viktor Frankl, and even the post-modernist movement’s Foucault and Derrida. Less directly, existentialism has influenced American psychologists such as Carl Rogers. The influence continues to this day.

78 | 115© Copyright 2006 C. George Boeree

Page 311: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

James Joyce Selection: Portrait of the Artist as a Young Man *

[Have you ever noticed how some writers seem to understand the human mind (and heart) far better than any psychologist or psychiatrist? Here's an incredible example of literary phenomenology by the amazing James Joyce.]

He was alone. He was unheeded, happy and near to the wild heart of life. He was alone and young and wilful and wildhearted, alone amid a waste of wild air and brackish waters and the sea-harvest of shells and tangle and veiled grey sunlight and gayclad lightclad figures of children and girls and voices childish and girlish in the air.

A girl stood before him in midstream, alone and still, gazing out to sea. She seemed like one whom magic had changed into the likeness of a strange and beautiful seabird. Her long slender bare legs were delicate as a crane's and pure save where an emerald trail of seaweed had fashioned itself as a sign upon the flesh. Her thighs, fuller and soft-hued as ivory, were bared almost to the hips, where the white fringes of her drawers were like feathering of soft white down. Her slate-blue skirts were kilted boldly about her waist and dovetailed behind her. Her bosom was as a bird's, soft and slight, slight and soft as the breast of some dark-plumaged dove. But her long fair hair was girlish: and girlish, and touched with the wonder of mortal beauty, her face.

She was alone and still, gazing out to sea; and when she felt his presence and the worship of his eyes her eyes turned to him in quiet sufferance of his gaze, without shame or wantonness. Long, long she suffered his gaze and then quietly withdrew her eyes from his and bent them towards the stream, gently stirring the water with her foot hither and thither. The first faint noise of gently moving water broke the silence, low and faint and whispering, faint as the bells of sleep; hither and thither, hither and thither; and a faint flame trembled on her cheek.

– Heavenly God! cried Stephen's soul, in an outburst of profane joy.

He turned away from her suddenly and set off across the strand. His cheeks were aflame; his body was aglow; his limbs were trembling. On and on and on and on he strode, far out over the sands, singing wildly to the sea, crying to greet the advent of the life that had cried to him.

* From http://www.bibliomania.com/Fiction/joyce/artist/index.html Image: Renoir's "Girl Braiding Hair"

79 | 115© Copyright 2006 C. George Boeree

Page 312: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Romance: A Partial Analysis *

Romance is a mood or state of mind akin to several others, including love, friendship, sexual interest, contentment, self-assuredness, and so on.

It is normally experienced in the context of an actual relationship, although it may be experienced in other ways, such as in fantasy, expectation, or possibility. It may also be experienced vicariously, such as when watching a romantic movie or real couples in romantic situations. It is even experienced occasionally with friends or relations.

It is, more specifically, associated with courtship and with the intimations of sexuality that go with it. It is itself, however, not primarily sexual. In fact, it often has an innocent feel to it, and is associated with "puppy" love, first love, early flirtations, and the like.

Romance often involves courtship symbols, traditions, and stereotypes, such as flowers, gifts, hand-holding, candle-lit dinners, "romantic" music, .... These, however, are not essential, but rather seem to derive from certain natural ways of expressing romantic feelings. Once upon a time, they were probably original! These symbols, etc., are now often used to "set the stage" for romance.

The romantic state of mind often seems to come on suddenly, a matter of rather abruptly becoming aware of being in a romantic moment. It very often involves surprise. This is where many of the aforementioned symbols come into play: Romance often involves being surprised by signs of someone's affection, whether it be in the form of a gift, a helping hand, an appreciative glance, a confidence shared, or what have you.

Associated with surprise is the sense of great motion, lightness, being swept up in the moment, or swept off your feet! On the other hand, some people instead focus on a feeling of steadiness and solidity, reflecting the firmness of a commitment or the solidity of a relationship, especially in adversity. The lightness in oneself and the steadiness of the other are by no means incompatible.

There is often a degree of gender stereotyping involved in romance: "He made me feel pretty, feminine.... He is my knight in shining armor.... He swept me off my feet.... I found comfort in his broad shoulders...." These comments are used to good advantage in romance novels, but have their sources in ordinary experience. In men, we find similar statements, in reverse: "She made me feel strong, like a real man...." Please note that this is not to be understood as a "power thing," but rather an awareness of the need to care for a woman, to "nurture." The connection with courtship seems quite strong, despite the many exceptions.

The mood may come upon both people naturally, but it is often "arranged for" by one or the other. The structure of the romantic episode seems best left simple and it is greatly enhanced by at least the appearance of spontaneity.

Circumstances can be very important. A small gesture or sign of support in adverse circumstances can be far more valuable than great generosity in good circumstances. Romance seems, in fact, to thrive on adversity, as in our common recollections of our "poor days."

This introduces as well the symbolism of the hero and the fair maiden in fairy tales. Selfless help in adversity, revealing deep affection, is a theme common to most fairy tales, many movies, and many real-life romantic moments as well.

The key feeling would seem to be one of a heightened self-worth seen as coming from the other person. Examples would include feeling especially attractive, important, strong, interesting, intelligent, and so on. Even the sense that one has been involved in something important can bring on a sense of romance. The increase in self-worth, curiously, results in an increase in one's valuing of or affection for the other.

Paradoxically, these feelings can also occur in reverse, so that coming upon the other person in

* Based on a class exercise

80 | 115© Copyright 2006 C. George Boeree

Page 313: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

circumstances that lead you to particularly value him or her may lead to feelings of strength, security, confidence, etc., and this too is felt as romantic! Common to both is the sense of being fortunate or lucky to be you, to be there, to be with this person.

Other aspects of a romantic mental state include (a) lightness, airiness, giddiness, a glow, excitement, enchantment, joking and laughing; (b) coziness, cuddling, contentment, comfort, closeness; and (c) riskiness, danger, and naughtiness. Set (a) seems most common, with the others being variation, and (c) being the least common, but certainly not rare.

The essence of romance seems to me to be the sudden discovery or bringing to awareness (whether by accident or by arrangement) of your importance or value to another, along with an awareness of their value to you. It is a confirmation that one is "lovable" or worthy of affection, whether in the eyes of a desirable young man or woman or in the context of a long, comfortable marriage. This confirmation comes with many of the qualities associated with other kinds of "ego-transcendence" or "ego-expansion," such as love itself: By losing yourself in your affection for another, you become stronger as an individual. As is often mentioned, it is just one of those things that defies logic!

81 | 115© Copyright 2006 C. George Boeree

Page 314: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Modern Medicine and Physiology

82 | 115© Copyright 2006 C. George Boeree

Page 315: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Technology and the brain

In the 1800's, anatomy had reached a point of sophistication that allowed medical artists to make such intricate drawings that modern surgeons could still benefit from them. But there was always a limitation involved: It was one thing to carve up a dead brain – quite another to actually see a living brain at work. In the late 1800's and throughout the 1900's, we see some remarkable efforts at exploring the brain without removing it from its owner: First, Wilhelm Konrad Roentgen invents the x-ray in 1895. A remarkable tool for physicians and researchers, it proves less useful when it comes to the soft tissues of the brain. In 1972, Godfrey Hounsfield added the computer to the x-ray and developed computerized (axial) tomography – the CT (or CAT) scan – which sums multiple extras into a far more detailed three-dimensional image.

In a very different approach, Hans Berger developed the first electroencephalogram (EEG) in 1929. In 1932, Jan Friedrich Tonnies created the first modern version, with its moving paper and vibrating pens. The EEG records the minute electrical coordinated pulses of large number of neurons on the surface of the cortex. It was only a matter of time before researchers added the computer to the equation.

In 1981, the team of Phelps, Hoffman, and TerPogossian develop the first PET scan. The PET scan (positron emission tomography) works like this: The doctor injects radioactive glucose (that’s sugar water) into the patient’s bloodstream. The device then detects the relative activity level – that is, the use of glucose – of different areas of the brain. The computer generates an image that allows the researcher to tell which parts of the brain are most active when we perform various mental operations, whether it’s looking at something, counting in our heads, imagining something, or listening to music!

In 1937, Isidor I. Rabi, a professor at Columbia University, noticed that atoms reveal themselves by emitting radio waves after first having been subjected to a powerful magnetic field. He called this nuclear magnetic resonance or NMR. This was soon used by scientists to identify chemical substances in the lab. It would be many years later that a Dr. Raymond Damadian would recognize the potential of NMR's for medicine.

Damadian is an interesting and controversial person. He was born in New York City in 1936. When he was only eight years old, he was accepted by the Juilliard School of Music. He was awarded a scholarship to the University of Wisconsin at Madison, and then went on to medical school at the Albert Einstein College of Medicine of the Yeshiva University in the Bronx. He received his MD in 1960 at the tender ago of 24. From there, he began medical research at Brooklyn's Downstate Medical Center.

Investigating tumors in rats, he noted that the NMR signals from cancerous tumors were significantly different from the signal from healthy rats. He hypothesized that the reason was the larger number of water molecules (and therefore hydrogen atoms) in these tumors. His findings were published in Science in 1971.

Realizing that this was the basis for a non-surgical way to detect cancer, he got the idea for a large-scale NMR device that could record the radio waves coming from all the atoms in a human being. You only had to create a magnetic field big enough!

In 1977, he and his students built a temperamental prototype of the modern MRI – magnetic resonance imaging – which they called the Indomitable. He tried it, unsuccessfully, on himself first, then on a graduate student named Larry Minkoff. The result was a mere 106 data points (recorded first in colored pencils!) describing the tissues of Minkoff's chest. The Indomitable is now in the Smithsonian.

Damadian's story continues with his recording of a patent and years of litigation trying to fight off companies like Hitachi and General Electric who disputed his patent. He has also stirred up controversy by supporting the work of so-called "creation scientists."

There have been a number of other scientists studying NMR who were in fact heading in the same direction as Damadian. One person in particular with a legitimate claim to co-discovery is Paul Lautenbur. He developed the idea of using small NMR gradients to map the body while at SUNY Stony Brook. In 1973, he

83 | 115© Copyright 2006 C. George Boeree

Page 316: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

used his technique on a test tube of water, and then used it on a clam. His work was published in Nature, and it is his technique that is favored today. Lautenbur and British MRI researcher Peter Mansfield were awarded the Nobel Prize in 2003.

The MRI works like this: You create a strong magnetic field which runs through the person from head to toe. This causes the spinning hydrogen atoms in the person’s body to line up with the magnetic field. Then you send a radio pulse at a special frequency that causes the hydrogen protons to spin in a different direction. When you turn off the radio pulse, the protons will return to their alignment with the magnetic field, and release the extra energy they took in from the radio pulse. That energy is picked up by the same coil that produced the energy, now acting like a three dimensional antenna. Since different tissues have different relative amounts of hydrogen in them, they give a different density of energy signals, which the computer organizes into a detailed three-dimensional image. This image is nearly as detailed as an anatomical photograph!

On the more active side, direct electrical stimulation of the brain of a living person became a fine art in the 1900's. In 1909, Harvey Cushing mapped the somatosensory cortex. In 1954, James Olds produced a media sensation by discovering the so-called 'pleasure center" of the hypothalamus. By the end of the century, the specialized areas of the brain were pretty well mapped.

Brain surgery also became more effective. In the process of looking for surgical relief for extreme epilepsy, it was discovered that cutting the corpus callosum which joins the two hemispheres of the cerebral cortex, greatly improved the patients' condition. Roger Sperry was then able to discover the various differences between the left and right hemisphere in some of the most interesting studies in history. He was awarded the Nobel Prize for his work in 1981.

The other aspect of technology is its use in attempting to heal people with mental illness. Although extremely controversial to this day, the evidence strongly suggests that electroshock therapy, first used by Ugo Cerletti and Lucino Bini in 1938, can be effective in the care of very depressed patients. Electroshock (also known as electro-convulsive therapy or ECT) involves sending a strong electrical current through an anesthetized patient's brain. When they awake, they cannot seem to recall several hours of time before the procedure, but also feel much less depressed. We aren't sure why it works.

Less effective and much more radical is the lobotomy, first used on human beings by Antonio Egaz Moniz of the University of Lisbon Medical School, who won the Nobel Prize for his work in 1949. The lobotomy was turned into a mass-production technique by Walter Freeman, who performed the first lobotomy in the U.S. in 1936.

The psychopharmacological explosion

In the 1800's, the basic principles of the nervous system were slowly being unraveled by people such as Galvani in Italy and Helmholtz in Germany. Toward the end of the 1800s, biologists were approaching an understanding of the details. In particular, Camillo Golgi (who believed that the nervous system was a single entity) invented a staining technique that allowed Santiago Ramon y Cajal to prove that the nervous system was actually composed of individual neurons. Together, they won the Nobel Prize in 1906.

The British biologist Sir Charles Sherrington had already named what Ramon y Cajal saw: the synapse. He, too, would win a Nobel Prize for his work on neurons with Edgar Douglas Adrian.

In 1921, the German biologist Otto Leowi completed the picture by discovering acetylcholine and the idea of the neurotransmitter. For this work, he received the Nobel Prize, shared with Henry Hallett Dale. Interestingly, acetylcholine is a relative of muscarine – the active ingredient of some of those mushrooms

84 | 115© Copyright 2006 C. George Boeree

Page 317: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

that some of our ancient ancestors liked so much. In 1946, another biologist, von Euler, discovered norepinephrine. And, in 1950, Eugene Roberts and J. Awapara discover GABA.

In the early part of the 1900's, we see the beginnings of psychopharmacology as a medical science, with the use of bromide and chloralhydrate as sedatives. Phenobarbital enters the picture in 1912 as the first barbituate. In the second half of the 1900s, with the basic mechanisms of the synapse understood, progress in the development of psychoactive drugs truly got underway. For example...

In 1949, John Cade, an Australian psychiatrist, found that lithium, a light metal, could lessen the manic aspect of manic-depression.

In 1952, a French Navy Doctor, Henri Laborit, came up with a calming medication which included chlorpromazine, which was promoted as the antipsychotic Thorazine a few years later.

Imipramine, the first tricyclic antidepressant, was developed at Geigy Labs by R. Kuhn in the early 1950's, while he was trying to find a better antihistamine!

In the late 1950's, Nathan Klein studied the use of reserpine in 1700s India, and found it reduced the symptoms of many of his psychiatric patients. Unfortunately, the side effects were debilitating.

In 1954, the drug meprobamate, better known as Miltown, became available on the market. Its chemical foundation was discovered a decade earlier by Frank Berger, while he was trying to discover a new antibiotic. He found a tranquilizer instead!

Iproniazid (an MAOI antidepressant) was developed in 1956 by the Hoffman-LaRoche pharmaceutical company for tuberculosis patients. It appeared to cheer them up a bit! Although it was banned because of side effects, it was the first in a long series of antidepressants.

Leo Sternbach also worked for Hoffman-LaRoche, and discovered the drug Valium (diazepam) in 1959, and Librium the following year – two of the most useful and used psychoactive drugs ever.The progress of psychopharmacology was greatly aided by increased knowledge of the activities at the level of the synapse. John Eccles, Alan Lloyd Hodgkin and Andrew Fielding Huxley shared the Nobel Prize in 1963 for their work on the neuron's membrane. And in 1973, Solomon Snyder and Candace Pert of Johns Hopkins discovered "internal morphine" or endorphin, and the "lock-and-key" theory – the basic mechanism of psychoactive drugs – was confirmed.

In 1974, D. T. Wong at Eli Lilly labs discovered fluoxetine – Prozac – and its antidepressant effects. It was approved by the FDA in 1987. This substance and others like it – known as the serotonin selective re-uptake inhibitors or SSRIs – would dramatically change the care of people with depression, obsessive-compulsive disorder, social anxiety, and other problems.

In the 1990's, new neuroleptics (antipsychotic drugs) such as clozapine were developed which addressed the problems of schizophrenia more completely than the older drugs such as chlorpromazine, and with fewer side effects.

What is the future going to be like, in regards to psychopharmacology? Some say the major breakthroughs are over, and it is just a matter of producing better variations. But that has been said many times before. Biochemistry is still progressing, and every year brings something new. The rest of us can only hope that many more and better medications with psychiatric applications will be found.

85 | 115© Copyright 2006 C. George Boeree

Page 318: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Genetics and the human genome

The science of genetics begins in the garden of an Austrian Monk named Gregor Mendel. In 1866, he published the results of his work suggesting the existence of "factors" – which would later be called genes – that are responsible for the physical characteristics of organisms.

A Columbia University professor, Dr. Thomas Morgan provided the next step in 1910 by discovering that these genes are in fact carried within the structures called chromosomes. And in 1926, Hermann J. Muller discoveres that he can created mutations in fruit flies by irradiating them with X-rays.

Finally, in 1953, Dr. James D. Watson and Dr. Francis Crick outline the structure of the DNA molecule. And Dr. Sydney Brenner completes the picture by discovering RNA and the basic processes of protein construction.

The next phase of genetics involves the mapping of the DNA: What is the sequence of bases (A, T, G, and C) that make up DNA, and how do those sequences relate to proteins and ultimately to the traits of living organisms? Two researchers, Frederick Sanger and Walter Gilbert, independently discover a technique to efficiently "read" the bases, and in 1977, a bacteriophage virus is the first creature to have its genome revealed.

In the 1980’s, the Department of Energy reveals a plan to bring together researchers world-wide to learn the entire genome – of

human beings! The NIH (National Institute of Health) joins in, and makes Dr. James Watson the director of the Office of Human Genome Research.

In 1995, Dr. Hamilton Smith and Dr. J. Craig Venter read the genome of a bacterium. In 1998, researchers publish the genome of the first animal, a roundworm. In 2000, they have the genome of the fruit fly. And in the same year, researchers have the genome sequence of the first plant.

In June of 2000, at a White House ceremony hosted by President Clinton, two research groups – the Human Genome Project consortium and the private company Celera Genomics – announce that they have nearly completed working drafts of the human genome. In February of 2001, the HGP consortium publishes its draft in Nature and Celera publishes its draft in Science. The drafts describe some 90% of the human genome, although scientists know the function of less than 50% of the genes discovered..

There were a few surprises: Although the human genome is comprised of more than three billion bases, this is only a third as large as scientists had predicted. And it is only twice as large as that of the roundworm. It is also discovered that 99.9% of the sequences are exactly the same for all human beings. We are not as special as we like to think!

The human genome project is not just an intellectual exercise: Knowing our genetic makeup will allow us to treat genetic illnesses, custom design medicines, correct mutations, more effectively treat and even cure cancer, and more. It is an accomplishment that surpasses even the landing on the moon.

86 | 115© Copyright 2006 C. George Boeree

Page 319: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

A Brief History of the Lobotomy

The idea of brain surgery as a means of improving mental health got started around 1890, when Friederich Golz, a German researcher, removed portions of his dogs’ temporal lobes, and found them to be calmer, less aggressive. It was swiftly followed by Gottlieb Burkhardt, the head of a Swiss mental institution, who attempted similar surgeries on six of his schizophrenic patients. Some were indeed calmer. Two died.

One would think that that would be the end of the idea. But in 1935, Carlyle Jacobsen of Yale University tried frontal and prefrontal lobotomies on chimps, and found them the be calmer afterwards. His colleague at Yale, John Fulton, attempted to induce "experimental neurosis" in his lobotomized chimps by presenting them with contradictory signals. He found that they were pretty much immune to the process.

It took a certain Antonio Egaz Moniz of the University of Lisbon Medical School to really put lobotomy on the map. A very productive medical researcher, he invented several significant improvements to brain x-ray techniques prior to his work with lobotomy. He also served as the Minister of Foreign Affairs and the Ambassador to Spain. He was even one of the signers of the Treaty of Versailles, which marked the end of World War I.

He found that cutting the nerves that run from the frontal cortex to the thalamus in psychotic patients who suffered from repetitive thoughts "short-circuited" the problem. Together with his colleague Almeida Lima, he devised a technique involving drilling two small holes on either side of the forehead, inserting a special surgical knife, and severing the prefrontal cortex from the rest of the brain. He called it leukotomy, but it would come to be known as lobotomy.

Some of his patients became calmer, some did not. Moniz advised extreme caution in using lobotomy, and felt it should only be used in cases where everything else had been tried. He was awarded the Nobel Prize for his work on lobotomy in 1949. He retired early after a former patient paralyzed him by shooting him in the back.

Walter Freeman, an American physician, with his colleague James Watts, performed his first lobotomy operation in 1936. He was so satisfied with the results that he went on to do many thousands more, and in fact began a propaganda campaign to promote its use. He is also famous for inventing what is called ice pick lobotomy. Impatient with the difficult surgical methods pioneered by Moniz, he found he could insert an ice pick above each eye of a patient with only local anesthetic, drive it through the thin bone with a light tap of a mallet, swish the pick back and forth like a windshield wiper and – voilà – a formerly difficult patient is now passive.

Freeman recommended the procedure for everything from psychosis to depression to neurosis to criminality. He developed what others called assembly line lobotomies, going from one patient to the next with his gold-plated ice pick, even having his assistants time him to see if he could break lobotomy speed records. It is said that even some seasoned surgeons fainted at the site. Even Watts thought he had gone too far.

Between 1939 and 1951, over 18,000 lobotomies were performed in the US, and many more in other countries. It was often used on convicts, and in Japan it was recommended for use on "difficult" children. There are still western countries that permit the use of the lobotomy, although its use has decreased dramatically worldwide. Curiously, the old USSR banned it back in the 1940s on moral grounds!

In the 1950s, people began getting upset about the prevalence of lobotomies. Protests began, and serious research supported the protesters. The general statistics showed roughly a third of lobotomy patients improved, a third stayed the same, and the last third actually got worse!

There have been a few famous cases over the years. For example, Rosemary Kennedy, sister to John, Robert, and Edward Kennedy, was given a lobotomy when her father complained to doctors about the mildly retarded girl’s embarrassing new interest in boys. Her father never informed the rest of the family about what he had done. She lived out her life in a Wisconsin institution and died January 7, 2005, at the age of 86. Her

87 | 115© Copyright 2006 C. George Boeree

Page 320: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

sister, Eunice Kennedy Shriver, founded the Special Olympics in her honor in 1968.

To learn more about lobotomy, try these sources:

Jack Pressman, Last Resort (1998).

Elliot Valenstein, Great and Desperate Cures (1986).

Renato Sabbatini, "The History of Psychosurgery" (Brain and Mind, June 1997). A selection from this article is available at http://www.epub.org.br/cm/n02/historia/lobotomy.htm.

88 | 115© Copyright 2006 C. George Boeree

Page 321: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

The Cognitive Movement

89 | 115© Copyright 2006 C. George Boeree

Page 322: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

In the latter half of the twentieth century, the advent of the computer and the way of thinking associated with it led to a new approach or orientation to psychology called the cognitive movement. Many are hoping that it will prove to be the paradigm – the unifying theory – we have been waiting for. It is still way too early to tell, but the significance of cognitive psychology is impossible to deny.

The roots of the cognitive movement are extremely varied: It includes gestalt psychology, behaviorism, even humanism; it has absorbed the ideas of E. C. Tolman, Albert Bandura, and George Kelly; it includes thinkers from linguistics, neuroscience, philosophy, and engineering; and it especially involves specialists in computer technology and the field of artificial intelligence. Let’s start by looking at three of the greatest information processing theorists: Norbert Wiener, Alan Turing, and Ludwig von Bertalanffy.

Norbert Wiener

Norbert Wiener was born November 26, 1894 in Columbia, Missouri. His father was a professor of Slavic languages who wanted more than anything for his son to be a genius. Fortunately, Norbert was up to the task. He was reading by age three, started high school at nine, graduated at 11, got his bachelors at 14, and his masters – from Harvard! – at 17. He received his PhD a year later, in 1913, with a dissertation on mathematical logic.

(If it is any consolation, Norbert was near-sighted, very nervous, clumsy, insecure, and socially inept. However, people liked him anyway!)

After graduation, he went to Cambridge to study under Bertrand Russell, and then to the University of Gottingen to study under the famous mathematician David Hilbert. When he returned, he taught at Columbia, Harvard, and Maine University, spent a year as a staff writer for the Encyclopedia Americana, another year as a journalist for the Boston Herald, and (though a pacifist) worked as a mathematician for the army.

Finally, in 1919, he became a professor of mathematics at MIT, where he would stay put until 1960. He married Margaret Engemann in 1926, and they had two daughters.

He began by studying the movement of particles and quantum physics, which led him to develop an interest in information transmission and control mechanisms. While working on the latter, he coined the term cybernetics, from the Greek word for steersman, to refer to any system that has built-in correction mechanisms, i.e. is self-steering. Appropriately, he worked on control mechanisms for the military during World War II.

In 1948, he published Cybernetics: or Control and Communication in the Animal and the Machine. In this book, he popularized such terms as input, output, and feedback!

Later, in 1964, he published the book God and Golem, Inc., which he subtitled "a comment on certain points where cybernetics impinges on religion." He was concerned that someday machines may overtake us, their creators. That same year, he won the National Medal of Science. A few weeks later, March 18, he died in Stockholm, Sweden.

The idea of feedback is very old, and is hinted at in the works of Aristotle. It began to gain some notoriety in the 1700's, in the form of "the invisible hand," an idea introduced in Adam Smith's The Wealth of Nations, which some see as the roots of both control theory and game theory. Feedback is a simple idea: Take the output of some system, and feed it back as an input, in order to in some way alter the process. For example, homeostasis or the thermostat principle is a form of negative feedback: It gets cold in the house, which triggers the thermostat, which turns on the furnace. It gets warmer, which triggers the thermostat, this time to

90 | 115© Copyright 2006 C. George Boeree

Page 323: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

turn off the furnace. Then, it gets colder, and the cycle begins again. The goal of such a system is equilibrium (say, 70º F in the house), but it is actually an oscillating or "hunting" process.

Positive feedback occurs when the output tells the system to produce even more of something. Although the "positive" in positive feedback makes it sound like a good thing, if it isn't backed up with negative feedback, it tends to run out of control. A common example of positive feedback are economic bubbles, where something increases in value (such as tulips in 17th century Holland, or "dot-coms" in the recent past), everyone buys into the product, driving up the prices, leading to more investors, until finally the whole thing collapses.

What Wiener did was recognize the larger significance of this feedback idea!

Alan M. Turing

Alan Turing was born June 23, 1912 in Paddington, London, the second child of Julius Mathison Turing and Ethel Sara Stoney. His parents met while his father and his mother’s father were serving in Madras, India, as part of the Civil Service. He and his brother were raised in other people’s homes while his parents continued their life in India.

A turning point in his life came when his best friend at Sherborne School, Christopher Marcom, died in 1930. This led him to think about the nature of existence and whether or not it ends at death.

He went to King’s College of Cambridge in 1931, where he read books by von Neumann, Russell and Whitehead, Goedel, and so on. He also became involved in the pacifist movement at Cambridge, as well as coming to terms with his homosexuality. He received his degree in 1934, and stayed on for a fellowship in 1935.

The Turing Machine – the first description of what would become the modern computer – was introduced in a 1936 paper, after which he left for Princeton in the US. There, he received his PhD in 1938, and returned to King’s College, living on his fellowship.

He began working with British Intelligence on breaking the famous Enigma Code by constructing code-breaking machines. In 1944, he made his first mention of "building a brain."

It should be noted that Turing was also an amateur cross-country runner, and just missed representing the UK in the 1948 Olympics!

In 1944, he became the deputy director of the computing lab at Manchester University, where they were attempting to build the first true computer. In 1950, he published a paper, "Computing Machinery and Intelligence," in Mind. Turing saw the human brain as an "unorganized machine" that learned through experience.

Unfortunately, he was arrested and tried in 1952 – for homosexuality! He made no defense, but took an offer to stay out of jail if he would take estrogen injections to lower his supposedly overactive libido. He lost security clearance because of his homosexuality as well.

He began working on pattern formation in biology – what we would now call the mathematics of fractals – and on quantum mechanics. But on June 7, 1954, he committed suicide by ingesting cyanide – making it look like an accident to spare his mother’s feelings. He was 41.

Today, he is considered the father of Computer Science. Let me let his biographer, Andrew Hodges, describe the famous Turing Machine:

91 | 115© Copyright 2006 C. George Boeree

Page 324: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

His work introduced a concept of immense practical significance: the idea of the Universal Turing Machine. The concept of 'the Turing machine' is like that of 'the formula' or 'the equation'; there is an infinity of possible Turing machines, each corresponding to a different 'definite method' or algorithm. But imagine, as Turing did, each particular algorithm written out as a set of instructions in a standard form. Then the work of interpreting the instructions and carrying them out is itself a mechanical process, and so can itself be embodied in a particular Turing machine, namely the Universal Turing machine. A Universal Turing machine can be made do what any other particular Turing machine would do, by supplying it with the standard form describing that Turing machine. One machine, for all possible tasks. It is hard now not to think of a Turing machine as a computer program, and the mechanical task of interpreting and obeying the program as what the computer itself does. Thus, the Universal Turing Machine embodies the essential principle of the computer: a single machine which can be turned to any well-defined task by being supplied with the appropriate program. Additionally, the abstract Universal Turing Machine naturally exploits what was later seen as the 'stored program' concept essential to the modern computer: it embodies the crucial twentieth-century insight that symbols representing instructions are no different in kind from symbols representing numbers. But computers, in this modern sense, did not exist in 1936. Turing created these concepts out of his mathematical imagination. Only nine years later would electronic technology be tried and tested sufficiently to make it practical to transfer the logic of his ideas into actual engineering. In the meanwhile the idea lived only in his mind.*

For much more on Turing, see the Turing Archive at http://www.cs.usfca.edu/www.AlanTuring.net/turing_archive/index.html

Ludwig von Bertalanffy

Ludwig was born near Vienna on September 19, 1901. In 1918, he went to the University of Innsbruck, and later transferred to the University of Vienna, where he studied the history of art, philosophy, and biology. He received his doctorate in 1926, with a PhD dissertation on Gustav Fechner.

In 1928, he published Modern Theories of Development, where he introduced the question of whether we could explain biology in purely physical terms. He suggested we could, if we see living things as endowed with self-organizational dynamics.

In 1937, he went to the University of Chicago, where he gave his first lecture on General Systems Theory, which he saw as a methodology for all sciences. In 1939, he became a professor at the University of Vienna and continued his research on the comparative physiology of growth. He summarized his work in Problems of Life, published in 1940.

In 1949, he emigrated to Canada, where he began research on cancer. Soon, he branched into cognitive psychology, where he introduced a holistic epistemology

that he contrasted with behaviorism.

In 1960, he became professor of theoretical biology in the department of zoology and psychology at the University of Alberta. In 1967, he wrote Robots, Men, and Minds, and in 1968, he wrote General Systems Theory.

* Quoted from Andrew Hodges: "Alan Turing – a short biography," at http://www.turing.org.uk/turing/bio/part3.html

92 | 115© Copyright 2006 C. George Boeree

Page 325: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Ludwig von Bertalanffy died of a heart attack on June 12, 1972.

Once upon a time, it was possible for one bright individual – say an Aristotle or a da Vinci – to know everything that his or her culture had to offer. We still sometimes refer to people who have a particularly broad knowledge base as a renaissance man or woman. But this isn't really possible anymore, because there is simply too much information in the world. Everyone winds up a specialist. That isn't, of course, entirely bad; but it does mean that the various sciences (and arts and humanities as well) tend to become isolated. A new idea in one field stays in that field, even when it might mean a revolution for another field. The last time we saw a truly significant transfer of ideas from one science to others was when Darwin introduced the theory of evolution!

General Systems Theory was a proposal for a mathematical and logical means of expressing ideas in what we nowadays comfortably call systems. Bertalanffy believed that this was the way we could unify the sciences, including biology, history, sociology, and even psychology, and open the door to a new kind of scientist who is a generalist rather than a specialist. These generalists, by making use of these common systems models, would be able to transfer insights from one field to another.

Bertalanffy took concepts from cybernetics, information theory, game theory, decision theory, topology, factor analysis, systems engineering, operations research, and human engineering, and perfected the "flow diagram" idea that we all take for granted today. His most significant innovation, however, was the idea of the open system – a system in the context of a larger system. This allowed systems theory to be applied to animals within ecosystems, for example, or to people withing their socio-cultural contexts. In particular, the idea of the open system gave the age-old metaphor of societies-as-organisms scientific legitimacy and a new lease-on-life.

Noam Chomsky

In addition to the input (no pun intended) from the "artificial intelligence" people, there was the input from a group of scientists in a variety of fields who thought of themselves as structuralists – not allying themselves with Wundt, but interested in the structure of their various topics. I'll call them neo-structuralists, just to keep them straight. For example, there's Claude Levi-Strauss, the famous French anthropologist. But the one everyone knows about is the linguist Noam Chomsky.

Avram Noam Chomsky was born December 7, 1928, in Philadelphia, the son of William Chomsky and Elsie Simonofsky. He was father was a Hebrew scholar, and young Noam became so good that he was proof-reading his father’s manuscripts by the time he was in high school. Noam was also passionate about politics, especially when it concerned the potential for a state of Israel.

He received his BA from the University of Pennsylvania in 1949, whereupon he married a fellow linguist, Carol Schatz. They would go on to have three children. He received his PhD in 1955, also from the U of Penn.

That same year, he started teaching at MIT and began his work on generative grammar. Generative grammar was based on the question "how can we create new sentences which have never been spoken before?" How, in other words, do we get so creative, so generative? While considering this questions, he familiarized himself with mathematical logic, the psychology of thought, and theories about thinking machines. He found himself, on the other hand, very

93 | 115© Copyright 2006 C. George Boeree

Page 326: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

critical of traditional linguistics and behavioristic psychology.

In 1957, he published his first book, Syntactic Structures. Besides introducing his generative grammar, he also introduced the idea of an innate ability to learn languages. We have born into us a "universal grammar" ready to absorb the details of whatever language is presented to us at an early age.

His book spoke about surface structure and deep structure and the rules of transformation that governed the relations between them. Surface structure is essentially language as we know it, particular languages with particular rules of phonetics and basic grammar. Deep structure is more abstract, at the level of meanings and the universal grammar.

In the 1960’s, Chomsky became one of the most vocal critics of the Vietnam War, and wrote American Power and the New Mandarins, a critique of government decision making. He is still at MIT today and continues to produce articles and books on linguistics – and politics!

Jean Piaget

Another neo-structuralist is Jean Piaget. Originally a biologist, he is now best remembered for his work on the development of cognition. Many would argue that he, more than anyone else, is responsible for the creation of cognitive psychology. If the English-speaking world had only learned to read a little French, this would be true without a doubt. Unfortunately, his work was only introduced in English after 1950, and only became widely known in the 1960's – just on time to be a part of the cognitive movement, but not of its creation.

Jean Piaget was born in Neuchâtel, Switzerland, on August 9, 1896. His father, Arthur Piaget, was a professor of medieval literature with an interest in local history. His mother, Rebecca Jackson, was intelligent and energetic, but Jean found her a bit neurotic – an impression that he said led to his interest in psychology, but away from pathology! The oldest child, he was quite independent and took an early interest in nature, especially the collecting of shells. He published his first "paper"when he was ten – a one page account of his sighting of an albino sparrow.

He began publishing in earnest in high school on his favorite subject, mollusks. He was particularly pleased to get a part time job with the director of Nuechâtel’s Museum of Natural History, Mr. Godel. His work became well

known among European students of mollusks, who assumed he was an adult! All this early experience with science kept him away, he says, from "the demon of philosophy."

Later in adolescence, he faced a bit a crisis of faith: Encouraged by his mother to attend religious instruction, he found religious argument childish. Studying various philosophers and the application of logic, he dedicated himself to finding a "biological explanation of knowledge." Ultimately, philosophy failed to assist him in his search, so he turned to psychology.

After high school, he went on to the University of Neuchâtel. Constantly studying and writing, he became sickly, and had to retire to the mountains for a year to recuperate. When he returned to Neuchâtel, he decided he would write down his philosophy. A fundamental point became a centerpiece for his entire life’s work: "In all fields of life (organic, mental, social) there exist ‘totalities’ qualitatively distinct from their parts and imposing on them an organization." This principle forms the basis of his structuralist philosophy, as it would for the Gestaltists, Systems Theorists, and many others.

In 1918, Piaget received his Doctorate in Science from the University of Neuchâtel. He worked for a year at

94 | 115© Copyright 2006 C. George Boeree

Page 327: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

psychology labs in Zurich and at Bleuler’s famous psychiatric clinic. During this period, he was introduced to the works of Freud, Jung, and others. In 1919, he taught psychology and philosophy at the Sorbonne in Paris. Here he met Simon (of Simon-Binet fame) and did research on intelligence testing. He didn’t care for the "right-or-wrong" style of the intelligent tests and started interviewing his subjects at a boys school instead, using the psychiatric interviewing techniques he had learned the year before. In other words, he began asking how children reasoned.

In 1921, his first article on the psychology of intelligence was published in the Journal de Psychologie. In the same year, he accepted a position at the Institut J. J. Rousseau in Geneva. Here he began with his students to research the reasoning of elementary school children. This research became his first five books on child psychology. Although he considered this work highly preliminary, he was surprised by the strong positive public reaction to his work.

In 1923, he married one of his student coworkers, Valentine Châtenay. In 1925, their first daughter was born; in 1927, their second daughter was born; and in 1931, their only son was born. They immediately became the focus of intense observation by Piaget and his wife. This research became three more books!

In 1929, Piaget began work as the director of the Bureau International Office de l’Education, in collaboration with UNESCO. He also began large scale research with A. Szeminska, E. Meyer, and especially Bärbel Inhelder, who would become his major collaborator. Piaget, it should be noted, was particularly influential in bringing women into experimental psychology. Some of this work, however, wouldn’t reach the world outside of Switzerland until World War II was over.

In 1940, He became chair of Experimental Psychology, the Director of the psychology laboratory, and the president of the Swiss Society of Psychology. In 1942, he gave a series of lectures at the Collège de France, during the Nazi occupation of France. These lectures became The Psychology of Intelligence. At the end of the war, he was named President of the Swiss Commission of UNESCO.

Also during this period, he received a number of honorary degrees. He received one from the Sorbonne in 1946, the University of Brussels and the University of Brazil in 1949, on top of an earlier one from Harvard in 1936. And, in 1949 and 1950, he published his synthesis, Introduction to Genetic Epistemology.

In 1952, he became a professor at the Sorbonne. In 1955, he created the International Center for Genetic Epistemology, of which he served as director the rest of his life. And, in 1956, he created the School of Sciences at the University of Geneva.

He continued working on a general theory of structures and tying his psychological work to biology for many more years. Likewise, he continued his public service through UNESCO as a Swiss delegate. By the end of his career, he had written over 60 books and many hundreds of articles. He died in Geneva, September 16, 1980, one of the most significant psychologists of the twentieth century.

Jean Piaget began his career as a biologist – specifically, a malacologist! But his interest in science and the history of science soon overtook his interest in snails and clams. As he delved deeper into the thought-processes of doing science, he became interested in the nature of thought itself, especially in the development of thinking. Finding relatively little work done in the area, he had the opportunity to give it a label. He called it genetic epistemology, meaning the study of the development of knowledge.

He noticed, for example, that even infants have certain skills in regard to objects in their environment. These skills were certainly simple ones, sensorimotor skills, but they directed the way in which the infant explored his or her environment and so how they gained more knowledge of the world and more sophisticated exploratory skills. These skills he called schemas.

For example, an infant knows how to grab his favorite rattle and thrust it into his mouth. He’s got that schema down pat. When he comes across some other object – say daddy’s expensive watch, he easily learns to transfer his "grab and thrust" schema to the new object. This Piaget called assimilation, specifically assimilating a new object into an old schema.

When our infant comes across another object again – say a beach ball – he will try his old schema of grab

95 | 115© Copyright 2006 C. George Boeree

Page 328: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

and thrust. This of course works poorly with the new object. So the schema will adapt to the new object: Perhaps, in this example, "squeeze and drool" would be an appropriate title for the new schema. This is called accommodation, specifically accommodating an old schema to a new object.

Assimilation and accommodation are the two sides of adaptation, Piaget’s term for what most of us would call learning. Piaget saw adaptation, however, as a good deal broader than the kind of learning that Behaviorists in the US were talking about. He saw it as a fundamentally biological process. Even one’s grip has to accommodate to a stone, while clay is assimilated into our grip. All living things adapt, even without a nervous system or brain.

Assimilation and accommodation work like pendulum swings at advancing our understanding of the world and our competency in it. According to Piaget, they are directed at a balance between the structure of the mind and the environment, at a certain congruency between the two, that would indicate that you have a good (or at least good-enough) model of the universe. This ideal state he calls equilibrium.

As he continued his investigation of children, he noted that there were periods where assimilation dominated, periods where accommodation dominated, and periods of relative equilibrium, and that these periods were similar among all the children he looked at in their nature and their timing. And so he developed the idea of stages of cognitive development. These constitute a lasting contribution to psychology.

Donald O. Hebb

There are three psychologists who, in my opinion, are most responsible for the development of cognitive psychology as a movement as well as for its incredible popularity today. They are Donald Hebb, George Miller, and Ulric Neisser. There are no doubt others we could add, but I am sure no one would leave these three out!

Donald Olding Hebb was born in 1904 in Chester, Nova Scotia. He graduated from Dalhousie University in 1925, and tried to begin a career as a novelist. He wound up as a school principle in Quebec.

He began as a part-time graduate student at McGill University in Montreal. Here, he began quickly disillusioned with behaviorism and turned to the work of Kohler and Lashley. Working with Lashley, he received his PhD from Harvard in 1936.

He took on a fellowship with Wilder Penfield at the Montreal Neurological Institute, where his research noted that large lesions in the brain often have little effect on a person’s perception, thinking, or behavior.

Moving on to Queens University, he researched intelligence testing of animals and humans. He noted that the environment played a far more significant role in intelligence than generally assumed.

In 1942, he worked with Lashley again, this time at the Yerkes Lab of Primate Biology. He then returned to McGill as a professor of psychology, and became the department chairperson in 1948.

The following year, he published his most famous book, The Organization of Behavior: A Neuropsychological Theory. This was very well received and made McGill a center for neuropsychology.

The basics of his theory can be summarized by defining three of his terms: First, there is the Hebb synapse. Repeated firing of a neuron causes growth or metabolic changes at the synapse that increase the efficiency of that synapse in the future. This is often called consolidation theory, and is the most accepted explanation for neural learning today.

96 | 115© Copyright 2006 C. George Boeree

Page 329: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Second, there is the Hebb cell assembly. There are groups of neurons so interconnected that, once activity begins, it persists well after the original stimulus is gone. Today, people call these neural nets.

And third, there is the phase sequence. Thinking is what happens when complex sequences of these cell assemblies are activated.

He humbly suggested that his theory is just a new version of connectionism – a neo- or neuro-connectionism. This connectionism is today the basic idea behind most models of neurological functioning. It should be noted that he was president of both the APA and its Canadian cousin, the CPA. Donald Hebb died in 1985.

George A. Miller

George A. Miller, born in 1920, began his career in college as a speech and English major. In 1941, he received his masters in speech from the University of Alabama. In 1946 he received his PhD from Harvard and began to study psycholinguistics.

In 1951, he published his first book, titled Language and Communication. In it, he argued that the behaviorist tradition was insufficient to the task of explaining language.

He wrote his most famous paper in 1956: "The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information." In it, he argued that short-term memory could only hold about seven pieces – called chunks – of information: Seven words, seven numbers, seven faces, whatever. This is still accepted as accurate.

In 1960, Miller founded the Center for Cognitive Studies at Harvard with famous cognitivist developmentalist, Jerome Bruner. In that same year, he published Plans and the Structure of Behavior (with Eugene Galanter and Karl Pribram, 1960), which outlined their conception of cognitive psychology. They used the computer as their model of human learning, and used such analogies as information processing, encoding, and retrieval. Miller went so far as to define psychology as the study of the mind, as it had been prior to the behaviorist redefinition of psychology as the study of behavior!

George Miller served as the president of APA 1969, and received the prestigious National Medal of Science in 1991. He is still teaching as professor emeritus at Princeton University.

97 | 115© Copyright 2006 C. George Boeree

Page 330: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Ulric Neisser

Ulric Neisser was born in 1928 in Kiel, Germany, and moved with his family to the US at the age of three.

He studied at Harvard as a physics major before switching to psychology. While there, he was influenced by Koffka’s work and by George Miller. In 1950, he received his bachelors degree, and in 1956, his PhD. At this point, he was a behaviorist, which was basically what everyone was at the time.

His first teaching position was at Brandeis, where Maslow was department head. Here he was encouraged to pursue his interest in cognition. In 1967, he wrote the book that was to mark the official beginning of the cognitive movement, Cognitive Psychology.

Later, in 1976, he wrote Cognition and Reality, in which he began to express a dissatisfaction with the linear programming model of cognitive psychology at that time, and the excessive reliance on laboratory work, rather than real-life situations. Over time, he would become a vocal critic of cognitive psychology, and moved towards the environmental psychology of his friend J. J. Gibson.

He is presently at Cornell University where his research interests include memory, especially memory for life events and in natural settings; intelligence, especially individual and group differences in test scores, IQ tests and their social significance; self-concepts, especially as based on self-perception. His latest works include The Rising Curve: Long-Term Gains in IQ and Related Measures (1998) and , with L. K. Libby, "Remembering life experiences" (In E. Tulving & F. I .M. Craik’s The Oxford Handbook of Memory, 2000)

Conclusions?

As I said at the beginning of this chapter, it is impossible to tell whether cognitive psychology will prove to be THE psychology of the future. In fact, as I pointed out with Ulric Neisser, even some of its major proponents have their doubts. Cognitive psychology is far more sophisticated, philosophically, than behaviorism. And yet it lacks in sophistication when compared, for example, to phenomenology and existentialism. It does, of course, have the tremendous advantage of being tied to the most rapidly developing technology we have ever seen – the computer. But few people see the computer as ultimately being a good model for human beings, in some ways not even as good as the old white rat, which at least was alive!

98 | 115© Copyright 2006 C. George Boeree

Page 331: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

A Computer Timeline *

600's bc?

The abacus is developed in China. It was later adopted by the Japanese and the Russians.

600's ad?

Arabic numbers – including the zero (represented by a dot) – were invented in India. Arabic translations of Indian math texts brought these numbers to the attention of the Europeans. Arabic numbers entered Europe by means of Spain around 1000 ad and first became popular among Italian merchants around 1300. Until then, people used the Roman system in western Europe, and the Greek system in the east. The original numbers were similar to the modern Devanagari numbers used in northern India:

1488

The moveable-type printing press is invented by Johann Gutenburg.

1492

Francis Pellos of Nice invents the decimal point.

c. 1600

Thomas Harriot invents the symbols used in algebra. He also drew the first maps of the moon and discovered sunspots.

1600

Dr. William Gilbert discovers static electricity, and coins the term in De Magnete.

1614

John Napier invents logarithms.

1622

William Oughtred invents the slide rule.

1623

Wilhelm Schickard makes his "Calculating Clock."

1644-5

Blaise Pascal a young French mathematician develops the Pascaline, a simple mechanical device for the addition of numbers. It consists of several toothed wheels arranged side by side, each marked from 0 to 9 at equal intervals around its perimeter. The important innovation is an automatic 'tens-carrying' operation: when a wheel completes a revolution, it is turned past the 9 to 0 and automatically pulls the adjacent wheel on its left, forward one tenth of a revolution, thus adding, or 'carrying'. (Pascal is also a respected philosopher and the inventor of the bus.)

*Major sources:

Chronology of Digital Computing Machines. Mark Brader http://www.best.com/wilson/faq/chrono.html From the Abacus to the Apple. Bobbi A. Kerlin http://www.irn.pdx.edu/~kerlinb/myresearch/timeline.html Global Networking: a Timeline. T. Matthew Ciolek http://www.ciolek.com/PAPERS/milestones.html And a variety of others whose names I no longer recollect. My sincere apologies!

99 | 115© Copyright 2006 C. George Boeree

Page 332: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

1660

Otto von Gürcke builds first "electric machine."

1674

Gottfried Wilhelm von Leibniz designs his "Stepped Reckoner", a machine similar to Pascal's, with the added features of multiplication and division, which is constructed by a man named Olivier, of Paris. (Leibniz is also a respected philosopher and the co-inventor of calculus.)

1752

Ben Franklin captures lightning.

1786

J. H. Mueller, of the Hessian army, conceives the idea of what came to be called a "difference engine". That's a special-purpose calculator for tabulating values of a polynomial. Mueller's attempt to raise funds fails and the project is forgotten.

1790

Galvani discovers electric current, and uses it on frogs' legs.

1800

Alessandro Volta invents the battery.

1801

Joseph-Marie Jacquard develops the punch card system which programs and thereby automates the weaving of patterns on looms.

1809

Sir Humphry Davey invents electric arc lamp.

1820

Charles Xavier Thomas de Colmar of France, makes his "Arithmometer", the first mass-produced calculator. It does multiplication using the same general approach as Leibniz's calculator; with assistance from the user it can also do division. It is also the most reliable calculator yet. Machines of this general design, large enough to occupy most of a desktop, continue to be sold for about 90 years.

1822-23

Charles Babbage begins his government-funded project to build the first of his machines, the "Difference Engine", to mechanize solutions to general algebra problems.

The importance of his work is recognized by Ada Lovelace, Lord Byron's daughter who, gifted in mathematics, devises a form of binary arithmetic which uses only the digits 1 and 0.

1825

The first railway is opened for public use.

1826

Photography is invented by Benoit Fourneyron.

1830

Thomas Davenport of Vermont invents the electric motor – calls it a toy.

100 | 115© Copyright 2006 C. George Boeree

Page 333: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

1831

Michael Faraday produces electricity with the first generator.

1832-34

Babbage conceives, and begins to design, his "Analytical Engine". Could be considered a programmable calculator, very close to the basic idea of a computer. The machine could do an addition in 3 seconds and a multiplication or division in 2-4 minutes.

1837

Telegraph, Samuel F. B. Morse.

1868

Christopher Latham Sholes (Milwaukee) invents the first commercial typewriter.

1872

One of the first large-scale analog computers is developed by Lord Kelvin to predict the height of tides in English harbors.

1876

Telephone is invented by Alexander Graham Bell.

1877

Gramaphone is invented by Thomas Edison.

1881

Charles S. Tainter invents the dictaphone.

1886

Dorr E. Felt of Chicago, makes his "Comptometer". This is the first calculator with keys.

1887

E. J. Marey invents the Motion Picture Camera.

Eastman patents the first box camera, moving photography from the hands of professionals to the general public.

1890

Herman Hollerith of MIT, designs a punch card tabulating machine which is used effectively in the US census of this year. The cards are read electrically.

1891

Thomas Edison develops the Motion Picture Projector. 1896 Guglielmo Marconi develops the Radio Telegraph. 1899 Val Demar Poulsen develops the Magnetic Recorder.

1900

Rene Graphen develops the Photocopying Machine.

1901

Reginald A. Fessenden develops the Radio Telephone.

1906

Henry Babbage, Charles's son, with the help of the firm of R. W. Munro, completes his father's Analytical Engine, just to show that it would have worked.

101 | 115© Copyright 2006 C. George Boeree

Page 334: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

1913

Thomas Edison invents Talking Motion Pictures.

1919

W. H. Eccles and F. W. Jordan publish the first flip-flop circuit design.

1924

Computing-Tabulating-Recording becomes International Business Machines.

1925

J. P. Maxfield develops the All-electric Phonograph.

1927

Philo T. Farnsworth, inventor of the television, gives first demonstration. See The Last Lone Inventor by Evan Schwartz (http://www.lastloneinventor.com)

1933

IBM introduces the first commercial electric typewriter.

Edwin H. Armstrong develops FM Radio.

1936

Robert A. Watson-Watt develops Radar.

Benjamin Burack builds the first electric logic machine.

In his thesis, Claude Shannon demonstrates the relationship between electrical circuitry and symbolic logic.

1937

Alan M. Turing, of Cambridge University, England, publishes a paper on "computable numbers" which introduces the theoretical simplified computer known today as a Turing machine.

1938

Claude E. Shannon publishes a paper on the implementation of symbolic logic using relays.

1939

John V. Atanasoff and graduate student Clifford Berry, of Iowa State College completes a prototype 16-bit adder. This is the first machine to calculate using vacuum tubes.

1940s

First electronic computers in US, UK, and Germany

1941

Working with limited backing from the German Aeronautical Research Institute, Zuse completes the "V3", the first operational programmable calculator. Zuse is a friend of Wernher von Braun

1943

Howard H. Aiken and his team at Harvard University, Cambridge, Mass. funded by IBM, complete the "ASCC Mark I" ("Automatic Sequence-Controlled Calculator Mark I"). The machine is 51 feet long, 8 feet high, weighs 5 tons, and incorporates 750,000 parts. It is the first binary computer built in the U.S. that is operated by electricity.

102 | 115© Copyright 2006 C. George Boeree

Page 335: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Max Newman, Wynn-Williams, and their team at the secret English Government Code and Cypher School, complete the "Heath Robinson". This is a specialized machine for cipher-breaking. (Heath Robinson was a British cartoonist known for his Rube-Goldberg-style contraptions.)

1945

John von Neumann drafts a report describing a stored-program computer, and gives rise to the term "von Neumann computer".

1945

John W. Mauchly and J. Presper Eckert and their team at the University of Pennsylvania, complete a secret project for the US Army's Ballistics Research Lab: The ENIAC (Electronic Numerical Integrator and Calculator). It weighs 30 tons, is 18 feet high and 80 feet long, covers about 1000 square feet of floor, and consumes 130 or 140 kilowatts of electricity. Containing 17,468 vacuum tubes and over 500,000 soldered connections, it costs $487,000. While it could perform five thousand additions in one second, the circuitry in ENIAC could now be contained on a panel the size of a playing card. Today’s desktop stores millions times more info and is 50,000 times faster. The ENIAC's clock speed is 100 kHz.

Two days before Christmas the transistor is perfected.

1946

Zuse invents Plankalkul, the first programming language, while hiding out in Bavaria.

The ENIAC is revealed to the public. A panel of lights is added to help show reporters how fast the machine is and what it is doing; and apparently Hollywood takes note.

1947

The magnetic drum memory is independently invented by several people, and the first examples are constructed.

1948

Newman, Freddie C. Williams, and their team at Manchester University, complete a prototype machine, the "Manchester Mark I". This is the first machine that everyone would call a computer, because it's the first with a true stored-program capability.

First tape recorder is sold

1949

A quote from Popular Mechanics:

"Where a computer like the ENIAC is equipped with 18,000 vacuum tubes and weighs 30 tons, computers in the future may have only 1,000 vacuum tubes and weigh only 1 1/2 tons."

Jay W. Forrester and his team at MIT construct the "Whirlwind" for the US Navy's Office of Research and Inventions. The Whirlwind is the first computer designed for real-time work; it can do 500,000 additions or 50,000 multiplications per second. This allows the machine to be used for air traffic control.

Forrester conceives the idea of magnetic core memory as it is to become commonly used, with a grid of wires used to address the cores.

1950

Alan Turing "Computing Machinery and Intelligence"

1951

U.S. Census Bureau takes delivery of the first UNIVACS originally developed by Eckert and Mauchly.

An Wang establishes Wang Laboratories

103 | 115© Copyright 2006 C. George Boeree

Page 336: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Ferranti Ltd. completes the first commercial computer. It has 256 40-bit words of main memory and 16K words of drum. An eventual total of 8 of these machines are sold.

Grace Murray Hopper, of Remington Rand, invents the modern concept of the compiler.

1952

The EDVAC is finally completed. It has 4000 tubes, 10,000 crystal diodes, and 1024 44-bit words of ultrasonic memory. Its clock speed is 1 MHz.

1953

Minsky and McCarthy get summer jobs at Bell Labs

1955

An Wang is issued Patent Number 2,708,722, including 34 claims for the magnetic memory core.

Shockley Semiconductor is founded in Palo Alto.

John Bardeen, Walter Brattain, and William Shockley share the Nobel Prize in physics for the transistor.

1956

Rockefeller funds Minsky and McCarthy's AI conference at Dartmouth

CIA funds GAT machine-translation project.

Newell, Shaw, and Simon develop Logic Theorist.

1957

USSR launches Sputnik, the first earth satellite.

Newell, Shaw, and Simon develop General Problem Solver.

Fortran, the first popular programming language, hits the streets.

1958

McCarthy creates first LISP.

1959

Minsky and McCarthy establish MIT AI Lab.

Frank Rosenblatt introduces Perceptrons.

COBOL, a programming language for business use, and LISP, the first string processing language, come out.

1960s

Edward Djikstra suggests that software and data should be created in standard, structured forms, so that people could build on each others' work.

Algol 60, a European programming language and ancestor of many others, including Pascal, is released.

1962

First industrial robots.

1963-64

Doug Englebart invents the computer mouse, first called the X-Y Position Indicator.

1964

Bobrow's "Student" solves math word-problems.

104 | 115© Copyright 2006 C. George Boeree

Page 337: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

John Kemeny and Thomas Kurtz of Dartmouth College develop the first BASIC programming language. PL1 comes out out the same year.

Wang introduces the LOCI (logarithmic calculating instrument), a desktop calculator at the bargain price of $6700, much less than the cost of a mainframe. In six months, Wang sells about twenty units.

Sabre database system, brought online. It solves the American Airlines' problem of coordinating information about hundreds of flight reservations across the continent every day.

Philips makes public the compact cassette.

1966

Weizenbaum and Colby create ELIZA.

Hewlett-Packard enters the computer market with the HP2116A real-time computer. It is designed to crunch data acquired from electronic test and measurement instruments. It has 8K of memory and costs $30,000.

Hewlett-Packard announces their HP 9100 series calculator with CRT displays selling for about $5000 each.

Intel is founded and begins marketing a semiconductor chip that holds 2,000 bits of memory. Wang is the first to buy this chip, using it in their business oriented calculators called the 600 series.

Late 1960s

IBM sells over 30,000 mainframe computers based on the 360 family which uses core memory.

1967

Greenblatt's MacHack defeats Hubert Deyfus at chess.

IBM builds the first floppy disk

1969

Kubrick's "2001" introduces AI to mass audience.

Intel announces a 1 KB RAM chip, which has a significantly larger capacity than any previously produced memory chip

Unix operating system, characterised by multitasking (also called time-sharing), virtual memory, multi-user design and security, designed by Ken Thompson and Dennis Ritchie at AT&T Bell Laboratories, USA

ARPANET (future Internet) links first two computers at UCLA and Stanford Research Institute. Dr. Leonard Kleinrock, a UCLA-based pioneer of Internet technology, and his assistant Charley Kline manage to send succesfully, after solving an initial problem with an inadequate memory buffer, a command "login" to a Stanford machine set-up and tuned by Bill Duvall. First email!

(UCLA, UCSB, University of Utah and SRI are the four original members of Arpanet.)

1970s

Commodore, a Canadian electronics company, moves from Toronto to Silicon Valley and begins selling calculators assembled around a Texas Instruments chip.

1970

Doug Englebart patents his X-Y Position Indicator mouse.

Nicklaus Wirth comes out with Pascal.

1971

The price of the Wang Model 300 series calculator drops to $600. Wang introduces the 1200 Word Processing System.

105 | 115© Copyright 2006 C. George Boeree

Page 338: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Stephen Wozniak and Bill Fernandez build their "Cream Soda computer."

Bowmar Instruments Corporation introduces the LSI-based (large scale integration) four function (+, -, *, /) pocket calculator with LED at an initial price of $250.

Intel markets the first microprocessor. Its speed is 60,000 'additions' per second.

1972

Ray Tomlinson, author of first email software, chooses @ sign for email addresses.

Dennis Ritchie invents C.

Bill Gates and Paul Allen form Traf-O-Data (which eventually becomes Microsoft).

Stephen Wozniak and Steven Jobs begin selling blue boxes.

Electronic mail!

1973

Stephen Wozniak joins Hewlett-Packard.

Radio Electronics publishes an article by Don Lancaster describing a "TV Typewriter."

IBM develops the first true sealed hard disk drive. The drive was called the "Winchester" after the rifle of the same name. It used two 30 Mb platters.

1975

MITS introduces the first personal computer - Altair in form of a kit, initially to be assembled by a buyer. It was based on Intel's 8-bit 8080 processor and included 256 bytes of memory (expandable to a 12 Kb), a set of toggle switches and an LED panel. Keyboard, screen or storage device could be added using extension cards.

The Apple I....

1976

Greenblatt creates first LISP machine.

Queen Elizabeth is first head of state to send email.

Shugart introduces 5.25" floppy.

IBM introduces a total information processing system. The system includes diskette storage, magnetic card reader/recorder, and CRT. The print station contains an ink jet printer, automatic paper and envelope feeder, and optional electronic communication.

Apple Computer opens its first offices in Cupertino and introduces the Apple II. It is the first personal computer with color graphics. It has a 6502 CPU, 4KB RAM, 16KB ROM, keyboard, 8-slot motherboard, game paddles, and built-in BASIC.

Commodore introduces the PET computer.

Tandy/Radio Shack announces its first TRS-80 microcomputer.

Ink-jet printing announced by IBM.

JVC introduces the VHS format to the videorecorders.

1977

The first digital audio disc prototypes are shown by Mitsubishi,Sony, and Hitachi at the Tokyo Audio fair.

106 | 115© Copyright 2006 C. George Boeree

Page 339: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

1978

Apple introduces and begins shipping disk drives for the Apple II and initiates the LISA research and development project.

BITNET (Because It's Time Network) protocol for electronic mail, listserv servers, file transfer, is established as a cooperative enterprise by the City University of New York and Yale University.

Xerox releases the 8010 Star and 820 computers.

IBM announces its Personal Computer.

DEC announces a line of personal computers.

HP introduces the HP 9000 technical computer with 32-bit "superchip" technology - it is the first "desktop mainframe", as powerful as room-sized computers of the 1960s.

1979

Kevin MacKenzie invents the emoticon :-)

Usenet news groups.

1980

First AAAI conference at Stanford.

Telnet. Remote log-in and long-distance work (telecommuting) are now possible.

1981

Listserv mailing list software. Online knowledge-groups and virtual seminars are formed.

Osborne introduces first portable computer.

MS-DOS introduced.

1982

CD disk (12 cm, 74 mins of playing time) and player released by Sony and Philips Europe and Japan. A year later the CD technolgy is introduced to the USA

1983

IBM announces the PCjr.

Apple Computer announces Lisa, the first business computer with a graphical user interface launched by Apple Computer Inc., Cupertino, California. The computer has 5MHz 68000 CPU, 860KB 5.25" floppy, 12" B&W screen, detached keyboard, and mouse.

1984

Macintosh personal computer, launched by Apple Computer Inc. The first computer has 128KB of memory and a 3.5" 400KB floppy disk-drive. The OS with astounding graphic interface is bundled with MacWrite (wordprocessor) and MacPaint (free-hand, B&W drawing) software.

Apple introduces 3.5" floppy.

The domain name system is established.

1985

CD-ROM technology (disk and drive) for computers developed by Sony and Philips

107 | 115© Copyright 2006 C. George Boeree

Page 340: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

File Transfer Protocol.

1987

Microsoft ships Windows 1.01.

1988

The 386 chip brings PC speeds into competition with LISP machines.

1989

Tim Berners-Lee invents the WWW while working at CERN, the European Particle Physics Laboratory in Geneva, Switzerland. He won the Finnish Technology Award Foundation's first Millennium Technology Prize in April of 2004. The $1.2 million prize was presented by Tarja Halonen, president of Finland.

1990

Archie FTP semi-crawler search engine, built by Peter Deutsch of MacGill University.

1991

CD-recordable (CD-R) technology is released.

WAIS publisher-fed search engine, invented by Brewster Kahle of the Thinking Machines Co.

Gopher, created at University of Minnesota Microcomputer, Workstations & Networks Center.

WWW server combines URL (addressing) syntax, HTML (markup) language for documents, and HTTP (communications protocol). It also offers integration of earlier Internet tools into a seamless whole.

1992

There are about 20 Web servers in existence (Ciolek 1998).

1993

"Universal Multiple-Octet Coded Character Set" (UCS), aka ISO/IEC 10646 is published in 1993 by the International Organization for Standardization (ISO). It is the first officially standardized coded character set with the purpose to eventually include all characters used in all the written languages in the world (and, in addition, all mathematical and other symbols).

Mosaic graphic WWW browser developed by Marc Andreessen (Cailliau 1995). Graphics user interface makes WWW finally a competitor to Gopher. Production of web pages becomes an easy task, even to an amateur. (Mosaic was the first Explorer- or Netscape-like "browser.")

There are 200+ Web servers in existence (Ciolek 1998).

1994

Labyrinth graphic 3-D (vrml) WWW browser is built by Mark Pesce. It provides access to the virtual reality of three-dimensional objects (artifacts, buildings, landscapes).

Netscape WWW browser, developed by Marc Andreessen, Mountain View, California.

1995

RealAudio narrowcasting (Reid 1997:69).

Java programming language, developed by Sun Microsystems, Palo Alto, California. Client-side, on-the-fly supplementary data processing can be performed using safe, downloadable micro-programs (applets).

108 | 115© Copyright 2006 C. George Boeree

Page 341: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Metacrawler WWW meta-search engine. The content of WWW is actively and automatically catalogued.

The first online bookstore, Amazon.com, is launched in Seattle by Jeffrey P. Bezos.

Altavista WWW crawler search engine is built by Digital around the Digital Alpha processor. A very fast search of 30-50% of the WWW is made possible).

1996

There are 100,000 Web servers in existence.

1997

There are 650,000 Web servers in existence.

"Deep Blue 2" beats Kasparov, the best chess player in the world. The world as we know it ends.

DVD technology (players and movies) is released. A DVD-recordable standard is created (Alpeda 1998).

Web TV introduced.

1998

Kevin Warwick, Professor of Cybernetics at the University of Reading in the U.K., became the first human to host a microchip. The approximately 23mm-by-3mm glass capsule containing several microprocessors stayed in Warwick's left arm for nine days. It was used to test implant's interaction with computer controlled doors and lights in a futuristic 'intelligent office building' .

There are 3.6 mln Web servers in existence (Zakon 1998).

1999

There are 4.3 mln Web servers in existence (Zakon 1999).

Netomat: The Non-Linear Browser, by the New York artist Maciej Wisniewski, launched. The open-source software uses Java and XML technology to navigate the web in terms of the data (text, images and sounds) it contains, as opposed to traditional browsers (Mosaic, Lynx, Netscape, Explorer) which navigate the web's pages.

1999/2000

A global TV programme '2000Today' reports live for 25 hrs non-stop the New Year celebrations in 68 countries all over the world. It is the first ever show of that duration and geographical coverage. The programme involved a round-the-clock work of over 6000 technical personnel, and used a array of 60 communication satellites to reach 1 billion viewers from all time-zones all over the globe (The Canberra Times, 1 Jan, 2000).

109 | 115© Copyright 2006 C. George Boeree

Page 342: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Conclusions:

Psychology Today and Tomorrow

110 | 115© Copyright 2006 C. George Boeree

Page 343: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

From Logical Positivism to Postmodernism

The philosophy that came to dominate research in psychology in the first half of the 20th century was called logical positivism. This philosophy began with meetings of philosophers and physicists in Vienna and Berlin in the 1920’s. The names that come up most often in association with logical positivism are Moritz Schlick (the founder) and Rudolph Carnap.

The basic idea of logical positivism is that all knowledge is based on empirical observation, assisted by the rigorous use of logic and mathematics. The ideal method in science, in other words, is hypothesis testing. In fact, any theoretical statement is meaningful only if it can be tested empirically. This is called the verification principle.

What this meant in the larger scheme is that all metaphysical (and, of course, theological) statements are meaningless. The only purpose left to philosophy, according to the logical positivists, is the investigation of the meaningfulness of scientific statements. Over time, logical positivism came to dominate the thinking of most people in physics and chemistry, and many in biology and psychology. It was the behaviorists who adopted it most enthusiastically.

But in the second half of the 20th century, a new philosophy called postmodernism came in with some powerful criticism of logical positivism and all modern philosophy. The most familiar names associated with postmodernism are Michel Foucault and Jacques Derrida.

Postmodernism started in architecture, when some young architects in the late 1900’s rebelled against what their teachers told them about "right" and "wrong" ways to design buildings. Their teachers at the time were mostly modernists, who liked clean lines and pure geometric forms, such as we see in many modern skyscrapers. So the rebels started calling themselves postmodernists. Before, the emphasis was on keeping with one architectural philosophy or another, one style or another. The postmodernists said break the rules! mix up the styles! play with space! defy gravity if you like!

In philosophy, modernism refers to enlightenment philosophy. Back then, philosophers were seeking a single, monolithic Truth. But, beginning with Hume’s skepticism and Kant’s critical philosophy, philosophers became increasingly aware of the limitations of philosophy. Although often hidden by the popularity of approaches such as Hegel's absolutism and Comte's positivism, this skeptical or critical line of thought continued all the way through the 1800’s to Nietzsche's perspectivism and William James' pragmatism.

The fundamental point of postmodernism is that there is no objective reality or ultimate truth that we have direct access to. Truth is a matter of perspective or point-of-view. Each individual constructs his or her own understanding of reality, and no one is capable of rising above their perspectives.

In the course of history, some constructions of reality have been privileged, that is, supported by a powerful elite – wealthy European men, to use a common example. Other constructions have been suppressed. Examples of supressed constructions include the points-of-view of women, the poor, and nonwestern cultures.

Everything is seen through "glasses" – social, cultural, even individual. Even science! Thomas Kuhn, a philosopher of science, pointed out that science is actually a messy business, full of personal, cultural and even political influences. "Truth" is whatever the scientists presently in power say it is – until this status quo is overwhelmed by contradictions. Then a scientific "revolution" – a paradigm shift – takes place. And things start all over again.

The major tool of postmodernism is deconstruction. Deconstruction is when you show that some system of thought is ultimately incomplete or irrational even by its own internal ideas and reasoning. It’s like an extended version of "reduction to absurdity" – criticism from the inside out. Or you can see it as an extension of nominalism: names refer to individuals, but words that pretend to refer to anything more (universals,

111 | 115© Copyright 2006 C. George Boeree

Page 344: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

ideals, forms, natural laws, Ultimate Truths...) are just empty noises!

By deconstructing some of our traditional philosophies, histories, literatures, and sciences, postmodernism made us aware of the biases we can’t easily see because those biases are too close to us, too much a part of us. This has been the task, for example, of feminism.

Feminism began as a call to take women seriously. After eons of women's lives being seen as little more than a footnote to men's, it is way past time to pay attention to them both as subjects of serious interest and as thinkers in their own right!

Feminists say that being male unconsciously biases men as philosophers (or historians, scientists...). If we want to improve our understanding of our world, we need to take the female perspective into account. These are very good points!

Another postmodernist movement is multiculturalism. It is argued that western thinkers are unconsciously biased by their common cultural assumptions, social structures, and histories. For many years, for example, there has been a tendency to see Europeans and their descendents as somehow "normal," with other peoples and civilizations in some way inferior or deviant.

Today, most social scientists are well aware of other cultural perspectives, and are careful to examine their own biases. Social science generally has welcomed the contributions of a constantly expanding number of scientists from non-western backgrounds.

A bias that interests me is the bias that comes from class. Until very recently, the majority of scientists and other scholars have been members of the upper classes, with little sympathy for, much less understanding of, the working class poor. Even today, we have to ask ourselves, who do we as scientists work for? More often than not, it is for establishments, academic or corporate. We do, consciously or not, what our lords demand of us!

Unfortunately, some argue that the view from the lower rungs of society are actually better than those from the top. Similarly, some feminists have argued that the female perspective is intrinsically better than the male perspective. This point of view ignores the possibility that men may overcome their biases, and the possibility that women can be equally biased. We find the same tendency among advocates of other critical philosophies. It is not, for example, necessarily true that if a theory is clearly European it is wrong, or if it is non-western it is right. And even someone who does research for multinational corporations can occasionally be correct! Okay, probably not.

Furthermore, not all perspectives are equally valuable. Astrology and phrenology may be perspectives on personality, but they are, in fact, wrong! The explanations of human behavior given by Siberian shamans, although certainly interesting, are not anymore likely to be accurate than the explanations provided by Europe's own early thinkers.

Deconstructionism and postmodern philosophies in general tend to be negative philosophies. They criticize, but seldom offer alternatives. Their arguments often lack empirical support or even rational thinking: Remember that they are criticizing our very ability to be empirical or rational!

At first, traditionalists were impressed and became interested in recognizing their limitations. Men as well as women became feminists; westerners as well as others embraced multiculturalism. Most welcomed the variety of perspectives!

But eventually, some noticed: If all truth is relative (just as if all morality is relative), then feminism, multiculturalism, etc. are not intrinsically truer or more valuable than "masculinism" or Eurocentrism, etc. If we can’t make judgments as to what is or isn’t True, then how can we progress? How can we improve ourselves and our societies when "progress" is all in the eyes of the beholder?

If you believe that all perspectives are equally valid, then the only thing that raises one perspective over any other, as Nietzsche pointed out, is power. If philosophy and science are reduced to power struggles among "authorities," we are right back where we were on, say, February 17, 1600, when the church burned

112 | 115© Copyright 2006 C. George Boeree

Page 345: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Giordano Bruno at the stake.

So, once we become aware (and stay aware!) of our limitations and biases, aware even of the limitations of empiricism and rationalism themselves, we must nevertheless return to empiricism and rationalism, as the only way we can at least approximate truth, perhaps as the only way we can survive as a species. We must learn our lesson and then get back to work!

The Situation for Psychology

So where are we today, in the first years of the new millennium?

Freudianism is slowly disappearing. Its insights have been absorbed into a general clinical psychology that is dominated by humanistic practices based more on Carl Rogers and Albert Ellis than on Sigmund Freud. The object relations school attempts to hang on to Freud, but is really little more than a belated recognition of humanist ideas, reconstructed into psychoanalytic language. Jungian psychology, too, is disappearing. Jung still lives on in the study of mythology and symbolism and in the amazing popularity of the Myers-Briggs categories. Adler, on the other hand, has been "rediscovered" and his insights thoroughly integrated into humanistic and existential psychology. The same can be said for "neo-Adlerian" theorists such as Karen Horney and Erich Fromm.

Sensation and perception, the concerns of most of the originators of psychology as a science, draw less and less attention over the years. Gestalt psychology has, for the most part, been absorbed into the mainstream and lost its status as a separate approach. Its two offspring, humanistic clinical psychology and the field of social psychology are, of course, alive and well. Humanistic psychology forms the bedrock of modern clinical practice, especially in the form of an eclectic blend of Rogers and Ellis (despite their outward incompatibility!), plus a few behavioristic techniques such as systematic desensitization.

Social psychology has become a blend of humanistic concerns and inventive experimental research. Unfortunately, it has rejected its phenomenological roots, and there is little in the way of coherent theorizing or long-term commitment to research programs. Much of social psychology is a matter of testing disconnected, intuitive hypotheses.

Other disciplines, such as personality and developmental psychology, follow the same pattern as social psychology. Not only is there little in the way of theorizing in personality, but the trend is toward quantitative research, almost all of it devoted to individual differences. The pet paradigm is test creation using factor analysis, despite the fact that factor analysis is a highly suspect methodology that may well relate more to word meanings than to constructs with real psychological referents.

Developmental psychology has become increasingly applied, especially, of course, in relation to education and parenting. One advance is the movement towards consideration of the entire life span. This change also has close ties to applied areas, this time the social problem of an increasingly elderly population.

Phenomenology as a method has become a part of a more general movement usually referred to as qualitative methods. These methods have become popular in certain fields, especially education and nursing, and in certain orientations, such as feminism and multiculturalism. Unfortunately, the methods are often poorly used. They are by nature far more susceptible to bias, and much of the research can only be taken as exploratory at best.

Existentialism has fused with humanism, sometimes contributing its philosophical depth, sometimes merely adding its confusing jargon. Many existentialists and humanists have drifted into the realm of transpersonal psychology, which investigates issues such as altered states of consciousness and spiritual experiences. Although there is legitimate and valuable research here, most of it is a form of new age mysticism in the guise of psychological science.

113 | 115© Copyright 2006 C. George Boeree

Page 346: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

Behaviorism, much like gestalt psychology, has been absorbed into mainstream psychology. While students continue to memorize Pavlovian and Skinnerian conditioning paradigms, it is increasingly understood that these are not particularly useful for understanding human behavior. It is really Tolman and Bandura that appear to be having the long-term impact. Hard-core behaviorists are moving into the study of physiological processes.

The most disappointing area of psychology for me personally has been cognitive psychology. While it began promisingly with the works of psychologists like Ulric Neisser and the input from the artificial intelligence movement, it seems that both Neisser and AI researchers have abandoned the program! Neisser felt that cognitive psychology was ignoring reality and is becoming a sort of intellectual game. AI reearchers found that it simply wasn’t necessary to model human cognitive processes in order to outdo human performance. When the Deep Blue computer beat grand master Garry Kasparov, humanity's secure place at the top of creation seems to have ended.

One offshoot of cognitive psychology is a new interest in such traditional philosophical issues as the nature of consciousness. Often considered the "ultimate" psychological question, it has generated a great deal of excitement at conferences. I may be alone in this, but the problem of consciousness is not a problem for me. It is only a problem if you insist, against all reason, on being a materialist!

The most active part of psychology today is physiological psychology. First, the remarkable progress in mapping even the living, working brain with CT scans, PET scans, and MRIs will soon result in a fairly complete picture of brain circuitry. Second, the discovery of effective new drugs operating at the synapse has revolutionized clinical psychology. And third, the completion of the mapping of the human genome heralds the beginning of a far more thorough understanding of the links between genetics and behavior. On the other hand, physiological psychologists are identifying themselves more and more with their biological and medical colleagues, and distancing themselves from the "softer" side of psychology.

Related to the developments in physiological psychology is the impact of sociobiology on psychological theory. Often called evolutionary psychology, this approach has produced a significant number of intriguing hypotheses about the origins of human behavior and the existence of possible instincts that delimit, if not define, our natures. Unfortunately, the approach has offered little in the way of testable hypotheses as yet.

As it stands right now, psychology is fragmented, with a particularly large divide between humanistic applied psychology and a highly reductionistic biological psychology. What is needed is a unifying theory, one that avoids the easy extremes. It has to be informed by postmodern criticism, but must ultimately base itself on a broad empiricism and rigorous rationalism. It has actually been done before: William James did it in the 1890’s; so did Gardner Murphy in the 1950’s. Apparently, the field was not ready to recognize the full implication of their efforts and others like them. Maybe we will be ready next time.

In the meantime, courses in the history of psychology (however painfully boring they may be!) have an important place in our educations: By looking at things from the big, historical perspective, and from the "aspect of eternity" we get by studying philosophy, perhaps we will have progress in psychology sooner rather than later.

See you in the future!

– George Boeree

114 | 115© Copyright 2006 C. George Boeree

Page 347: Boeree, C. George - The History of Psychology (I-IV)

C. George Boeree: History of Psychology Part Four: The 1900's

The History of Psychology Part One: The Ancients

Part Two: The Rebirth

Part Three: The 1800's

[ http://www.ship.edu/%7Ecgboeree/historyofpsych.html ]

115 | 115© Copyright 2006 C. George Boeree