Perspectives of Social Computing life-like characters as social actors
description
Transcript of Perspectives of Social Computing life-like characters as social actors
Perspectives of Social Computing
life-like characters as social actors
Helmut Prendinger and Mitsuru IshizukaDept. of Information and Communication Eng.
Graduate School of Information Science and TechnologyUniversity of Tokyo
Social Computingobjective
Social Computing aims to support the tendency of humans to interact with
computers as social actors.Technology that reinforces the bias towardssocial interaction by appropriate response
may improve communication betweenhumans and computational devices.
Social Computing (cont.)realization
Most naturally, social computing can be realized
by using life-like characters.
Life-like Charactersrequirements for their believability
• Synthetic bodies• 2D or 3D animations
[realism not required]
• Affective voice• Emotional display• Gestures • Posture
Embodiment
Features of life-like Characters
Artificial Mind
• Emotional response• Personality• Context and situation
dependent response [social role awareness]
• Adaptive behavior [social intelligence]
Terms
Life-likeness:providing“illusion of life”
Believability:allowing“suspension of disbelief”
Scope of Applicationssome examples
Outlinesocial computing
• Background– The Media Equation, Affective Computing, the Persona
Effect
• Artificial mind– An architecture for emotion-based agents
• Embodied behavior– Gestures, affective speech
• Implementation– Coffee shop demo, Casino demo
• Emotion recognition (sketch only)– Stereotypes, biosensors
• Environments with narrative intelligence (sketch only)– Character and story
Backgroundcomputers as social actors• Psychological studies show that
people are strongly biased to treat computers as social actors– For a series of classical tests of
human-human social interaction, results still hold if “human” is replaced by “computer”
– Computers with language output (human-sounding voice) and a role (companion, opponent,…)
– Tendency to be nicer in “face-to-face” interactions, ...
• We hypothesize that life-like characters support human tendency of natural social interactions with computers
Ref.: B. Reeves and C. Nass, 1998. The Media Equation. Cambridge University Press, Cambridge.
Background (cont.)computers that express and recognize emotions• Affective Computing (R. Picard)
– “[…] computing that relates to, arises from, or deliberately influences emotions.”
– “[…] if we want computers to be genuinely intelligent, to adapt to us, and to interact naturally with us, then they will need to recognize and express emotions […]”
• We hypothesize that life-like characters constitute an effective technology to realize affect-based interactions with humans
Ref.: R. Picard, 1997. Affective Computing. The MIT Press.
Background (cont.)the persona effect
• Experiment by J. Lester et al. on the `persona effect’– [...] which is that the
presence of a lifelike character in an interactive learning environment - even one that is not expressive - can have a strong positive effect on student’s perception of their learning experience.
– Dimensions: motivation, entertainment, helpfulness, …
Ref.: J. Lester et al., 1997. The Persona effect: Affective impact of animated pedagogical agents. Proc. of CHI’97, 359-366.J. Lester et al., 1999. Animated agents and problem-solving effectiveness: A large-scale empirical evaluation . Artificial Intelligence in Education, 23-30.
Herman the Bug watches as a student chooses roots for a plant in an Alpine Meadow
Life-like Charactersdesigning their mind
• Architecture for emotion-based behavior– Affect processing – Personality– Awareness of social and contextual factors– Adaptive to interlocutor’s emotional responses
• SCREAM: SCRipting Emotion-based Agent Minds– Scripting tool to specify character behavior– Encodes affect-related processes– Allows author to define character profile for agent
SCREAM System ArchitectureSCRipting Emotion-based Agent Minds
Ref.: H. Prendinger, S. Descamps, M. Ishizuka, 2002. Scripting affective communication with life-like characters. Artificial Intelligence Journal. To appear.H. Prendinger, M. Ishizuka, 2002. SCREAM: SCRipting Emotion-based Agent Minds. Proceedings 1st International Joint Conference on Autonomous Agents and Multi-Agent Systems (AAMAS’01). To appear.
Emotion Generation Componentelicitation and management of emotions• Appraisal Module
– Process that qualitatively evaluates events according to their emotional significance for the character
– Outputs emotion types: joy, distress, angry at, happy for, resent, Schadenfreude, …
• Resolution Module– Given a multitude of
emotions are active at a time, the most dominant emotion must be extracted
• Maintenance Module– Emotions are short-lived,
they decay
Appraisal Modulethe cognitive structure of emotions
Ref.: A. Ortony, G. Glore, A. Collins, 1988. The Cognitive Structure of Emotions. Cambridge UniversityPress, Cambridge.
Appraisal Rulesexamples
joy(L,F,I,S) if % emotion type wants(L,F,Des,S) and % goal holds(F,S) and % belief I = Des. % intensity
happy-for(L1,L2,F,I,S) if % emotion type likes(L1,L2,App,S) and % attitude joy(L2,L1,F,Des,S) and % belief (hypothesized emotion of L2) log-combination(App,Des,I). % intensity
Appraisal Rules (cont.)examples
angry-at(L1,L2,A,I,S) if % emotion type holds(did(A,L2),S) and % belief causes(A,F,S0) and % belief precedes(S0,S) and % formal condition blameworthy(A,Praise,L1) and % standard wants(L1,Non-F,Des,S) and % goal log-combination(Praise,Des,I). % intensity
Emotion Resolution/Maintenanceemotion dynamics
happy for (5) happy for (5)
bad mood (4)
hope (4)
angry at (3)
happy for (3)
active emotions (valence positive or negative)
distress (2)
distress (3) distress (1)
hope (4) distress (2) happy for (1) distress (0)
angry at (3) hope (0) distress (1) happy for (-1)
0
1
2
3
winning state
Example of disagreeable character[agreeableness dimension of personality decides decay rate of pos./neg. emotions]
Emotion Regulation Componentinterface between emotional state and expression• “Display rules”
– Ekman and Friesen (’69): expression and intensity of emotions is governed by social and cultural norms
• Linguistic style variations– Brown and Levinson (’87):
linguistic style is determined by assessment of seriousness of Face Threatening Acts (FTAs)
– Social variables (universal): distance, power, imposition of speech acts
• Emotion regulation studies – J. Gross in psychology– De Carolis, de Rosis in HCI
Social Filter Moduleemotion expression modulating factors
Ref.: H. Prendinger, M. Ishizuka, 2001. Social role awareness in animated agents. Proceedings 5th International Conference on Autonomous Agents (Agents’01), 270-277.
Linear combinationof parameters
Social Filter Module (cont.)alternative combination using decision network
Agent Model Componentaffective state management
• Character Profile– Static and dynamic
features– Values of dynamic
features are initialized
• Static features– personality traits,
standards
• Dynamic features– goals, beliefs: updated by
surface consistency check– Attitude, social distance:
simple update mechanisms
Affect Dynamicsattitude and familiarity change• Attitudes (liking, disliking)
– Attitudes are an important source of emotions• Decisive for `happy for’–resent, `sorry for’–gloat
– On the other hand … an agent’s attitude changes as result of `affective interaction history’ (elicited emotions) with interlocutor
– Implementation of Signed Summary Record (Ortony ‘91)
• Familiarity (social distance)– Source for some emotions
• attraction, aversion
– Positive emotions elicited with interlocutor improves social relationship, possibly increases familiarity
– Simplified implementation of Social Perlocutions (Pautler and Quilici ‘98)
– [More sophisticated model implemented by Cassell and Bickmore ’01, variety of topics and depth of topics considered]
Signed Summary Recordcomputing attitude
joy (2)
distress (1)
distress (3)
angry at (2)
hope (2)
good mood (1)
gloat (1)
happy for (2)
winning emotionalstates
time
positive emotions
negativeemotions
joy (2)
hope (2)
good mood (1)
happy for (2)
distress (1)
distress (3)
angry at (2)
gloat (1)
+ Liking if positiveDisliking if negative
Attitudesummaryvalue
Ref.: A. Ortony, 1991. Value and emotion. In: W. Kessen, A. Ortony, and F. Craik (eds.), Memories,Thoughts, and emotions: Essays in the honor of George Mandler. Hillsdale, NJ: Erlbaum, 337-353.
=
Updating Attitudeweighted update rule• What if a high-intensity emotion of opposite sign
occurs? (a liked agent makes the character very angry)– Character ignores `inconsistent’ new information– Character updates summary value by giving greater weight to
`inconsistent’ information (primacy of recency, Anderson ‘65)
• Consequence for future interaction with interlocutor– Momentary disliking: new value is active for current situation– Essential disliking: new value replaces summary record
3 0.25 5 0.75 = 3liking h-weight angry r-weight disliking
Input and Output Componentsreceiving utterances and expressing emotions• Input are formulas encoding
– speaker, hearer – conveyed information– modalities (facial display,
linguistic style) – hypothesized interlocutor
goals, attitudes,…
• Output – 2D animation sequences
displaying character– Synthesized speech
Embodimentcharacters that act and speak
• Realization of embodiment– 2D animation sequences visually display the character– Synthetic speech
• Technology– Microsoft Agent package (installed client-side)– JavaScript based interface in Internet Explorer
• Microsoft Agent package– Controls to trigger character actions and speech– Text-to-Speech (TTS) Engine– Voice recognition
• Multi-modal Presentation Markup Language (MPML)– Easy-to-use XML-style authoring tool– Supports multiple character synchronization, simple
synchronization of action and speech– Interface with SCREAM system
Gesturesnon-verbal behaviors supporting speech
• Propositional gestures I
“there is a small difference” “there is a big difference”
Ref.: J. Cassell, 2000. Nudge nudge wink wink: Elements of face-to-face conversation for embodied conversational agents. In: J. Cassell, S. Prevost, J. Sullivan, and E. Churchill. Embodied ConversationalAgents. The MIT Press, 1-27.
Gestures (cont.)non-verbal behaviors supporting speech• Propositional gestures II
“do you mean [this]”
“or do you mean [that]”
Gestures (cont.)non-verbal behaviors supporting speech
• Gestures and posture for emotion expression
“happy”
“sad”
Gestures (cont.)non-verbal behaviors supporting speech
• Communicative Behavior I
“greet”
“wantturn”
Communicativefunction
Gestures (cont.)non-verbal behaviors supporting speech
• Communicative Behavior II
“taketurn”
“givefeedback”
Communicativefunction
Affective Speechvocal effects associated with five emotions
Emotion Fear Anger Sadness Happiness
Disgust
Speech rate
much faster
slightly faster
slightly slower
faster or slower
very much slower
Pitch average
very much higher
very much higher
slightly lower
much higher
very much lower
Pitch range
much wider
much wider
slightly narrower
much wider
slightly wider
Intensity normal higher lower higher lower
Pitch changes
normal abrupt on stressed syllables
downward inflections
smooth upward inflections
wide downward terminal inflections
Ref.: I. R. Murray, J. L. Arnott, 1995. Implementation and testing of a system for producingemotion-by-rule in synthetic speech. Speech Communication (16), 369-390.
Implementation
Implementation (cont.)simple MPML script
<!--Example MPML script --><mpml>… <scene id=“introduction” agents=“james,al,spaceboy”> <seq> <speak agent=“james”>Do you guys want to play Black Jack?</speak> <speak agent=“al”>Sure.</speak> <speak agent=“spaceboy”>I will join too.</speak> <par> <speak agent=“al”>Ready? You got enough coupons? </speak> <act agent=“spaceboy” act=“applause”/> </par> </seq> </scene>…</mpml>
Implementation (cont.)interface between MPML and SCREAM
<!--MPML script illustrating interface with SCREAM --><mpml>… <consult target=”[…].jamesApplet.askResponseComAct(‘james,’al’,’5’)”> <test value=“response25”> <act agent=“james” act=“pleased”/> <speak agent=“james”>I am so happy to hear that.</speak> </test> <test value=“response26”> <act agent=“james” act=“decline”/> <speak agent=“james”>We can talk about that another time.</speak> </test> … </consult>… </mpml>
Life-like Characters in Inter-Actionthree demonstrationsCoffee Shop
ScenarioJapanese Comics
Scenario
CasinoScenario
Animated agents with personality and social role
awareness
Life-like characters that
change their attitude during
interaction
Animated comics actors engaging in developing
social relationships
Coffee Shop Scenariolife-like characters with social role awareness• User in the role of customer• Animated waiter features
– Emotion, personality– Social role awareness:
respecting conventional practices depending on interlocutor
• Aim of implementation– Entertaining environment for
language conversation training
• Aim of study– Does social role awareness
have an effect on the character’s believability?
Ref.: H. Prendinger, M. Ishizuka, 2001. Let’s talk! Socially intelligent agents for language conversationtraining. IEEE Transactions of SMC – Part A: Systems and Humans, 31(5), 465-471.
Experimental Studyuser-agent and agent-agent interaction
Unfriendly Waiter Version (C1)
Friendly Waiter Version (C2)
Description James responds rude to user (ignores practices) Changes behavior to polite with his manager and Al
James displays polite behavior to customer Disobeys the manager’s order and turns down Al (ignores practices)
Hypotheses James’ behavior is unnatural towards user but natural to other agents
James’ behavior is natural towards user but unnatural to other agents
James (waiter)
Genie (manager)
Al (waiter’s friend)
Cast
Example Conversationunfriendly waiter version (excerpt only)Speaker
Utterance Annotation
Customer I would like a beer. User selects drink.
Waiter No way, this is a coffee shop.
Waiter considers it as blameworthy to be asked for alcohol and shows his anger. Waiter ignores conventional practices toward customer.
Manager Hello James! The manager of the coffee shop appears.
Waiter Good afternoon. May I take a day off tomorrow?
Performs welcome gesture. Being aware of the social threat from his manager, the waiter uses polite linguistic style.
Manager It will be a busy day. Manager implies that the waiter should not take a day off.
Waiter Ok, I will be there. Considers it as blameworthy to be denied a vacation and is angry. As the waiter is aware of the threat from his manger he suppresses his angry emotion.
Resultssocial role awareness and believability
• Support for effect of social role awareness– Behavior more
natural to user in C2 [respects role]
– Behavior more agreeable in C2 [friendly behavior even though low threat from user]
• Unexpected results– James’ behavior
slightly more natural to others in C2
– Personality and mood rated differently (despite of short interaction time)
Question Unfriendly Waiter (C1)
Friendly Waiter (C2)
James natural to user 3.00 6.00
James natural to others
4.88 5.50
James in real life, movie
5.00 4.63
James has good mood 2.25 2.25
James is agreeable 2.38 4.75
James likes his job 1.63 2.63Mean scores for participants’ attitudes (8 subjects for each version)Ratings range from 1 (disagreement) to 7 (agreement)
Casino Scenariolife-like characters with changing attitude• User in the role of player
of Black Jack game• Animated advisor features
– Emotion, personality– Changes attitude
dependent on interaction history with user
• Advisor’s agent profile– Agreeable, extrovert,
initially slightly likes the user
– Wants user to follow his advice (high intensity)
– Wants user to win (low intensity) Implemented with MPML and
SCREAM
Casino Demo
Produced in cooperation with Sylvain Descamps
Emotional Arcadvisor’s winning emotions depending on attitude
• Fig. shows the agent’s internal intensity values for dominant emotions– Highly abstract description (personality, context,… influences are left out)
• Values of expressed emotions differ depending on agent’s personality and contextual features– Since character’s personality is agreeable, e.g., negative emotions are de-
intensified
sorry for (4)distress (4)
Game 1user rejects
advicelooses game
Game 2rejects advicelooses game
Game 3rejects advicelooses game
Game 4follows advicelooses game
Game 5rejects advice
wins game
Pos.
att
itude
Neg.
att
itude
gloat (5)
sorry for (5) good mood (5)
Japanese Comics ScenarioJapanese Manga for children “Little Akko’s Got a Secret”
• User controls an avatar (“Kankichi”)– Goal is to elicit Little Akko’s
attraction emotion by guessing her wishes
– Correct guesses increase her liking and familiarity values
• Animated character features– Emotion (joy, distress,
attraction, aversion)
• Aim of game– Develop social relationship– Entertainment
User makes a wrong guess …
Emotion Recognitionlimitations of our characters as social actors• Human social actors can recognize interlocutors’ emotions
– Humans recognize frustration (confusion,…) when interacting others and typically react appropriately
– Our characters’ emotion recognition ability is very limited– Characters make assumptions about other agents (incl. the
user) and use emotion generation rules to detect their emotional state
• Stereotypes are used to reason about emotions of others– A typical visitor in a coffee shop wants to be served a certain
beverage and is assumed to be distressed upon failure to receive it (the goal “get a certain beverage” is not satisfied)
– A typical visitor in a casino wants to win, …– The very same appraisal rules are used to reason about the
emotional state of the interlocutor • Emotion recognition via physiological data from user
– We started to use bio-signals to detect users’ emotional state
Physiological Data AssessmentProComp+• EMG: Electromyography
• EEG: Electroencephalography
• EKG: Electrocardiography• BVP: Blood Volume Pulse• SC: Skin Conductance• Respiration• Temperature
Visualization of Physiological DataBiograph Software
Emotion ModelLang’s (95) 2-dimensional model
• Valence: positive or negative dimension of feeling• Arousal: degree of intensity of emotional response
depressed
enraged
sad
relaxed
joyful
excited
Valence
Arousal
Educational Gamesrecognizing students’ emotions (C. Conati)
• Computer games have high potential as educational tools– May generate high level of engagement and motivation– Detect students’ emotions to improve learning experience
Prime Climb Gameto teach numberfactorization (UBC)
Example Sessionuser’s traits
bodily expressions
reproach shame relief
self-esteem
extraversion
skin conductivity
eyebrows position
agent’s action
pos valence
vision basedrecognizer
EMG
sensors
GSR
ti+1ti
reproach
shame
relief
user’s emotional state at ti
user’s emotionalstate at ti+1
arousal
neg valence
highdown(frowning)
heart rate
HR monitor
high
provide help
do nothing
Narrative Intelligence (sketch only)limitations of our characters as social actors• Our characters are embedded in quite simplistic scenarios
– Knowledge gain might be limited even if characters are life-like
• “Knowledge is Stories” (R. Schank ‘95)– Schank argues that knowledge is essentially encoded as stories– This suggests to design `story-like’ interaction scenarios
• Narrative Intelligence (P. Sengers ’00)– Humans have a tendency to interpret events in terms of
narratives– This suggests that characters should be designed to produce
narratively comprehensible behavior, so that humans can easily create narrative explanations of them
• Applications– Learning environments (users as co-constructors of narratives)– Virtual sales agents (story serves rapport building and credibility)– Corporate memory (story-telling to enhance knowledge
exchange in organizations, learning from mistakes,…)
Summary (sketch only)
• Social computing – Humans are strongly biased to treat computational
devices as social actors– In order to achieve effective and efficient interaction
between humans and computational devices, social computing aims to support the tendency of humans to communicate with computational devices in an essentially natural way
• Approach– Use of life-like characters as social actors– Model and implement some aspects of the
interaction capability and modalities of humans– Many features of human-human interaction need
further investigation…