Expressive gesture in interaction: the role of movement and gesture in emotion

22
Expressive gesture in interaction: the role of movement and gesture in emotion Ginevra Castellano, Antonio Camurri, Gualtiero Volpe WP6 HUMAINE Workshop, Paris, March, 10-11, 2005 Infomus Lab http://infomus.dist.unige.it Department of computer science, systems and telematic, University of Genoa

description

Expressive gesture in interaction: the role of movement and gesture in emotion. Ginevra Castellano, Antonio Camurri, Gualtiero Volpe. Department of computer science, systems and telematic, University of Genoa. Infomus Lab http://infomus.dist.unige.it. - PowerPoint PPT Presentation

Transcript of Expressive gesture in interaction: the role of movement and gesture in emotion

Page 1: Expressive gesture in interaction:  the role of movement and  gesture in emotion

Expressive gesture in interaction: the role of movement and

gesture in emotion

Ginevra Castellano, Antonio Camurri, Gualtiero Volpe

WP6 HUMAINE Workshop, Paris, March, 10-11, 2005

Infomus Labhttp://infomus.dist.unige.it

Department of computer science, systems and telematic,University of Genoa

Page 2: Expressive gesture in interaction:  the role of movement and  gesture in emotion

What is expressive gesture

Gesture: support to verbal communication, movement

of the body containing information

Expressive gesture: high-level non-verbal expressive and emotional communication (Camurri et al., 2004)

Artistic context: gesture as conveyor of information related to the emotional domain (dance or music performances)

Page 3: Expressive gesture in interaction:  the role of movement and  gesture in emotion

Expressive gesture in HCI

Aims:

to communicate emotions to users

to recognize users’ emotional engagement

Expressive gesture: movement as

conveyor of emotional information

component of an emotional process

Page 4: Expressive gesture in interaction:  the role of movement and  gesture in emotion

Expressive gesture analysis: a layered approach (1) (Camurri et al., 2004, 2005)

Modelling techniques: prediction of emotions (e.g., multiple regression, neural networks,

decision trees)

Techniques for gesture segmentation, representation of gestures as trajectories

in virtual, expressive spaces

Video and audio processing techniques (computer vision techniques on the incoming images, signal processing on audio signals)

Physical signals, video and audio pre-processing techniques

(e.g. motion detection, audio filtering, etc.)

Page 5: Expressive gesture in interaction:  the role of movement and  gesture in emotion

Expressive gesture analysis: a layered approach (2)

Not only analysis but also synthesis of expressive gesture

Experiments on expressive gesture are carried out with Eyesweb open software platform (Camurri et al., 2000)

Perspective: mapping info about users’ behaviour onto real time generation of expressive behaviour of virtual agents such ECAs

Page 6: Expressive gesture in interaction:  the role of movement and  gesture in emotion

Our activity in HUMAINE

Expressive gesture as a component of an emotional process

Component-process model of emotion provided by Klaus Scherer (GERG) has been investigated (Scherer, 1984, 2000; Scherer and Zentner, 2001)

We used motor activation component to evaluate the emotional engagement of users exposed to emotional stimuli

Page 7: Expressive gesture in interaction:  the role of movement and  gesture in emotion

Music, emotion and movement

Research in collaboration with Professor Klaus Scherer’s group (GERG, Geneva Emotion Research Group)

Aim: to investigate the relationship between emotions induced by musical stimuli and movement

Pilot experiment: are there correlations between the emotional characterizations of music excerpts and human movement ?

Page 8: Expressive gesture in interaction:  the role of movement and  gesture in emotion

Continuous measures of emotions

Music as induction technique Music and emotion: time-varying relationship Several indicators

Problem: conscious Vs unconscious measurements

Verbal report

Physiological measures

Coding of non-verbal behavior, subject interfaces: from

sliders to multimodal interfaces Idea: laser pointer as semi-conscious interface through

which movement to communicate an emotional experience generated by music

Page 9: Expressive gesture in interaction:  the role of movement and  gesture in emotion

A pilot experiment (1)

20 subjects equipped with a laser pointer and asked to move it on a white wall in front of them while listening to music excerpts

Stimuli: a set of classical music excerpts provided by GERG

Grouped in four characterizations defined on the basis

of valence and energy: slow positive, slow negative,

fast positive and fast negative

Page 10: Expressive gesture in interaction:  the role of movement and  gesture in emotion

A pilot experiment (2)

Method

Each subject listened to four music excerpts, one for each emotional characterization

Trajectories performed by the subjects moving the laser pointer on the wall have been recorded Questionnaire to indicate emotions felt by subjects during listening to music

Page 11: Expressive gesture in interaction:  the role of movement and  gesture in emotion

Preliminary analysis

Aim: to look for correlations among features of the trajectories performed by subjects with the laser pointer and emotional characterization of the music excerpt a subject was listening to

Global and static analysis: integration of the laser trajectories over time

To obtain for each video file with the movement of the laser a bitmap summarizing the trajectory followed during the whole listening Each bitmap represents a graphical subject response (GSR) from the listening of a single music excerpt Is it possible to separate the GSRs in classes and to verify if these classes can be correlated with the characterization of the music excerpts?

CLUSTERING ANALYSIS

Page 12: Expressive gesture in interaction:  the role of movement and  gesture in emotion

Extraction of global trajectories

Patch summarizing the path followed by the laser pointer (Eyesweb platform)

..\Presentazione\Presentazione.eyw

Page 13: Expressive gesture in interaction:  the role of movement and  gesture in emotion

Identification and measure of trajectories features

To identify a collection of descriptors

Related to specific features of the trajectory patterns Angularity, rarefaction, spatial occupation, vertical symmetry, horizontal symmetry, central symmetry, compactness, lateral location,vertical location, angular tendency, spatial extension

Providing measures for relevant trajectory features Manual annotation Unambiguous criteria Patterns are evaluated with a value from 0 to 4 with respect to each specific feature Five evaluators

Page 14: Expressive gesture in interaction:  the role of movement and  gesture in emotion

An example: angularity

The trajectories drawn by the laser can be smooth (0) or angular (4) 

  Smooth trajectory: wavy, soft lines

Angular trajectory: direct, sharp, nervous lines

Smooth trajectory Angular trajectory

Page 15: Expressive gesture in interaction:  the role of movement and  gesture in emotion

An example: rarefaction

The pattern can be thick and intense (0) or rarefied (4) White pixels / total pixels in the boundary rectangle

Thick trajectory: high degree of filling of the occupied space

Rarefied trajectory: low degree of filling of the occupied space

Thick trajectory Rarefied trajectory

Page 16: Expressive gesture in interaction:  the role of movement and  gesture in emotion

Statistical analysis: mean of all the ratings of all the features for the four

emotions

Useful for deciding how to realize a clustering analysis

Page 17: Expressive gesture in interaction:  the role of movement and  gesture in emotion

Hypotheses to be verified during the clustering analysis

Angularity, rarefaction and compactness seem to explain the motor activation analyzed with this static and global analysis: critical features

Slow patterns: low angularity, high rarefaction, low compactness

Fast patterns: high angularity, low rarefaction, high compactness

Page 18: Expressive gesture in interaction:  the role of movement and  gesture in emotion

Clustering global trajectories

Eyesweb patch with a block implementing the K-Means algorithm

Aim: to verify if the grouping create clusters that are consistent with the emotional characterizations of the music excerpts used to induce the emotions in the subjects

Choice of the best type of clustering: two clusters, three features

Two different classifications: fast/slow and positive/negative

Page 19: Expressive gesture in interaction:  the role of movement and  gesture in emotion

Results Fast/slow patterns explained by angularity only Positive and the negative patterns don’t distinguish

from each others Subjects, moving the laser pointer, synchronize with

the rhythm of the excerpts

If the velocity of the music increases, consequently the

velocity of the arm movement increases as well as the

direction changes frequency

There could be a sort of correlation between characteristics of the music listened to and movement performed

Resonance between music and motor activation

Page 20: Expressive gesture in interaction:  the role of movement and  gesture in emotion

Future developments

Dynamic analysis of laser pointer trajectories: how can be correlated with the musical structure at different time scales

Aim: to discover how rules can be established to recognize emotions of users

Possible perspective: to contribute to define the role of attention in emotion-oriented systems such ECAs

Page 21: Expressive gesture in interaction:  the role of movement and  gesture in emotion

Applications

Motor rehabilitation Multimedia content analysis through novel affective

interfaces (e.g. mobiles, embedded systems, new media)

Music industry: music information retrieval from huge databases based on emotional responses

Artistic and musical applications Cultural applications, museums, and science centers

Page 22: Expressive gesture in interaction:  the role of movement and  gesture in emotion

References1. Camurri A., Hashimoto, S., Ricchetti, M., Trocca, R., Suzuki, K., and Volpe, G., (2000),

“Eyesweb – Toward Gesture and Affect Recognition in Interactive Dance and Music Systems”, Computer Music Journal, 24:1, pp. 57-69, MIT Press, Spring 2000.

2. Camurri, A., Mazzarino, B., Ricchetti, M., Timmers, R., and Volpe, G., (2004), “Multimodal Analysis of Expressive Gesture in Music and Dance Performances”, in A.Camurri, G. Volpe, (Eds.), “Gesture-based Communication in Human-Computer Interaction”, LNAI 2915, Springer Verlag, 2004.

3. Camurri, A., De Poli, G., Leman, M., and Volpe, G., (2005), “Communicating Expressiveness and Affect in Multimodal Interactive Systems”, IEEE MultiMedia, January-March 2005, pp.43-53.

4. Scherer, K.R., (1984), “On the nature and function of emotion: a component process approach”, in K.R. Scherer & P. Ekman (Eds.), Approaches to emotion (pp.293-317). Hillsdale, NJ: Erlbaum.

5. Scherer, K.R., (2000), “Emotions as episodes of subsystem synchronization driven by nonlinear appraisal processes”, in Lewis, M. & Granic, I. (Eds.) Emotion, Development, and Self-Organization (pp. 70-99). New York/Cambridge: Cambridge University Press.

6. Scherer K.R., Zentner M.R., (2001), “Emotional effects of music: production rules”, In P.N. Juslin & J.A.Sloboda (Eds). Music and emotion: Theory and research (pp. 361-392). Oxford: Oxford University Press.