ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/...

18
eNTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4

Transcript of ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/...

Page 1: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments

eNTERFACE ‘08| Project 4

Page 2: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Project Team

Team Leader: • Catherine Guastavino, McGill University, Canada

Team Members: • Emma Murphy, McGill University, Canada• Charles Verron, France Telecom R&D and Laboratoire de

Mecanique et d'Acoustique de Marseille (LMA), France• Camille Mousette, Ume Institute of Design, Sweden

Page 3: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Project Proposal

• The main aim of this project was to run usability tests to investigate issues of integration and effectiveness of information delivery between audio and haptic modalities.

• It is proposed that an interface involving a target finding task with audio and haptic (touch) feedback is relevant to this aim

• Furthermore target finding using multimodal cues is relevant to the field of New Instrument Design and also the wider field of Human Computer Interaction

Page 4: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Literature

• Various studies have indicated that the use of non-speech audio and haptics can help improve access to graphical user interfaces (Mynatt and Weber, 1994; Ramstein et al., 1996), by reducing the burden on other senses, such as vision and speech.

• Studies have specifically investigated the use of audio and haptics to convey object location in a spatial structure (Wood et al., 2003; Lahav and Mioduser, 2004; Murphy et al., 2007)

• Previous studies have also investigated the use of 3D audio with gesture for target finding in virtual environments (Marentakis and Brewster, 2005 )

Page 5: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Interface Design

• Audio Feedback• Non-Speech Auditory Cues

• Freesound

• MAX/MSP • Ircam Spat Object

• Haptic Feedback• PHANTOM OMNI

• H3D API

Page 6: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Audio-Haptic Design

• The proposed idea was to implement a target finding task with haptic feedback using the PHANTOM and non-speech audio cues.

• A virtual environment composed of a number of parallel planes was created with a target located randomly on one the the planes

• Haptics: A magnetic effect was used to create a rigid surface for the planes and also on the target.

• Audio: Auditory cues were designed based on a string instrument (a cello) utilizing 3D spatial audio

Page 7: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Audio-Haptic Design

Horizontal and Vertical conditions

Audio:Crossing the planesTarget Location CueTarget Found Cue

Haptics: Magnetic Effect on the surface of

planes and target

Page 8: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Audio-Haptic Design

We used the IRCAM SPAT object for 3D sound spatialization over headphones using binaural rendering with HRTF database by (Martin et al., 1994; Gardner et al, 1994)

The virtual sound source (the bowed cello sound) is spatialized using the “ears in hand metaphor”

The virtual sound source is played only when the target and the stylus are located on the same plane (horizontal or vertical, according to the

configuration).

Page 9: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Experimental Design

• Independent variables • Feedback = audio only or audio-haptic • Orientation = vertical or horizontal

• Resulting on 4 experimental conditions• Audio-Haptic Vertical• Audio-Haptic Horizontal• Audio-Only Vertical• Audio-Only Horizontal

• Dependent variables • completion times • trajectories • perceived effectiveness and ease of use • cognitive strategies

Page 10: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Experimental Hypotheses

• Hypotheses:

1. Users would find the audio only condition more difficult to navigate without the support of the haptic planes

2. Investigate the effect of flipping the planes from vertical to horizontal orientations in both audio only and audio-haptic conditions.

Page 11: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Demo Video

QuickTime™ and aH.264 decompressor

are needed to see this picture.

Page 12: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Experiment: Target Finding Task

• 23 Participants

• Training Introduction: Users became familiar with audio-haptic cues using a visual representation of the planes. Users were asked to navigate the planes, firstly find the plane with the target and then locate that target.

• Trial Experiment: Users were presented with 8 trials, 2 of each condition. Users were not give any information about the 3D audio mappings or the haptic feedback

• Main Experiment - 44 Trials (11 per condition) • Condition, Target position within and across planes

were randomised

Page 13: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Initial Results: Completion Times

Initial completion time analyses confirm hypothesis; audio completion times are significantly longer than audio-haptic condition.Furthermore the vertical condition took significantly longer than the horizontal condition

Page 14: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Initial Results: Interaction effects

Factorial Anova: Significant effect of feedback Audio vs. audio-haptic (p=>.001) Orientation: vertical vs. horizontal (p=>.01)

Page 15: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Analysis

• Strategies• From observation the most efficient users were those who immediately

grasped the structure of the virtual environment and understood the 3D audio cues.

• Spatial audio - elevation cues more difficult to perceive without individualized HRTFs

• Gestures• Interesting gestural use of the haptic device• Some participants changed hand movement according to the position of

the planes• One participant had an interesting gestural strategy of recreating the

haptic planes with his free hand in the audio-only condition

• Further Analysis• Post-task Questionnaires• Trajectories - to further analys gestural control and to investigate

random identifications of the target source

Page 16: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Demo: Recreating User Trajectories

QuickTime™ and aH.264 decompressor

are needed to see this picture.

Page 17: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Future Work

• Further Analysis• Post Task Questionnaires• Trajectories• Further quantitative results

• Further Evaluations • Develop cues • Implement experiment using other haptic devices

• Applications • Visually impaired users• Small screen devices

Page 18: ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

eNTERFACE ‘08: Project4

Summary

• The aim of this project was to highlight usability issues for audio haptic cues by conducting user evaluations of a multimodal interface.

• Experiments confirmed our initial hypothesis that the audio-only condition would be most difficult for users to navigate

• Further analysis will focus on the qualitative comments from post task questionnaires and also trajectory analysis and gestural movements

• We intend to develop this study in terms of the audio-haptic cues, using other haptic devices and also extend the perceptual evaluation to investigate other aspects of multimodal integration