Steerable Interfaces for Interactive Environments
-
Upload
mallory-slater -
Category
Documents
-
view
36 -
download
2
description
Transcript of Steerable Interfaces for Interactive Environments
Steerable Interfaces for Interactive
EnvironmentsStanislaw Borkowski
thesis director: James L. CrowleyJury:
Institut National de Recherche en Informatique et Automatique
INRIA Rhône-AlpesJune 26, 2006
Andreas Butz (UM), Joëlle Coutaz (UJF), Alex Pentland (MIT), Pierre Wellner (IDIAP)
2
User Interface (UI): aggregate of physical entities or information bound to these entities
What is a user interface?
A
3
Steerable UI:can be relocated in spaceposition is mediated by
the computer system
Portable UI:can be relocatedposition is directly
controlled through physical contact
Mobile UI’s
Mobile UIs
Portable UIs
Steerable UIs
A
4
Mobility in current IT Steerable interfaces
Conventional GUI (steerable output) X11 session teleporting [Richardson93]
Portable interfaces Wearable computers Cell phones Personal Digital Assistants Laptops ….
5
Mobility in ambient computing Multiple displays embedded in
the environment Large size displays Mobile interaction resources,
both portable and steerable
[Pinhanez01] [Streitz99]
[Arias00]
6
Why steerable?
Flexibility in resources usage
New forms of Human-computer interaction
New forms of Human-Human interaction
7
Current situation – summary
Problem: Need for steerable UIs No predictive models
Solution: Provide enabling technology Explore interaction techniques Evaluate the value of steerable UIs
8
Outline Mobility in IT Steerable UIs Mobile projected UI Mobile UIs for collaborative work Conclusions
9
State of the art
EasyLiving [Brumitt00]
Tic-Tac-Toe [Pinhanez05]
10
State of the art – limitations
UI is observable at standstill Limited spatial controllability Only predefined locations Planar surfaces only
Requirements for steerable UIs: Continuous observability and
controllability
11
Outline Mobility in IT Steerable UIs Mobile projected UI
Prototype implementation[in collaboration with J. Letessier]
Evaluation – latency estimation
Mobile UIs for collaborative work Conclusions
12
The Steerable Camera Projector
(2002)
Other steerable projection systems: The Everywhere Display (IBM 2000)
Fluid Beam (Fluidum consortium 2002)
SCP from Karlsruhe (UKA 2004)
13
Steerable display (2002)
14
User-centric approach
End-users: Latency limits < 50ms Easy setup, no maintenance Reliability / predictability
Developers: Abstraction: be relevant Isolation: allow integration Contract: offer quality of service
15
Pragmatic approach
Black-box servicesBIP (Basic Interconnection Protocol)
BIP implementation ≈ SOA middleware
service/service and service/application communication
service discovery (standards-based)
16
Interactive system
ApplicationSCP
software
Human and Environmen
t
Interaction events
Display orders
17
Interactive system
Application
Human and Environmen
tSCPdisplay
SCPcontroller
Frame grabber
Interaction detector
SCPcalibrator
A
18
Interactive system
Application
Human and Environmen
tSCPdisplay
SCPcontroller
Frame grabber
Interaction detector
SCPcalibrator
19
Video projector
Light source
Screen
Source image
Projection on arbitrary oriented planar surfaces
User’s perception
20
Video projector
Screen
Projection on arbitrary oriented planar surfaces
Light source
Image to project
User’s perception
Source image
SCPdisplay
21
Projection on arbitrary oriented planar surfaces
Image to project
User’s view
22
Interactive system
Application
Human and Environmen
tSCPdisplay
SCPcontroller
Frame grabber
Interaction detector
SCPcalibrator
23
Sensor-centric environment model
1 2
3
123
24
Display surface detection
Screen
25
The Portable Display Surface
26
Interactive system
Application
Human and Environmen
tSCPdisplay
SCPcontroller
Frame grabber
Interaction detector
SCPcalibrator
27
Interactive widgets projected on a portable display surface
28
Luminance-based button widget
29
Locate widget in the camera image
Estimate occlusion
Update widget state
Touch detection
CIH
)()(:)( tLtLtL io
30
Robustness to clutter
31
Assembling occlusion detectors
32
Striplet – the occlusion detector
x
y
0 R
33
Striplet-based SPOD
SPOD – Simple-Pattern Occlusion Detector
34
Striplet-based button
35
SPOD-based calculator
Accelerated video
36
Outline Mobility in IT Steerable UIs Mobile projected UI
Prototype implementationEvaluation – latency estimation
Mobile UIs for collaborative work Conclusions
37
Latency estimation
0t
pt 0t
PCI A/D converter
Frame Grabber
CPU
Imalab shell
Image processing
Graphic Card
OpenGl render
0ttl p
38
Latency estimation
Fan
PCI A/D converter
Frame Grabber
CPU
Imalab shell
Image processing
Graphic Card
OpenGl renderRegulated power
supply
Video sequence capture
Plastic bar
Projection of the bar
39
Latency estimation – results
+ up to 51ms!!!
A~17ms
0tPCI A/D converter
Frame Grabber
CPU
Imalab shell
Image processing
Graphic Card
OpenGl render
~70ms
0tPCI A/D converterFrame Grabber
CPU
Imalab shell
Graphic Card~32ms
40
Interactive system
Application
Human and Environmen
tSCPdisplay
SCPcontroller
Frame grabber
Interaction detector
SCPcalibrator
41
Outline
Mobility in IT Steerable UIs Mobile projected UIs Mobile UIs for collaborative work
ContAct applicationUser study – comparison of different
take-over techniques
Conclusions
42
ContAct – a system for authoring presentations
Collaboration through interface mobility
43
ContAct application setup Wide angle camera Tabletop camera Steerable Camera Projector Portable Display Surface
44
ContAct application GUI
45
Outline Mobility in IT Steerable interface prototype Mobile UIs for collaborative work
ContAct applicationTaking control: a comparative user study
[in collaboration with J. Maisonnasse and J. Letessier]
Conclusions
46
Evaluation of techniques for taking control
Objectives: Determine the preferred
control taking technique Evaluate the impact on the
task completion performance Evaluate user acceptance of
steerable interfaces
47
Experimental setup
GUI
Users
Hardware:
Steerable Camera Projector
Microphone headsets
Portable Display Surface
Software:
Speech detector [D. Vaufreydaz]
Conversation modeling [J. Maisonnaisse]
Finger tracking [J. Letessier]
PDS tracking
Drawing application
48
The User Interface
49
The task
Collaborative reconstruction of a graph
50
The task
Collaborative reconstruction of a graph
User 2 User 3User 1
51
Experimental conditions
Proposed techniques for taking control: Baseline: fixed interface Portable: PDS Steerable: touch-based Steerable: voice-based steering
52
Fixed interface
GUI
Users
53
Explicit direct manipulation
54
Explicit touch-based steering
55
Implicit voice-based steering
Rules controlling the interface location: Interface is steered toward
the “main speaker” Interruptions are ignored Drawing inhibits vocal steering Conflicts result in loss of interface control
56
Subjects
12 groups of 3 people 13 women, 23 men Average age 27.7 19 experts in IT 17 subjects familiar with IT
57
1
1.5
2
2.5
3
3.5
fixed button pds voice
experts non-experts
Results – user preference
rank
Rank scale: 1 = most liked 4 = least liked
58
Results – PDS
Fun to use Predictable
Less intuitive Less reactive Not well suited
for the task
Experts Non-experts
59
Results – Voice-based control
Intimidating Limits
collaboration
Fun to use Enhances
collaboration
Experts Non-expertsModified their behaviour
Least predictable
60
Example result
61
User performance – ability to duplicate
0
10
20
30
40
50
60
70
80
90
100
fixed button pds voice
experts non-experts
% of remembered elements
62
Outline Mobility in IT Steerable UIs Mobile projected UI Mobile UIs for collaborative work Conclusions
63
Conclusions 1/2
Steerable camera-projector pair enables mobile UIs
Portable UIs (PDS)
Steerable UIs
64
Conclusions 2/2
UI mobility can enhance the collaborative experience
Explicit control is preferred over implicit control
65
Future directions 1/2
The SCP: Adapting to display surface texture
The PDS: Tracking and interaction with multiple PDS’ High frame-rate tracking
Vision-based projected widgets: Integration of multiple occlusion detectors
66
Steerable interfaces: Other applications for steerable interfaces Alternative methods for controlling the
location Exploring links with plastic interfaces –
dynamic interface adaptation Creation of a “space manager”
Future directions 2/2
67
Thank you for your attention
68
69
ResultsThe preference:#1 the PDS #2 button-based control #3 voice-based control#4 fixed interface
2
3
4
fixed button pds voice
0
2
4
6
8
10
12
14
16
1 2 3 4
70
1 2
3
Sensor-centric environment model
71
SPOD software components
Frame Grabber
Client Application
Calibration
GUI rendering
GUI
Striplets Engine
VEILS
P
O
D
72
Striplet – the occlusion detector
dxdytyxyx t ),,L(),(f)(R gain
Gain
x
x
y
0),(fgain dxdyyx
73
VEIL – Vision Events Interpretation Layer
Striplets Engine
VEILS
P
O
D
Inputs Widgets coordinates Scale and UI to camera mapping matrix Striplets occlusion events
Outputs Interaction events Striplets coordinates
74
VEIL – Vision Events Interpretation Layer
Striplets Engine
VEILS
P
O
D
Inputs Widgets coordinates Scale and UI to camera mapping matrix Striplets occlusion events
Outputs Interaction events Striplets coordinates
75
VEIL – Vision Events Interpretation Layer
Striplets Engine
VEILS
P
O
D
Inputs Widgets coordinates Scale and UI to camera mapping matrix Striplets occlusion events
Outputs Interaction events Striplets coordinates
76
Inputs Striplets UI-coordinates UI to camera mapping matrix Images from camera service
Outputs Occlusion events
Striplets Engine Service
Striplets Engine
VEILS
P
O
D
77
Inputs Striplets UI-coordinates UI to camera mapping matrix Images from camera service
Outputs Occlusion events
Striplets Engine Service
Striplets Engine
VEILS
P
O
D
78
Inputs Striplets UI-coordinates UI to camera mapping matrix Images from camera service
Outputs Occlusion events
Striplets Engine Service
Striplets Engine
VEILS
P
O
D
79
VEIL – Vision Events Interpretation Layer
Striplets Engine
VEILS
P
O
D
Inputs Widgets coordinates Scale and UI to camera mapping matrix Striplets occlusion events
Outputs Interaction events Striplets coordinates
80
Striplet-based slider
81
Tracking the PDS
Tracking edges in the Hough space
+ Naturally robust to partial occlusions
- High computation cost
Line-segments-based tracking
+ Efficient quadrilateral detection
- Difficulties in handling occlusions
82
Pushing vs. pulling the UI
83
Results
0
0.2
0.4
0.6
0.8
1
1 2 3 4
Time performance:
Trial number
Normalized trial time
sipi XHX
84
Performance• +- 180 deg of pan
• 1600 discrete positions (resolution)• 90 deg/s max pan speed reached in 0.75 s
• 90 deg of tilt• 500 discrete positions• 80 deg/s max tilt speed reached in 0.60s
Video Projector
Camera
Pan Stepper-motor
Tilt Stepper-motor
Control and power supply