Advanced User Interfaces · 2013-02-04 · Computer Science 425 Advanced User Interfaces Topics in...

Post on 27-Jul-2020

4 views 0 download

Transcript of Advanced User Interfaces · 2013-02-04 · Computer Science 425 Advanced User Interfaces Topics in...

Computer Science 425

Advanced User InterfacesTopics in Human-Computer Interaction

Week 05: Context Aware Computing

Prof. Roel Vertegaal, PhD

add discussion of the four laws of hci: hicks, fitts, power learning (see landauer

towards the year twothousandmarilyn has slides on it)

Week 05: PlanThanks to Saul Greenberg and

Jason Hong

I. Context-Aware Computing

What is Context-Awareness

Taxonomy

Toolkit

II. Context-Aware Applications

Readings

Abowd, G. and Mynatt, E. Charting past, present, and future research in ubiquitous computing Special issue on human-computer interaction in the new millennium, Part 1, TOCHI 7(1), 2000.

Salber, D., Dey, A and Abowd, G. The Context Toolkit: Aiding the Development of Context-Enabled Applications. In Proceedings of CHI'99, Pittsburgh, PA, May 15-20, 1999. ACM Press.

I. Context-Aware Computing

Computers have extremely limited input

Aware of explicit input only

What’s a voleComputers are Blind!

Computers too many

In Judeo/Christian tradition, the term Ivory Tower is a symbol for noble purity. It originates with the Song of Solomon (7,4) ("Your neck is like an ivory tower") and was added to the epithets for Mary in the sixteenth century Litany of the Blessed Virgin Mary ("tower of ivory", in latin Turris eburnea

Modern computers are divorced from reality

Computers do not take into account user interactions with other people or computers.

Context-Aware Computing

Making computers more aware of the physical and social worlds we live in

Allowing them to adapt to our situation

Would a Collaborative Model Apply to Ubiquitous Computing?

Computers have extremely limited input

Aware of explicit input only

Can take a lot of effort to do simple things

Modern computers are divorced from reality

Unaware of who, where, and what around them

Computers do not take into account user interactions with other people or computers.

Context-Aware Computing

Making computers more aware of the physical and social worlds we live in

Mismatch between how people do things and what today's computers can offer

•Screen saver activates at wrong times•Cell phones interrupt us at bad times

•Modern computers are divorced from realityUnaware of who, where, and what around them

Computers have extremely limited input

Aware of explicit input only

Can take a lot of effort to do simple things

What is Context?

Merging of virtual with physical and social

Mostly implicit input, often secondary to task at hand

People / Places / Things / Virtual / Time

People: Location, Identity, Task, Affect

Places: Activity, People, Temperature, Audio

Things: Location, Identity

Virtual: Services, Bandwidth

Time: All of these Past, Present, and Future

What is Context?

How to define it?

Workflows?

Intent and Desire?

Objects and People?

Attention?

What isn't context?

Instead, operationalize it

Distributed, multi-device, sensor-based

Fairly well-defined and computable concepts (e.g. location and identity, but not intent or workflow)

Example of Context-Aware Things

Can you help me write down some current systems that may be considered context-aware?

Why Context-Aware Computing?

Context TypesExisting Examples Human Concern

Room ActivitySmoke Alarm Safety

Room ActivityAuto Lights On / Off Convenience

Object IdentityBarcode Scanners Efficiency

Personal Identity & TimeFile Systems Finding Info

TimeCalendar Reminders Memory

•Context-Awareness isn't new, it's fundamental to how we build things (tools and applications)

Existing Examples

Why Context-Aware Computing?

Context Types Potential Examples Human Concern

Activity Convenience

Activity Finding Info

Identity Memory

Identity & Time Safety

Time Efficiency

Identity

Time

Location

Proximity

Activity

History

Smoke Alarm

Auto Lights On / Off

Barcode Scanners

File Systems

Calendar Reminders

Health Alert

Auto Cell Phone Off In Meetings

Service FleetDispatching

Tag Photos

Proximal Reminders

•Here are some potential context-aware applications enabled by these new technologies

•Other concerns•Rapid decision making

•Health for elderly and for young•Multimodal interaction and disambiguation

•Efficiency•Safety

Technology Trends

Sensors

GPS, Active Badges, Active Bats

Smart Dust

Cameras and microphones

Recognition algorithms

MSR Radar location from 802.11

Smart Floor footstep force

Wireless technologies

Bluetooth, 802.11, cell phones

•Three trends are changing computer-based context-awareness

•Picture of BATS•Other recognition algorithms

•Biometrics - Speaker identification•Extracting location from Wireless LAN

•Vision recognition (object identification, face recognition)

II. Context-Aware Applications

Active Badge (Want, Olivetti 1992)

Want 1992 (Olivetti Cambridge labs

Badges emit infrared signals

Gives rough location + ID

Teleport

Redirect screen output from "home" computer to nearby computer

Phone forwarding

Automatically forward phone calls to nearest phone

Efficient location andcoordination of staff in any large organization is a difficult and recurring problem. Hospitals, forexample,mayrequireup-to-dateinformationaboutthelocationofstaffand patients, particularlywhenmedicalemergenciesarise. Inanofficebuilding, areceptionistisusu- allyresponsiblefor determiningthelocationof staff members; insomeorganizations, public- addresssystemsareprovidedtohelpareceptionistlocateemployeesbut, morefrequently, atele- phoneisusedtocontact all thepossiblelocationsat whichtherequiredpersonmight befound. Thesesolutionscancauseagreat deal ofirritationanddisruptiontootheremployees; asolution that provides direct location information is more desirable. Locationinformationforofficestaffthat isavailableinacomputer-readableformat canalsobe usedtoimprovetheoperationoftheofficetelephonesystem.Integrationoftelephonesystemswith computersystemsisalsoimportant inthedevelopment oftheautomatedoffice. Muchworkhas alreadybeenundertakenintegratingdigitalvoiceandcomputerdataintoasinglenetwork[4],but therehasbeenlesscommercialeffortinvestedinimprovingthetelephoneinterface.Althoughthese interfacesarefunctionallysophisticated,theyarecrypticandtheiroperationisdifficulttoremem- ber.ThefeaturesmostcommonlyusedbyPBXclientsare‘calltransfer’and‘callforward’.Inmost casestheexecutionofthesefeaturescouldbeautomatedbythePBXifit hadinformationabout the current location of its clients.

ParcTab (Want et al. PARC 1993)

http://sandbox.xerox.com/parctab/

The Xerox PARCTAB

IntroductionThe PARCTAB system is a research prototype developed at Xerox PARC to explore the capabilities and impact of mobile computers in an office setting. This research is part of PARC's Ubiquitous Computing research program.

The PARCTAB system consists of palm-sized mobile computers that can communicate wirelessly through infrared transceivers to workstation-based applications. Take a look at the hardware specifications or a picture gallery .

A small number of basic principles and assumptions have driven our design:

* Extreme portability. The device is designed to be carried or worn at all times, much like a pager. It's size, weight, and features are intended to promote casual, spur of the moment, computing. For example, it has no power switch and instead automatically turns itself on when a person starts interacting and off after a person has finished interacting. * Constant connectivity. The system assumes the palm-top unit is always connected to the network infrastructure. * Location reporting. The location of each PARCTAB is always known to system software.

Sample Context-Aware Apps ParcTabs

Active badge + wireless

Rough location + ID

Proximate selection

Interfaces for nearby objects

Auto-diaries

People, places, and time

Triggers

Alerts on preset events

Reconfiguration

Bind device to room

ParcTabsXerox PARC

Want, Schilit, et al

Proximate Selection

Proximate selection is a user interface technique where the located-objects that are nearby are emphasized or otherwise made easier to choose. In general, proximate selection involves entering two variables, the “locus” and the “selection.” However, of particular interest are user interfaces that automatically default the locus to the user’s current location. There are at least three kinds of located-objects that are interesting to select using this technique. The first kind is computer input and output devices that require co-location for use. This includes printers, displays, speakers, facsimiles, video cameras, thermostats, and so on. The second kind is the set of objects that you are already interacting with, and which need to be addressed by a software process. This includes people in the same room to whom you would like to “beam” a document. The third kind is the set of places one wants to find out about: restaurants, night clubs, gas stations, and stores, or more generically, exits and entrances. Consider an electronic yellow pages directory that, instead of the “city” divisions of information, sorts represented businesses according to their distance from the reader. Location information can be used to weight the choices of printers that are nearby. Figure 1. shows proximate selection dialogs for printers using three columns: the name of the printer, the location, and a distance from the user. One interface issue is how to navigate dialogs that contain this additional location information. For example, should dialogs use the familiar alphabetical ordering by name or should they be ordered by location. Shown here are (a) alphabetically ordering by name; (b) ordered by proximity; (c) alphabetical with nearby printers emphasized; (d) alphabetical with selections scaled by proximity, something like a perspective view.

Sample Context-Aware Apps ParcTabs

Active badge + wireless

Rough location + ID

Proximate selection

Interfaces for nearby objects

Auto-diaries

People, places, and time

Triggers

Alerts on preset events

Reconfiguration

Bind device to room

Explain only subject headings: rest comes later

Forget-me-not system

http://www.xrce.xerox.com/publis/cam-trs/html/epc-1994-103.htm

Sample Context-Aware Apps ParcTabs

Active badge + wireless

Rough location + ID

Proximate selection

Interfaces for nearby objects

Auto-diaries

People, places, and time

Triggers

Alerts on preset events

Reconfiguration

Bind device to room

"Like living in a rule-based expert system!"

Using predictate logic for declarative programming of contextual triggers.

Context-Aware Computing ApplicationsBill Schilit, Norman Adams, Roy Want

Entries are of the form: badge location event-type action The badge and location are strings that match the badge wearer and sighting location. The event-type is a badge event type: arriving, departing, settled-in, missing, or attention 2. When a matching event occurs, Watchdog invokes the action with a set of Unix environment variables as parameters. These include the badge owner, owner’s office, sighting location, and the name of the nearest host. For example: Coffee Kitchen arriving "play -v 50 ̃/sounds/rooster.au" schilit * attention "emacs -display $NEARESTHOST:0.0" The first example monitors the “coffee” badge—which is attached to the coffee maker in the kitchen—and plays the rooster sound whenever anyone makes coffee. The second starts an Emacs window at a nearby host whenever the attention signal is received.

Cyberguide (Georgia Tech)

GPS or infrared tracking

Fairly precise location

Display location on screen

Predefined points of interest

Automatically pop up if nearby

Travel journal

Keep log of places seen and photographs taken

Sample Context-Aware Apps Enhanced PDAs

Voice memo recording

Hold like phone near mouth to start recording

Portrait / Landscape mode

Just physically rotate screen

Tilt scrolling

Tilt instead of scrollbars

Power management

Turn on if being held and tilted

Microsoft ResearchHinckley et al

Some Issues in Context-Aware Computing

Sensor ambiguity

Sensors not 100% reliable

Precision / Accuracy / Granularity

Sensor Fusion

Merging different sensor inputs together

Self-contained vs. distributed systems

PDA doesn't need location sensors if it can ask nearby sensors to approximate

Requires lots of knowledge and effort to build

Sensors, recognition algorithms, devices, application

Few kinds of context beyond location + ID used

There haven't been many rigorous evaluations of utility

Building Context-Aware Apps

Describe support at app-level

ParcTab System

Context Toolkit

Cooltown

Social Floor

LAFCam

Try to make it easier to build a certain class of context-aware apps

A Rough Taxonomy of Context-Aware Apps

Triggers

Metadata Tagging

Reconfiguration and Streamlining

Input specification

Presentation

A Rough Taxonomy of Context-Aware Apps

Triggers

On X do Y

"Notify doctor and nearby ambulances if serious health problem detected"

"Remind me to talk to Chris about user studies next time I see him"

Metadata Tagging

"Where was this picture taken?"

"Find all notes taken while Mae was talking"

Memory prosthesis

•http://www.kodak.com/US/en/corp/georgeFisher/dCarpAdp2000.shtml

•"Who is that person sitting next to Uncle Ralph?" •"Where on earth was that picture taken?"

•"Was this a photo from my second wedding? Or my third?"

A Rough Taxonomy of Context-Aware Apps

Reconfiguration and Streamlining

Telephone forwarding and Teleport

Turn off cell phone in theaters

Automatically adjust brightness / volume

Automatic file pre-caching

Select modes in multimodal interaction

Multimedia / Bandwidth adaptation

A Rough Taxonomy of Context-Aware Apps

Input specification

Send mail only to people in building now

Print to nearest printer

"Find gas stations nearest me"

Presentation

Current location

Activity

Presence

Contextual info about objects

Proximate selection

Context Toolkit (Dey, Salber, Abowd 2001)

Toolkit for distributed context-aware apps

Framework for acquiring and handling context

Standard components

Three key abstractions

Widgets

Interpreters

Aggregators

http://www.cs.cmu.edu/~anind/context.html

GPSActiveBadge

App App

LocationLayer

Cell TowerLocation

ActiveBadge

Widgets abstract out sensors

App

LocationWidget

Location to Room

Interpreter

Location to Room

Interpreter

Interpreters transform context data

App

PersonAggregator

ActivityLayer

AffectLayer

App

LocationLayer

Location to Room

Interpreter

has-a

Widgets abstract out sensors

In/Out Board Georgia Tech

Dummbo

Since web is universal, link context awareness to web

Literally everything has a URL

People, places, things

Infrared beacons, bar codes, etc

Literally everything has a web page

Current status, contact info, services offered, etc

Buses equipped with GPS and webservers

In busShow locationShow nearby points of interest

Waiting for busShow locationShow wait time\

SEMANTIC WEB

The LAFCam makes use of the involuntary contextual cues people utter.

LAFCam recognizes nonverbal utterances while shooting videos.

LAFCam finds the most engaging moments in the video based on inadvertent utterances

Highlights points of interest from recorder’s perspective to the audience.

Social Floor (Selker, MIT 2002)

Attentive UI as Context-Aware Interface

Open Research ChallengesSystems Issues

Programming model

Programming the physical world

Unreliable sensors, recognition algorithms

Interoperability

Sensors, services, and devices

Useless if everyone has proprietary / custom systems

Need standard data formats, protocols, and frameworks

Need clearer definitions of context

What is and isn't context?

Temperature? The monitor I'm looking at? Personal history?

Have to avoid the AI tarpit (ie does it matter?)

Open Research ChallengesPeople Issues

Making it predictable and understandable

Setting preferences

"I want my cell phone to ring except in theaters and when I'm in a meeting unless…"

Why the heck did it do that?

Privacy

What does the computer know about me?

What do others know about me?

What do I gain? What do I lose?

Questions?