426 lecture6b: AR Interaction

86
COSC 426: Augmented Reality Mark Billinghurst [email protected] August 14 th 2012 Lecture 6: AR Interaction

description

COSC 426 Lecture 6b on AR Interaction. P

Transcript of 426 lecture6b: AR Interaction

Page 1: 426 lecture6b: AR Interaction

COSC 426: Augmented Reality

Mark Billinghurst

[email protected]

August 14th 2012

Lecture 6: AR Interaction

Page 2: 426 lecture6b: AR Interaction

experiences

applications

tools

components

Sony CSL © 2004

Building Compelling AR Experiences

Tracking, Display

Authoring

Interaction

Page 3: 426 lecture6b: AR Interaction

AR Interaction

  Designing AR System = Interface Design   Using different input and output technologies

  Objective is a high quality of user experience   Ease of use and learning   Performance and satisfaction

Page 4: 426 lecture6b: AR Interaction

User Interface and Tool   Human User Interface/Tool Machine/Object   Human Machine Interface

© Andreas Dünser

Tools

User Interface

Page 5: 426 lecture6b: AR Interaction

User Interface: Characteristics   Input: mono or multimodal   Output: mono or multisensorial   Technique/Metaphor/Paradigm

© Andreas Dünser

Input

Output Sensation of movement

Metaphor: “Push” to accelerate “Turn” to rotate

Page 6: 426 lecture6b: AR Interaction

Human Computer Interface   Human User Interface Computer System   Human Computer Interface=

Hardware +| Software   Computer is everywhere now HCI electronic

devices, Home Automation, Transport vehicles, etc

© Andreas Dünser

Page 7: 426 lecture6b: AR Interaction

More terminology   Interaction Device= Input/Output of User

Interface   Interaction Style= category of similar

interaction techniques   Interaction Paradigm   Modality (human sense)   Usability

Page 8: 426 lecture6b: AR Interaction

Back to AR   You can see spatially registered AR..

how can you interact with it?

Page 9: 426 lecture6b: AR Interaction

Interaction Tasks   2D (from [Foley]):

  Selection, Text Entry, Quantify, Position   3D (from [Bowman]):

 Navigation (Travel/Wayfinding)   Selection  Manipulation   System Control/Data Input

  AR: 2D + 3D Tasks and.. more specific tasks?

[Foley] The Human Factors of Computer Graphics Interaction Techniques Foley, J. D., V. Wallace & P. Chan. IEEE Computer Graphics and Applications(Nov.): 13-48. 1984. [Bowman]: 3D User Interfaces: Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005

Page 10: 426 lecture6b: AR Interaction

AR Interfaces as Data Browsers

  2D/3D virtual objects are registered in 3D   “VR in Real World”

  Interaction   2D/3D virtual viewpoint

control

  Applications   Visualization, training

Page 11: 426 lecture6b: AR Interaction

AR Information Browsers   Information is registered to

real-world context   Hand held AR displays

  Interaction   Manipulation of a window

into information space   Applications

  Context-aware information displays

Rekimoto, et al. 1997

Page 12: 426 lecture6b: AR Interaction

Architecture

Page 13: 426 lecture6b: AR Interaction

Current AR Information Browsers   Mobile AR

  GPS + compass

  Many Applications   Layar   Wikitude   Acrossair   PressLite   Yelp   AR Car Finder   …

Page 14: 426 lecture6b: AR Interaction

Junaio   AR Browser from Metaio

  http://www.junaio.com/

  AR browsing  GPS + compass   2D/3D object placement   Photos/live video  Community viewing

Page 15: 426 lecture6b: AR Interaction
Page 16: 426 lecture6b: AR Interaction

Web Interface

Page 17: 426 lecture6b: AR Interaction

Adding Models in Web Interface

Page 18: 426 lecture6b: AR Interaction

Advantages and Disadvantages

  Important class of AR interfaces  Wearable computers   AR simulation, training

  Limited interactivity  Modification of virtual

content is difficult

Rekimoto, et al. 1997

Page 19: 426 lecture6b: AR Interaction

3D AR Interfaces

  Virtual objects displayed in 3D physical space and manipulated   HMDs and 6DOF head-tracking   6DOF hand trackers for input

  Interaction   Viewpoint control   Traditional 3D user interface

interaction: manipulation, selection, etc.

Kiyokawa, et al. 2000

Page 20: 426 lecture6b: AR Interaction

AR 3D Interaction

Page 21: 426 lecture6b: AR Interaction

AR Graffiti

www.nextwall.net

Page 22: 426 lecture6b: AR Interaction

Advantages and Disadvantages   Important class of AR interfaces

  Entertainment, design, training

  Advantages   User can interact with 3D virtual

object everywhere in space   Natural, familiar interaction

  Disadvantages   Usually no tactile feedback   User has to use different devices for

virtual and physical objects Oshima, et al. 2000

Page 23: 426 lecture6b: AR Interaction

Augmented Surfaces and Tangible Interfaces   Basic principles

  Virtual objects are projected on a surface

  Physical objects are used as controls for virtual objects

  Support for collaboration

Page 24: 426 lecture6b: AR Interaction

Augmented Surfaces

  Rekimoto, et al. 1998   Front projection  Marker-based tracking  Multiple projection surfaces

Page 25: 426 lecture6b: AR Interaction

Tangible User Interfaces (Ishii 97)   Create digital shadows

for physical objects   Foreground

  graspable UI

  Background   ambient interfaces

Page 26: 426 lecture6b: AR Interaction

Tangible Interfaces - Ambient   Dangling String

  Jeremijenko 1995   Ambient ethernet monitor   Relies on peripheral cues

  Ambient Fixtures   Dahley, Wisneski, Ishii 1998   Use natural material qualities

for information display

Page 27: 426 lecture6b: AR Interaction

Tangible Interface: ARgroove   Collaborative Instrument   Exploring Physically Based Interaction

  Map physical actions to Midi output -  Translation, rotation -  Tilt, shake

Page 28: 426 lecture6b: AR Interaction

ARgroove in Use

Page 29: 426 lecture6b: AR Interaction

Visual Feedback   Continuous Visual Feedback is Key   Single Virtual Image Provides:

  Rotation   Tilt  Height

Page 30: 426 lecture6b: AR Interaction

i/O Brush (Ryokai, Marti, Ishii)

Page 31: 426 lecture6b: AR Interaction

Other Examples   Triangles (Gorbert 1998)

  Triangular based story telling

  ActiveCube (Kitamura 2000-)  Cubes with sensors

Page 32: 426 lecture6b: AR Interaction

Lessons from Tangible Interfaces   Physical objects make us smart

 Norman’s “Things that Make Us Smart”   encode affordances, constraints

  Objects aid collaboration   establish shared meaning

  Objects increase understanding   serve as cognitive artifacts

Page 33: 426 lecture6b: AR Interaction

TUI Limitations

  Difficult to change object properties   can’t tell state of digital data

  Limited display capabilities   projection screen = 2D   dependent on physical display surface

  Separation between object and display   ARgroove

Page 34: 426 lecture6b: AR Interaction

Advantages and Disadvantages

  Advantages  Natural - users hands are used for interacting

with both virtual and real objects. -  No need for special purpose input devices

  Disadvantages   Interaction is limited only to 2D surface

-  Full 3D interaction and manipulation is difficult

Page 35: 426 lecture6b: AR Interaction

Orthogonal Nature of AR Interfaces

Page 36: 426 lecture6b: AR Interaction

Back to the Real World

  AR overcomes limitation of TUIs   enhance display possibilities  merge task/display space   provide public and private views

  TUI + AR = Tangible AR   Apply TUI methods to AR interface design

Page 37: 426 lecture6b: AR Interaction

  Space-multiplexed   Many devices each with one function

-  Quicker to use, more intuitive, clutter -  Real Toolbox

  Time-multiplexed   One device with many functions

-  Space efficient -  mouse

Page 38: 426 lecture6b: AR Interaction

Tangible AR: Tiles (Space Multiplexed)

  Tiles semantics   data tiles   operation tiles

  Operation on tiles   proximity   spatial arrangements   space-multiplexed

Page 39: 426 lecture6b: AR Interaction

Space-multiplexed Interface

Data authoring in Tiles

Page 40: 426 lecture6b: AR Interaction

Proximity-based Interaction

Page 41: 426 lecture6b: AR Interaction

Object Based Interaction: MagicCup   Intuitive Virtual Object Manipulation

on a Table-Top Workspace

  Time multiplexed  Multiple Markers

-  Robust Tracking

  Tangible User Interface -  Intuitive Manipulation

  Stereo Display -  Good Presence

Page 42: 426 lecture6b: AR Interaction

Our system

 Main table, Menu table, Cup interface

Page 43: 426 lecture6b: AR Interaction
Page 44: 426 lecture6b: AR Interaction

Tangible AR: Time-multiplexed Interaction

  Use of natural physical object manipulations to control virtual objects

  VOMAR Demo  Catalog book:

-  Turn over the page

  Paddle operation: -  Push, shake, incline, hit, scoop

Page 45: 426 lecture6b: AR Interaction

VOMAR Interface

Page 46: 426 lecture6b: AR Interaction

Advantages and Disadvantages

  Advantages   Natural interaction with virtual and physical tools

-  No need for special purpose input devices

  Spatial interaction with virtual objects -  3D manipulation with virtual objects anywhere in physical

space

  Disadvantages   Requires Head Mounted Display

Page 47: 426 lecture6b: AR Interaction

Wrap-up   Browsing Interfaces

  simple (conceptually!), unobtrusive

  3D AR Interfaces   expressive, creative, require attention

  Tangible Interfaces   Embedded into conventional environments

  Tangible AR  Combines TUI input + AR display

Page 48: 426 lecture6b: AR Interaction

AR User Interface: Categorization

  Traditional Desktop: keyboard, mouse, joystick (with or without 2D/3D GUI)

  Specialized/VR Device: 3D VR devices, specially design device

Page 49: 426 lecture6b: AR Interaction

AR User Interface: Categorization   Tangible Interface : using physical object Hand/

Touch Interface : using pose and gesture of hand, fingers

  Body Interface: using movement of body

Page 50: 426 lecture6b: AR Interaction

AR User Interface: Categorization   Speech Interface: voice, speech control   Multimodal Interface : Gesture + Speech   Haptic Interface : haptic feedback   Eye Tracking, Physiological, Brain Computer

Interface..

Page 51: 426 lecture6b: AR Interaction

OSGART: From Registration to Interaction

Page 52: 426 lecture6b: AR Interaction

Keyboard and Mouse Interaction   Traditional input techniques   OSG provides a framework for handling keyboard

and mouse input events (osgGA) 1.  Subclass osgGA::GUIEventHandler 2.  Handle events:

•  Mouse up / down / move / drag / scroll-wheel •  Key up / down

3.  Add instance of new handler to the viewer

Page 53: 426 lecture6b: AR Interaction

Keyboard and Mouse Interaction

class KeyboardMouseEventHandler : public osgGA::GUIEventHandler {

public: KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { }

virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa, osg::Object* obj, osg::NodeVisitor* nv) {

switch (ea.getEventType()) { // Possible events we can handle case osgGA::GUIEventAdapter::PUSH: break; case osgGA::GUIEventAdapter::RELEASE: break; case osgGA::GUIEventAdapter::MOVE: break; case osgGA::GUIEventAdapter::DRAG: break; case osgGA::GUIEventAdapter::SCROLL: break; case osgGA::GUIEventAdapter::KEYUP: break; case osgGA::GUIEventAdapter::KEYDOWN: break; }

return false; } };

viewer.addEventHandler(new KeyboardMouseEventHandler());

  Create your own event handler class

  Add it to the viewer to receive events

Page 54: 426 lecture6b: AR Interaction

Keyboard Interaction

case osgGA::GUIEventAdapter::KEYDOWN: {

switch (ea.getKey()) { case 'w': // Move forward 5mm localTransform->preMult(osg::Matrix::translate(0, -5, 0)); return true; case 's': // Move back 5mm localTransform->preMult(osg::Matrix::translate(0, 5, 0)); return true; case 'a': // Rotate 10 degrees left localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS)); return true; case 'd': // Rotate 10 degrees right localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS)); return true; case ' ': // Reset the transformation localTransform->setMatrix(osg::Matrix::identity()); return true; }

break;

  Handle W,A,S,D keys to move an object

localTransform = new osg::MatrixTransform(); localTransform->addChild(osgDB::readNodeFile("media/car.ive")); arTransform->addChild(localTransform.get());

Page 55: 426 lecture6b: AR Interaction

Keyboard Interaction Demo

Page 56: 426 lecture6b: AR Interaction

Mouse Interaction   Mouse is pointing device…   Use mouse to select objects in an AR scene   OSG provides methods for ray-casting and

intersection testing   Return an osg::NodePath (the path from the hit

node all the way back to the root)

Projection Plane (screen) scene

Page 57: 426 lecture6b: AR Interaction

Mouse Interaction

case osgGA::GUIEventAdapter::PUSH:

osgViewer::View* view = dynamic_cast<osgViewer::View*>(&aa); osgUtil::LineSegmentIntersector::Intersections intersections;

// Clear previous selections for (unsigned int i = 0; i < targets.size(); i++) { targets[i]->setSelected(false); }

// Find new selection based on click position if (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) { for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin(); iter != intersections.end(); iter++) {

if (Target* target = dynamic_cast<Target*>(iter->nodePath.back())) { std::cout << "HIT!" << std::endl; target->setSelected(true); return true; } } }

break;

  Compute the list of nodes under the clicked position   Invoke an action on nodes that are hit, e.g. select, delete

Page 58: 426 lecture6b: AR Interaction

Mouse Interaction Demo

Page 59: 426 lecture6b: AR Interaction

Proximity Techniques   Interaction based on

  the distance between a marker and the camera   the distance between multiple markers

Page 60: 426 lecture6b: AR Interaction

Single Marker Techniques: Proximity   Use distance from camera to marker as

input parameter   e.g. Lean in close to examine

  Can use the osg::LOD class to show different content at different depth ranges Image: OpenSG Consortium

Page 61: 426 lecture6b: AR Interaction

Single Marker Techniques: Proximity // Load some models osg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg"); osg::ref_ptr<osg::Node> closerNode = osgDB::readNodeFile("media/closer.osg"); osg::ref_ptr<osg::Node> nearNode = osgDB::readNodeFile("media/near.osg");

// Use a Level-Of-Detail node to show each model at different distance ranges. osg::ref_ptr<osg::LOD> lod = new osg::LOD(); lod->addChild(farNode.get(), 500.0f, 10000.0f); // Show the "far" node from 50cm to 10m away lod->addChild(closerNode.get(), 200.0f, 500.0f); // Show the "closer" node from 20cm to 50cm away lod->addChild(nearNode.get(), 0.0f, 200.0f); // Show the "near" node from 0cm to 2cm away

arTransform->addChild(lod.get());

  Define depth ranges for each node   Add as many as you want   Ranges can overlap

Page 62: 426 lecture6b: AR Interaction

Single Marker Proximity Demo

Page 63: 426 lecture6b: AR Interaction

Multiple Marker Concepts   Interaction based on the relationship between

markers   e.g. When the distance between two markers

decreases below threshold invoke an action  Tangible User Interface

  Applications:  Memory card games   File operations

Page 64: 426 lecture6b: AR Interaction

Multiple Marker Proximity

Transform A Transform B

Switch A Switch B

Model A1

Model A2

Model B1

Virtual Camera

Distance > Threshold

Model B2

Page 65: 426 lecture6b: AR Interaction

Multiple Marker Proximity

Transform A Transform B

Switch A Switch B

Model A1

Model A2

Model B1

Virtual Camera

Distance <= Threshold

Model B2

Page 66: 426 lecture6b: AR Interaction

Multiple Marker Proximity

virtual void operator()(osg::Node* node, osg::NodeVisitor* nv) {

if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) { if (mMarkerA->valid() && mMarkerB->valid()) {

osg::Vec3 posA = mMarkerA->getTransform().getTrans(); osg::Vec3 posB = mMarkerB->getTransform().getTrans(); osg::Vec3 offset = posA - posB; float distance = offset.length();

if (distance <= mThreshold) { if (mSwitchA->getNumChildren() > 1) mSwitchA->setSingleChildOn(1); if (mSwitchB->getNumChildren() > 1) mSwitchB->setSingleChildOn(1); } else { if (mSwitchA->getNumChildren() > 0) mSwitchA->setSingleChildOn(0); if (mSwitchB->getNumChildren() > 0) mSwitchB->setSingleChildOn(0); }

}

}

traverse(node,nv);

}

  Use a node callback to test for proximity and update the relevant nodes

Page 67: 426 lecture6b: AR Interaction

Multiple Marker Proximity

Page 68: 426 lecture6b: AR Interaction

Paddle Interaction   Use one marker as a tool for selecting and

manipulating objects (tangible user interface)   Another marker provides a frame of reference

  A grid of markers can alleviate problems with occlusion

MagicCup (Kato et al) VOMAR (Kato et al)

Page 69: 426 lecture6b: AR Interaction

Paddle Interaction   Often useful to adopt a local coordinate system

  Allows the camera to move without disrupting Tlocal

  Places the paddle in the same coordinate system as the content on the grid   Simplifies interaction

  osgART computes Tlocal using the osgART::LocalTransformationCallback

Page 70: 426 lecture6b: AR Interaction

Tilt and Shake Interaction

  Detect types of paddle movement:   Tilt

-  gradual change in orientation

  Shake -  short, sudden changes in translation

Page 71: 426 lecture6b: AR Interaction

Building Tangible AR Interfaces with ARToolKit

Page 72: 426 lecture6b: AR Interaction

Required Code   Calculating Camera Position

  Range to marker

  Loading Multiple Patterns/Models   Interaction between objects

  Proximity   Relative position/orientation

  Occlusion   Stencil buffering   Multi-marker tracking

Page 73: 426 lecture6b: AR Interaction

Tangible AR Coordinate Frames

Page 74: 426 lecture6b: AR Interaction

Local vs. Global Interactions

  Local   Actions determined from single camera to marker

transform -  shaking, appearance, relative position, range

  Global   Actions determined from two relationships

-  marker to camera, world to camera coords. -  Marker transform determined in world coordinates

•  object tilt, absolute position, absolute rotation, hitting

Page 75: 426 lecture6b: AR Interaction

Range-based Interaction   Sample File: RangeTest.c

/* get the camera transformation */ arGetTransMat(&marker_info[k], marker_center, marker_width, marker_trans);

/* find the range */ Xpos = marker_trans[0][3]; Ypos = marker_trans[1][3]; Zpos = marker_trans[2][3]; range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos);

Page 76: 426 lecture6b: AR Interaction

Loading Multiple Patterns   Sample File: LoadMulti.c

  Uses object.c to load

  Object Structure typedef struct { char name[256]; int id; int visible; double marker_coord[4][2]; double trans[3][4]; double marker_width; double marker_center[2]; } ObjectData_T;

Page 77: 426 lecture6b: AR Interaction

Finding Multiple Transforms   Create object list ObjectData_T *object;

  Read in objects - in init( ) read_ObjData( char *name, int *objectnum );

  Find Transform – in mainLoop( ) for( i = 0; i < objectnum; i++ ) { ..Check patterns ..Find transforms for each marker }

Page 78: 426 lecture6b: AR Interaction

Drawing Multiple Objects

  Send the object list to the draw function draw( object, objectnum );

  Draw each object individually for( i = 0; i < objectnum; i++ ) { if( object[i].visible == 0 ) continue; argConvGlpara(object[i].trans, gl_para); draw_object( object[i].id, gl_para); }

Page 79: 426 lecture6b: AR Interaction

Proximity Based Interaction

  Sample File – CollideTest.c   Detect distance between markers

checkCollisions(object[0],object[1], DIST) If distance < collide distance Then change the model/perform interaction

Page 80: 426 lecture6b: AR Interaction

Multi-marker Tracking

  Sample File – multiTest.c   Multiple markers to establish a

single coordinate frame   Reading in a configuration file   Tracking from sets of markers  Careful camera calibration

Page 81: 426 lecture6b: AR Interaction

  Sample File - Data/multi/marker.dat

  Contains list of all the patterns and their exact positions

#the number of patterns to be recognized 6

#marker 1 Data/multi/patt.a 40.0 0.0 0.0 1.0000 0.0000 0.0000 -100.0000 0.0000 1.0000 0.0000 50.0000 0.0000 0.0000 1.0000 0.0000 …

MultiMarker Configuration File

Pattern File

Pattern Width + Coordinate Origin

Pattern Transform Relative to Global Origin

Page 82: 426 lecture6b: AR Interaction

Camera Transform Calculation   Include <AR/arMulti.h>   Link to libARMulti.lib

  In mainLoop()   Detect markers as usual arDetectMarkerLite(dataPtr, thresh,

&marker_info, &marker_num)

  Use MultiMarker Function if( (err=arMultiGetTransMat(marker_info,

marker_num, config)) < 0 ) {

argSwapBuffers(); return; }

Page 83: 426 lecture6b: AR Interaction

Paddle-based Interaction

Tracking single marker relative to multi-marker set - paddle contains single marker

Page 84: 426 lecture6b: AR Interaction

Paddle Interaction Code   Sample File – PaddleDemo.c

  Get paddle marker location + draw paddle before drawing background model

paddleGetTrans(paddleInfo, marker_info, marker_flag, marker_num, &cparam);

/* draw the paddle */ if( paddleInfo->active ){ draw_paddle( paddleInfo); }

draw_paddle uses a Stencil Buffer to increase realism

Page 85: 426 lecture6b: AR Interaction

Paddle Interaction Code II

  Sample File – paddleDrawDemo.c

  Finds the paddle position relative to global coordinate frame: setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4])

  Sample File – paddleTouch.c   Finds the paddle position:

findPaddlePos(&curPadPos,paddleInfo->trans,config->trans);

  Checks for collisions: checkCollision(&curPaddlePos,myTarget[i].pos,20.0)

Page 86: 426 lecture6b: AR Interaction

General Tangible AR Library   command_sub.c, command_sub.h   Contains functions for recognizing a range of

different paddle motions: int check_shake( ); int check_punch( ); int check_incline( ); int check_pickup( ); int check_push( );

  Eg: to check angle between paddle and base check_incline(paddle->trans, base->trans, &ang)