426 lecture 7: Designing AR Interfaces

of 98 /98
COSC 426: Augmented Reality Mark Billinghurst [email protected] Sept 5 th 2012 Lecture 7: Designing AR Interfaces

Embed Size (px)

description

Lecture 7 in the 201

Transcript of 426 lecture 7: Designing AR Interfaces

  • 1. COSC 426: Augmented RealityMark Billinghurst [email protected] 5th 2012Lecture 7: Designing AR Interfaces

2. AR Interfaces Browsing Interfaces simple (conceptually!), unobtrusive 3D AR Interfaces expressive, creative, require attention Tangible Interfaces Embedded into conventional environments Tangible AR Combines TUI input + AR display 3. AR Interfaces as Data Browsers 2D/3D virtual objects are registered in 3D VR in Real World Interaction 2D/3D virtual viewpoint control Applications Visualization, training 4. 3D AR Interfaces Virtual objects displayed in 3D physical space and manipulated HMDs and 6DOF head-tracking 6DOF hand trackers for input Interaction Viewpoint control Traditional 3D user interface Kiyokawa, et al. 2000interaction: manipulation, selection,etc. 5. Augmented Surfaces andTangible Interfaces Basic principles Virtual objects areprojected on a surface Physical objects are usedas controls for virtualobjects Support for collaboration 6. Tangible Interfaces - Ambient Dangling String Jeremijenko 1995 Ambient ethernet monitor Relies on peripheral cues Ambient Fixtures Dahley, Wisneski, Ishii 1998 Use natural material qualities for information display 7. Back to the Real World AR overcomes limitation of TUIs enhance display possibilities merge task/display space provide public and private views TUI + AR = Tangible AR Apply TUI methods to AR interface design 8. Space-multiplexed Many devices each with one function- Quicker to use, more intuitive, clutter- Real Toolbox Time-multiplexed One device with many functions- Space efficient- mouse 9. Tangible AR: Tiles (Space Multiplexed) Tiles semantics data tiles operation tiles Operation on tiles proximity spatial arrangements space-multiplexed 10. Tangible AR: Time-multiplexed Interaction Use of natural physical object manipulations to control virtual objects VOMAR Demo Catalog book:- Turn over the page Paddle operation:- Push, shake, incline, hit, scoop 11. Building Compelling AR Experiencesexperiencesapplications Interaction tools Authoringcomponents Tracking, Display Sony CSL 2004 12. Interface Design Path1/ Prototype Demonstration2/ Adoption of Interaction Techniques from otherinterface metaphorsAugmented Reality3/ Development of new interface metaphorsappropriate to the medium Virtual Reality4/ Development of formal theoretical models forpredicting and modeling user actions Desktop WIMP 13. Interface metaphors Designed to be similar to a physical entity but also has own properties e.g. desktop metaphor, search engine Exploit users familiar knowledge, helping them to understand the unfamiliar Conjures up the essence of the unfamiliar activity, enabling users to leverage of this to understand more aspects of the unfamiliar functionality People find it easier to learn and talk about what they are doing at the computer interface in terms familiar to them 14. Example: The spreadsheet Analogous to ledger sheet Interactive and computational Easy to understand Greatly extending what accountants and others could dowww.bricklin.com/history/refcards.htm 15. Why was it so good? It was simple, clear, and obvious to the users how to use the application and what it could do it is just a tool to allow others to work out their ideas and reduce the tedium of repeating the same calculations. capitalized on users familiarity with ledger sheets Got the computer to perform a range of different calculations in response to user input 16. Another classic 8010 Star office system targeted at workers not interested in computing per se Spent several person-years at beginning working out the conceptual model Simplified the electronic world, making it seem more familiar, less alien, and easier to learnJohnson et al (1989) 17. The Star interface 18. Benefits of interface metaphors Makes learning new systems easier Helps users understand the underlying conceptual model Can be innovative and enable the realm of computers and their applications to be made more accessible to a greater diversity of users 19. Problems with interface metaphors(Nielson, 1990) Break conventional and cultural rules e.g., recycle bin placed on desktop Can constrain designers in the way they conceptualize a problem Conflict with design principles Forces users to only understand the system in terms of the metaphor Designers can inadvertently use bad existing designs and transfer the bad parts over Limits designers imagination with new conceptual models 20. Microsoft Bob 21. PSDoom killing processes 22. AR Design Principles Interface Components Physical components Display elements- Visual/audio Interaction metaphors Physical Display Elements Interaction ElementsMetaphorInputOutput 23. Back to the Real World AR overcomes limitation of TUIs enhance display possibilities merge task/display space provide public and private views TUI + AR = Tangible AR Apply TUI methods to AR interface design 24. AR Design SpaceRealityVirtual RealityAugmented RealityPhysical Design Virtual Design 25. Tangible AR Design Principles Tangible AR Interfaces use TUI principles Physical controllers for moving virtual content Support for spatial 3D interaction techniques Time and space multiplexed interaction Support for multi-handed interaction Match object affordances to task requirements Support parallel activity with multiple objects Allow collaboration between multiple users 26. Space-multiplexed Many devices each with one function - Quicker to use, more intuitive, clutter - Tiles Interface, toolbox Time-multiplexed One device with many functions - Space efficient - VOMAR Interface, mouse 27. Design of Objects Objects Purposely built affordances Found repurposed Existing already at use in marketplace Make affordances obvious (Norman) Object affordances visible Give feedback Provide constraints Use natural mapping Use good cognitive model 28. Object Design 29. Affordances: to give a clue Refers to an attribute of an object that allows people to know how to use it e.g. a mouse button invites pushing, a door handle affordspulling Norman (1988) used the term to discuss the design of everyday objects Since has been much popularised in interaction design to discuss how to design interface objects e.g. scrollbars to afford moving up and down, icons to affordclicking on 30. "...the termaffordancerefers to the perceived andactual properties of the thing, primarily thosefundamental properties that determine just how thething could possibly be used. [...] Affordancesprovide strong clues to the operations of things.Plates are for pushing. Knobs are for turning. Slotsare for inserting things into. Balls are for throwing orbouncing. When affordances are taken advantage of,the user knows what to do just by looking: nopicture, label, or instruction needed."(Norman, The Psychology of Everyday Things 1988, p.9) 31. Physical Affordances Physical affordances: How do the following physical objects afford? Are they obvious? 32. Affordance and Interface Design? Interfaces are virtual and do not have affordances like physical objects Norman argues it does not make sense to talk about interfaces in terms of real affordances Instead interfaces are better conceptualized as perceived affordances Learned conventions of arbitrary mappings betweenaction and effect at the interface Some mappings are better than others 33. Virtual Affordances Virtual affordances How do the following screen objects afford? What if you were a novice user? Would you know what to do with them? 34. AR is mixture of physical affordance and virtual affordance Physical Tangible controllers and objects Virtual Virtual graphics and audio 35. Case Study 1: 3D AR LensGoal: Develop a lens based AR interface MagicLenses Developed at Xerox PARC in 1993 View a region of the workspace differently to the rest Overlap MagicLenses to create composite effects 36. 3D MagicLensesMagicLenses extended to 3D (Veiga et. al. 96) Volumetric and flat lenses 37. AR Lens Design Principles Physical Components Lens handle - Virtual lens attached to real object Display Elements Lens view - Reveal layers in dataset Interaction Metaphor Physically holding lens 38. 3D AR Lenses: Model Viewer Displays models made up of multiple parts Each part can be shown or hidden through the lens Allows the user to peer inside the model Maintains focus + context 39. AR Lens Demo 40. AR Lens ImplementationStencil BufferOutside LensInside LensVirtual Magnifying Glass 41. AR FlexiLensReal handles/controllers with flexible AR lens 42. Techniques based on AR Lenses Object Selection Select objects by targeting them with the lens Information Filtering Show different representations through the lens Hide certain content to reduce clutter, look inside things Move between AR and VR Transition along the reality-virtuality continuum Change our viewpoint to suit our needs 43. Case Study 2 : LevelHead Block based game 44. Case Study 2: LevelHead Physical Components Real blocks Display Elements Virtual person and rooms Interaction Metaphor Blocks are rooms 45. Case Study 3: AR Chemistry (Fjeld 2002) Tangible AR chemistry education 46. Goal: An AR application to test molecular structure in chemistry Physical Components Real book, rotation cube, scoop, tracking markers Display Elements AR atoms and molecules Interaction Metaphor Build your own molecule 47. AR Chemistry Input Devices 48. Case Study 4: Transitional InterfacesGoal: An AR interface supporting transitions from reality to virtual reality Physical Components Real book Display Elements AR and VR content Interaction Metaphor Book pages hold virtual scenes 49. Milgrams Continuum (1994)Mixed Reality (MR)RealityAugmentedAugmentedVirtuality(TangibleReality (AR) Virtuality (AV)(VirtualInterfaces)Reality)Central Hypothesis The next generation of interfaces will support transitionsalong the Reality-Virtuality continuum 50. Transitions Interfaces of the future will need to support transitions along the RV continuum Augmented Reality is preferred for: co-located collaboration Immersive Virtual Reality is preferred for: experiencing world immersively (egocentric) sharing views remote collaboration 51. The MagicBook Design Goals: Allows user to move smoothly between reality and virtual reality Support collaboration 52. MagicBook Metaphor 53. Features Seamless transition between Reality and Virtuality Reliance on real decreases as virtual increases Supports egocentric and exocentric views User can pick appropriate view Computer becomes invisible Consistent interface metaphors Virtual content seems real Supports collaboration 54. Collaboration Collaboration on multiple levels: Physical Object AR Object Immersive Virtual Space Egocentric + exocentric collaboration multiple multi-scale users Independent Views Privacy, role division, scalability 55. Technology Reality No technology Augmented Reality Camera tracking Switch fly in Virtual Reality Compass tracking Press pad move Switch fly out 56. Scientific Visualization 57. Education 58. Summary When designing AR interfaces, think of: Physical Components - Physical affordances Virtual Components - Virtual affordances Interface Metaphors 59. OSGART:From Registration to Interaction 60. Keyboard and Mouse Interaction Traditional input techniques OSG provides a framework for handling keyboard and mouse input events (osgGA)1. Subclass osgGA::GUIEventHandler2. Handle events: Mouse up / down / move / drag / scroll-wheel Key up / down3. Add instance of new handler to the viewer 61. Keyboard and Mouse Interaction Create your own event handler classclass KeyboardMouseEventHandler : public osgGA::GUIEventHandler {public: KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { } virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa,osg::Object* obj, osg::NodeVisitor* nv) { switch (ea.getEventType()) {// Possible events we can handlecase osgGA::GUIEventAdapter::PUSH: break;case osgGA::GUIEventAdapter::RELEASE: break;case osgGA::GUIEventAdapter::MOVE: break;case osgGA::GUIEventAdapter::DRAG: break;case osgGA::GUIEventAdapter::SCROLL: break;case osgGA::GUIEventAdapter::KEYUP: break;case osgGA::GUIEventAdapter::KEYDOWN: break; } return false; }}; Add it to the viewer to receive eventsviewer.addEventHandler(new KeyboardMouseEventHandler()); 62. Keyboard Interaction Handle W,A,S,D keys to move an objectcase osgGA::GUIEventAdapter::KEYDOWN: { switch (ea.getKey()) {case w: // Move forward 5mm localTransform->preMult(osg::Matrix::translate(0, -5, 0)); return true;case s: // Move back 5mm localTransform->preMult(osg::Matrix::translate(0, 5, 0)); return true;case a: // Rotate 10 degrees left localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS)); return true;case d: // Rotate 10 degrees right localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS)); return true;case: // Reset the transformation localTransform->setMatrix(osg::Matrix::identity()); return true; }break;localTransform = new osg::MatrixTransform();localTransform->addChild(osgDB::readNodeFile("media/car.ive"));arTransform->addChild(localTransform.get()); 63. Keyboard Interaction Demo 64. Mouse Interaction Mouse is pointing device Use mouse to select objects in an AR scene OSG provides methods for ray-casting and intersection testing Return an osg::NodePath (the path from the hit node all the way back to the root)ProjectionPlane (screen)scene 65. Mouse Interaction Compute the list of nodes under the clicked position Invoke an action on nodes that are hit, e.g. select, deletecase osgGA::GUIEventAdapter::PUSH: osgViewer::View* view = dynamic_cast(&aa); osgUtil::LineSegmentIntersector::Intersections intersections; // Clear previous selections for (unsigned int i = 0; i < targets.size(); i++) {targets[i]->setSelected(false); } // Find new selection based on click position if (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) {for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin(); iter != intersections.end(); iter++) {if (Target* target = dynamic_cast(iter->nodePath.back())) { std::cout addChild(farNode.get(), 500.0f, 10000.0f);// Show the "far" node from 50cm to 10m awaylod->addChild(closerNode.get(), 200.0f, 500.0f); // Show the "closer" node from 20cm to 50cm awaylod->addChild(nearNode.get(), 0.0f, 200.0f); // Show the "near" node from 0cm to 2cm awayarTransform->addChild(lod.get()); Define depth ranges for each node Add as many as you want Ranges can overlap 70. Single Marker Proximity Demo 71. Multiple Marker Concepts Interaction based on the relationship between markers e.g. When the distance between two markers decreases below threshold invoke an action Tangible User Interface Applications: Memory card games File operations 72. Multiple Marker Proximity Virtual CameraTransform A Transform BDistance > ThresholdSwitch A Switch BModelModel ModelModel A1 A2B1 B2 73. Multiple Marker Proximity Virtual CameraTransform A Transform BDistance valid() && mMarkerB->valid()) {osg::Vec3 posA = mMarkerA->getTransform().getTrans();osg::Vec3 posB = mMarkerB->getTransform().getTrans();osg::Vec3 offset = posA - posB;float distance = offset.length();if (distance getNumChildren() > 1) mSwitchA->setSingleChildOn(1); if (mSwitchB->getNumChildren() > 1) mSwitchB->setSingleChildOn(1);} else { if (mSwitchA->getNumChildren() > 0) mSwitchA->setSingleChildOn(0); if (mSwitchB->getNumChildren() > 0) mSwitchB->setSingleChildOn(0);}}}traverse(node,nv);} 75. Multiple Marker Proximity 76. Paddle Interaction Use one marker as a tool for selecting and manipulating objects (tangible user interface) Another marker provides a frame of reference A grid of markers can alleviate problems with occlusion MagicCup (Kato et al)VOMAR (Kato et al) 77. Paddle Interaction Often useful to adopt a local coordinate system Allows the camerato move withoutdisrupting Tlocal Places the paddle inthe same coordinatesystem as thecontent on the grid Simplifies interaction osgART computes Tlocal using the osgART::LocalTransformationCallback 78. Tilt and Shake Interaction Detect types of paddle movement: Tilt- gradual change in orientation Shake- short, sudden changes in translation 79. Building Tangible AR Interfaceswith ARToolKit 80. Required Code Calculating Camera Position Range to marker Loading Multiple Patterns/Models Interaction between objects Proximity Relative position/orientation Occlusion Stencil buffering Multi-marker tracking 81. Tangible AR Coordinate Frames 82. Local vs. Global Interactions Local Actions determined from single camera to markertransform- shaking, appearance, relative position, range Global Actions determined from two relationships- marker to camera, world to camera coords.- Marker transform determined in world coordinates object tilt, absolute position, absolute rotation, hitting 83. Range-based Interaction Sample File: RangeTest.c/* get the camera transformation */arGetTransMat(&marker_info[k], marker_center,marker_width, marker_trans);/* find the range */Xpos = marker_trans[0][3];Ypos = marker_trans[1][3];Zpos = marker_trans[2][3];range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos); 84. Loading Multiple Patterns Sample File: LoadMulti.c Uses object.c to load Object Structure typedef struct { char name[256]; intid; intvisible; double marker_coord[4][2]; double trans[3][4]; double marker_width; double marker_center[2]; } ObjectData_T; 85. Finding Multiple Transforms Create object listObjectData_T*object; Read in objects - in init( )read_ObjData( char *name, int *objectnum ); Find Transform in mainLoop( )for( i = 0; i < objectnum; i++ ) {..Check patterns..Find transforms for each marker} 86. Drawing Multiple Objects Send the object list to the draw functiondraw( object, objectnum ); Draw each object individuallyfor( i = 0; i < objectnum; i++ ) { if( object[i].visible == 0 ) continue; argConvGlpara(object[i].trans, gl_para); draw_object( object[i].id, gl_para);} 87. Proximity Based Interaction Sample File CollideTest.c Detect distance between markerscheckCollisions(object[0],object[1], DIST)If distance < collide distanceThen change the model/perform interaction 88. Multi-marker Tracking Sample File multiTest.c Multiple markers to establish a single coordinate frame Reading in a configuration file Tracking from sets of markers Careful camera calibration 89. MultiMarker Configuration File Sample File - Data/multi/marker.dat Contains list of all the patterns and their exact positions #the number of patterns to be recognized 6Pattern File #marker 1 Pattern Width + Data/multi/patt.a Coordinate Origin 40.0 0.0 0.0Pattern Transform 1.0000 0.0000 0.0000 -100.0000 Relative to Global 0.0000 1.0000 0.0000 50.0000 Origin 0.0000 0.0000 1.0000 0.0000 90. Camera Transform Calculation Include Link to libARMulti.lib In mainLoop() Detect markers as usualarDetectMarkerLite(dataPtr, thresh,&marker_info, &marker_num) Use MultiMarker Functionif( (err=arMultiGetTransMat(marker_info,marker_num, config)) active ){draw_paddle( paddleInfo);}draw_paddle uses a Stencil Buffer to increase realism 93. Paddle Interaction Code II Sample File paddleDrawDemo.c Finds the paddle position relative to global coordinate frame: setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4]) Sample File paddleTouch.c Finds the paddle position: findPaddlePos(&curPadPos,paddleInfo->trans,config->trans); Checks for collisions: checkCollision(&curPaddlePos,myTarget[i].pos,20.0) 94. General Tangible AR Library command_sub.c, command_sub.h Contains functions for recognizing a range of different paddle motions: int check_shake( ); int check_punch( ); int check_incline( ); int check_pickup( ); int check_push( ); Eg: to check angle between paddle and base check_incline(paddle->trans, base->trans, &ang)