Supporting Remote Manipulation with an Ecological Augmented Virtuality Interface

33
Supporting Remote Manipulation with an Ecological Augmented Virtuality Interface J. Alan Atherton Michael Goodrich Brigham Young University Department of Computer Science April 9, 2009 Funded in part by Idaho National Laboratory And Army Research Laboratory 1

description

Supporting Remote Manipulation with an Ecological Augmented Virtuality Interface. J. Alan Atherton Michael Goodrich Brigham Young University Department of Computer Science April 9, 2009 Funded in part by Idaho National Laboratory And Army Research Laboratory. Outline. Background - PowerPoint PPT Presentation

Transcript of Supporting Remote Manipulation with an Ecological Augmented Virtuality Interface

Supporting Remote Manipulation: An Ecological Approach

Supporting Remote Manipulation with an Ecological Augmented Virtuality InterfaceJ. Alan AthertonMichael GoodrichBrigham Young UniversityDepartment of Computer ScienceApril 9, 2009

Funded in part by Idaho National LaboratoryAnd Army Research Laboratory1Exploratory study1OutlineBackgroundRelated WorkEcological InterfaceUser StudyInterface Changes from StudyConclusions and Future Work

2BackgroundWhat is a remote manipulator?ApplicationsUSAREODPlanetary Exploration

3We show a remote manipulator and a mobile manipulator. Mobile manipulators are used in these applications. Our research is a step toward supporting mobile manipulation, although right now we are focused on only the manipulation aspect.Mobile manipulation involves two phases for most tasks: navigation and manipulation. Operators must drive the robot to a particular place and then manipulate on objects there.3ProblemRemotely operating a robot is difficultSoda straw Maintaining situation awarenessTime delayMental workloadWhy is this a problem?CollisionsSlow Stressful

Foster-Miller Talon4Hard to get depth perception from 2D camera video4Older Interfaces

All images adopted fromYanco, H. A.; Drury, J. L. & Scholtz, J.Beyond usability evaluation: analysis of human-robot interaction at a major robotics competitionHum.-Comput. Interact., L. Erlbaum Associates Inc., 2004, 19, 117-149 5Problems with these older interfacesHard to get contextHard to integrate informationUnnatural interaction5OutlineBackgroundRelated WorkEcological InterfaceUser StudyInterface Changes from StudyConclusions and Future Work

6Related Work

Idaho National Laboratory

UMass LowellBruemmer, D. J. et al.Shared understanding for collaborative control.IEEE Transactions on Systems, Man and Cybernetics, Part A, 2005, 35, 494-504 Yanco, H. A. et al.Analysis of Human-Robot Interaction for Urban Search and Rescue.Proceedings of the IEEE International Workshop on Safety, Security and Rescue Robotics, 2006 7Improvements on 2D OCU styleGive much more information than just camerasGood for engineering7Related Work

INL / BYU AV InterfaceFerland et al. - SherbrookeC. W. Nielsen, M. A. Goodrich, and B. Ricks. Ecological Interfaces for Improving Mobile Robot Teleoperation. IEEE Transactions on Robotics and Automation. Vol 23, No 5, pp. 927-941, October 2007. Ferland, F.; Pomerleau, F.; Dinh, C. T. L. & Michaud, F.Egocentric and exocentric teleoperation interface using real-time, 3D video projection.Proceedings of the 4th ACM/IEEE international conference on Human robot interaction, ACM, 2009, 37-44 8Augmented virtuality interfacesShow real elements in a virtual sceneAllow for viewpoints you cant get with camerasReduce workload for associating data, reducing transformsOur work represents an extension of AV interfaces to support manipulation.8Related Work

NASA VizNguyen, L. A.; Bualat, M.; Edwards, L. J.; Flueckiger, L.; Neveu, C.; Schwehr, K. ..; Wagner, M. D. & Zbinden, E.Virtual Reality Interfaces for visualization and control of remote vehiclesAutonomous Robots, 2001, 11, 59-68 9

Kelly, A.; Anderson, D.; Capstick, E.; Herman, H. & Rander, P.Photogeometric Sensing for Mobile Robot Control and Visualisation TasksProceedings of the AISB Symposium on New Frontiers in Human-Robot Interaction, 2009

CMU Robotics InstituteExtremely high detailCareful planningIntegrate data from multiple sourcesOur work is like applying the Nasa style interface to a real-time task, with real-time controls9OutlineBackgroundRelated WorkEcological InterfaceUser StudyInterface Changes from StudyConclusions and Future Work

10Interface DesignRequirementsEcologicalIncrease situation awarenessManage workload Existing InterfacesLack depth informationNo manipulation supportNot designed for real-time operation

11SA multiple viewpoints, depth informationWorkload Integrate information, grounded frame of reference

11Ecological Augmented Virtuality Interface

12Real-time remote manipulationDefine ecological make relationships perceptually evident.Define AV real inside virtual env.Explain arm graphic, 3D scan, video, video rotationSay how we have multiple perspectives, integrated information, depth information, grounded frame of reference12Ecological Interface

13Video rotates for view-dependent controlCan get views from the top, sideColor indicates distance from the ranging camera13InfrastructureRobotBuild from kit, modifyPlayer driverMotion planning for armSwiss Ranger driverCommunicationIntegrate with INLs systemNetwork data transferUser InterfaceOpenGL displayExperiment automation

14

Robot ControllerUser InterfaceJust the key points for what we had to do14OutlineBackgroundRelated WorkEcological InterfaceUser StudyInterface Changes from StudyConclusions and Future Work

15Experiment SetupVariant 1Variant 2Variant 3Variant 4Variant 5Variant 6

3D + VideoEnd EffectorVideo 3D Joint Robot ControlVisualizationTask: collect yellow blocks30 participantsBetween-subject comparison1630 participants2x6Each participant tested 3 variantsBlock collection task, similar to collecting rock samplesSwiss Ranger camera to get depth and build up modelWebcam mounted to armReconfigurable bases16ReconfigurationReduce memorization effectsMinimize damage to armQuick change

17Task

18Visualization

193 VisualizationsNot everyone tested all 3Calibration of 3D model not perfect, as bad as 2cm off in some places19Robot ControlJoint controlView-dependent end effector control

20Some people tested one, some people mixedJoint control moves each joint independently very commonEE control Blue pad on the left controls the view angle20Robot View-Dependent Control

21Robot reaches for pointUser moves point with joystickPoint movement depends on view orientationResults Time

3D + Vid.End eff.3D + Vid.Joint3DEnd eff.3DJointVideoEnd eff.VideoJoint22People work slower with the 3d scan, faster with videoMight have something to do with adjusting the virtual viewpoint22Results Collisions

3D + Vid.End eff.3D + Vid.Joint3DEnd eff.3DJointVideoEnd eff.VideoJoint3D + Vid.End eff.3D + Vid.Joint3DEnd eff.3DJointVideoEnd eff.VideoJoint23Collisions with posts, box, tableCollisions with block in final adjustmentsWhen video is present, people bump into things more (sensor FOV?)Joint control correlates with more collisions (faster, coarser though)Video helps with final alignment when poorly calibrated 3d23Results Comparison InterfaceMeasure3D + Vid. End eff.3D + Vid.Joint3DEnd eff.3DJointVideoEnd eff.VideoJointTime to completion6th1stWorld collisions3rd6th1st3rd6thBlock collisions1st6th6thPreference1st1st6th1stSubjective mental workload1st6th2nd3rd2424OutlineBackgroundRelated WorkEcological InterfaceUser StudyInterface Changes from StudyConclusions and Future Work

25Changes Inspired by User StudyProblemsAlignmentTime lagCluttered 3D scan modelChangesStereo camera exterior orientationInteractive robot arm calibrationSimple QuickeningScan Pruning

26First study brought out some problems to work on.26Camera and Robot CalibrationInteractive stereo camera calibration

Live robot arm calibration

27Camera calibration was just guess and check before, now we get the human to specify correspondences and then use linear least squares to find the best transformationRobot arm was measured as carefully as possible, but ended up being nonlinear.We calibrate only the graphic of the arm, as the piecewise transformation can cause the arm to behave erratically.27Simple Quickening28

Time lag leads to a move, stop, check cycle for controlQuickening shows a model-based prediction of the actual current stateWe show the predicted end effector point.28Scan Pruning29

Original 3D scan had clutter. Floor, curtain, robot arm noise.The only things essential to this task are the shapes, posts, and box.Caveat is that the simple filter sometimes did not prune all of the floor, so some participants mistook the floor for the box.29Second User Study30

Important thing to notice is that everything is faster and more consistent.Even video-only improved due to smoother arm control.30OutlineBackgroundRelated WorkEcological InterfaceUser StudyInterface Changes from StudyConclusions and Future Work

31Conclusions3D visualization supports SAVideo is faster3D + video is a good tradeoff3D + video might reduce workload32We have already fixed several of the major design flaws and are currently running a follow-up studyAlso poor coverage of experiment parameter space 6 variations, only test 3 per person, basically random order32Future Work

33Head trackingEcological camera videoHaptics

Head tracking with wii remote from Johnny Lee at CMUPresent video more integrated, hide when zoomed out

33