Mixed Reality Systems -Lab IV – Augmented Reality- Christoph Anthes

Click here to load reader

  • date post

    17-Dec-2015
  • Category

    Documents

  • view

    218
  • download

    2

Embed Size (px)

Transcript of Mixed Reality Systems -Lab IV – Augmented Reality- Christoph Anthes

  • Slide 1
  • Mixed Reality Systems -Lab IV Augmented Reality- Christoph Anthes
  • Slide 2
  • Lab Mixed Reality Systems 2 Overview ARToolKit Combining ARToolKit with OpenSG Initialisation ARToolKit Loop Helper Functions Tracking Objects Interaction with objects Combining ARToolKit with inVRs
  • Slide 3
  • Lab Mixed Reality Systems 3 ARToolKit Current Version 2.72.1available at Sourceforge page C and C++ API Cross-platform API (Windows, Linux, MacOS, IRIX) OpenGL used for rendering, GLUT used for event handling Used video library depends on chosen platform Architecture General Picture Build on GLUT with OpenGL, C and C++ Makes use of the video device specific and graphics drivers Often used only as an independent tracking library Own application is designed to be build on top of OpenGL and ARToolKit Interfaces to larger tracking libraries exist (e.g. OpenTracker) From http://www.hitl.washington.edu/artoolkit/documentation/
  • Slide 4
  • Lab ARToolKit Architecture ARToolKit Focus AR module Core module with marker tracking, calibration and parameter collection Video module Collection of video routines for capturing the video input frames Wrapper around the standard platform SDK video capture routines Gsub module Graphic routines based on the OpenGL and GLUT libraries Gsub_lite module Replaces GSub with a more efficient collection of graphics routines Independent of any particular windowing toolkit 4 Mixed Reality Systems From http://www.hitl.washington.edu/artoolkit/documentation/
  • Slide 5
  • Lab ARToolKit Coordinate Systems Several coordinate systems are to be used Most important camera and marker coordinates From camera to screen coordinates a transformation via a distortion function can be performed Z-axis of the marker is pointing upward Z-axis of the camera is pointing in the scene Top left corner of the screen is 0,0 arGetTransMat() Returns the coordinates of the marker in the Camera coordinate System arMatrixInverse() Returns the coordinates of the camera in the marker coordinate system 5 Mixed Reality Systems From http://www.hitl.washington.edu/artoolkit/documentation/
  • Slide 6
  • Lab Mixed Reality Systems 6 ARToolKit Basic Application From http://www.hitl.washington.edu/artoolkit/documentation/
  • Slide 7
  • Lab Mixed Reality Systems 7 ARToolKit Corresponding Function Calls in plain ARToolKit API From http://www.hitl.washington.edu/artoolkit/documentation/
  • Slide 8
  • Lab Mixed Reality Systems 8 ARToolKit Connections to Scene Graphs in General Three steps Initialising the Camera Transforming the Object Real Occluders slightly advanced OpenSG Creation of a new node with fscEdit OpenSceneGraph Extension of scene view class Display of occluding geometry Overwrite colour buffer with video image Finally display of recognised objects Two big approaches OSGART (http://www.artoolworks.com/community/osgart/index.html)http://www.artoolworks.com/community/osgart/index.html OSGAR (http://www.gvu.gatech.edu/ael/projects/ARSceneGraph.html)http://www.gvu.gatech.edu/ael/projects/ARSceneGraph.html
  • Slide 9
  • Lab Mixed Reality Systems 9 Combining ARToolKit and OpenSG OpenSG provides examples for interconnecting either ARToolKit or ARToolKit Plus We are going to work with ARToolKit So we first need the additional include files gsub.h contains main display functions used in ARToolkit video.h provides multi-platform video input support for ARToolKit param.h contains principal routines for loading, saving, and modify camera parameters ar.h provides image analysis and marker detection routines
  • Slide 10
  • Lab Mixed Reality Systems 10 Combining ARToolKit and OpenSG If we take a look at our example we start with a set of forward declarations some which we have not seen in the previous OpenSG or inVRs tutorial Additionally more methods are used at the end of the code We start with the setup and cleanup methods: initARToolkit() is used for initialisation of the ARToolKit components of the example initOpenSG() initialises the OpenSG setup of the example initGlut() registers the GLUT callbacks as we have seen in previous examples setupCamera () provides a setup for our interconnected webcam cleanupARToolkit() stops the video capture of ARToolKit and closes the video stream processing cleanupOpenSG() frees used variables, stops the binding to ARToolKit, and call osgExit() Lets have a detailed look at the setup methods
  • Slide 11
  • Lab Mixed Reality Systems 11 Combining ARToolKit and OpenSG initARToolkit() Wraps and calls the different internal setup functions for ARToolKit The cleanup method is registered as a callback at program termination
  • Slide 12
  • Lab Mixed Reality Systems 12 Combining ARToolKit and OpenSG setupCamera() The camera setup is defined in this function Camera parameters from calibration file are parsed and evaluated Conversion has to take place in order to write data out in the right format The ModelViewMatrix and the ProjectionMatrix are set This setup is stored inside an OpenSG camera object The camera parameters are stored in binary files E.g. currentParams.mat
  • Slide 13
  • Lab Mixed Reality Systems 13 Combining ARToolKit and OpenSG setupCamera()
  • Slide 14
  • Lab Mixed Reality Systems 14 Combining ARToolKit and OpenSG initOpenSG() The standard OpenSG setup is performed A GLUT window is created and initialised A root node with an anonymous group core is created The previously described camera setup is triggered A background object is interconnected with a video texture The SimpleSceneManager is initialised and interconnected with the window and the root node of the scene The background image as well as the camera are attached to the viewport of the just generated window All changes are committed to OpenSG Finally the OpenSG cleanup function is registered to be triggered at the termination of the application
  • Slide 15
  • Lab Mixed Reality Systems 15 Combining ARToolKit and OpenSG initOpenSG()
  • Slide 16
  • Lab Mixed Reality Systems 16 Combining ARToolKit and OpenSG Then we have the ARToolKit processing methods In the captureFrame() method the image is retrieved from the camera The detectMarkers() function triggers marker detection and outputs the amount of found markers applyMarkerTrans() applies the transformation from a given marker to an OpenSG transformation core ARToolKit Loop These steps represent our steps 2-4 from our ARToolKit loop Step 1 was the initialisation given with the previous set of functions Step 5 the rendering is performed by OpenSG They have to be processed frame by frame, thus they are called inside the display loop of the application
  • Slide 17
  • Lab Mixed Reality Systems 17 Combining ARToolKit and OpenSG captureFrame() This method retrieves an image which was captured by ARToolKit from the incoming video stream The image data is set in an OpenSG image and an update notification is issued This frame will later on be rendered as a background image It is as well used by ARToolKit for image processing which is performed in the next step
  • Slide 18
  • Lab Mixed Reality Systems 18 Combining ARToolKit and OpenSG detectMarkers() In this method the markers visible on the image are detected The amount of detected markers as well as the markers themselves are returned by reference A threshold parameter determines the binarisation of the image Output describing the amount of detected markers as well as the ids of the markers is generated on the console
  • Slide 19
  • Lab Mixed Reality Systems 19 Combining ARToolKit and OpenSG applyMarkerTrans() Extracts the transformation information from a marker and applies it to a given OpenSG model By using an STL map a binding between The marker transformation is requested from ARToolKit via arGetMarkerTrans It is then converted into an OpenSG matrix If a marker is found in the map it becomes activated again and the transformation matrix just retrieved is applied on an OpenSG object The changes are finally committed
  • Slide 20
  • Lab Mixed Reality Systems 20 Combining ARToolKit and OpenSG applyMarkerTrans()
  • Slide 21
  • Lab Mixed Reality Systems 21 Combining ARToolKit and OpenSG Additional methods which are used as helper functions and appear in the code The method arMatrixToOSGMatrix() performs a data conversion from ARToolKit to OpenSG The argConvGLcpara() method is used for conversion of ARToolKit camera parameters to OpenGL parameters getTranslation() returns the translation vector of a transformation matrix
  • Slide 22
  • Lab Mixed Reality Systems 22 Combining ARToolKit and OpenSG arMatrixToOSGMatrix() Some very basic reformatting of an ARToolKit Matrix to an OpenSG Matrix is performed in this method
  • Slide 23
  • Lab Mixed Reality Systems 23 Combining ARToolKit and OpenSG argConvGLcpara() is used to transform ARToolKit intrinsic camera parameters matrix format to an OpenGL matrix format More details on camera calibration and the parameters are given in the computer vision class in Winter semester
  • Slide 24
  • Lab Mixed Reality Systems 24 Combining ARToolKit and OpenSG Additional methods which are used as helper functions and appear in the code createPattern() connects a marker with an OpenSG sub scene graph createBackground() creates an image background based on the image gathererd from the ARToolKit Video stream createModel() loads a su