Realty and Reality: Where Location Matters · Realty and Reality: Where Location Matters John...

2
Realty and Reality: Where Location Matters John Miller, Niranjini Rajagopal, Krishna Kumar, Anh Luong, Anthony Rowe {jmiller4, niranjir, kreghuku, aluong2, agr}@andrew.cmu.edu Electrical and Computer Engineering Department Carnegie Mellon University Pittsburgh, PA Abstract—We demonstrate an indoor location solution for mobile devices based on fusion of ultra-wideband (UWB) time-of- flight (TOF) ranges and visual-inertial odometry (VIO) provided by Apple’s ARKit. VIO provides accurate position and orien- tation tracking relative to device start-up, and UWB provides global location estimates. We use a battery-powered Raspberry Pi (RPi) with a UWB module as a beacon. We localize a consumer smartphone with an attached RPi and UWB module (tag). I. I NTRODUCTION We demonstrate a system that utilizes UWB-based rang- ing and VIO-based tracking. UWB-based TOF systems have been shown to be promising for accurate indoor localization. However, they tend to have low update rates, require a high beacon density, and suffer from errors due to NLOS signals. Over the past year, we have seen the emergence of Apple’s ARKit and Android’s ARCore to support AR applications on mobile devices. These APIs provide accurate relative location and orientation with respect to device start-up, using VIO. However, VIO can drift over time and can lose tracking precision with poor vision features, low lighting or abrupt movements of the camera. Our system leverages the comple- mentary features of these two technologies and fuses the VIO information with TOF ranges to estimate the location with respect to the beacons to obtain high-rate, high-precision, and globally accurate location estimates. We envision that future commodity mobile devices will be capable of TOF ranging with WiFi access points (using 802.11ax) or beacons deployed for indoor localization (using Bluetooth 5). AR systems can leverage this localization capa- bility without incurring any additional cost. The larger goal of building this system is to enable persistent multi-user AR applications on mobile devices [1]. II. SYSTEM DESCRIPTION The components of the localization system are as follows: A. Ultra-Wideband We use a DecaWave DWM1000 module for obtaining TOF ranges between the beacons and the tracked device [2]. The module is capable of approximately 30 centimeter ranging accuracy and 30 metres line-of-sight range. The ranges are produced using a double-sided two-way ranging (DS-TWR) protocol, allowing two round-trip TOF to be measured. This helps mitigate range errors due to clock drift between the beacons and the tag. A Raspberry Pi is used as the interface to the DWM1000 module on both the beacons and the tag. Fig. 1. Hardware Setup: (left) DWM1000 UWB module on top of RPi Zero W with tablet for VIO, and (right) UWB Beacon with a DWM1000 module connected to RPi 2 powered by a POE Injector and Spliter for ease of deployment. Fig. 2. Architecture: The user carries an iOS device with an attached UWB tag to localize with respect to UWB beacons deployed in the environment B. Visual-Inertial Odometry The advent of VIO on consumer smartphones has greatly advanced tracking capabilities over what was possible with only inertial sensors (gyroscope and accelerometer). ARKit provides relative location updates by fusing motion estimations from the camera with inertial sensors in the smartphone [3]. It provides 6 degree-of-freedom (position and orientation) information at an update rate of 60 hertz. Therefore, the system could be used for 3D location. We characterize the ARKit VIO error model, which is integrated into our fusion engine.

Transcript of Realty and Reality: Where Location Matters · Realty and Reality: Where Location Matters John...

Page 1: Realty and Reality: Where Location Matters · Realty and Reality: Where Location Matters John Miller, Niranjini Rajagopal, Krishna Kumar, Anh Luong, Anthony Rowe fjmiller4, niranjir,

Realty and Reality: Where Location MattersJohn Miller, Niranjini Rajagopal, Krishna Kumar, Anh Luong, Anthony Rowe

{jmiller4, niranjir, kreghuku, aluong2, agr}@andrew.cmu.eduElectrical and Computer Engineering Department

Carnegie Mellon UniversityPittsburgh, PA

Abstract—We demonstrate an indoor location solution formobile devices based on fusion of ultra-wideband (UWB) time-of-flight (TOF) ranges and visual-inertial odometry (VIO) providedby Apple’s ARKit. VIO provides accurate position and orien-tation tracking relative to device start-up, and UWB providesglobal location estimates. We use a battery-powered RaspberryPi (RPi) with a UWB module as a beacon. We localize a consumersmartphone with an attached RPi and UWB module (tag).

I. INTRODUCTION

We demonstrate a system that utilizes UWB-based rang-ing and VIO-based tracking. UWB-based TOF systems havebeen shown to be promising for accurate indoor localization.However, they tend to have low update rates, require a highbeacon density, and suffer from errors due to NLOS signals.Over the past year, we have seen the emergence of Apple’sARKit and Android’s ARCore to support AR applications onmobile devices. These APIs provide accurate relative locationand orientation with respect to device start-up, using VIO.However, VIO can drift over time and can lose trackingprecision with poor vision features, low lighting or abruptmovements of the camera. Our system leverages the comple-mentary features of these two technologies and fuses the VIOinformation with TOF ranges to estimate the location withrespect to the beacons to obtain high-rate, high-precision, andglobally accurate location estimates.

We envision that future commodity mobile devices willbe capable of TOF ranging with WiFi access points (using802.11ax) or beacons deployed for indoor localization (usingBluetooth 5). AR systems can leverage this localization capa-bility without incurring any additional cost. The larger goalof building this system is to enable persistent multi-user ARapplications on mobile devices [1].

II. SYSTEM DESCRIPTION

The components of the localization system are as follows:

A. Ultra-Wideband

We use a DecaWave DWM1000 module for obtaining TOFranges between the beacons and the tracked device [2]. Themodule is capable of approximately 30 centimeter rangingaccuracy and 30 metres line-of-sight range. The ranges areproduced using a double-sided two-way ranging (DS-TWR)protocol, allowing two round-trip TOF to be measured. Thishelps mitigate range errors due to clock drift between thebeacons and the tag. A Raspberry Pi is used as the interfaceto the DWM1000 module on both the beacons and the tag.

Fig. 1. Hardware Setup: (left) DWM1000 UWB module on top of RPiZero W with tablet for VIO, and (right) UWB Beacon with a DWM1000module connected to RPi 2 powered by a POE Injector and Spliter for easeof deployment.

Fig. 2. Architecture: The user carries an iOS device with an attached UWBtag to localize with respect to UWB beacons deployed in the environment

B. Visual-Inertial Odometry

The advent of VIO on consumer smartphones has greatlyadvanced tracking capabilities over what was possible withonly inertial sensors (gyroscope and accelerometer). ARKitprovides relative location updates by fusing motion estimationsfrom the camera with inertial sensors in the smartphone [3].It provides 6 degree-of-freedom (position and orientation)information at an update rate of 60 hertz. Therefore, the systemcould be used for 3D location. We characterize the ARKit VIOerror model, which is integrated into our fusion engine.

Page 2: Realty and Reality: Where Location Matters · Realty and Reality: Where Location Matters John Miller, Niranjini Rajagopal, Krishna Kumar, Anh Luong, Anthony Rowe fjmiller4, niranjir,

C. Architecture

Ten beacons will be placed throughout the tracking area,powered by external batteries. The beacons’ locations will bemapped and the beacons will provide range information toa mobile tag affixed to the tracked smartphone. The locationestimations will be computed on the phone using a Bayesianestimation algorithm, such as a Particle Filter or ExtendedKalman Filter. Figure 1 shows the tag and beacon hardwarefor the proposed system and Figure 2 shows the systemarchitecture.

REFERENCES

[1] Demo Abstract: Welcome to My World: Demystifying Multi-user ARwith the Cloud, Niranjini Rajagopal, John Miller, Krishna Kumar, AnhLuong, Anthony Rowe, In Information Processing in Sensor Networks,(IPSN) 2018

[2] Decawave (2017). DecaRangeRTLS ARM Source Code Guide. [online]Medium. Available at: https://www.decawave.com/content/trek-source-code [Accessed 10 Feb. 2018]

[3] https://developer.apple.com/arkit/ [Accessed 10 Feb. 2018]