Google Project Tango
-
Upload
akhil-nair -
Category
Technology
-
view
714 -
download
0
Transcript of Google Project Tango
Hello !
Google unveiled Project Tango, an experimental project from the company’s Advanced Technology and Projects (ATAP) group.
What is Project Tango?
Help everything and everyone understand where they are.
Project Tango technology gives a mobile device the ability to navigate the physical world similar to how we do as humans. Project Tango brings a new kind of spatial perception to the Android device platform by adding advanced computer vision, image processing, and special vision sensors.
Why Project Tango?
What if you could capture the dimensions of your home simply by walking around with your smartphone before you went furniture shopping? What if you never again found yourself lost in a new building? What if you could search for a product and see where the exact shelf is located in a store?
Imagine playing hide and seek in your house with your favourite game character, imagine competing against a friend for control over territories in your home with your own miniature army?
Hardware
Front View
Back View
Smartphone Specifications
Screen : Memory : 7” 1920x1200 HD 128 GB internal
storage,IPS display (323 ppi) 4 GB RAM.
Camera : Processor :4 MP 2 micrometer NVIDIA Tegra K1
CUDA coresRGB-IR pixel sensor, ( w/192)1 MP front facing.
Battery : 4960 mAH cell
Software
Android 4.4 (KitKat)
Store everything in Drive.
Play Me
Everything you are about to see exists today.
Project Tango Concepts
Motion Tracking
Area Learning
Depth Perception
Overview
Motion Tracking
Project Tango’s core functionality is measuring movement through space and understanding the area moved through. Google API’s provide the position and orientation of the user’s device in full six degrees of freedom, referred to as its pose.
Pose
Project Tango devices combine the camera, gyroscope and accelerometer to estimate six degrees of freedom motion tracking, providing developers the ability to track 3D motion of a device while simultaneously creating a map of the environment.
Area Learning
Using area learning, Project Tango device can remember the visual features of the area it is moving through and recognize when it sees those features again. These features can be saved in an Area Description File (ADF) to se again later. With an ADF loaded, Project Tango devices gain a new feature called drift corrections or improved motion tracking.
What is Drift Correction?
Using area learning, a Project Tango device can remember the visual features of the area it has visited and use them to correct errors in its understanding of its position, orientation and movement. This differs from motion tracking alone, which has no memory of the environment. This memory allows the system to perform drift correction, also called loop closures.
When the device comes back to a place it has already visited, it realizes it has travelled in a loop and adjusts its path to be consistent with its previous observations, thereby improving position and trajectory of the device.
Depth Perception
Project Tango devices are equipped with integrated 3D sensors that measure the distance from a device to objects in the real world. This configuration gives good depth at a distance while balancing power requirements for infrared illumination and depth processing.
The depth data allows an application to understand the distance of visible objects to the device. By combining depth perception with motion tracking, you can also measure distance between points in an area that aren’t in the same fame.
Project Tango APIs provides a function to get depth information in the form of a point cloud.This format gives (x, y, z) coordinates for as many points in the scene as are possible to calculate. Each dimension is a floating point value recording the position of each point in meters in the coordinate frame of the depth sensing camera.
+ +
=
$ 512
A truly open project
The Fu tu re is awesome.
Akhil - A - Nair