Mixed Reality from demo to product

67
[email protected] @matteovaloriani www.fifthingenium.com Mixed Reality:Dalle demo a un prodotto Matteo Valoriani Torino, 23 Settembre 2016

Transcript of Mixed Reality from demo to product

Page 1: Mixed Reality from demo to product

[email protected]

@matteovaloriani

www.fifthingenium.com

Mixed Reality:Dalle demo a un prodottoMatteo Valoriani

Torino, 23 Settembre 2016

Page 2: Mixed Reality from demo to product

Sponsor

Page 3: Mixed Reality from demo to product

Nice to Meet You3

Matteo ValorianiCEO of FifthIngeniumPhD at Politecnico of MilanoSpeaker and Consultant

Microsoft MVP – Emerging Experiences

Intel Software Innovator

mvaloriani at gmail.com@MatteoValorianiSlideshare: www.slideshare.net/MatteoValorianiLinkedin: https://it.linkedin.com/in/matteovalorianiBlog: http://fifthingenium.com/blogGitHub: https://github.com/mvaloriani

Page 4: Mixed Reality from demo to product

Agenda

Inside the HoloLens

Hardware Review

HoloLens Optics

Holographic Processing Unit v 1.0

HoloLens UX

HoloLens UX overview

HoloLens UX Tips

Fast Developer Tips

What NEXT?

Page 5: Mixed Reality from demo to product

Inside the HoloLens

5

Page 6: Mixed Reality from demo to product

Hololens Prototype6

Page 7: Mixed Reality from demo to product

Hololens Device - Confort7

Page 8: Mixed Reality from demo to product

Hololens Device - Confort8

Page 9: Mixed Reality from demo to product

Hololens Device – Sensor Bar9

Page 10: Mixed Reality from demo to product

Hololens Device – Sensor Bar

Ambient Light

Sensor

2MP Photo (2048x1152)/ HD Video Camera (1408x792)

Depth CameraIR Camera based on Time-Of-FlyHands Tracking + Surface reconstruction + Object Position

4 Environment Understanding CameraGray Scale Cameras

Create Map of the Room

10

Page 11: Mixed Reality from demo to product

Hololens Device – Optics11

Page 12: Mixed Reality from demo to product

Hololens Device – Optics

IMUGyroscope + Magnetometer + AccelerometerFast Position Updates (<10ms )

2 HD 16:9 light enginesProject images on lensesHolographic Resolution: 2.3M total light pointsHolographic Density >2.5k radiants(light points per radian)Automatic pupillary distance calibration

See-through holographic lenses (waveguides)R - G – B Layers

12

Page 13: Mixed Reality from demo to product

Hololens Device – Internal Hardware13

Page 14: Mixed Reality from demo to product

Hololens Device – Internal Hardware

Memory: 64GB Flash / 2GB RAM

Processor: Intel Atom x5-Z8100 1.04 GHz

64-bit

Intel Airmont (14nm) 4 Logical

GPU: Intel 8086h, Dedicated Video

Memory 114 MB, Shared System

Memory 980 MB

Custom-built Microsoft Holographic

Processing Unit (HPU 1.0)

Battery: 16,500 mWhWeight : 579g

14

Page 15: Mixed Reality from demo to product

Hololens Device – 3D Spatial Sound15

Page 16: Mixed Reality from demo to product

Hololens Device – 3D Spatial Sound

Built-in speakers.

A precise audio experience without headphones that is immersive, yet won’t block out the real world.

Spatial sound.

Using a scientific model that characterizes how the human ear receives sound from a specific location, Microsoft HoloLens synthesizes sound so that you can hear holograms from anywhere in the room.

16

Page 17: Mixed Reality from demo to product

HoloLens Optics

17

Page 18: Mixed Reality from demo to product

Optica vs Video see-through18

Page 19: Mixed Reality from demo to product

Device modules19

Imaging Optics

Combiner Optics

Head Traking

Gesture Sensing

EPE

Display

Page 20: Mixed Reality from demo to product

TIR Prism Combiners20

Prisms and beam splitters: Prisms are crystals that bend and redirect light. Beam splitters use similar technology to split light and send it in two directions simultaneously. The reflected light is projected directly into the user’s retina. Google Glass uses a prism to redirect the image into the eye.

Mirrors: The basis for many optical instruments, mirrors can be used to redirect and focus light. Depending on how they are designed and manufactured, they can transmit light from one direction and reflect light from another. Osterhout Design Group (ODG) uses a mirror with a special coating in its R-7 smartglasses.

Page 21: Mixed Reality from demo to product

General concept of Wave guide21

Waveguides: These devices channel light along a path as in an optical fiber, and they are used widely in telecommunications and electronics. In smartglasses, waveguides direct light from tiny displays housed in the temples of the glasses toward the lenses in front of the eye. Vuzix was the first to use waveguides in 2013.

Page 22: Mixed Reality from demo to product

Half Tone Reflective Waveguide Combiners22

Page 23: Mixed Reality from demo to product

Diffractive extraction - Exit Pupil Expansion23

Nanometer wide structures or gratings are placed on the surface of the waveguide at the location where we want to extract an image. The grating effectively creates an interference pattern that diffracts the light out and even enlarges the image. This is known as SRG or surface relief grating.

EPE literally means making an image bigger (expanding it) so it covers as much of the exit pupil as possible, which means your eye plus every area your pupil might go to as you rotate your eyeball to take in your field of view (about a 10mm x 8mm rectangle or eye box).

Original work by Nokia on 1D EPE waveguide grating conbiners (1995)

Page 24: Mixed Reality from demo to product

Difractibe-Holographic Waveguide

Combiners with EPE

24

Page 25: Mixed Reality from demo to product

Light field25

This term is defined as the amount of light flowing in every direction through every

point in space.4 It is emerging as an alternative method for displaying 3-D objects that

appear more realistic than those created by providing different left and right images in

a stereoscopic display.

Magic Leap states it is using light field technology in its smartglasses.

Page 26: Mixed Reality from demo to product

Holographic Processing Unit v 1.0

26

Page 27: Mixed Reality from demo to product

Hololens Hadware Blocks27

Page 28: Mixed Reality from demo to product

Holographic Processing Unit v 1.0

TSMC-fabricated 28 nm co-processor.

24 Tensilica DSP cores (12 clusters)

65 million logic gates (used 50%)

8 MB of SRAM

1GB DRAM

1 Trillion Operation per second

Sensor aggregator with gesture and environment processing

200x over software implementation

Low Power (<10 Watts)

12 mm

12 m

m28

Page 29: Mixed Reality from demo to product

Fild Of View29

Page 30: Mixed Reality from demo to product

Remember HW Limits30

Goals

• Frame Rate 60 fps

• Memory < 900 MB Total Commit

The the biggest factors for CPU performance are:

• Too many objects being rendered (try to keep this under 100 unique Renderers or UI elements)

• Expensive updates or too many object updates

• Hitches due to garbage collection

• Expensive graphics settings and shaders (shadows, reflection probes, etc.)

https://developer.microsoft.com/en-us/windows/holographic/performance_recommendations_for_unity

Page 31: Mixed Reality from demo to product

Optimize Player31

Go to the player settings by navigating to "Edit > Project Settings > Player" page, click on the "Windows Store“

Use Shader preloading, preloading means you won't see any hitches due to runtime shader compilation.

Make sure "Rendering > Rendering Path" is set to Forward (this is the default).

The "Use 16-bit Depth Buffers" setting allows you to enable 16-bit depth buffers, which drastically reduces the bandwidth (and thus power) associated with depth buffer traffic.

Page 32: Mixed Reality from demo to product

HoloLens UX

32

Page 33: Mixed Reality from demo to product

Objects Interaction33

Page 34: Mixed Reality from demo to product

Move Objects34

Page 35: Mixed Reality from demo to product

Rotate Objects35

Page 36: Mixed Reality from demo to product

Small and Large Interaction36

Page 37: Mixed Reality from demo to product

Object size / Interaction space37

Page 38: Mixed Reality from demo to product

Menu38

Page 39: Mixed Reality from demo to product

Menu39

Page 40: Mixed Reality from demo to product

Maximize 3D Use40

Page 41: Mixed Reality from demo to product

Feedbacks41

Page 42: Mixed Reality from demo to product

Learnability42

Page 43: Mixed Reality from demo to product

Looking Indicator / Target43

Page 44: Mixed Reality from demo to product

Looking Indicator / Target44

Page 45: Mixed Reality from demo to product

Incidental Interaction45

Page 46: Mixed Reality from demo to product

Space Limits46

Page 47: Mixed Reality from demo to product

Security47

Page 48: Mixed Reality from demo to product

Takeaways

Page 49: Mixed Reality from demo to product

Minimalize Fatigue

Gestural interaction involves more muscles than keyboard interaction or speech.

Gestural interactions must therefore be concise and quick, and minimize user’s effort and physical stress.

Two types of muscular stress are known:

• static, the effort required maintaining a posture for a fixed amount of time;

• dynamic, related to the effort required to move a portion of the body through a trajectory.

Page 50: Mixed Reality from demo to product

Favor ease of learning (Learnability) 1/2

It must be easy for the user to learn how to

perform and remember interaction,

minimizing the mental load of recalling

associated actions.

The learning rate depends on tasks, user

experience, skills, as well as the size of the

gesture language (more gestures decrease

the learnability rate).

Page 51: Mixed Reality from demo to product

Favor ease of learning (Learnability) 2/2

The interaction that are most natural, easy to learn and are immediately

assimilated by the user are those that belong to everyday life, or involve the

least physical effort.

Complex interaction can be more expressive and give more control, but have

a higher learnability burden.

Hence there is clearly a tension between design requirements, among which a

compromise must be made: naturalness of interaction, minimum size of the

gesture language, expressiveness and completeness of the interaction.

Page 52: Mixed Reality from demo to product

Intentionality (Immersion Syndrome)

Users can perform unintended gestures, i.e.,

movements that are not meant to communicate

with the system they are interacting with.

The “immersion syndrome” occurs if every

movement is interpreted by the system, whether

or not it was intended, and may determine

interaction effects against the user’s will.

Need to design reaction to unpredicted user

interaction.

Page 53: Mixed Reality from demo to product

Not-self-revealing

Appropriate feedback indicating the effects

and correctness of the interaction performed is

necessary for successful interaction, and to

improve the user's confidence in the system.

Page 54: Mixed Reality from demo to product

Developer Tips

54

Page 55: Mixed Reality from demo to product

What next?

55

Page 56: Mixed Reality from demo to product

56

Page 57: Mixed Reality from demo to product

Project Alloy57

Wireless VR/MR device

2 RealSense for spatial and hands traking

Page 58: Mixed Reality from demo to product

Commercial options58

Kiosk mode. With HoloLens kiosk mode, you can limit which apps to run to enable demo or showcase experiences.

Mobile Device Management (MDM) for HoloLens.Your IT department can manage multiple HoloLens devices simultaneously using solutions like Microsoft InTune. You will be able to manage settings, select apps to install and set security configurations tailored to your organization's need.

Identity. Azure Active Directory and next generation credentials with PIN unlock.

Windows Update for Business. Controlled operating system updates to devices and support for long term servicing branch.

Data security. BitLocker data encryption and secure boot is enabled on HoloLens to provide the same level of security protection as any other Windows device.

Work access. Anyone in your organization can remotely connect to the corporate network through a virtual private network on a HoloLens. HoloLens can also access Wi-Fi networks that require credentials.

Windows Store for Business.Your IT department can also set up an enterprise private store, containing only your company’s apps for your specific HoloLens usage. Securely distribute your enterprise software to selected group of enterprise users

Page 59: Mixed Reality from demo to product

Meta 259

Page 60: Mixed Reality from demo to product

Magic Leap60

Page 61: Mixed Reality from demo to product

Daqri61

Page 62: Mixed Reality from demo to product

HTC Vive62

Page 63: Mixed Reality from demo to product

Oculus Rift 263

Page 64: Mixed Reality from demo to product

Vuzix64

Page 65: Mixed Reality from demo to product

Play Station VR65

Page 66: Mixed Reality from demo to product

Thanks66

Page 67: Mixed Reality from demo to product

References

• https://www.youtube.com/watch?v=zuzK3amWFzg

• https://developer.microsoft.com/en-us/windows/holographic/hardware_details

• https://www.youtube.com/watch?v=u0eBd2m_wEs&app=desktop

• http://www.techtimes.com/articles/175034/20160826/microsoft-reveals-more-about-hololens-hardware.htm

• http://www.techtimes.com/articles/174764/20160823/microsoft-reveals-specs-of-hololens-holographic-processing-unit.htm

• http://www.tomshardware.com/news/microsoft-hololens-components-hpu-28nm,32546.html

• https://books.google.it/books?id=qPU2DAAAQBAJ&pg=PT136&lpg=PT136&dq=hololens+spatial+resolution&source=bl&ots=rZxOQzXlj7&sig=by_ssS7gRYaL_viWpmPBACMQffU&hl=en&sa=X&ved=0ahUKEwj0ueLXiZnPAhUBthoKHVpfCL44ChDoAQhUMAk#v=onepage&q=hololens%20spatial%20resolution&f=false

• http://doc-ok.org/?p=1329

• http://www.pwc.com/us/en/technology-forecast/augmented-reality/optic-breakthroughs-reshaping-augmented-reality.html

• http://www.slideshare.net/marknb00/a-survey-of-augmented-reality

• https://www.microsoft.com/microsoft-hololens/en-us/hololens-commercial

67/8