Gaurav.report on femto photography

39
A Seminar Report On FEMTO PHOTOGRAPHY In partial fulfillment of requirements for the degree of Bachelors of Technology SUBMITTED BY: GAURAV CHAUHAN 1101731020 Under the Guidance of Mr.ASHISH KUMAR, HEAD OF DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING, BIJNOR

Transcript of Gaurav.report on femto photography

Page 1: Gaurav.report on femto photography

A

Seminar Report

On

FEMTO PHOTOGRAPHY

In partial fulfillment of requirements for the degree of

Bachelors of Technology

SUBMITTED BY

GAURAV CHAUHAN

1101731020

Under the Guidance of

MrASHISH KUMAR HEAD OF

DEPARTMENT OF ELECTRONICS AND COMMUNICATION

ENGINEERING BIJNOR

APRIL 2014

KUNWAR SATYAVIRA COLLEGE OF

ENGINEERING BIJNOR

CERTIFICATE

This is to certify that GAURAV CHAUHAN Roll No1101731020 has

successfully completed his report entitled ldquoFEMTO PHOTOGRAPHYrdquo in the

seminar presentation of Third Year BTech in Electronics and Communication

Engineering of Maha Maya Technical University session 2013-14

The matter in this report is a record of his work carried out by him under our

Supervision and Guidance

We wish him all the success in future endeavours

(ASHISH KUMAR) (NISHI CHANDRA)

Head of Department Seminar Incharge

EampC Engineering EampC Engineering

Acknowledgement

While preparing my seminar report I was helped and assisted by many individuals

This Seminar report could not have been completed without taking help from

numerous sources and people I have to gratefully acknowledge them for their

generous help in providing the relevant data with their valuable guidelines

Firstly I am very thankful to my parents for their blessings continuous support

and encouragement during the completion of this seminar report I am also

thankful to Mr ASHISH KUMAR HOD Electronics And Communication

Department for granting me the permission to take this topic and also like to

express my sincere gratitude for his guidance and kind support at every step I

needed

At last I would like to thank each and every person who directly or indirectly

helped me during the completion of this assignment

GAURAV CHAUHAN

1101731020

ECE Third Year

TABLE OF CONTENTS

I Certificate i

II Acknowledgement ii

Chapter 1

Femto photography An Introduction

11 Introduction

12 Femto photography

13 Challenges

14 Contribution

15 Limitation

Chapter 2

RELATED WORKS

II1Time resolved Imaging

Chapter 3

Capturing Space-Time Planes

31System

32 Capturing space-time planes

33Performance Validation

Chapter 4

Capturing Space-Time Volumes

Chapter 5

Depicting Ultrafast Videos in 2D

51IPeak Time Images

52 Integral Photo Fusion

Chapter 6

Time Unwarping

Chapter 7

Captured Scenes

71Bottle

72Crystal

Chapter 8

CONCLUSION

Chapter 9

REFERENCES

ABSTRACT

We present femto-photography a novel imaging technique to captureand visualize the propagation of light With an effective exposuretime of 185 picoseconds (ps) per frame we reconstruct moviesof ultrafast events at an equivalent resolution of about one half trillionframes per second Because cameras with this shutter speeddo not exist we re-purpose modern imaging hardware to record anensemble average of repeatable events that are synchronized to astreak sensor in which the time of arrival of light from the scene iscoded in one of the sensorrsquos spatial dimensions We introduce reconstructionmethods that allow us to visualize the propagation offemtosecond light pulses through macroscopic scenes at such fastresolution we must consider the notion of time-unwarping betweenthe camerarsquos and the worldrsquos space-time coordinate systems to take

into account effects associated with the finite speed of light Weapply our femto-photography technique to visualizations of verydifferent scenes which allow us to observe the rich dynamics oftime-resolved light transport effects including scattering specularreflections diffuse interreflections diffraction caustics and subsurfacescattering Our work has potential applications in artisticeducational and scientific visualizations industrial imaging to analyzematerial properties and medical imaging to reconstruct subsurfaceelements In addition our time-resolved technique may motivatenew forms of computational photography

Chapter 1 Introduction

Forward and inverse analysis of light transport plays an important

role in diverse fields such as computer graphics computer visionand scientific imaging Because conventional imaging hardware isslow compared to the speed of light traditional computer graphicsand computer vision algorithms typically analyze transport usinglow time-resolution photos Consequently any information that isencoded in the time delays of light propagation is lost Whereas thejoint design of novel optical hardware and smart computation iecomputational photography has expanded the way we capture analyze and understand visual information speed-of-light propagationhas been largely unexplored In this paper we present a novel ultrafastimaging technique which we term femto-photography consistingof femtosecond laser illumination picosecond-accurate detectorsand mathematical reconstruction techniques to allow us to visualizemovies of light in motion as it travels through a scene withan effective framerate of about one half trillion frames per secondThis allows us to see for instance a light pulse scattering inside aplastic bottle or image formation in a mirror as a function of time

Challenges Developing such time-resolved system is a challengingproblem for several reasons that are under-appreciated in conventionalmethods (a) brute-force time exposures under 2 ps yieldan impractical signal-to-noise (SNR) ratio (b) suitable cameras torecord 2D image sequences at this time resolution do not exist dueto sensor bandwidth limitations (c) comprehensible visualizationof the captured time-resolved data is non-trivial and (d) direct measurementsof events appear warped in space-time because the finitespeed of light implies that the recorded light propagation delay dependson camera position relative to the sceneContributions Our main contribution is in addressing these challengesand creating a first prototype as follows_ We exploit the statistical similarity of periodic light transportevents to record multiple ultrashort exposure times of onedimensionalviews (Section 3)_ We introduce a novel hardware implementation to sweep theexposures across a vertical field of view to build 3D spacetimedata volumes (Section 4)_ We create techniques for comprehensible visualization includingmovies showing the dynamics of real-world lighttransport phenomena (including reflections scattering diffuseinter-reflections or beam diffraction) and the notion ofpeak-time which partially overcomes the low-frequency appearanceof integrated global light transport (Section 5)_ We introduce a time-unwarping technique to correct the distortionsin captured time-resolved information due to the finitespeed of light (Section 6)

Limitations Although not conceptual our setup has several practicallimitations primarily due to the limited SNR of scattered light Since the hardware elements in our system were originally designedfor different purposes it is not optimized for efficiency and suffersfrom low optical throughput (eg the detector is optimized for 500nm visible light while the infrared laser wavelength we use is 795nm) and from dynamic range limitations This lengthens the totalrecording time to approximately one hour Furthermore the scanningmirror rotating continuously introduces some blurring in thedata along the scanned (vertical) dimension Future optimized systemscan overcome these limitations

Chapter 2 Related WorkUltrafast Devices The fastest 2D continuous real-timemonochromatic camera operates at hundreds of nanoseconds perframe [Goda et al 2009] (about 6_ 106 frames per second) with aspatial resolution of 200_200 pixels less than one third of what weachieve Avalanche photodetector (APD) arrays can reach temporalresolutions of several tens of picoseconds if they are used in aphoton starved regime where only a single photon hits a detectorwithin a time window of tens of nanoseconds [Charbon 2007]Repetitive illumination techniques used in incoherent LiDAR [Tou1995 Gelbart et al 2002] use cameras with typical exposure timeson the order of hundreds of picoseconds [Busck and Heiselberg2004 Colaccedilo et al 2012] two orders of magnitude slower thanour system Liquid nonlinear shutters actuated with powerful laserpulses have been used to capture single analog frames imaginglight pulses at picosecond time resolution [Duguay and Mattick1971] Other sensors that use a coherent phase relation betweenthe illumination and the detected light such as optical coherencetomography (OCT) [Huang et al 1991] coherent LiDAR [Xia

and Zhang 2009] light-in-flight holography [Abramson 1978]or white light interferometry [Wyant 2002] achieve femtosecondresolutions however they require light to maintain coherence(ie wave interference effects) during light transport and aretherefore unsuitable for indirect illumination in which diffusereflections remove coherence from the light Simple streak sensorscapture incoherent light at picosecond to nanosecond speeds butare limited to a line or low resolution (20 _ 20) square field ofview [Campillo and Shapiro 1987 Itatani et al 2002 Shiragaet al 1995 Gelbart et al 2002 Kodama et al 1999 Qu et al2006] They have also been used as line scanning devices forimage transmission through highly scattering turbid media byrecording the ballistic photons which travel a straight path throughthe scatterer and thus arrive first on the sensor [Hebden 1993]

Figure 3 Left Photograph of our ultrafast imaging system setup The DSLR camera takes a conventional photo for comparison RightTime sequence illustrating the arrival of the pulse striking a diffuser its transformation into a spherical energy front and its propagationthrough the scene The corresponding captured scene is shown in Figure 10 (top row)

principles that we develop in this paper for the purpose of transientimaging were first demonstrated by Velten et al [2012c] Recentlyphotonic mixer devices along with nonlinear optimization havealso been used in this context [Heide et al 2013]Our system can record and reconstruct space-time world informationof incoherent light propagation in free-space table-top scenesat a resolution of up to 672 _ 1000 pixels and under 2 picosecondsper frame The varied range and complexity of the scenes

we capture allow us to visualize the dynamics of global illuminationeffects such as scattering specular reflections interreflectionssubsurface scattering caustics and diffractionTime-Resolved Imaging Recent advances in time-resolvedimaging have been exploited to recover geometry and motionaround corners [Raskar and Davis 2008 Kirmani et al 2011 Veltenet al 2012b Velten et al 2012a Gupta et al 2012 Pandharkaret al 2011] and albedo of from single view point [Naik et al 2011]But none of them explored the idea of capturing videos of light inmotion in direct view and have some fundamental limitations (suchas capturing only third-bounce light) that make them unsuitable forthe present purpose Wu et al [2012a] separate direct and global illuminationcomponents from time-resolved data captured with thesystem we describe in this paper by analyzing the time profile ofeach pixel In a recent publication [Wu et al 2012b] the authorspresent an analysis on transient light transport in frequency spaceand show how it can be applied to bare-sensor imaging Chapter 3 Capturing Space-Time PlanesWe capture time scales orders of magnitude faster than the exposuretimes of conventional cameras in which photons reaching thesensor at different times are integrated into a single value makingit impossible to observe ultrafast optical phenomena The systemdescribed in this paper has an effective exposure time down to 185ps since light travels at 03 mmps light travels approximately 05mm between frames in our reconstructed moviesSystem An ultrafast setup must overcome several difficulties inorder to accurately measure a high-resolution (both in space and

time) image First for an unamplified laser pulse a single exposuretime of less than 2 ps would not collect enough light so the SNRwould be unworkably low As an example for a table-top sceneilluminated by a 100Wbulb only about 1 photon on average wouldreach the sensor during a 2 ps open-shutter period Second becauseof the time scales involved synchronization of the sensor and theillumination must be executed within picosecond precision Thirdstandalone streak sensors sacrifice the vertical spatial dimension inorder to code the time dimension thus producing x-t images Asa consequence their field of view is reduced to a single horizontalline of view of the scene We solve these problems with our ultrafast imaging system outlinedin Figure 2 (A photograph of the actual setup is shown inFigure 3 (left)) The light source is a femtosecond (fs) Kerr lensmode-locked TiSapphire laser which emits 50-fs with a centerwavelength of 795 nm at a repetition rate of 75 MHz and averagepower of 500 mW In order to see ultrafast events in a scene withmacro-scaled objects we focus the light with a lens onto a Lambertiandiffuser which then acts as a point light source and illuminatesthe entire scene with a spherically-shaped pulse (see Figure3 (right)) Alternatively if we want to observe pulse propagationitself rather than the interactions with large objects we direct thelaser beam across the field of view of the camera through a scatteringmedium (see the bottle scene in Figure 1)Because all the pulses are statistically identical we can record thescattered light from many of them and integrate the measurementsto average out any noise The result is a signal with a high SNR Tosynchronize this illumination with the streak sensor (HamamatsuC5680 [Hamamatsu 2012]) we split off a portion of the beam with

a glass slide and direct it onto a fast photodetector connected to thesensor so that now both detector and illumination operate synchronously(see Figure 2 (a))Capturing space-time planes The streak sensor then capturesan x-t image of a certain scanline (ie a line of pixels in the horizontaldimension) of the scene with a space-time resolution of672 _ 512 The exact time resolution depends on the amplificaamplificationof an internal sweep voltage signal applied to the streak sensorWith our hardware it can be adjusted from 030 ps to 507 ps Practicallywe choose the fastest resolution that still allows for captureof the entire duration of the event In the streak sensor a photocathodeconverts incoming photons arriving from each spatial locationin the scanline into electrons The streak sensor generates the x-timage by deflecting these electrons according to the time of theirarrival to different positions along the t-dimension of the sensor(see Figure 2(b) and 2(c)) This is achieved by means of rapidlychanging the sweep voltage between the electrodes in the sensorFor each horizontal scanline the camera records a scene illuminatedby the pulse and averages the light scattered by 45 _ 108

pulses (see Figure 2(d) and 2(e))Performance Validation To characterize the streak sensor wecompare sensor measurements with known geometry and verify thelinearity reproducibility and calibration of the time measurementsTo do this we first capture a streak image of a scanline of a simplescene a plane being illuminated by the laser after hitting the diffuser

(see Figure 4 (left)) Then by using a Faro digitizer arm [Faro2012] we obtain the ground truth geometry of the points along that

plane and of the point of the diffuser hit by the laser this allows usto compute the total travel time per path (diffuser-plane-streak sensor)for each pixel in the scanline We then compare the travel timecaptured by our streak sensor with the real travel time computed from the known geometry

Chapter 4 Capturing Space-Time VolumesAlthough the synchronized pulsed measurements overcome SNRissues the streak sensor still provides only a one-dimensionalmovie Extension to two dimensions requires unfeasible bandwidthsa typical dimension is roughly 103 pixels so a threedimensionaldata cube has 109 elements Recording such a largequantity in a 1010485769 second (1 ns) time widow requires a bandwidthof 1018 bytes far beyond typical available bandwidthsWe solve this acquisition problem by again utilizing the synchronizedrepeatability of the hardware A mirror-scanning system (two9 cm _ 13 cm mirrors see Figure 3 (left)) rotates the camerarsquos centerof projection so that it records horizontal slices of a scene sequentiallyWe use a computer-controlled one-rpm servo motor torotate one of the mirrors and consequently scan the field of viewvertically The scenes are about 25 cm wide and placed about 1

meter from the camera With high gear ratios (up to 11000) thecontinuous rotation of the mirror is slow enough to allow the camerato record each line for about six seconds requiring about onehour for 600 lines (our video resolution) We generally capture extralines above and below the scene (up to 1000 lines) and thencrop them to match the aspect ratio of the physical scenes beforethe movie was reconstructedThese resulting images are combined into one matrixMijk wherei = 1672 and k = 1512 are the dimensions of the individualx-t streak images and j = 11000 addresses the second spatialdimension y For a given time instant k the submatrix Nij containsa two-dimensional image of the scene with a resolution of 672 _1000 pixels exposed for as short to 185 ps Combining the x-tslices of the scene for each scanline yields a 3D x-y-t data volumeas shown in Figure 5 (left) An x-y slice represents one frame of the

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 2: Gaurav.report on femto photography

KUNWAR SATYAVIRA COLLEGE OF

ENGINEERING BIJNOR

CERTIFICATE

This is to certify that GAURAV CHAUHAN Roll No1101731020 has

successfully completed his report entitled ldquoFEMTO PHOTOGRAPHYrdquo in the

seminar presentation of Third Year BTech in Electronics and Communication

Engineering of Maha Maya Technical University session 2013-14

The matter in this report is a record of his work carried out by him under our

Supervision and Guidance

We wish him all the success in future endeavours

(ASHISH KUMAR) (NISHI CHANDRA)

Head of Department Seminar Incharge

EampC Engineering EampC Engineering

Acknowledgement

While preparing my seminar report I was helped and assisted by many individuals

This Seminar report could not have been completed without taking help from

numerous sources and people I have to gratefully acknowledge them for their

generous help in providing the relevant data with their valuable guidelines

Firstly I am very thankful to my parents for their blessings continuous support

and encouragement during the completion of this seminar report I am also

thankful to Mr ASHISH KUMAR HOD Electronics And Communication

Department for granting me the permission to take this topic and also like to

express my sincere gratitude for his guidance and kind support at every step I

needed

At last I would like to thank each and every person who directly or indirectly

helped me during the completion of this assignment

GAURAV CHAUHAN

1101731020

ECE Third Year

TABLE OF CONTENTS

I Certificate i

II Acknowledgement ii

Chapter 1

Femto photography An Introduction

11 Introduction

12 Femto photography

13 Challenges

14 Contribution

15 Limitation

Chapter 2

RELATED WORKS

II1Time resolved Imaging

Chapter 3

Capturing Space-Time Planes

31System

32 Capturing space-time planes

33Performance Validation

Chapter 4

Capturing Space-Time Volumes

Chapter 5

Depicting Ultrafast Videos in 2D

51IPeak Time Images

52 Integral Photo Fusion

Chapter 6

Time Unwarping

Chapter 7

Captured Scenes

71Bottle

72Crystal

Chapter 8

CONCLUSION

Chapter 9

REFERENCES

ABSTRACT

We present femto-photography a novel imaging technique to captureand visualize the propagation of light With an effective exposuretime of 185 picoseconds (ps) per frame we reconstruct moviesof ultrafast events at an equivalent resolution of about one half trillionframes per second Because cameras with this shutter speeddo not exist we re-purpose modern imaging hardware to record anensemble average of repeatable events that are synchronized to astreak sensor in which the time of arrival of light from the scene iscoded in one of the sensorrsquos spatial dimensions We introduce reconstructionmethods that allow us to visualize the propagation offemtosecond light pulses through macroscopic scenes at such fastresolution we must consider the notion of time-unwarping betweenthe camerarsquos and the worldrsquos space-time coordinate systems to take

into account effects associated with the finite speed of light Weapply our femto-photography technique to visualizations of verydifferent scenes which allow us to observe the rich dynamics oftime-resolved light transport effects including scattering specularreflections diffuse interreflections diffraction caustics and subsurfacescattering Our work has potential applications in artisticeducational and scientific visualizations industrial imaging to analyzematerial properties and medical imaging to reconstruct subsurfaceelements In addition our time-resolved technique may motivatenew forms of computational photography

Chapter 1 Introduction

Forward and inverse analysis of light transport plays an important

role in diverse fields such as computer graphics computer visionand scientific imaging Because conventional imaging hardware isslow compared to the speed of light traditional computer graphicsand computer vision algorithms typically analyze transport usinglow time-resolution photos Consequently any information that isencoded in the time delays of light propagation is lost Whereas thejoint design of novel optical hardware and smart computation iecomputational photography has expanded the way we capture analyze and understand visual information speed-of-light propagationhas been largely unexplored In this paper we present a novel ultrafastimaging technique which we term femto-photography consistingof femtosecond laser illumination picosecond-accurate detectorsand mathematical reconstruction techniques to allow us to visualizemovies of light in motion as it travels through a scene withan effective framerate of about one half trillion frames per secondThis allows us to see for instance a light pulse scattering inside aplastic bottle or image formation in a mirror as a function of time

Challenges Developing such time-resolved system is a challengingproblem for several reasons that are under-appreciated in conventionalmethods (a) brute-force time exposures under 2 ps yieldan impractical signal-to-noise (SNR) ratio (b) suitable cameras torecord 2D image sequences at this time resolution do not exist dueto sensor bandwidth limitations (c) comprehensible visualizationof the captured time-resolved data is non-trivial and (d) direct measurementsof events appear warped in space-time because the finitespeed of light implies that the recorded light propagation delay dependson camera position relative to the sceneContributions Our main contribution is in addressing these challengesand creating a first prototype as follows_ We exploit the statistical similarity of periodic light transportevents to record multiple ultrashort exposure times of onedimensionalviews (Section 3)_ We introduce a novel hardware implementation to sweep theexposures across a vertical field of view to build 3D spacetimedata volumes (Section 4)_ We create techniques for comprehensible visualization includingmovies showing the dynamics of real-world lighttransport phenomena (including reflections scattering diffuseinter-reflections or beam diffraction) and the notion ofpeak-time which partially overcomes the low-frequency appearanceof integrated global light transport (Section 5)_ We introduce a time-unwarping technique to correct the distortionsin captured time-resolved information due to the finitespeed of light (Section 6)

Limitations Although not conceptual our setup has several practicallimitations primarily due to the limited SNR of scattered light Since the hardware elements in our system were originally designedfor different purposes it is not optimized for efficiency and suffersfrom low optical throughput (eg the detector is optimized for 500nm visible light while the infrared laser wavelength we use is 795nm) and from dynamic range limitations This lengthens the totalrecording time to approximately one hour Furthermore the scanningmirror rotating continuously introduces some blurring in thedata along the scanned (vertical) dimension Future optimized systemscan overcome these limitations

Chapter 2 Related WorkUltrafast Devices The fastest 2D continuous real-timemonochromatic camera operates at hundreds of nanoseconds perframe [Goda et al 2009] (about 6_ 106 frames per second) with aspatial resolution of 200_200 pixels less than one third of what weachieve Avalanche photodetector (APD) arrays can reach temporalresolutions of several tens of picoseconds if they are used in aphoton starved regime where only a single photon hits a detectorwithin a time window of tens of nanoseconds [Charbon 2007]Repetitive illumination techniques used in incoherent LiDAR [Tou1995 Gelbart et al 2002] use cameras with typical exposure timeson the order of hundreds of picoseconds [Busck and Heiselberg2004 Colaccedilo et al 2012] two orders of magnitude slower thanour system Liquid nonlinear shutters actuated with powerful laserpulses have been used to capture single analog frames imaginglight pulses at picosecond time resolution [Duguay and Mattick1971] Other sensors that use a coherent phase relation betweenthe illumination and the detected light such as optical coherencetomography (OCT) [Huang et al 1991] coherent LiDAR [Xia

and Zhang 2009] light-in-flight holography [Abramson 1978]or white light interferometry [Wyant 2002] achieve femtosecondresolutions however they require light to maintain coherence(ie wave interference effects) during light transport and aretherefore unsuitable for indirect illumination in which diffusereflections remove coherence from the light Simple streak sensorscapture incoherent light at picosecond to nanosecond speeds butare limited to a line or low resolution (20 _ 20) square field ofview [Campillo and Shapiro 1987 Itatani et al 2002 Shiragaet al 1995 Gelbart et al 2002 Kodama et al 1999 Qu et al2006] They have also been used as line scanning devices forimage transmission through highly scattering turbid media byrecording the ballistic photons which travel a straight path throughthe scatterer and thus arrive first on the sensor [Hebden 1993]

Figure 3 Left Photograph of our ultrafast imaging system setup The DSLR camera takes a conventional photo for comparison RightTime sequence illustrating the arrival of the pulse striking a diffuser its transformation into a spherical energy front and its propagationthrough the scene The corresponding captured scene is shown in Figure 10 (top row)

principles that we develop in this paper for the purpose of transientimaging were first demonstrated by Velten et al [2012c] Recentlyphotonic mixer devices along with nonlinear optimization havealso been used in this context [Heide et al 2013]Our system can record and reconstruct space-time world informationof incoherent light propagation in free-space table-top scenesat a resolution of up to 672 _ 1000 pixels and under 2 picosecondsper frame The varied range and complexity of the scenes

we capture allow us to visualize the dynamics of global illuminationeffects such as scattering specular reflections interreflectionssubsurface scattering caustics and diffractionTime-Resolved Imaging Recent advances in time-resolvedimaging have been exploited to recover geometry and motionaround corners [Raskar and Davis 2008 Kirmani et al 2011 Veltenet al 2012b Velten et al 2012a Gupta et al 2012 Pandharkaret al 2011] and albedo of from single view point [Naik et al 2011]But none of them explored the idea of capturing videos of light inmotion in direct view and have some fundamental limitations (suchas capturing only third-bounce light) that make them unsuitable forthe present purpose Wu et al [2012a] separate direct and global illuminationcomponents from time-resolved data captured with thesystem we describe in this paper by analyzing the time profile ofeach pixel In a recent publication [Wu et al 2012b] the authorspresent an analysis on transient light transport in frequency spaceand show how it can be applied to bare-sensor imaging Chapter 3 Capturing Space-Time PlanesWe capture time scales orders of magnitude faster than the exposuretimes of conventional cameras in which photons reaching thesensor at different times are integrated into a single value makingit impossible to observe ultrafast optical phenomena The systemdescribed in this paper has an effective exposure time down to 185ps since light travels at 03 mmps light travels approximately 05mm between frames in our reconstructed moviesSystem An ultrafast setup must overcome several difficulties inorder to accurately measure a high-resolution (both in space and

time) image First for an unamplified laser pulse a single exposuretime of less than 2 ps would not collect enough light so the SNRwould be unworkably low As an example for a table-top sceneilluminated by a 100Wbulb only about 1 photon on average wouldreach the sensor during a 2 ps open-shutter period Second becauseof the time scales involved synchronization of the sensor and theillumination must be executed within picosecond precision Thirdstandalone streak sensors sacrifice the vertical spatial dimension inorder to code the time dimension thus producing x-t images Asa consequence their field of view is reduced to a single horizontalline of view of the scene We solve these problems with our ultrafast imaging system outlinedin Figure 2 (A photograph of the actual setup is shown inFigure 3 (left)) The light source is a femtosecond (fs) Kerr lensmode-locked TiSapphire laser which emits 50-fs with a centerwavelength of 795 nm at a repetition rate of 75 MHz and averagepower of 500 mW In order to see ultrafast events in a scene withmacro-scaled objects we focus the light with a lens onto a Lambertiandiffuser which then acts as a point light source and illuminatesthe entire scene with a spherically-shaped pulse (see Figure3 (right)) Alternatively if we want to observe pulse propagationitself rather than the interactions with large objects we direct thelaser beam across the field of view of the camera through a scatteringmedium (see the bottle scene in Figure 1)Because all the pulses are statistically identical we can record thescattered light from many of them and integrate the measurementsto average out any noise The result is a signal with a high SNR Tosynchronize this illumination with the streak sensor (HamamatsuC5680 [Hamamatsu 2012]) we split off a portion of the beam with

a glass slide and direct it onto a fast photodetector connected to thesensor so that now both detector and illumination operate synchronously(see Figure 2 (a))Capturing space-time planes The streak sensor then capturesan x-t image of a certain scanline (ie a line of pixels in the horizontaldimension) of the scene with a space-time resolution of672 _ 512 The exact time resolution depends on the amplificaamplificationof an internal sweep voltage signal applied to the streak sensorWith our hardware it can be adjusted from 030 ps to 507 ps Practicallywe choose the fastest resolution that still allows for captureof the entire duration of the event In the streak sensor a photocathodeconverts incoming photons arriving from each spatial locationin the scanline into electrons The streak sensor generates the x-timage by deflecting these electrons according to the time of theirarrival to different positions along the t-dimension of the sensor(see Figure 2(b) and 2(c)) This is achieved by means of rapidlychanging the sweep voltage between the electrodes in the sensorFor each horizontal scanline the camera records a scene illuminatedby the pulse and averages the light scattered by 45 _ 108

pulses (see Figure 2(d) and 2(e))Performance Validation To characterize the streak sensor wecompare sensor measurements with known geometry and verify thelinearity reproducibility and calibration of the time measurementsTo do this we first capture a streak image of a scanline of a simplescene a plane being illuminated by the laser after hitting the diffuser

(see Figure 4 (left)) Then by using a Faro digitizer arm [Faro2012] we obtain the ground truth geometry of the points along that

plane and of the point of the diffuser hit by the laser this allows usto compute the total travel time per path (diffuser-plane-streak sensor)for each pixel in the scanline We then compare the travel timecaptured by our streak sensor with the real travel time computed from the known geometry

Chapter 4 Capturing Space-Time VolumesAlthough the synchronized pulsed measurements overcome SNRissues the streak sensor still provides only a one-dimensionalmovie Extension to two dimensions requires unfeasible bandwidthsa typical dimension is roughly 103 pixels so a threedimensionaldata cube has 109 elements Recording such a largequantity in a 1010485769 second (1 ns) time widow requires a bandwidthof 1018 bytes far beyond typical available bandwidthsWe solve this acquisition problem by again utilizing the synchronizedrepeatability of the hardware A mirror-scanning system (two9 cm _ 13 cm mirrors see Figure 3 (left)) rotates the camerarsquos centerof projection so that it records horizontal slices of a scene sequentiallyWe use a computer-controlled one-rpm servo motor torotate one of the mirrors and consequently scan the field of viewvertically The scenes are about 25 cm wide and placed about 1

meter from the camera With high gear ratios (up to 11000) thecontinuous rotation of the mirror is slow enough to allow the camerato record each line for about six seconds requiring about onehour for 600 lines (our video resolution) We generally capture extralines above and below the scene (up to 1000 lines) and thencrop them to match the aspect ratio of the physical scenes beforethe movie was reconstructedThese resulting images are combined into one matrixMijk wherei = 1672 and k = 1512 are the dimensions of the individualx-t streak images and j = 11000 addresses the second spatialdimension y For a given time instant k the submatrix Nij containsa two-dimensional image of the scene with a resolution of 672 _1000 pixels exposed for as short to 185 ps Combining the x-tslices of the scene for each scanline yields a 3D x-y-t data volumeas shown in Figure 5 (left) An x-y slice represents one frame of the

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 3: Gaurav.report on femto photography

Acknowledgement

While preparing my seminar report I was helped and assisted by many individuals

This Seminar report could not have been completed without taking help from

numerous sources and people I have to gratefully acknowledge them for their

generous help in providing the relevant data with their valuable guidelines

Firstly I am very thankful to my parents for their blessings continuous support

and encouragement during the completion of this seminar report I am also

thankful to Mr ASHISH KUMAR HOD Electronics And Communication

Department for granting me the permission to take this topic and also like to

express my sincere gratitude for his guidance and kind support at every step I

needed

At last I would like to thank each and every person who directly or indirectly

helped me during the completion of this assignment

GAURAV CHAUHAN

1101731020

ECE Third Year

TABLE OF CONTENTS

I Certificate i

II Acknowledgement ii

Chapter 1

Femto photography An Introduction

11 Introduction

12 Femto photography

13 Challenges

14 Contribution

15 Limitation

Chapter 2

RELATED WORKS

II1Time resolved Imaging

Chapter 3

Capturing Space-Time Planes

31System

32 Capturing space-time planes

33Performance Validation

Chapter 4

Capturing Space-Time Volumes

Chapter 5

Depicting Ultrafast Videos in 2D

51IPeak Time Images

52 Integral Photo Fusion

Chapter 6

Time Unwarping

Chapter 7

Captured Scenes

71Bottle

72Crystal

Chapter 8

CONCLUSION

Chapter 9

REFERENCES

ABSTRACT

We present femto-photography a novel imaging technique to captureand visualize the propagation of light With an effective exposuretime of 185 picoseconds (ps) per frame we reconstruct moviesof ultrafast events at an equivalent resolution of about one half trillionframes per second Because cameras with this shutter speeddo not exist we re-purpose modern imaging hardware to record anensemble average of repeatable events that are synchronized to astreak sensor in which the time of arrival of light from the scene iscoded in one of the sensorrsquos spatial dimensions We introduce reconstructionmethods that allow us to visualize the propagation offemtosecond light pulses through macroscopic scenes at such fastresolution we must consider the notion of time-unwarping betweenthe camerarsquos and the worldrsquos space-time coordinate systems to take

into account effects associated with the finite speed of light Weapply our femto-photography technique to visualizations of verydifferent scenes which allow us to observe the rich dynamics oftime-resolved light transport effects including scattering specularreflections diffuse interreflections diffraction caustics and subsurfacescattering Our work has potential applications in artisticeducational and scientific visualizations industrial imaging to analyzematerial properties and medical imaging to reconstruct subsurfaceelements In addition our time-resolved technique may motivatenew forms of computational photography

Chapter 1 Introduction

Forward and inverse analysis of light transport plays an important

role in diverse fields such as computer graphics computer visionand scientific imaging Because conventional imaging hardware isslow compared to the speed of light traditional computer graphicsand computer vision algorithms typically analyze transport usinglow time-resolution photos Consequently any information that isencoded in the time delays of light propagation is lost Whereas thejoint design of novel optical hardware and smart computation iecomputational photography has expanded the way we capture analyze and understand visual information speed-of-light propagationhas been largely unexplored In this paper we present a novel ultrafastimaging technique which we term femto-photography consistingof femtosecond laser illumination picosecond-accurate detectorsand mathematical reconstruction techniques to allow us to visualizemovies of light in motion as it travels through a scene withan effective framerate of about one half trillion frames per secondThis allows us to see for instance a light pulse scattering inside aplastic bottle or image formation in a mirror as a function of time

Challenges Developing such time-resolved system is a challengingproblem for several reasons that are under-appreciated in conventionalmethods (a) brute-force time exposures under 2 ps yieldan impractical signal-to-noise (SNR) ratio (b) suitable cameras torecord 2D image sequences at this time resolution do not exist dueto sensor bandwidth limitations (c) comprehensible visualizationof the captured time-resolved data is non-trivial and (d) direct measurementsof events appear warped in space-time because the finitespeed of light implies that the recorded light propagation delay dependson camera position relative to the sceneContributions Our main contribution is in addressing these challengesand creating a first prototype as follows_ We exploit the statistical similarity of periodic light transportevents to record multiple ultrashort exposure times of onedimensionalviews (Section 3)_ We introduce a novel hardware implementation to sweep theexposures across a vertical field of view to build 3D spacetimedata volumes (Section 4)_ We create techniques for comprehensible visualization includingmovies showing the dynamics of real-world lighttransport phenomena (including reflections scattering diffuseinter-reflections or beam diffraction) and the notion ofpeak-time which partially overcomes the low-frequency appearanceof integrated global light transport (Section 5)_ We introduce a time-unwarping technique to correct the distortionsin captured time-resolved information due to the finitespeed of light (Section 6)

Limitations Although not conceptual our setup has several practicallimitations primarily due to the limited SNR of scattered light Since the hardware elements in our system were originally designedfor different purposes it is not optimized for efficiency and suffersfrom low optical throughput (eg the detector is optimized for 500nm visible light while the infrared laser wavelength we use is 795nm) and from dynamic range limitations This lengthens the totalrecording time to approximately one hour Furthermore the scanningmirror rotating continuously introduces some blurring in thedata along the scanned (vertical) dimension Future optimized systemscan overcome these limitations

Chapter 2 Related WorkUltrafast Devices The fastest 2D continuous real-timemonochromatic camera operates at hundreds of nanoseconds perframe [Goda et al 2009] (about 6_ 106 frames per second) with aspatial resolution of 200_200 pixels less than one third of what weachieve Avalanche photodetector (APD) arrays can reach temporalresolutions of several tens of picoseconds if they are used in aphoton starved regime where only a single photon hits a detectorwithin a time window of tens of nanoseconds [Charbon 2007]Repetitive illumination techniques used in incoherent LiDAR [Tou1995 Gelbart et al 2002] use cameras with typical exposure timeson the order of hundreds of picoseconds [Busck and Heiselberg2004 Colaccedilo et al 2012] two orders of magnitude slower thanour system Liquid nonlinear shutters actuated with powerful laserpulses have been used to capture single analog frames imaginglight pulses at picosecond time resolution [Duguay and Mattick1971] Other sensors that use a coherent phase relation betweenthe illumination and the detected light such as optical coherencetomography (OCT) [Huang et al 1991] coherent LiDAR [Xia

and Zhang 2009] light-in-flight holography [Abramson 1978]or white light interferometry [Wyant 2002] achieve femtosecondresolutions however they require light to maintain coherence(ie wave interference effects) during light transport and aretherefore unsuitable for indirect illumination in which diffusereflections remove coherence from the light Simple streak sensorscapture incoherent light at picosecond to nanosecond speeds butare limited to a line or low resolution (20 _ 20) square field ofview [Campillo and Shapiro 1987 Itatani et al 2002 Shiragaet al 1995 Gelbart et al 2002 Kodama et al 1999 Qu et al2006] They have also been used as line scanning devices forimage transmission through highly scattering turbid media byrecording the ballistic photons which travel a straight path throughthe scatterer and thus arrive first on the sensor [Hebden 1993]

Figure 3 Left Photograph of our ultrafast imaging system setup The DSLR camera takes a conventional photo for comparison RightTime sequence illustrating the arrival of the pulse striking a diffuser its transformation into a spherical energy front and its propagationthrough the scene The corresponding captured scene is shown in Figure 10 (top row)

principles that we develop in this paper for the purpose of transientimaging were first demonstrated by Velten et al [2012c] Recentlyphotonic mixer devices along with nonlinear optimization havealso been used in this context [Heide et al 2013]Our system can record and reconstruct space-time world informationof incoherent light propagation in free-space table-top scenesat a resolution of up to 672 _ 1000 pixels and under 2 picosecondsper frame The varied range and complexity of the scenes

we capture allow us to visualize the dynamics of global illuminationeffects such as scattering specular reflections interreflectionssubsurface scattering caustics and diffractionTime-Resolved Imaging Recent advances in time-resolvedimaging have been exploited to recover geometry and motionaround corners [Raskar and Davis 2008 Kirmani et al 2011 Veltenet al 2012b Velten et al 2012a Gupta et al 2012 Pandharkaret al 2011] and albedo of from single view point [Naik et al 2011]But none of them explored the idea of capturing videos of light inmotion in direct view and have some fundamental limitations (suchas capturing only third-bounce light) that make them unsuitable forthe present purpose Wu et al [2012a] separate direct and global illuminationcomponents from time-resolved data captured with thesystem we describe in this paper by analyzing the time profile ofeach pixel In a recent publication [Wu et al 2012b] the authorspresent an analysis on transient light transport in frequency spaceand show how it can be applied to bare-sensor imaging Chapter 3 Capturing Space-Time PlanesWe capture time scales orders of magnitude faster than the exposuretimes of conventional cameras in which photons reaching thesensor at different times are integrated into a single value makingit impossible to observe ultrafast optical phenomena The systemdescribed in this paper has an effective exposure time down to 185ps since light travels at 03 mmps light travels approximately 05mm between frames in our reconstructed moviesSystem An ultrafast setup must overcome several difficulties inorder to accurately measure a high-resolution (both in space and

time) image First for an unamplified laser pulse a single exposuretime of less than 2 ps would not collect enough light so the SNRwould be unworkably low As an example for a table-top sceneilluminated by a 100Wbulb only about 1 photon on average wouldreach the sensor during a 2 ps open-shutter period Second becauseof the time scales involved synchronization of the sensor and theillumination must be executed within picosecond precision Thirdstandalone streak sensors sacrifice the vertical spatial dimension inorder to code the time dimension thus producing x-t images Asa consequence their field of view is reduced to a single horizontalline of view of the scene We solve these problems with our ultrafast imaging system outlinedin Figure 2 (A photograph of the actual setup is shown inFigure 3 (left)) The light source is a femtosecond (fs) Kerr lensmode-locked TiSapphire laser which emits 50-fs with a centerwavelength of 795 nm at a repetition rate of 75 MHz and averagepower of 500 mW In order to see ultrafast events in a scene withmacro-scaled objects we focus the light with a lens onto a Lambertiandiffuser which then acts as a point light source and illuminatesthe entire scene with a spherically-shaped pulse (see Figure3 (right)) Alternatively if we want to observe pulse propagationitself rather than the interactions with large objects we direct thelaser beam across the field of view of the camera through a scatteringmedium (see the bottle scene in Figure 1)Because all the pulses are statistically identical we can record thescattered light from many of them and integrate the measurementsto average out any noise The result is a signal with a high SNR Tosynchronize this illumination with the streak sensor (HamamatsuC5680 [Hamamatsu 2012]) we split off a portion of the beam with

a glass slide and direct it onto a fast photodetector connected to thesensor so that now both detector and illumination operate synchronously(see Figure 2 (a))Capturing space-time planes The streak sensor then capturesan x-t image of a certain scanline (ie a line of pixels in the horizontaldimension) of the scene with a space-time resolution of672 _ 512 The exact time resolution depends on the amplificaamplificationof an internal sweep voltage signal applied to the streak sensorWith our hardware it can be adjusted from 030 ps to 507 ps Practicallywe choose the fastest resolution that still allows for captureof the entire duration of the event In the streak sensor a photocathodeconverts incoming photons arriving from each spatial locationin the scanline into electrons The streak sensor generates the x-timage by deflecting these electrons according to the time of theirarrival to different positions along the t-dimension of the sensor(see Figure 2(b) and 2(c)) This is achieved by means of rapidlychanging the sweep voltage between the electrodes in the sensorFor each horizontal scanline the camera records a scene illuminatedby the pulse and averages the light scattered by 45 _ 108

pulses (see Figure 2(d) and 2(e))Performance Validation To characterize the streak sensor wecompare sensor measurements with known geometry and verify thelinearity reproducibility and calibration of the time measurementsTo do this we first capture a streak image of a scanline of a simplescene a plane being illuminated by the laser after hitting the diffuser

(see Figure 4 (left)) Then by using a Faro digitizer arm [Faro2012] we obtain the ground truth geometry of the points along that

plane and of the point of the diffuser hit by the laser this allows usto compute the total travel time per path (diffuser-plane-streak sensor)for each pixel in the scanline We then compare the travel timecaptured by our streak sensor with the real travel time computed from the known geometry

Chapter 4 Capturing Space-Time VolumesAlthough the synchronized pulsed measurements overcome SNRissues the streak sensor still provides only a one-dimensionalmovie Extension to two dimensions requires unfeasible bandwidthsa typical dimension is roughly 103 pixels so a threedimensionaldata cube has 109 elements Recording such a largequantity in a 1010485769 second (1 ns) time widow requires a bandwidthof 1018 bytes far beyond typical available bandwidthsWe solve this acquisition problem by again utilizing the synchronizedrepeatability of the hardware A mirror-scanning system (two9 cm _ 13 cm mirrors see Figure 3 (left)) rotates the camerarsquos centerof projection so that it records horizontal slices of a scene sequentiallyWe use a computer-controlled one-rpm servo motor torotate one of the mirrors and consequently scan the field of viewvertically The scenes are about 25 cm wide and placed about 1

meter from the camera With high gear ratios (up to 11000) thecontinuous rotation of the mirror is slow enough to allow the camerato record each line for about six seconds requiring about onehour for 600 lines (our video resolution) We generally capture extralines above and below the scene (up to 1000 lines) and thencrop them to match the aspect ratio of the physical scenes beforethe movie was reconstructedThese resulting images are combined into one matrixMijk wherei = 1672 and k = 1512 are the dimensions of the individualx-t streak images and j = 11000 addresses the second spatialdimension y For a given time instant k the submatrix Nij containsa two-dimensional image of the scene with a resolution of 672 _1000 pixels exposed for as short to 185 ps Combining the x-tslices of the scene for each scanline yields a 3D x-y-t data volumeas shown in Figure 5 (left) An x-y slice represents one frame of the

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 4: Gaurav.report on femto photography

TABLE OF CONTENTS

I Certificate i

II Acknowledgement ii

Chapter 1

Femto photography An Introduction

11 Introduction

12 Femto photography

13 Challenges

14 Contribution

15 Limitation

Chapter 2

RELATED WORKS

II1Time resolved Imaging

Chapter 3

Capturing Space-Time Planes

31System

32 Capturing space-time planes

33Performance Validation

Chapter 4

Capturing Space-Time Volumes

Chapter 5

Depicting Ultrafast Videos in 2D

51IPeak Time Images

52 Integral Photo Fusion

Chapter 6

Time Unwarping

Chapter 7

Captured Scenes

71Bottle

72Crystal

Chapter 8

CONCLUSION

Chapter 9

REFERENCES

ABSTRACT

We present femto-photography a novel imaging technique to captureand visualize the propagation of light With an effective exposuretime of 185 picoseconds (ps) per frame we reconstruct moviesof ultrafast events at an equivalent resolution of about one half trillionframes per second Because cameras with this shutter speeddo not exist we re-purpose modern imaging hardware to record anensemble average of repeatable events that are synchronized to astreak sensor in which the time of arrival of light from the scene iscoded in one of the sensorrsquos spatial dimensions We introduce reconstructionmethods that allow us to visualize the propagation offemtosecond light pulses through macroscopic scenes at such fastresolution we must consider the notion of time-unwarping betweenthe camerarsquos and the worldrsquos space-time coordinate systems to take

into account effects associated with the finite speed of light Weapply our femto-photography technique to visualizations of verydifferent scenes which allow us to observe the rich dynamics oftime-resolved light transport effects including scattering specularreflections diffuse interreflections diffraction caustics and subsurfacescattering Our work has potential applications in artisticeducational and scientific visualizations industrial imaging to analyzematerial properties and medical imaging to reconstruct subsurfaceelements In addition our time-resolved technique may motivatenew forms of computational photography

Chapter 1 Introduction

Forward and inverse analysis of light transport plays an important

role in diverse fields such as computer graphics computer visionand scientific imaging Because conventional imaging hardware isslow compared to the speed of light traditional computer graphicsand computer vision algorithms typically analyze transport usinglow time-resolution photos Consequently any information that isencoded in the time delays of light propagation is lost Whereas thejoint design of novel optical hardware and smart computation iecomputational photography has expanded the way we capture analyze and understand visual information speed-of-light propagationhas been largely unexplored In this paper we present a novel ultrafastimaging technique which we term femto-photography consistingof femtosecond laser illumination picosecond-accurate detectorsand mathematical reconstruction techniques to allow us to visualizemovies of light in motion as it travels through a scene withan effective framerate of about one half trillion frames per secondThis allows us to see for instance a light pulse scattering inside aplastic bottle or image formation in a mirror as a function of time

Challenges Developing such time-resolved system is a challengingproblem for several reasons that are under-appreciated in conventionalmethods (a) brute-force time exposures under 2 ps yieldan impractical signal-to-noise (SNR) ratio (b) suitable cameras torecord 2D image sequences at this time resolution do not exist dueto sensor bandwidth limitations (c) comprehensible visualizationof the captured time-resolved data is non-trivial and (d) direct measurementsof events appear warped in space-time because the finitespeed of light implies that the recorded light propagation delay dependson camera position relative to the sceneContributions Our main contribution is in addressing these challengesand creating a first prototype as follows_ We exploit the statistical similarity of periodic light transportevents to record multiple ultrashort exposure times of onedimensionalviews (Section 3)_ We introduce a novel hardware implementation to sweep theexposures across a vertical field of view to build 3D spacetimedata volumes (Section 4)_ We create techniques for comprehensible visualization includingmovies showing the dynamics of real-world lighttransport phenomena (including reflections scattering diffuseinter-reflections or beam diffraction) and the notion ofpeak-time which partially overcomes the low-frequency appearanceof integrated global light transport (Section 5)_ We introduce a time-unwarping technique to correct the distortionsin captured time-resolved information due to the finitespeed of light (Section 6)

Limitations Although not conceptual our setup has several practicallimitations primarily due to the limited SNR of scattered light Since the hardware elements in our system were originally designedfor different purposes it is not optimized for efficiency and suffersfrom low optical throughput (eg the detector is optimized for 500nm visible light while the infrared laser wavelength we use is 795nm) and from dynamic range limitations This lengthens the totalrecording time to approximately one hour Furthermore the scanningmirror rotating continuously introduces some blurring in thedata along the scanned (vertical) dimension Future optimized systemscan overcome these limitations

Chapter 2 Related WorkUltrafast Devices The fastest 2D continuous real-timemonochromatic camera operates at hundreds of nanoseconds perframe [Goda et al 2009] (about 6_ 106 frames per second) with aspatial resolution of 200_200 pixels less than one third of what weachieve Avalanche photodetector (APD) arrays can reach temporalresolutions of several tens of picoseconds if they are used in aphoton starved regime where only a single photon hits a detectorwithin a time window of tens of nanoseconds [Charbon 2007]Repetitive illumination techniques used in incoherent LiDAR [Tou1995 Gelbart et al 2002] use cameras with typical exposure timeson the order of hundreds of picoseconds [Busck and Heiselberg2004 Colaccedilo et al 2012] two orders of magnitude slower thanour system Liquid nonlinear shutters actuated with powerful laserpulses have been used to capture single analog frames imaginglight pulses at picosecond time resolution [Duguay and Mattick1971] Other sensors that use a coherent phase relation betweenthe illumination and the detected light such as optical coherencetomography (OCT) [Huang et al 1991] coherent LiDAR [Xia

and Zhang 2009] light-in-flight holography [Abramson 1978]or white light interferometry [Wyant 2002] achieve femtosecondresolutions however they require light to maintain coherence(ie wave interference effects) during light transport and aretherefore unsuitable for indirect illumination in which diffusereflections remove coherence from the light Simple streak sensorscapture incoherent light at picosecond to nanosecond speeds butare limited to a line or low resolution (20 _ 20) square field ofview [Campillo and Shapiro 1987 Itatani et al 2002 Shiragaet al 1995 Gelbart et al 2002 Kodama et al 1999 Qu et al2006] They have also been used as line scanning devices forimage transmission through highly scattering turbid media byrecording the ballistic photons which travel a straight path throughthe scatterer and thus arrive first on the sensor [Hebden 1993]

Figure 3 Left Photograph of our ultrafast imaging system setup The DSLR camera takes a conventional photo for comparison RightTime sequence illustrating the arrival of the pulse striking a diffuser its transformation into a spherical energy front and its propagationthrough the scene The corresponding captured scene is shown in Figure 10 (top row)

principles that we develop in this paper for the purpose of transientimaging were first demonstrated by Velten et al [2012c] Recentlyphotonic mixer devices along with nonlinear optimization havealso been used in this context [Heide et al 2013]Our system can record and reconstruct space-time world informationof incoherent light propagation in free-space table-top scenesat a resolution of up to 672 _ 1000 pixels and under 2 picosecondsper frame The varied range and complexity of the scenes

we capture allow us to visualize the dynamics of global illuminationeffects such as scattering specular reflections interreflectionssubsurface scattering caustics and diffractionTime-Resolved Imaging Recent advances in time-resolvedimaging have been exploited to recover geometry and motionaround corners [Raskar and Davis 2008 Kirmani et al 2011 Veltenet al 2012b Velten et al 2012a Gupta et al 2012 Pandharkaret al 2011] and albedo of from single view point [Naik et al 2011]But none of them explored the idea of capturing videos of light inmotion in direct view and have some fundamental limitations (suchas capturing only third-bounce light) that make them unsuitable forthe present purpose Wu et al [2012a] separate direct and global illuminationcomponents from time-resolved data captured with thesystem we describe in this paper by analyzing the time profile ofeach pixel In a recent publication [Wu et al 2012b] the authorspresent an analysis on transient light transport in frequency spaceand show how it can be applied to bare-sensor imaging Chapter 3 Capturing Space-Time PlanesWe capture time scales orders of magnitude faster than the exposuretimes of conventional cameras in which photons reaching thesensor at different times are integrated into a single value makingit impossible to observe ultrafast optical phenomena The systemdescribed in this paper has an effective exposure time down to 185ps since light travels at 03 mmps light travels approximately 05mm between frames in our reconstructed moviesSystem An ultrafast setup must overcome several difficulties inorder to accurately measure a high-resolution (both in space and

time) image First for an unamplified laser pulse a single exposuretime of less than 2 ps would not collect enough light so the SNRwould be unworkably low As an example for a table-top sceneilluminated by a 100Wbulb only about 1 photon on average wouldreach the sensor during a 2 ps open-shutter period Second becauseof the time scales involved synchronization of the sensor and theillumination must be executed within picosecond precision Thirdstandalone streak sensors sacrifice the vertical spatial dimension inorder to code the time dimension thus producing x-t images Asa consequence their field of view is reduced to a single horizontalline of view of the scene We solve these problems with our ultrafast imaging system outlinedin Figure 2 (A photograph of the actual setup is shown inFigure 3 (left)) The light source is a femtosecond (fs) Kerr lensmode-locked TiSapphire laser which emits 50-fs with a centerwavelength of 795 nm at a repetition rate of 75 MHz and averagepower of 500 mW In order to see ultrafast events in a scene withmacro-scaled objects we focus the light with a lens onto a Lambertiandiffuser which then acts as a point light source and illuminatesthe entire scene with a spherically-shaped pulse (see Figure3 (right)) Alternatively if we want to observe pulse propagationitself rather than the interactions with large objects we direct thelaser beam across the field of view of the camera through a scatteringmedium (see the bottle scene in Figure 1)Because all the pulses are statistically identical we can record thescattered light from many of them and integrate the measurementsto average out any noise The result is a signal with a high SNR Tosynchronize this illumination with the streak sensor (HamamatsuC5680 [Hamamatsu 2012]) we split off a portion of the beam with

a glass slide and direct it onto a fast photodetector connected to thesensor so that now both detector and illumination operate synchronously(see Figure 2 (a))Capturing space-time planes The streak sensor then capturesan x-t image of a certain scanline (ie a line of pixels in the horizontaldimension) of the scene with a space-time resolution of672 _ 512 The exact time resolution depends on the amplificaamplificationof an internal sweep voltage signal applied to the streak sensorWith our hardware it can be adjusted from 030 ps to 507 ps Practicallywe choose the fastest resolution that still allows for captureof the entire duration of the event In the streak sensor a photocathodeconverts incoming photons arriving from each spatial locationin the scanline into electrons The streak sensor generates the x-timage by deflecting these electrons according to the time of theirarrival to different positions along the t-dimension of the sensor(see Figure 2(b) and 2(c)) This is achieved by means of rapidlychanging the sweep voltage between the electrodes in the sensorFor each horizontal scanline the camera records a scene illuminatedby the pulse and averages the light scattered by 45 _ 108

pulses (see Figure 2(d) and 2(e))Performance Validation To characterize the streak sensor wecompare sensor measurements with known geometry and verify thelinearity reproducibility and calibration of the time measurementsTo do this we first capture a streak image of a scanline of a simplescene a plane being illuminated by the laser after hitting the diffuser

(see Figure 4 (left)) Then by using a Faro digitizer arm [Faro2012] we obtain the ground truth geometry of the points along that

plane and of the point of the diffuser hit by the laser this allows usto compute the total travel time per path (diffuser-plane-streak sensor)for each pixel in the scanline We then compare the travel timecaptured by our streak sensor with the real travel time computed from the known geometry

Chapter 4 Capturing Space-Time VolumesAlthough the synchronized pulsed measurements overcome SNRissues the streak sensor still provides only a one-dimensionalmovie Extension to two dimensions requires unfeasible bandwidthsa typical dimension is roughly 103 pixels so a threedimensionaldata cube has 109 elements Recording such a largequantity in a 1010485769 second (1 ns) time widow requires a bandwidthof 1018 bytes far beyond typical available bandwidthsWe solve this acquisition problem by again utilizing the synchronizedrepeatability of the hardware A mirror-scanning system (two9 cm _ 13 cm mirrors see Figure 3 (left)) rotates the camerarsquos centerof projection so that it records horizontal slices of a scene sequentiallyWe use a computer-controlled one-rpm servo motor torotate one of the mirrors and consequently scan the field of viewvertically The scenes are about 25 cm wide and placed about 1

meter from the camera With high gear ratios (up to 11000) thecontinuous rotation of the mirror is slow enough to allow the camerato record each line for about six seconds requiring about onehour for 600 lines (our video resolution) We generally capture extralines above and below the scene (up to 1000 lines) and thencrop them to match the aspect ratio of the physical scenes beforethe movie was reconstructedThese resulting images are combined into one matrixMijk wherei = 1672 and k = 1512 are the dimensions of the individualx-t streak images and j = 11000 addresses the second spatialdimension y For a given time instant k the submatrix Nij containsa two-dimensional image of the scene with a resolution of 672 _1000 pixels exposed for as short to 185 ps Combining the x-tslices of the scene for each scanline yields a 3D x-y-t data volumeas shown in Figure 5 (left) An x-y slice represents one frame of the

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 5: Gaurav.report on femto photography

Chapter 4

Capturing Space-Time Volumes

Chapter 5

Depicting Ultrafast Videos in 2D

51IPeak Time Images

52 Integral Photo Fusion

Chapter 6

Time Unwarping

Chapter 7

Captured Scenes

71Bottle

72Crystal

Chapter 8

CONCLUSION

Chapter 9

REFERENCES

ABSTRACT

We present femto-photography a novel imaging technique to captureand visualize the propagation of light With an effective exposuretime of 185 picoseconds (ps) per frame we reconstruct moviesof ultrafast events at an equivalent resolution of about one half trillionframes per second Because cameras with this shutter speeddo not exist we re-purpose modern imaging hardware to record anensemble average of repeatable events that are synchronized to astreak sensor in which the time of arrival of light from the scene iscoded in one of the sensorrsquos spatial dimensions We introduce reconstructionmethods that allow us to visualize the propagation offemtosecond light pulses through macroscopic scenes at such fastresolution we must consider the notion of time-unwarping betweenthe camerarsquos and the worldrsquos space-time coordinate systems to take

into account effects associated with the finite speed of light Weapply our femto-photography technique to visualizations of verydifferent scenes which allow us to observe the rich dynamics oftime-resolved light transport effects including scattering specularreflections diffuse interreflections diffraction caustics and subsurfacescattering Our work has potential applications in artisticeducational and scientific visualizations industrial imaging to analyzematerial properties and medical imaging to reconstruct subsurfaceelements In addition our time-resolved technique may motivatenew forms of computational photography

Chapter 1 Introduction

Forward and inverse analysis of light transport plays an important

role in diverse fields such as computer graphics computer visionand scientific imaging Because conventional imaging hardware isslow compared to the speed of light traditional computer graphicsand computer vision algorithms typically analyze transport usinglow time-resolution photos Consequently any information that isencoded in the time delays of light propagation is lost Whereas thejoint design of novel optical hardware and smart computation iecomputational photography has expanded the way we capture analyze and understand visual information speed-of-light propagationhas been largely unexplored In this paper we present a novel ultrafastimaging technique which we term femto-photography consistingof femtosecond laser illumination picosecond-accurate detectorsand mathematical reconstruction techniques to allow us to visualizemovies of light in motion as it travels through a scene withan effective framerate of about one half trillion frames per secondThis allows us to see for instance a light pulse scattering inside aplastic bottle or image formation in a mirror as a function of time

Challenges Developing such time-resolved system is a challengingproblem for several reasons that are under-appreciated in conventionalmethods (a) brute-force time exposures under 2 ps yieldan impractical signal-to-noise (SNR) ratio (b) suitable cameras torecord 2D image sequences at this time resolution do not exist dueto sensor bandwidth limitations (c) comprehensible visualizationof the captured time-resolved data is non-trivial and (d) direct measurementsof events appear warped in space-time because the finitespeed of light implies that the recorded light propagation delay dependson camera position relative to the sceneContributions Our main contribution is in addressing these challengesand creating a first prototype as follows_ We exploit the statistical similarity of periodic light transportevents to record multiple ultrashort exposure times of onedimensionalviews (Section 3)_ We introduce a novel hardware implementation to sweep theexposures across a vertical field of view to build 3D spacetimedata volumes (Section 4)_ We create techniques for comprehensible visualization includingmovies showing the dynamics of real-world lighttransport phenomena (including reflections scattering diffuseinter-reflections or beam diffraction) and the notion ofpeak-time which partially overcomes the low-frequency appearanceof integrated global light transport (Section 5)_ We introduce a time-unwarping technique to correct the distortionsin captured time-resolved information due to the finitespeed of light (Section 6)

Limitations Although not conceptual our setup has several practicallimitations primarily due to the limited SNR of scattered light Since the hardware elements in our system were originally designedfor different purposes it is not optimized for efficiency and suffersfrom low optical throughput (eg the detector is optimized for 500nm visible light while the infrared laser wavelength we use is 795nm) and from dynamic range limitations This lengthens the totalrecording time to approximately one hour Furthermore the scanningmirror rotating continuously introduces some blurring in thedata along the scanned (vertical) dimension Future optimized systemscan overcome these limitations

Chapter 2 Related WorkUltrafast Devices The fastest 2D continuous real-timemonochromatic camera operates at hundreds of nanoseconds perframe [Goda et al 2009] (about 6_ 106 frames per second) with aspatial resolution of 200_200 pixels less than one third of what weachieve Avalanche photodetector (APD) arrays can reach temporalresolutions of several tens of picoseconds if they are used in aphoton starved regime where only a single photon hits a detectorwithin a time window of tens of nanoseconds [Charbon 2007]Repetitive illumination techniques used in incoherent LiDAR [Tou1995 Gelbart et al 2002] use cameras with typical exposure timeson the order of hundreds of picoseconds [Busck and Heiselberg2004 Colaccedilo et al 2012] two orders of magnitude slower thanour system Liquid nonlinear shutters actuated with powerful laserpulses have been used to capture single analog frames imaginglight pulses at picosecond time resolution [Duguay and Mattick1971] Other sensors that use a coherent phase relation betweenthe illumination and the detected light such as optical coherencetomography (OCT) [Huang et al 1991] coherent LiDAR [Xia

and Zhang 2009] light-in-flight holography [Abramson 1978]or white light interferometry [Wyant 2002] achieve femtosecondresolutions however they require light to maintain coherence(ie wave interference effects) during light transport and aretherefore unsuitable for indirect illumination in which diffusereflections remove coherence from the light Simple streak sensorscapture incoherent light at picosecond to nanosecond speeds butare limited to a line or low resolution (20 _ 20) square field ofview [Campillo and Shapiro 1987 Itatani et al 2002 Shiragaet al 1995 Gelbart et al 2002 Kodama et al 1999 Qu et al2006] They have also been used as line scanning devices forimage transmission through highly scattering turbid media byrecording the ballistic photons which travel a straight path throughthe scatterer and thus arrive first on the sensor [Hebden 1993]

Figure 3 Left Photograph of our ultrafast imaging system setup The DSLR camera takes a conventional photo for comparison RightTime sequence illustrating the arrival of the pulse striking a diffuser its transformation into a spherical energy front and its propagationthrough the scene The corresponding captured scene is shown in Figure 10 (top row)

principles that we develop in this paper for the purpose of transientimaging were first demonstrated by Velten et al [2012c] Recentlyphotonic mixer devices along with nonlinear optimization havealso been used in this context [Heide et al 2013]Our system can record and reconstruct space-time world informationof incoherent light propagation in free-space table-top scenesat a resolution of up to 672 _ 1000 pixels and under 2 picosecondsper frame The varied range and complexity of the scenes

we capture allow us to visualize the dynamics of global illuminationeffects such as scattering specular reflections interreflectionssubsurface scattering caustics and diffractionTime-Resolved Imaging Recent advances in time-resolvedimaging have been exploited to recover geometry and motionaround corners [Raskar and Davis 2008 Kirmani et al 2011 Veltenet al 2012b Velten et al 2012a Gupta et al 2012 Pandharkaret al 2011] and albedo of from single view point [Naik et al 2011]But none of them explored the idea of capturing videos of light inmotion in direct view and have some fundamental limitations (suchas capturing only third-bounce light) that make them unsuitable forthe present purpose Wu et al [2012a] separate direct and global illuminationcomponents from time-resolved data captured with thesystem we describe in this paper by analyzing the time profile ofeach pixel In a recent publication [Wu et al 2012b] the authorspresent an analysis on transient light transport in frequency spaceand show how it can be applied to bare-sensor imaging Chapter 3 Capturing Space-Time PlanesWe capture time scales orders of magnitude faster than the exposuretimes of conventional cameras in which photons reaching thesensor at different times are integrated into a single value makingit impossible to observe ultrafast optical phenomena The systemdescribed in this paper has an effective exposure time down to 185ps since light travels at 03 mmps light travels approximately 05mm between frames in our reconstructed moviesSystem An ultrafast setup must overcome several difficulties inorder to accurately measure a high-resolution (both in space and

time) image First for an unamplified laser pulse a single exposuretime of less than 2 ps would not collect enough light so the SNRwould be unworkably low As an example for a table-top sceneilluminated by a 100Wbulb only about 1 photon on average wouldreach the sensor during a 2 ps open-shutter period Second becauseof the time scales involved synchronization of the sensor and theillumination must be executed within picosecond precision Thirdstandalone streak sensors sacrifice the vertical spatial dimension inorder to code the time dimension thus producing x-t images Asa consequence their field of view is reduced to a single horizontalline of view of the scene We solve these problems with our ultrafast imaging system outlinedin Figure 2 (A photograph of the actual setup is shown inFigure 3 (left)) The light source is a femtosecond (fs) Kerr lensmode-locked TiSapphire laser which emits 50-fs with a centerwavelength of 795 nm at a repetition rate of 75 MHz and averagepower of 500 mW In order to see ultrafast events in a scene withmacro-scaled objects we focus the light with a lens onto a Lambertiandiffuser which then acts as a point light source and illuminatesthe entire scene with a spherically-shaped pulse (see Figure3 (right)) Alternatively if we want to observe pulse propagationitself rather than the interactions with large objects we direct thelaser beam across the field of view of the camera through a scatteringmedium (see the bottle scene in Figure 1)Because all the pulses are statistically identical we can record thescattered light from many of them and integrate the measurementsto average out any noise The result is a signal with a high SNR Tosynchronize this illumination with the streak sensor (HamamatsuC5680 [Hamamatsu 2012]) we split off a portion of the beam with

a glass slide and direct it onto a fast photodetector connected to thesensor so that now both detector and illumination operate synchronously(see Figure 2 (a))Capturing space-time planes The streak sensor then capturesan x-t image of a certain scanline (ie a line of pixels in the horizontaldimension) of the scene with a space-time resolution of672 _ 512 The exact time resolution depends on the amplificaamplificationof an internal sweep voltage signal applied to the streak sensorWith our hardware it can be adjusted from 030 ps to 507 ps Practicallywe choose the fastest resolution that still allows for captureof the entire duration of the event In the streak sensor a photocathodeconverts incoming photons arriving from each spatial locationin the scanline into electrons The streak sensor generates the x-timage by deflecting these electrons according to the time of theirarrival to different positions along the t-dimension of the sensor(see Figure 2(b) and 2(c)) This is achieved by means of rapidlychanging the sweep voltage between the electrodes in the sensorFor each horizontal scanline the camera records a scene illuminatedby the pulse and averages the light scattered by 45 _ 108

pulses (see Figure 2(d) and 2(e))Performance Validation To characterize the streak sensor wecompare sensor measurements with known geometry and verify thelinearity reproducibility and calibration of the time measurementsTo do this we first capture a streak image of a scanline of a simplescene a plane being illuminated by the laser after hitting the diffuser

(see Figure 4 (left)) Then by using a Faro digitizer arm [Faro2012] we obtain the ground truth geometry of the points along that

plane and of the point of the diffuser hit by the laser this allows usto compute the total travel time per path (diffuser-plane-streak sensor)for each pixel in the scanline We then compare the travel timecaptured by our streak sensor with the real travel time computed from the known geometry

Chapter 4 Capturing Space-Time VolumesAlthough the synchronized pulsed measurements overcome SNRissues the streak sensor still provides only a one-dimensionalmovie Extension to two dimensions requires unfeasible bandwidthsa typical dimension is roughly 103 pixels so a threedimensionaldata cube has 109 elements Recording such a largequantity in a 1010485769 second (1 ns) time widow requires a bandwidthof 1018 bytes far beyond typical available bandwidthsWe solve this acquisition problem by again utilizing the synchronizedrepeatability of the hardware A mirror-scanning system (two9 cm _ 13 cm mirrors see Figure 3 (left)) rotates the camerarsquos centerof projection so that it records horizontal slices of a scene sequentiallyWe use a computer-controlled one-rpm servo motor torotate one of the mirrors and consequently scan the field of viewvertically The scenes are about 25 cm wide and placed about 1

meter from the camera With high gear ratios (up to 11000) thecontinuous rotation of the mirror is slow enough to allow the camerato record each line for about six seconds requiring about onehour for 600 lines (our video resolution) We generally capture extralines above and below the scene (up to 1000 lines) and thencrop them to match the aspect ratio of the physical scenes beforethe movie was reconstructedThese resulting images are combined into one matrixMijk wherei = 1672 and k = 1512 are the dimensions of the individualx-t streak images and j = 11000 addresses the second spatialdimension y For a given time instant k the submatrix Nij containsa two-dimensional image of the scene with a resolution of 672 _1000 pixels exposed for as short to 185 ps Combining the x-tslices of the scene for each scanline yields a 3D x-y-t data volumeas shown in Figure 5 (left) An x-y slice represents one frame of the

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 6: Gaurav.report on femto photography

ABSTRACT

We present femto-photography a novel imaging technique to captureand visualize the propagation of light With an effective exposuretime of 185 picoseconds (ps) per frame we reconstruct moviesof ultrafast events at an equivalent resolution of about one half trillionframes per second Because cameras with this shutter speeddo not exist we re-purpose modern imaging hardware to record anensemble average of repeatable events that are synchronized to astreak sensor in which the time of arrival of light from the scene iscoded in one of the sensorrsquos spatial dimensions We introduce reconstructionmethods that allow us to visualize the propagation offemtosecond light pulses through macroscopic scenes at such fastresolution we must consider the notion of time-unwarping betweenthe camerarsquos and the worldrsquos space-time coordinate systems to take

into account effects associated with the finite speed of light Weapply our femto-photography technique to visualizations of verydifferent scenes which allow us to observe the rich dynamics oftime-resolved light transport effects including scattering specularreflections diffuse interreflections diffraction caustics and subsurfacescattering Our work has potential applications in artisticeducational and scientific visualizations industrial imaging to analyzematerial properties and medical imaging to reconstruct subsurfaceelements In addition our time-resolved technique may motivatenew forms of computational photography

Chapter 1 Introduction

Forward and inverse analysis of light transport plays an important

role in diverse fields such as computer graphics computer visionand scientific imaging Because conventional imaging hardware isslow compared to the speed of light traditional computer graphicsand computer vision algorithms typically analyze transport usinglow time-resolution photos Consequently any information that isencoded in the time delays of light propagation is lost Whereas thejoint design of novel optical hardware and smart computation iecomputational photography has expanded the way we capture analyze and understand visual information speed-of-light propagationhas been largely unexplored In this paper we present a novel ultrafastimaging technique which we term femto-photography consistingof femtosecond laser illumination picosecond-accurate detectorsand mathematical reconstruction techniques to allow us to visualizemovies of light in motion as it travels through a scene withan effective framerate of about one half trillion frames per secondThis allows us to see for instance a light pulse scattering inside aplastic bottle or image formation in a mirror as a function of time

Challenges Developing such time-resolved system is a challengingproblem for several reasons that are under-appreciated in conventionalmethods (a) brute-force time exposures under 2 ps yieldan impractical signal-to-noise (SNR) ratio (b) suitable cameras torecord 2D image sequences at this time resolution do not exist dueto sensor bandwidth limitations (c) comprehensible visualizationof the captured time-resolved data is non-trivial and (d) direct measurementsof events appear warped in space-time because the finitespeed of light implies that the recorded light propagation delay dependson camera position relative to the sceneContributions Our main contribution is in addressing these challengesand creating a first prototype as follows_ We exploit the statistical similarity of periodic light transportevents to record multiple ultrashort exposure times of onedimensionalviews (Section 3)_ We introduce a novel hardware implementation to sweep theexposures across a vertical field of view to build 3D spacetimedata volumes (Section 4)_ We create techniques for comprehensible visualization includingmovies showing the dynamics of real-world lighttransport phenomena (including reflections scattering diffuseinter-reflections or beam diffraction) and the notion ofpeak-time which partially overcomes the low-frequency appearanceof integrated global light transport (Section 5)_ We introduce a time-unwarping technique to correct the distortionsin captured time-resolved information due to the finitespeed of light (Section 6)

Limitations Although not conceptual our setup has several practicallimitations primarily due to the limited SNR of scattered light Since the hardware elements in our system were originally designedfor different purposes it is not optimized for efficiency and suffersfrom low optical throughput (eg the detector is optimized for 500nm visible light while the infrared laser wavelength we use is 795nm) and from dynamic range limitations This lengthens the totalrecording time to approximately one hour Furthermore the scanningmirror rotating continuously introduces some blurring in thedata along the scanned (vertical) dimension Future optimized systemscan overcome these limitations

Chapter 2 Related WorkUltrafast Devices The fastest 2D continuous real-timemonochromatic camera operates at hundreds of nanoseconds perframe [Goda et al 2009] (about 6_ 106 frames per second) with aspatial resolution of 200_200 pixels less than one third of what weachieve Avalanche photodetector (APD) arrays can reach temporalresolutions of several tens of picoseconds if they are used in aphoton starved regime where only a single photon hits a detectorwithin a time window of tens of nanoseconds [Charbon 2007]Repetitive illumination techniques used in incoherent LiDAR [Tou1995 Gelbart et al 2002] use cameras with typical exposure timeson the order of hundreds of picoseconds [Busck and Heiselberg2004 Colaccedilo et al 2012] two orders of magnitude slower thanour system Liquid nonlinear shutters actuated with powerful laserpulses have been used to capture single analog frames imaginglight pulses at picosecond time resolution [Duguay and Mattick1971] Other sensors that use a coherent phase relation betweenthe illumination and the detected light such as optical coherencetomography (OCT) [Huang et al 1991] coherent LiDAR [Xia

and Zhang 2009] light-in-flight holography [Abramson 1978]or white light interferometry [Wyant 2002] achieve femtosecondresolutions however they require light to maintain coherence(ie wave interference effects) during light transport and aretherefore unsuitable for indirect illumination in which diffusereflections remove coherence from the light Simple streak sensorscapture incoherent light at picosecond to nanosecond speeds butare limited to a line or low resolution (20 _ 20) square field ofview [Campillo and Shapiro 1987 Itatani et al 2002 Shiragaet al 1995 Gelbart et al 2002 Kodama et al 1999 Qu et al2006] They have also been used as line scanning devices forimage transmission through highly scattering turbid media byrecording the ballistic photons which travel a straight path throughthe scatterer and thus arrive first on the sensor [Hebden 1993]

Figure 3 Left Photograph of our ultrafast imaging system setup The DSLR camera takes a conventional photo for comparison RightTime sequence illustrating the arrival of the pulse striking a diffuser its transformation into a spherical energy front and its propagationthrough the scene The corresponding captured scene is shown in Figure 10 (top row)

principles that we develop in this paper for the purpose of transientimaging were first demonstrated by Velten et al [2012c] Recentlyphotonic mixer devices along with nonlinear optimization havealso been used in this context [Heide et al 2013]Our system can record and reconstruct space-time world informationof incoherent light propagation in free-space table-top scenesat a resolution of up to 672 _ 1000 pixels and under 2 picosecondsper frame The varied range and complexity of the scenes

we capture allow us to visualize the dynamics of global illuminationeffects such as scattering specular reflections interreflectionssubsurface scattering caustics and diffractionTime-Resolved Imaging Recent advances in time-resolvedimaging have been exploited to recover geometry and motionaround corners [Raskar and Davis 2008 Kirmani et al 2011 Veltenet al 2012b Velten et al 2012a Gupta et al 2012 Pandharkaret al 2011] and albedo of from single view point [Naik et al 2011]But none of them explored the idea of capturing videos of light inmotion in direct view and have some fundamental limitations (suchas capturing only third-bounce light) that make them unsuitable forthe present purpose Wu et al [2012a] separate direct and global illuminationcomponents from time-resolved data captured with thesystem we describe in this paper by analyzing the time profile ofeach pixel In a recent publication [Wu et al 2012b] the authorspresent an analysis on transient light transport in frequency spaceand show how it can be applied to bare-sensor imaging Chapter 3 Capturing Space-Time PlanesWe capture time scales orders of magnitude faster than the exposuretimes of conventional cameras in which photons reaching thesensor at different times are integrated into a single value makingit impossible to observe ultrafast optical phenomena The systemdescribed in this paper has an effective exposure time down to 185ps since light travels at 03 mmps light travels approximately 05mm between frames in our reconstructed moviesSystem An ultrafast setup must overcome several difficulties inorder to accurately measure a high-resolution (both in space and

time) image First for an unamplified laser pulse a single exposuretime of less than 2 ps would not collect enough light so the SNRwould be unworkably low As an example for a table-top sceneilluminated by a 100Wbulb only about 1 photon on average wouldreach the sensor during a 2 ps open-shutter period Second becauseof the time scales involved synchronization of the sensor and theillumination must be executed within picosecond precision Thirdstandalone streak sensors sacrifice the vertical spatial dimension inorder to code the time dimension thus producing x-t images Asa consequence their field of view is reduced to a single horizontalline of view of the scene We solve these problems with our ultrafast imaging system outlinedin Figure 2 (A photograph of the actual setup is shown inFigure 3 (left)) The light source is a femtosecond (fs) Kerr lensmode-locked TiSapphire laser which emits 50-fs with a centerwavelength of 795 nm at a repetition rate of 75 MHz and averagepower of 500 mW In order to see ultrafast events in a scene withmacro-scaled objects we focus the light with a lens onto a Lambertiandiffuser which then acts as a point light source and illuminatesthe entire scene with a spherically-shaped pulse (see Figure3 (right)) Alternatively if we want to observe pulse propagationitself rather than the interactions with large objects we direct thelaser beam across the field of view of the camera through a scatteringmedium (see the bottle scene in Figure 1)Because all the pulses are statistically identical we can record thescattered light from many of them and integrate the measurementsto average out any noise The result is a signal with a high SNR Tosynchronize this illumination with the streak sensor (HamamatsuC5680 [Hamamatsu 2012]) we split off a portion of the beam with

a glass slide and direct it onto a fast photodetector connected to thesensor so that now both detector and illumination operate synchronously(see Figure 2 (a))Capturing space-time planes The streak sensor then capturesan x-t image of a certain scanline (ie a line of pixels in the horizontaldimension) of the scene with a space-time resolution of672 _ 512 The exact time resolution depends on the amplificaamplificationof an internal sweep voltage signal applied to the streak sensorWith our hardware it can be adjusted from 030 ps to 507 ps Practicallywe choose the fastest resolution that still allows for captureof the entire duration of the event In the streak sensor a photocathodeconverts incoming photons arriving from each spatial locationin the scanline into electrons The streak sensor generates the x-timage by deflecting these electrons according to the time of theirarrival to different positions along the t-dimension of the sensor(see Figure 2(b) and 2(c)) This is achieved by means of rapidlychanging the sweep voltage between the electrodes in the sensorFor each horizontal scanline the camera records a scene illuminatedby the pulse and averages the light scattered by 45 _ 108

pulses (see Figure 2(d) and 2(e))Performance Validation To characterize the streak sensor wecompare sensor measurements with known geometry and verify thelinearity reproducibility and calibration of the time measurementsTo do this we first capture a streak image of a scanline of a simplescene a plane being illuminated by the laser after hitting the diffuser

(see Figure 4 (left)) Then by using a Faro digitizer arm [Faro2012] we obtain the ground truth geometry of the points along that

plane and of the point of the diffuser hit by the laser this allows usto compute the total travel time per path (diffuser-plane-streak sensor)for each pixel in the scanline We then compare the travel timecaptured by our streak sensor with the real travel time computed from the known geometry

Chapter 4 Capturing Space-Time VolumesAlthough the synchronized pulsed measurements overcome SNRissues the streak sensor still provides only a one-dimensionalmovie Extension to two dimensions requires unfeasible bandwidthsa typical dimension is roughly 103 pixels so a threedimensionaldata cube has 109 elements Recording such a largequantity in a 1010485769 second (1 ns) time widow requires a bandwidthof 1018 bytes far beyond typical available bandwidthsWe solve this acquisition problem by again utilizing the synchronizedrepeatability of the hardware A mirror-scanning system (two9 cm _ 13 cm mirrors see Figure 3 (left)) rotates the camerarsquos centerof projection so that it records horizontal slices of a scene sequentiallyWe use a computer-controlled one-rpm servo motor torotate one of the mirrors and consequently scan the field of viewvertically The scenes are about 25 cm wide and placed about 1

meter from the camera With high gear ratios (up to 11000) thecontinuous rotation of the mirror is slow enough to allow the camerato record each line for about six seconds requiring about onehour for 600 lines (our video resolution) We generally capture extralines above and below the scene (up to 1000 lines) and thencrop them to match the aspect ratio of the physical scenes beforethe movie was reconstructedThese resulting images are combined into one matrixMijk wherei = 1672 and k = 1512 are the dimensions of the individualx-t streak images and j = 11000 addresses the second spatialdimension y For a given time instant k the submatrix Nij containsa two-dimensional image of the scene with a resolution of 672 _1000 pixels exposed for as short to 185 ps Combining the x-tslices of the scene for each scanline yields a 3D x-y-t data volumeas shown in Figure 5 (left) An x-y slice represents one frame of the

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 7: Gaurav.report on femto photography

into account effects associated with the finite speed of light Weapply our femto-photography technique to visualizations of verydifferent scenes which allow us to observe the rich dynamics oftime-resolved light transport effects including scattering specularreflections diffuse interreflections diffraction caustics and subsurfacescattering Our work has potential applications in artisticeducational and scientific visualizations industrial imaging to analyzematerial properties and medical imaging to reconstruct subsurfaceelements In addition our time-resolved technique may motivatenew forms of computational photography

Chapter 1 Introduction

Forward and inverse analysis of light transport plays an important

role in diverse fields such as computer graphics computer visionand scientific imaging Because conventional imaging hardware isslow compared to the speed of light traditional computer graphicsand computer vision algorithms typically analyze transport usinglow time-resolution photos Consequently any information that isencoded in the time delays of light propagation is lost Whereas thejoint design of novel optical hardware and smart computation iecomputational photography has expanded the way we capture analyze and understand visual information speed-of-light propagationhas been largely unexplored In this paper we present a novel ultrafastimaging technique which we term femto-photography consistingof femtosecond laser illumination picosecond-accurate detectorsand mathematical reconstruction techniques to allow us to visualizemovies of light in motion as it travels through a scene withan effective framerate of about one half trillion frames per secondThis allows us to see for instance a light pulse scattering inside aplastic bottle or image formation in a mirror as a function of time

Challenges Developing such time-resolved system is a challengingproblem for several reasons that are under-appreciated in conventionalmethods (a) brute-force time exposures under 2 ps yieldan impractical signal-to-noise (SNR) ratio (b) suitable cameras torecord 2D image sequences at this time resolution do not exist dueto sensor bandwidth limitations (c) comprehensible visualizationof the captured time-resolved data is non-trivial and (d) direct measurementsof events appear warped in space-time because the finitespeed of light implies that the recorded light propagation delay dependson camera position relative to the sceneContributions Our main contribution is in addressing these challengesand creating a first prototype as follows_ We exploit the statistical similarity of periodic light transportevents to record multiple ultrashort exposure times of onedimensionalviews (Section 3)_ We introduce a novel hardware implementation to sweep theexposures across a vertical field of view to build 3D spacetimedata volumes (Section 4)_ We create techniques for comprehensible visualization includingmovies showing the dynamics of real-world lighttransport phenomena (including reflections scattering diffuseinter-reflections or beam diffraction) and the notion ofpeak-time which partially overcomes the low-frequency appearanceof integrated global light transport (Section 5)_ We introduce a time-unwarping technique to correct the distortionsin captured time-resolved information due to the finitespeed of light (Section 6)

Limitations Although not conceptual our setup has several practicallimitations primarily due to the limited SNR of scattered light Since the hardware elements in our system were originally designedfor different purposes it is not optimized for efficiency and suffersfrom low optical throughput (eg the detector is optimized for 500nm visible light while the infrared laser wavelength we use is 795nm) and from dynamic range limitations This lengthens the totalrecording time to approximately one hour Furthermore the scanningmirror rotating continuously introduces some blurring in thedata along the scanned (vertical) dimension Future optimized systemscan overcome these limitations

Chapter 2 Related WorkUltrafast Devices The fastest 2D continuous real-timemonochromatic camera operates at hundreds of nanoseconds perframe [Goda et al 2009] (about 6_ 106 frames per second) with aspatial resolution of 200_200 pixels less than one third of what weachieve Avalanche photodetector (APD) arrays can reach temporalresolutions of several tens of picoseconds if they are used in aphoton starved regime where only a single photon hits a detectorwithin a time window of tens of nanoseconds [Charbon 2007]Repetitive illumination techniques used in incoherent LiDAR [Tou1995 Gelbart et al 2002] use cameras with typical exposure timeson the order of hundreds of picoseconds [Busck and Heiselberg2004 Colaccedilo et al 2012] two orders of magnitude slower thanour system Liquid nonlinear shutters actuated with powerful laserpulses have been used to capture single analog frames imaginglight pulses at picosecond time resolution [Duguay and Mattick1971] Other sensors that use a coherent phase relation betweenthe illumination and the detected light such as optical coherencetomography (OCT) [Huang et al 1991] coherent LiDAR [Xia

and Zhang 2009] light-in-flight holography [Abramson 1978]or white light interferometry [Wyant 2002] achieve femtosecondresolutions however they require light to maintain coherence(ie wave interference effects) during light transport and aretherefore unsuitable for indirect illumination in which diffusereflections remove coherence from the light Simple streak sensorscapture incoherent light at picosecond to nanosecond speeds butare limited to a line or low resolution (20 _ 20) square field ofview [Campillo and Shapiro 1987 Itatani et al 2002 Shiragaet al 1995 Gelbart et al 2002 Kodama et al 1999 Qu et al2006] They have also been used as line scanning devices forimage transmission through highly scattering turbid media byrecording the ballistic photons which travel a straight path throughthe scatterer and thus arrive first on the sensor [Hebden 1993]

Figure 3 Left Photograph of our ultrafast imaging system setup The DSLR camera takes a conventional photo for comparison RightTime sequence illustrating the arrival of the pulse striking a diffuser its transformation into a spherical energy front and its propagationthrough the scene The corresponding captured scene is shown in Figure 10 (top row)

principles that we develop in this paper for the purpose of transientimaging were first demonstrated by Velten et al [2012c] Recentlyphotonic mixer devices along with nonlinear optimization havealso been used in this context [Heide et al 2013]Our system can record and reconstruct space-time world informationof incoherent light propagation in free-space table-top scenesat a resolution of up to 672 _ 1000 pixels and under 2 picosecondsper frame The varied range and complexity of the scenes

we capture allow us to visualize the dynamics of global illuminationeffects such as scattering specular reflections interreflectionssubsurface scattering caustics and diffractionTime-Resolved Imaging Recent advances in time-resolvedimaging have been exploited to recover geometry and motionaround corners [Raskar and Davis 2008 Kirmani et al 2011 Veltenet al 2012b Velten et al 2012a Gupta et al 2012 Pandharkaret al 2011] and albedo of from single view point [Naik et al 2011]But none of them explored the idea of capturing videos of light inmotion in direct view and have some fundamental limitations (suchas capturing only third-bounce light) that make them unsuitable forthe present purpose Wu et al [2012a] separate direct and global illuminationcomponents from time-resolved data captured with thesystem we describe in this paper by analyzing the time profile ofeach pixel In a recent publication [Wu et al 2012b] the authorspresent an analysis on transient light transport in frequency spaceand show how it can be applied to bare-sensor imaging Chapter 3 Capturing Space-Time PlanesWe capture time scales orders of magnitude faster than the exposuretimes of conventional cameras in which photons reaching thesensor at different times are integrated into a single value makingit impossible to observe ultrafast optical phenomena The systemdescribed in this paper has an effective exposure time down to 185ps since light travels at 03 mmps light travels approximately 05mm between frames in our reconstructed moviesSystem An ultrafast setup must overcome several difficulties inorder to accurately measure a high-resolution (both in space and

time) image First for an unamplified laser pulse a single exposuretime of less than 2 ps would not collect enough light so the SNRwould be unworkably low As an example for a table-top sceneilluminated by a 100Wbulb only about 1 photon on average wouldreach the sensor during a 2 ps open-shutter period Second becauseof the time scales involved synchronization of the sensor and theillumination must be executed within picosecond precision Thirdstandalone streak sensors sacrifice the vertical spatial dimension inorder to code the time dimension thus producing x-t images Asa consequence their field of view is reduced to a single horizontalline of view of the scene We solve these problems with our ultrafast imaging system outlinedin Figure 2 (A photograph of the actual setup is shown inFigure 3 (left)) The light source is a femtosecond (fs) Kerr lensmode-locked TiSapphire laser which emits 50-fs with a centerwavelength of 795 nm at a repetition rate of 75 MHz and averagepower of 500 mW In order to see ultrafast events in a scene withmacro-scaled objects we focus the light with a lens onto a Lambertiandiffuser which then acts as a point light source and illuminatesthe entire scene with a spherically-shaped pulse (see Figure3 (right)) Alternatively if we want to observe pulse propagationitself rather than the interactions with large objects we direct thelaser beam across the field of view of the camera through a scatteringmedium (see the bottle scene in Figure 1)Because all the pulses are statistically identical we can record thescattered light from many of them and integrate the measurementsto average out any noise The result is a signal with a high SNR Tosynchronize this illumination with the streak sensor (HamamatsuC5680 [Hamamatsu 2012]) we split off a portion of the beam with

a glass slide and direct it onto a fast photodetector connected to thesensor so that now both detector and illumination operate synchronously(see Figure 2 (a))Capturing space-time planes The streak sensor then capturesan x-t image of a certain scanline (ie a line of pixels in the horizontaldimension) of the scene with a space-time resolution of672 _ 512 The exact time resolution depends on the amplificaamplificationof an internal sweep voltage signal applied to the streak sensorWith our hardware it can be adjusted from 030 ps to 507 ps Practicallywe choose the fastest resolution that still allows for captureof the entire duration of the event In the streak sensor a photocathodeconverts incoming photons arriving from each spatial locationin the scanline into electrons The streak sensor generates the x-timage by deflecting these electrons according to the time of theirarrival to different positions along the t-dimension of the sensor(see Figure 2(b) and 2(c)) This is achieved by means of rapidlychanging the sweep voltage between the electrodes in the sensorFor each horizontal scanline the camera records a scene illuminatedby the pulse and averages the light scattered by 45 _ 108

pulses (see Figure 2(d) and 2(e))Performance Validation To characterize the streak sensor wecompare sensor measurements with known geometry and verify thelinearity reproducibility and calibration of the time measurementsTo do this we first capture a streak image of a scanline of a simplescene a plane being illuminated by the laser after hitting the diffuser

(see Figure 4 (left)) Then by using a Faro digitizer arm [Faro2012] we obtain the ground truth geometry of the points along that

plane and of the point of the diffuser hit by the laser this allows usto compute the total travel time per path (diffuser-plane-streak sensor)for each pixel in the scanline We then compare the travel timecaptured by our streak sensor with the real travel time computed from the known geometry

Chapter 4 Capturing Space-Time VolumesAlthough the synchronized pulsed measurements overcome SNRissues the streak sensor still provides only a one-dimensionalmovie Extension to two dimensions requires unfeasible bandwidthsa typical dimension is roughly 103 pixels so a threedimensionaldata cube has 109 elements Recording such a largequantity in a 1010485769 second (1 ns) time widow requires a bandwidthof 1018 bytes far beyond typical available bandwidthsWe solve this acquisition problem by again utilizing the synchronizedrepeatability of the hardware A mirror-scanning system (two9 cm _ 13 cm mirrors see Figure 3 (left)) rotates the camerarsquos centerof projection so that it records horizontal slices of a scene sequentiallyWe use a computer-controlled one-rpm servo motor torotate one of the mirrors and consequently scan the field of viewvertically The scenes are about 25 cm wide and placed about 1

meter from the camera With high gear ratios (up to 11000) thecontinuous rotation of the mirror is slow enough to allow the camerato record each line for about six seconds requiring about onehour for 600 lines (our video resolution) We generally capture extralines above and below the scene (up to 1000 lines) and thencrop them to match the aspect ratio of the physical scenes beforethe movie was reconstructedThese resulting images are combined into one matrixMijk wherei = 1672 and k = 1512 are the dimensions of the individualx-t streak images and j = 11000 addresses the second spatialdimension y For a given time instant k the submatrix Nij containsa two-dimensional image of the scene with a resolution of 672 _1000 pixels exposed for as short to 185 ps Combining the x-tslices of the scene for each scanline yields a 3D x-y-t data volumeas shown in Figure 5 (left) An x-y slice represents one frame of the

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 8: Gaurav.report on femto photography

role in diverse fields such as computer graphics computer visionand scientific imaging Because conventional imaging hardware isslow compared to the speed of light traditional computer graphicsand computer vision algorithms typically analyze transport usinglow time-resolution photos Consequently any information that isencoded in the time delays of light propagation is lost Whereas thejoint design of novel optical hardware and smart computation iecomputational photography has expanded the way we capture analyze and understand visual information speed-of-light propagationhas been largely unexplored In this paper we present a novel ultrafastimaging technique which we term femto-photography consistingof femtosecond laser illumination picosecond-accurate detectorsand mathematical reconstruction techniques to allow us to visualizemovies of light in motion as it travels through a scene withan effective framerate of about one half trillion frames per secondThis allows us to see for instance a light pulse scattering inside aplastic bottle or image formation in a mirror as a function of time

Challenges Developing such time-resolved system is a challengingproblem for several reasons that are under-appreciated in conventionalmethods (a) brute-force time exposures under 2 ps yieldan impractical signal-to-noise (SNR) ratio (b) suitable cameras torecord 2D image sequences at this time resolution do not exist dueto sensor bandwidth limitations (c) comprehensible visualizationof the captured time-resolved data is non-trivial and (d) direct measurementsof events appear warped in space-time because the finitespeed of light implies that the recorded light propagation delay dependson camera position relative to the sceneContributions Our main contribution is in addressing these challengesand creating a first prototype as follows_ We exploit the statistical similarity of periodic light transportevents to record multiple ultrashort exposure times of onedimensionalviews (Section 3)_ We introduce a novel hardware implementation to sweep theexposures across a vertical field of view to build 3D spacetimedata volumes (Section 4)_ We create techniques for comprehensible visualization includingmovies showing the dynamics of real-world lighttransport phenomena (including reflections scattering diffuseinter-reflections or beam diffraction) and the notion ofpeak-time which partially overcomes the low-frequency appearanceof integrated global light transport (Section 5)_ We introduce a time-unwarping technique to correct the distortionsin captured time-resolved information due to the finitespeed of light (Section 6)

Limitations Although not conceptual our setup has several practicallimitations primarily due to the limited SNR of scattered light Since the hardware elements in our system were originally designedfor different purposes it is not optimized for efficiency and suffersfrom low optical throughput (eg the detector is optimized for 500nm visible light while the infrared laser wavelength we use is 795nm) and from dynamic range limitations This lengthens the totalrecording time to approximately one hour Furthermore the scanningmirror rotating continuously introduces some blurring in thedata along the scanned (vertical) dimension Future optimized systemscan overcome these limitations

Chapter 2 Related WorkUltrafast Devices The fastest 2D continuous real-timemonochromatic camera operates at hundreds of nanoseconds perframe [Goda et al 2009] (about 6_ 106 frames per second) with aspatial resolution of 200_200 pixels less than one third of what weachieve Avalanche photodetector (APD) arrays can reach temporalresolutions of several tens of picoseconds if they are used in aphoton starved regime where only a single photon hits a detectorwithin a time window of tens of nanoseconds [Charbon 2007]Repetitive illumination techniques used in incoherent LiDAR [Tou1995 Gelbart et al 2002] use cameras with typical exposure timeson the order of hundreds of picoseconds [Busck and Heiselberg2004 Colaccedilo et al 2012] two orders of magnitude slower thanour system Liquid nonlinear shutters actuated with powerful laserpulses have been used to capture single analog frames imaginglight pulses at picosecond time resolution [Duguay and Mattick1971] Other sensors that use a coherent phase relation betweenthe illumination and the detected light such as optical coherencetomography (OCT) [Huang et al 1991] coherent LiDAR [Xia

and Zhang 2009] light-in-flight holography [Abramson 1978]or white light interferometry [Wyant 2002] achieve femtosecondresolutions however they require light to maintain coherence(ie wave interference effects) during light transport and aretherefore unsuitable for indirect illumination in which diffusereflections remove coherence from the light Simple streak sensorscapture incoherent light at picosecond to nanosecond speeds butare limited to a line or low resolution (20 _ 20) square field ofview [Campillo and Shapiro 1987 Itatani et al 2002 Shiragaet al 1995 Gelbart et al 2002 Kodama et al 1999 Qu et al2006] They have also been used as line scanning devices forimage transmission through highly scattering turbid media byrecording the ballistic photons which travel a straight path throughthe scatterer and thus arrive first on the sensor [Hebden 1993]

Figure 3 Left Photograph of our ultrafast imaging system setup The DSLR camera takes a conventional photo for comparison RightTime sequence illustrating the arrival of the pulse striking a diffuser its transformation into a spherical energy front and its propagationthrough the scene The corresponding captured scene is shown in Figure 10 (top row)

principles that we develop in this paper for the purpose of transientimaging were first demonstrated by Velten et al [2012c] Recentlyphotonic mixer devices along with nonlinear optimization havealso been used in this context [Heide et al 2013]Our system can record and reconstruct space-time world informationof incoherent light propagation in free-space table-top scenesat a resolution of up to 672 _ 1000 pixels and under 2 picosecondsper frame The varied range and complexity of the scenes

we capture allow us to visualize the dynamics of global illuminationeffects such as scattering specular reflections interreflectionssubsurface scattering caustics and diffractionTime-Resolved Imaging Recent advances in time-resolvedimaging have been exploited to recover geometry and motionaround corners [Raskar and Davis 2008 Kirmani et al 2011 Veltenet al 2012b Velten et al 2012a Gupta et al 2012 Pandharkaret al 2011] and albedo of from single view point [Naik et al 2011]But none of them explored the idea of capturing videos of light inmotion in direct view and have some fundamental limitations (suchas capturing only third-bounce light) that make them unsuitable forthe present purpose Wu et al [2012a] separate direct and global illuminationcomponents from time-resolved data captured with thesystem we describe in this paper by analyzing the time profile ofeach pixel In a recent publication [Wu et al 2012b] the authorspresent an analysis on transient light transport in frequency spaceand show how it can be applied to bare-sensor imaging Chapter 3 Capturing Space-Time PlanesWe capture time scales orders of magnitude faster than the exposuretimes of conventional cameras in which photons reaching thesensor at different times are integrated into a single value makingit impossible to observe ultrafast optical phenomena The systemdescribed in this paper has an effective exposure time down to 185ps since light travels at 03 mmps light travels approximately 05mm between frames in our reconstructed moviesSystem An ultrafast setup must overcome several difficulties inorder to accurately measure a high-resolution (both in space and

time) image First for an unamplified laser pulse a single exposuretime of less than 2 ps would not collect enough light so the SNRwould be unworkably low As an example for a table-top sceneilluminated by a 100Wbulb only about 1 photon on average wouldreach the sensor during a 2 ps open-shutter period Second becauseof the time scales involved synchronization of the sensor and theillumination must be executed within picosecond precision Thirdstandalone streak sensors sacrifice the vertical spatial dimension inorder to code the time dimension thus producing x-t images Asa consequence their field of view is reduced to a single horizontalline of view of the scene We solve these problems with our ultrafast imaging system outlinedin Figure 2 (A photograph of the actual setup is shown inFigure 3 (left)) The light source is a femtosecond (fs) Kerr lensmode-locked TiSapphire laser which emits 50-fs with a centerwavelength of 795 nm at a repetition rate of 75 MHz and averagepower of 500 mW In order to see ultrafast events in a scene withmacro-scaled objects we focus the light with a lens onto a Lambertiandiffuser which then acts as a point light source and illuminatesthe entire scene with a spherically-shaped pulse (see Figure3 (right)) Alternatively if we want to observe pulse propagationitself rather than the interactions with large objects we direct thelaser beam across the field of view of the camera through a scatteringmedium (see the bottle scene in Figure 1)Because all the pulses are statistically identical we can record thescattered light from many of them and integrate the measurementsto average out any noise The result is a signal with a high SNR Tosynchronize this illumination with the streak sensor (HamamatsuC5680 [Hamamatsu 2012]) we split off a portion of the beam with

a glass slide and direct it onto a fast photodetector connected to thesensor so that now both detector and illumination operate synchronously(see Figure 2 (a))Capturing space-time planes The streak sensor then capturesan x-t image of a certain scanline (ie a line of pixels in the horizontaldimension) of the scene with a space-time resolution of672 _ 512 The exact time resolution depends on the amplificaamplificationof an internal sweep voltage signal applied to the streak sensorWith our hardware it can be adjusted from 030 ps to 507 ps Practicallywe choose the fastest resolution that still allows for captureof the entire duration of the event In the streak sensor a photocathodeconverts incoming photons arriving from each spatial locationin the scanline into electrons The streak sensor generates the x-timage by deflecting these electrons according to the time of theirarrival to different positions along the t-dimension of the sensor(see Figure 2(b) and 2(c)) This is achieved by means of rapidlychanging the sweep voltage between the electrodes in the sensorFor each horizontal scanline the camera records a scene illuminatedby the pulse and averages the light scattered by 45 _ 108

pulses (see Figure 2(d) and 2(e))Performance Validation To characterize the streak sensor wecompare sensor measurements with known geometry and verify thelinearity reproducibility and calibration of the time measurementsTo do this we first capture a streak image of a scanline of a simplescene a plane being illuminated by the laser after hitting the diffuser

(see Figure 4 (left)) Then by using a Faro digitizer arm [Faro2012] we obtain the ground truth geometry of the points along that

plane and of the point of the diffuser hit by the laser this allows usto compute the total travel time per path (diffuser-plane-streak sensor)for each pixel in the scanline We then compare the travel timecaptured by our streak sensor with the real travel time computed from the known geometry

Chapter 4 Capturing Space-Time VolumesAlthough the synchronized pulsed measurements overcome SNRissues the streak sensor still provides only a one-dimensionalmovie Extension to two dimensions requires unfeasible bandwidthsa typical dimension is roughly 103 pixels so a threedimensionaldata cube has 109 elements Recording such a largequantity in a 1010485769 second (1 ns) time widow requires a bandwidthof 1018 bytes far beyond typical available bandwidthsWe solve this acquisition problem by again utilizing the synchronizedrepeatability of the hardware A mirror-scanning system (two9 cm _ 13 cm mirrors see Figure 3 (left)) rotates the camerarsquos centerof projection so that it records horizontal slices of a scene sequentiallyWe use a computer-controlled one-rpm servo motor torotate one of the mirrors and consequently scan the field of viewvertically The scenes are about 25 cm wide and placed about 1

meter from the camera With high gear ratios (up to 11000) thecontinuous rotation of the mirror is slow enough to allow the camerato record each line for about six seconds requiring about onehour for 600 lines (our video resolution) We generally capture extralines above and below the scene (up to 1000 lines) and thencrop them to match the aspect ratio of the physical scenes beforethe movie was reconstructedThese resulting images are combined into one matrixMijk wherei = 1672 and k = 1512 are the dimensions of the individualx-t streak images and j = 11000 addresses the second spatialdimension y For a given time instant k the submatrix Nij containsa two-dimensional image of the scene with a resolution of 672 _1000 pixels exposed for as short to 185 ps Combining the x-tslices of the scene for each scanline yields a 3D x-y-t data volumeas shown in Figure 5 (left) An x-y slice represents one frame of the

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 9: Gaurav.report on femto photography

Challenges Developing such time-resolved system is a challengingproblem for several reasons that are under-appreciated in conventionalmethods (a) brute-force time exposures under 2 ps yieldan impractical signal-to-noise (SNR) ratio (b) suitable cameras torecord 2D image sequences at this time resolution do not exist dueto sensor bandwidth limitations (c) comprehensible visualizationof the captured time-resolved data is non-trivial and (d) direct measurementsof events appear warped in space-time because the finitespeed of light implies that the recorded light propagation delay dependson camera position relative to the sceneContributions Our main contribution is in addressing these challengesand creating a first prototype as follows_ We exploit the statistical similarity of periodic light transportevents to record multiple ultrashort exposure times of onedimensionalviews (Section 3)_ We introduce a novel hardware implementation to sweep theexposures across a vertical field of view to build 3D spacetimedata volumes (Section 4)_ We create techniques for comprehensible visualization includingmovies showing the dynamics of real-world lighttransport phenomena (including reflections scattering diffuseinter-reflections or beam diffraction) and the notion ofpeak-time which partially overcomes the low-frequency appearanceof integrated global light transport (Section 5)_ We introduce a time-unwarping technique to correct the distortionsin captured time-resolved information due to the finitespeed of light (Section 6)

Limitations Although not conceptual our setup has several practicallimitations primarily due to the limited SNR of scattered light Since the hardware elements in our system were originally designedfor different purposes it is not optimized for efficiency and suffersfrom low optical throughput (eg the detector is optimized for 500nm visible light while the infrared laser wavelength we use is 795nm) and from dynamic range limitations This lengthens the totalrecording time to approximately one hour Furthermore the scanningmirror rotating continuously introduces some blurring in thedata along the scanned (vertical) dimension Future optimized systemscan overcome these limitations

Chapter 2 Related WorkUltrafast Devices The fastest 2D continuous real-timemonochromatic camera operates at hundreds of nanoseconds perframe [Goda et al 2009] (about 6_ 106 frames per second) with aspatial resolution of 200_200 pixels less than one third of what weachieve Avalanche photodetector (APD) arrays can reach temporalresolutions of several tens of picoseconds if they are used in aphoton starved regime where only a single photon hits a detectorwithin a time window of tens of nanoseconds [Charbon 2007]Repetitive illumination techniques used in incoherent LiDAR [Tou1995 Gelbart et al 2002] use cameras with typical exposure timeson the order of hundreds of picoseconds [Busck and Heiselberg2004 Colaccedilo et al 2012] two orders of magnitude slower thanour system Liquid nonlinear shutters actuated with powerful laserpulses have been used to capture single analog frames imaginglight pulses at picosecond time resolution [Duguay and Mattick1971] Other sensors that use a coherent phase relation betweenthe illumination and the detected light such as optical coherencetomography (OCT) [Huang et al 1991] coherent LiDAR [Xia

and Zhang 2009] light-in-flight holography [Abramson 1978]or white light interferometry [Wyant 2002] achieve femtosecondresolutions however they require light to maintain coherence(ie wave interference effects) during light transport and aretherefore unsuitable for indirect illumination in which diffusereflections remove coherence from the light Simple streak sensorscapture incoherent light at picosecond to nanosecond speeds butare limited to a line or low resolution (20 _ 20) square field ofview [Campillo and Shapiro 1987 Itatani et al 2002 Shiragaet al 1995 Gelbart et al 2002 Kodama et al 1999 Qu et al2006] They have also been used as line scanning devices forimage transmission through highly scattering turbid media byrecording the ballistic photons which travel a straight path throughthe scatterer and thus arrive first on the sensor [Hebden 1993]

Figure 3 Left Photograph of our ultrafast imaging system setup The DSLR camera takes a conventional photo for comparison RightTime sequence illustrating the arrival of the pulse striking a diffuser its transformation into a spherical energy front and its propagationthrough the scene The corresponding captured scene is shown in Figure 10 (top row)

principles that we develop in this paper for the purpose of transientimaging were first demonstrated by Velten et al [2012c] Recentlyphotonic mixer devices along with nonlinear optimization havealso been used in this context [Heide et al 2013]Our system can record and reconstruct space-time world informationof incoherent light propagation in free-space table-top scenesat a resolution of up to 672 _ 1000 pixels and under 2 picosecondsper frame The varied range and complexity of the scenes

we capture allow us to visualize the dynamics of global illuminationeffects such as scattering specular reflections interreflectionssubsurface scattering caustics and diffractionTime-Resolved Imaging Recent advances in time-resolvedimaging have been exploited to recover geometry and motionaround corners [Raskar and Davis 2008 Kirmani et al 2011 Veltenet al 2012b Velten et al 2012a Gupta et al 2012 Pandharkaret al 2011] and albedo of from single view point [Naik et al 2011]But none of them explored the idea of capturing videos of light inmotion in direct view and have some fundamental limitations (suchas capturing only third-bounce light) that make them unsuitable forthe present purpose Wu et al [2012a] separate direct and global illuminationcomponents from time-resolved data captured with thesystem we describe in this paper by analyzing the time profile ofeach pixel In a recent publication [Wu et al 2012b] the authorspresent an analysis on transient light transport in frequency spaceand show how it can be applied to bare-sensor imaging Chapter 3 Capturing Space-Time PlanesWe capture time scales orders of magnitude faster than the exposuretimes of conventional cameras in which photons reaching thesensor at different times are integrated into a single value makingit impossible to observe ultrafast optical phenomena The systemdescribed in this paper has an effective exposure time down to 185ps since light travels at 03 mmps light travels approximately 05mm between frames in our reconstructed moviesSystem An ultrafast setup must overcome several difficulties inorder to accurately measure a high-resolution (both in space and

time) image First for an unamplified laser pulse a single exposuretime of less than 2 ps would not collect enough light so the SNRwould be unworkably low As an example for a table-top sceneilluminated by a 100Wbulb only about 1 photon on average wouldreach the sensor during a 2 ps open-shutter period Second becauseof the time scales involved synchronization of the sensor and theillumination must be executed within picosecond precision Thirdstandalone streak sensors sacrifice the vertical spatial dimension inorder to code the time dimension thus producing x-t images Asa consequence their field of view is reduced to a single horizontalline of view of the scene We solve these problems with our ultrafast imaging system outlinedin Figure 2 (A photograph of the actual setup is shown inFigure 3 (left)) The light source is a femtosecond (fs) Kerr lensmode-locked TiSapphire laser which emits 50-fs with a centerwavelength of 795 nm at a repetition rate of 75 MHz and averagepower of 500 mW In order to see ultrafast events in a scene withmacro-scaled objects we focus the light with a lens onto a Lambertiandiffuser which then acts as a point light source and illuminatesthe entire scene with a spherically-shaped pulse (see Figure3 (right)) Alternatively if we want to observe pulse propagationitself rather than the interactions with large objects we direct thelaser beam across the field of view of the camera through a scatteringmedium (see the bottle scene in Figure 1)Because all the pulses are statistically identical we can record thescattered light from many of them and integrate the measurementsto average out any noise The result is a signal with a high SNR Tosynchronize this illumination with the streak sensor (HamamatsuC5680 [Hamamatsu 2012]) we split off a portion of the beam with

a glass slide and direct it onto a fast photodetector connected to thesensor so that now both detector and illumination operate synchronously(see Figure 2 (a))Capturing space-time planes The streak sensor then capturesan x-t image of a certain scanline (ie a line of pixels in the horizontaldimension) of the scene with a space-time resolution of672 _ 512 The exact time resolution depends on the amplificaamplificationof an internal sweep voltage signal applied to the streak sensorWith our hardware it can be adjusted from 030 ps to 507 ps Practicallywe choose the fastest resolution that still allows for captureof the entire duration of the event In the streak sensor a photocathodeconverts incoming photons arriving from each spatial locationin the scanline into electrons The streak sensor generates the x-timage by deflecting these electrons according to the time of theirarrival to different positions along the t-dimension of the sensor(see Figure 2(b) and 2(c)) This is achieved by means of rapidlychanging the sweep voltage between the electrodes in the sensorFor each horizontal scanline the camera records a scene illuminatedby the pulse and averages the light scattered by 45 _ 108

pulses (see Figure 2(d) and 2(e))Performance Validation To characterize the streak sensor wecompare sensor measurements with known geometry and verify thelinearity reproducibility and calibration of the time measurementsTo do this we first capture a streak image of a scanline of a simplescene a plane being illuminated by the laser after hitting the diffuser

(see Figure 4 (left)) Then by using a Faro digitizer arm [Faro2012] we obtain the ground truth geometry of the points along that

plane and of the point of the diffuser hit by the laser this allows usto compute the total travel time per path (diffuser-plane-streak sensor)for each pixel in the scanline We then compare the travel timecaptured by our streak sensor with the real travel time computed from the known geometry

Chapter 4 Capturing Space-Time VolumesAlthough the synchronized pulsed measurements overcome SNRissues the streak sensor still provides only a one-dimensionalmovie Extension to two dimensions requires unfeasible bandwidthsa typical dimension is roughly 103 pixels so a threedimensionaldata cube has 109 elements Recording such a largequantity in a 1010485769 second (1 ns) time widow requires a bandwidthof 1018 bytes far beyond typical available bandwidthsWe solve this acquisition problem by again utilizing the synchronizedrepeatability of the hardware A mirror-scanning system (two9 cm _ 13 cm mirrors see Figure 3 (left)) rotates the camerarsquos centerof projection so that it records horizontal slices of a scene sequentiallyWe use a computer-controlled one-rpm servo motor torotate one of the mirrors and consequently scan the field of viewvertically The scenes are about 25 cm wide and placed about 1

meter from the camera With high gear ratios (up to 11000) thecontinuous rotation of the mirror is slow enough to allow the camerato record each line for about six seconds requiring about onehour for 600 lines (our video resolution) We generally capture extralines above and below the scene (up to 1000 lines) and thencrop them to match the aspect ratio of the physical scenes beforethe movie was reconstructedThese resulting images are combined into one matrixMijk wherei = 1672 and k = 1512 are the dimensions of the individualx-t streak images and j = 11000 addresses the second spatialdimension y For a given time instant k the submatrix Nij containsa two-dimensional image of the scene with a resolution of 672 _1000 pixels exposed for as short to 185 ps Combining the x-tslices of the scene for each scanline yields a 3D x-y-t data volumeas shown in Figure 5 (left) An x-y slice represents one frame of the

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 10: Gaurav.report on femto photography

Limitations Although not conceptual our setup has several practicallimitations primarily due to the limited SNR of scattered light Since the hardware elements in our system were originally designedfor different purposes it is not optimized for efficiency and suffersfrom low optical throughput (eg the detector is optimized for 500nm visible light while the infrared laser wavelength we use is 795nm) and from dynamic range limitations This lengthens the totalrecording time to approximately one hour Furthermore the scanningmirror rotating continuously introduces some blurring in thedata along the scanned (vertical) dimension Future optimized systemscan overcome these limitations

Chapter 2 Related WorkUltrafast Devices The fastest 2D continuous real-timemonochromatic camera operates at hundreds of nanoseconds perframe [Goda et al 2009] (about 6_ 106 frames per second) with aspatial resolution of 200_200 pixels less than one third of what weachieve Avalanche photodetector (APD) arrays can reach temporalresolutions of several tens of picoseconds if they are used in aphoton starved regime where only a single photon hits a detectorwithin a time window of tens of nanoseconds [Charbon 2007]Repetitive illumination techniques used in incoherent LiDAR [Tou1995 Gelbart et al 2002] use cameras with typical exposure timeson the order of hundreds of picoseconds [Busck and Heiselberg2004 Colaccedilo et al 2012] two orders of magnitude slower thanour system Liquid nonlinear shutters actuated with powerful laserpulses have been used to capture single analog frames imaginglight pulses at picosecond time resolution [Duguay and Mattick1971] Other sensors that use a coherent phase relation betweenthe illumination and the detected light such as optical coherencetomography (OCT) [Huang et al 1991] coherent LiDAR [Xia

and Zhang 2009] light-in-flight holography [Abramson 1978]or white light interferometry [Wyant 2002] achieve femtosecondresolutions however they require light to maintain coherence(ie wave interference effects) during light transport and aretherefore unsuitable for indirect illumination in which diffusereflections remove coherence from the light Simple streak sensorscapture incoherent light at picosecond to nanosecond speeds butare limited to a line or low resolution (20 _ 20) square field ofview [Campillo and Shapiro 1987 Itatani et al 2002 Shiragaet al 1995 Gelbart et al 2002 Kodama et al 1999 Qu et al2006] They have also been used as line scanning devices forimage transmission through highly scattering turbid media byrecording the ballistic photons which travel a straight path throughthe scatterer and thus arrive first on the sensor [Hebden 1993]

Figure 3 Left Photograph of our ultrafast imaging system setup The DSLR camera takes a conventional photo for comparison RightTime sequence illustrating the arrival of the pulse striking a diffuser its transformation into a spherical energy front and its propagationthrough the scene The corresponding captured scene is shown in Figure 10 (top row)

principles that we develop in this paper for the purpose of transientimaging were first demonstrated by Velten et al [2012c] Recentlyphotonic mixer devices along with nonlinear optimization havealso been used in this context [Heide et al 2013]Our system can record and reconstruct space-time world informationof incoherent light propagation in free-space table-top scenesat a resolution of up to 672 _ 1000 pixels and under 2 picosecondsper frame The varied range and complexity of the scenes

we capture allow us to visualize the dynamics of global illuminationeffects such as scattering specular reflections interreflectionssubsurface scattering caustics and diffractionTime-Resolved Imaging Recent advances in time-resolvedimaging have been exploited to recover geometry and motionaround corners [Raskar and Davis 2008 Kirmani et al 2011 Veltenet al 2012b Velten et al 2012a Gupta et al 2012 Pandharkaret al 2011] and albedo of from single view point [Naik et al 2011]But none of them explored the idea of capturing videos of light inmotion in direct view and have some fundamental limitations (suchas capturing only third-bounce light) that make them unsuitable forthe present purpose Wu et al [2012a] separate direct and global illuminationcomponents from time-resolved data captured with thesystem we describe in this paper by analyzing the time profile ofeach pixel In a recent publication [Wu et al 2012b] the authorspresent an analysis on transient light transport in frequency spaceand show how it can be applied to bare-sensor imaging Chapter 3 Capturing Space-Time PlanesWe capture time scales orders of magnitude faster than the exposuretimes of conventional cameras in which photons reaching thesensor at different times are integrated into a single value makingit impossible to observe ultrafast optical phenomena The systemdescribed in this paper has an effective exposure time down to 185ps since light travels at 03 mmps light travels approximately 05mm between frames in our reconstructed moviesSystem An ultrafast setup must overcome several difficulties inorder to accurately measure a high-resolution (both in space and

time) image First for an unamplified laser pulse a single exposuretime of less than 2 ps would not collect enough light so the SNRwould be unworkably low As an example for a table-top sceneilluminated by a 100Wbulb only about 1 photon on average wouldreach the sensor during a 2 ps open-shutter period Second becauseof the time scales involved synchronization of the sensor and theillumination must be executed within picosecond precision Thirdstandalone streak sensors sacrifice the vertical spatial dimension inorder to code the time dimension thus producing x-t images Asa consequence their field of view is reduced to a single horizontalline of view of the scene We solve these problems with our ultrafast imaging system outlinedin Figure 2 (A photograph of the actual setup is shown inFigure 3 (left)) The light source is a femtosecond (fs) Kerr lensmode-locked TiSapphire laser which emits 50-fs with a centerwavelength of 795 nm at a repetition rate of 75 MHz and averagepower of 500 mW In order to see ultrafast events in a scene withmacro-scaled objects we focus the light with a lens onto a Lambertiandiffuser which then acts as a point light source and illuminatesthe entire scene with a spherically-shaped pulse (see Figure3 (right)) Alternatively if we want to observe pulse propagationitself rather than the interactions with large objects we direct thelaser beam across the field of view of the camera through a scatteringmedium (see the bottle scene in Figure 1)Because all the pulses are statistically identical we can record thescattered light from many of them and integrate the measurementsto average out any noise The result is a signal with a high SNR Tosynchronize this illumination with the streak sensor (HamamatsuC5680 [Hamamatsu 2012]) we split off a portion of the beam with

a glass slide and direct it onto a fast photodetector connected to thesensor so that now both detector and illumination operate synchronously(see Figure 2 (a))Capturing space-time planes The streak sensor then capturesan x-t image of a certain scanline (ie a line of pixels in the horizontaldimension) of the scene with a space-time resolution of672 _ 512 The exact time resolution depends on the amplificaamplificationof an internal sweep voltage signal applied to the streak sensorWith our hardware it can be adjusted from 030 ps to 507 ps Practicallywe choose the fastest resolution that still allows for captureof the entire duration of the event In the streak sensor a photocathodeconverts incoming photons arriving from each spatial locationin the scanline into electrons The streak sensor generates the x-timage by deflecting these electrons according to the time of theirarrival to different positions along the t-dimension of the sensor(see Figure 2(b) and 2(c)) This is achieved by means of rapidlychanging the sweep voltage between the electrodes in the sensorFor each horizontal scanline the camera records a scene illuminatedby the pulse and averages the light scattered by 45 _ 108

pulses (see Figure 2(d) and 2(e))Performance Validation To characterize the streak sensor wecompare sensor measurements with known geometry and verify thelinearity reproducibility and calibration of the time measurementsTo do this we first capture a streak image of a scanline of a simplescene a plane being illuminated by the laser after hitting the diffuser

(see Figure 4 (left)) Then by using a Faro digitizer arm [Faro2012] we obtain the ground truth geometry of the points along that

plane and of the point of the diffuser hit by the laser this allows usto compute the total travel time per path (diffuser-plane-streak sensor)for each pixel in the scanline We then compare the travel timecaptured by our streak sensor with the real travel time computed from the known geometry

Chapter 4 Capturing Space-Time VolumesAlthough the synchronized pulsed measurements overcome SNRissues the streak sensor still provides only a one-dimensionalmovie Extension to two dimensions requires unfeasible bandwidthsa typical dimension is roughly 103 pixels so a threedimensionaldata cube has 109 elements Recording such a largequantity in a 1010485769 second (1 ns) time widow requires a bandwidthof 1018 bytes far beyond typical available bandwidthsWe solve this acquisition problem by again utilizing the synchronizedrepeatability of the hardware A mirror-scanning system (two9 cm _ 13 cm mirrors see Figure 3 (left)) rotates the camerarsquos centerof projection so that it records horizontal slices of a scene sequentiallyWe use a computer-controlled one-rpm servo motor torotate one of the mirrors and consequently scan the field of viewvertically The scenes are about 25 cm wide and placed about 1

meter from the camera With high gear ratios (up to 11000) thecontinuous rotation of the mirror is slow enough to allow the camerato record each line for about six seconds requiring about onehour for 600 lines (our video resolution) We generally capture extralines above and below the scene (up to 1000 lines) and thencrop them to match the aspect ratio of the physical scenes beforethe movie was reconstructedThese resulting images are combined into one matrixMijk wherei = 1672 and k = 1512 are the dimensions of the individualx-t streak images and j = 11000 addresses the second spatialdimension y For a given time instant k the submatrix Nij containsa two-dimensional image of the scene with a resolution of 672 _1000 pixels exposed for as short to 185 ps Combining the x-tslices of the scene for each scanline yields a 3D x-y-t data volumeas shown in Figure 5 (left) An x-y slice represents one frame of the

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 11: Gaurav.report on femto photography

and Zhang 2009] light-in-flight holography [Abramson 1978]or white light interferometry [Wyant 2002] achieve femtosecondresolutions however they require light to maintain coherence(ie wave interference effects) during light transport and aretherefore unsuitable for indirect illumination in which diffusereflections remove coherence from the light Simple streak sensorscapture incoherent light at picosecond to nanosecond speeds butare limited to a line or low resolution (20 _ 20) square field ofview [Campillo and Shapiro 1987 Itatani et al 2002 Shiragaet al 1995 Gelbart et al 2002 Kodama et al 1999 Qu et al2006] They have also been used as line scanning devices forimage transmission through highly scattering turbid media byrecording the ballistic photons which travel a straight path throughthe scatterer and thus arrive first on the sensor [Hebden 1993]

Figure 3 Left Photograph of our ultrafast imaging system setup The DSLR camera takes a conventional photo for comparison RightTime sequence illustrating the arrival of the pulse striking a diffuser its transformation into a spherical energy front and its propagationthrough the scene The corresponding captured scene is shown in Figure 10 (top row)

principles that we develop in this paper for the purpose of transientimaging were first demonstrated by Velten et al [2012c] Recentlyphotonic mixer devices along with nonlinear optimization havealso been used in this context [Heide et al 2013]Our system can record and reconstruct space-time world informationof incoherent light propagation in free-space table-top scenesat a resolution of up to 672 _ 1000 pixels and under 2 picosecondsper frame The varied range and complexity of the scenes

we capture allow us to visualize the dynamics of global illuminationeffects such as scattering specular reflections interreflectionssubsurface scattering caustics and diffractionTime-Resolved Imaging Recent advances in time-resolvedimaging have been exploited to recover geometry and motionaround corners [Raskar and Davis 2008 Kirmani et al 2011 Veltenet al 2012b Velten et al 2012a Gupta et al 2012 Pandharkaret al 2011] and albedo of from single view point [Naik et al 2011]But none of them explored the idea of capturing videos of light inmotion in direct view and have some fundamental limitations (suchas capturing only third-bounce light) that make them unsuitable forthe present purpose Wu et al [2012a] separate direct and global illuminationcomponents from time-resolved data captured with thesystem we describe in this paper by analyzing the time profile ofeach pixel In a recent publication [Wu et al 2012b] the authorspresent an analysis on transient light transport in frequency spaceand show how it can be applied to bare-sensor imaging Chapter 3 Capturing Space-Time PlanesWe capture time scales orders of magnitude faster than the exposuretimes of conventional cameras in which photons reaching thesensor at different times are integrated into a single value makingit impossible to observe ultrafast optical phenomena The systemdescribed in this paper has an effective exposure time down to 185ps since light travels at 03 mmps light travels approximately 05mm between frames in our reconstructed moviesSystem An ultrafast setup must overcome several difficulties inorder to accurately measure a high-resolution (both in space and

time) image First for an unamplified laser pulse a single exposuretime of less than 2 ps would not collect enough light so the SNRwould be unworkably low As an example for a table-top sceneilluminated by a 100Wbulb only about 1 photon on average wouldreach the sensor during a 2 ps open-shutter period Second becauseof the time scales involved synchronization of the sensor and theillumination must be executed within picosecond precision Thirdstandalone streak sensors sacrifice the vertical spatial dimension inorder to code the time dimension thus producing x-t images Asa consequence their field of view is reduced to a single horizontalline of view of the scene We solve these problems with our ultrafast imaging system outlinedin Figure 2 (A photograph of the actual setup is shown inFigure 3 (left)) The light source is a femtosecond (fs) Kerr lensmode-locked TiSapphire laser which emits 50-fs with a centerwavelength of 795 nm at a repetition rate of 75 MHz and averagepower of 500 mW In order to see ultrafast events in a scene withmacro-scaled objects we focus the light with a lens onto a Lambertiandiffuser which then acts as a point light source and illuminatesthe entire scene with a spherically-shaped pulse (see Figure3 (right)) Alternatively if we want to observe pulse propagationitself rather than the interactions with large objects we direct thelaser beam across the field of view of the camera through a scatteringmedium (see the bottle scene in Figure 1)Because all the pulses are statistically identical we can record thescattered light from many of them and integrate the measurementsto average out any noise The result is a signal with a high SNR Tosynchronize this illumination with the streak sensor (HamamatsuC5680 [Hamamatsu 2012]) we split off a portion of the beam with

a glass slide and direct it onto a fast photodetector connected to thesensor so that now both detector and illumination operate synchronously(see Figure 2 (a))Capturing space-time planes The streak sensor then capturesan x-t image of a certain scanline (ie a line of pixels in the horizontaldimension) of the scene with a space-time resolution of672 _ 512 The exact time resolution depends on the amplificaamplificationof an internal sweep voltage signal applied to the streak sensorWith our hardware it can be adjusted from 030 ps to 507 ps Practicallywe choose the fastest resolution that still allows for captureof the entire duration of the event In the streak sensor a photocathodeconverts incoming photons arriving from each spatial locationin the scanline into electrons The streak sensor generates the x-timage by deflecting these electrons according to the time of theirarrival to different positions along the t-dimension of the sensor(see Figure 2(b) and 2(c)) This is achieved by means of rapidlychanging the sweep voltage between the electrodes in the sensorFor each horizontal scanline the camera records a scene illuminatedby the pulse and averages the light scattered by 45 _ 108

pulses (see Figure 2(d) and 2(e))Performance Validation To characterize the streak sensor wecompare sensor measurements with known geometry and verify thelinearity reproducibility and calibration of the time measurementsTo do this we first capture a streak image of a scanline of a simplescene a plane being illuminated by the laser after hitting the diffuser

(see Figure 4 (left)) Then by using a Faro digitizer arm [Faro2012] we obtain the ground truth geometry of the points along that

plane and of the point of the diffuser hit by the laser this allows usto compute the total travel time per path (diffuser-plane-streak sensor)for each pixel in the scanline We then compare the travel timecaptured by our streak sensor with the real travel time computed from the known geometry

Chapter 4 Capturing Space-Time VolumesAlthough the synchronized pulsed measurements overcome SNRissues the streak sensor still provides only a one-dimensionalmovie Extension to two dimensions requires unfeasible bandwidthsa typical dimension is roughly 103 pixels so a threedimensionaldata cube has 109 elements Recording such a largequantity in a 1010485769 second (1 ns) time widow requires a bandwidthof 1018 bytes far beyond typical available bandwidthsWe solve this acquisition problem by again utilizing the synchronizedrepeatability of the hardware A mirror-scanning system (two9 cm _ 13 cm mirrors see Figure 3 (left)) rotates the camerarsquos centerof projection so that it records horizontal slices of a scene sequentiallyWe use a computer-controlled one-rpm servo motor torotate one of the mirrors and consequently scan the field of viewvertically The scenes are about 25 cm wide and placed about 1

meter from the camera With high gear ratios (up to 11000) thecontinuous rotation of the mirror is slow enough to allow the camerato record each line for about six seconds requiring about onehour for 600 lines (our video resolution) We generally capture extralines above and below the scene (up to 1000 lines) and thencrop them to match the aspect ratio of the physical scenes beforethe movie was reconstructedThese resulting images are combined into one matrixMijk wherei = 1672 and k = 1512 are the dimensions of the individualx-t streak images and j = 11000 addresses the second spatialdimension y For a given time instant k the submatrix Nij containsa two-dimensional image of the scene with a resolution of 672 _1000 pixels exposed for as short to 185 ps Combining the x-tslices of the scene for each scanline yields a 3D x-y-t data volumeas shown in Figure 5 (left) An x-y slice represents one frame of the

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 12: Gaurav.report on femto photography

we capture allow us to visualize the dynamics of global illuminationeffects such as scattering specular reflections interreflectionssubsurface scattering caustics and diffractionTime-Resolved Imaging Recent advances in time-resolvedimaging have been exploited to recover geometry and motionaround corners [Raskar and Davis 2008 Kirmani et al 2011 Veltenet al 2012b Velten et al 2012a Gupta et al 2012 Pandharkaret al 2011] and albedo of from single view point [Naik et al 2011]But none of them explored the idea of capturing videos of light inmotion in direct view and have some fundamental limitations (suchas capturing only third-bounce light) that make them unsuitable forthe present purpose Wu et al [2012a] separate direct and global illuminationcomponents from time-resolved data captured with thesystem we describe in this paper by analyzing the time profile ofeach pixel In a recent publication [Wu et al 2012b] the authorspresent an analysis on transient light transport in frequency spaceand show how it can be applied to bare-sensor imaging Chapter 3 Capturing Space-Time PlanesWe capture time scales orders of magnitude faster than the exposuretimes of conventional cameras in which photons reaching thesensor at different times are integrated into a single value makingit impossible to observe ultrafast optical phenomena The systemdescribed in this paper has an effective exposure time down to 185ps since light travels at 03 mmps light travels approximately 05mm between frames in our reconstructed moviesSystem An ultrafast setup must overcome several difficulties inorder to accurately measure a high-resolution (both in space and

time) image First for an unamplified laser pulse a single exposuretime of less than 2 ps would not collect enough light so the SNRwould be unworkably low As an example for a table-top sceneilluminated by a 100Wbulb only about 1 photon on average wouldreach the sensor during a 2 ps open-shutter period Second becauseof the time scales involved synchronization of the sensor and theillumination must be executed within picosecond precision Thirdstandalone streak sensors sacrifice the vertical spatial dimension inorder to code the time dimension thus producing x-t images Asa consequence their field of view is reduced to a single horizontalline of view of the scene We solve these problems with our ultrafast imaging system outlinedin Figure 2 (A photograph of the actual setup is shown inFigure 3 (left)) The light source is a femtosecond (fs) Kerr lensmode-locked TiSapphire laser which emits 50-fs with a centerwavelength of 795 nm at a repetition rate of 75 MHz and averagepower of 500 mW In order to see ultrafast events in a scene withmacro-scaled objects we focus the light with a lens onto a Lambertiandiffuser which then acts as a point light source and illuminatesthe entire scene with a spherically-shaped pulse (see Figure3 (right)) Alternatively if we want to observe pulse propagationitself rather than the interactions with large objects we direct thelaser beam across the field of view of the camera through a scatteringmedium (see the bottle scene in Figure 1)Because all the pulses are statistically identical we can record thescattered light from many of them and integrate the measurementsto average out any noise The result is a signal with a high SNR Tosynchronize this illumination with the streak sensor (HamamatsuC5680 [Hamamatsu 2012]) we split off a portion of the beam with

a glass slide and direct it onto a fast photodetector connected to thesensor so that now both detector and illumination operate synchronously(see Figure 2 (a))Capturing space-time planes The streak sensor then capturesan x-t image of a certain scanline (ie a line of pixels in the horizontaldimension) of the scene with a space-time resolution of672 _ 512 The exact time resolution depends on the amplificaamplificationof an internal sweep voltage signal applied to the streak sensorWith our hardware it can be adjusted from 030 ps to 507 ps Practicallywe choose the fastest resolution that still allows for captureof the entire duration of the event In the streak sensor a photocathodeconverts incoming photons arriving from each spatial locationin the scanline into electrons The streak sensor generates the x-timage by deflecting these electrons according to the time of theirarrival to different positions along the t-dimension of the sensor(see Figure 2(b) and 2(c)) This is achieved by means of rapidlychanging the sweep voltage between the electrodes in the sensorFor each horizontal scanline the camera records a scene illuminatedby the pulse and averages the light scattered by 45 _ 108

pulses (see Figure 2(d) and 2(e))Performance Validation To characterize the streak sensor wecompare sensor measurements with known geometry and verify thelinearity reproducibility and calibration of the time measurementsTo do this we first capture a streak image of a scanline of a simplescene a plane being illuminated by the laser after hitting the diffuser

(see Figure 4 (left)) Then by using a Faro digitizer arm [Faro2012] we obtain the ground truth geometry of the points along that

plane and of the point of the diffuser hit by the laser this allows usto compute the total travel time per path (diffuser-plane-streak sensor)for each pixel in the scanline We then compare the travel timecaptured by our streak sensor with the real travel time computed from the known geometry

Chapter 4 Capturing Space-Time VolumesAlthough the synchronized pulsed measurements overcome SNRissues the streak sensor still provides only a one-dimensionalmovie Extension to two dimensions requires unfeasible bandwidthsa typical dimension is roughly 103 pixels so a threedimensionaldata cube has 109 elements Recording such a largequantity in a 1010485769 second (1 ns) time widow requires a bandwidthof 1018 bytes far beyond typical available bandwidthsWe solve this acquisition problem by again utilizing the synchronizedrepeatability of the hardware A mirror-scanning system (two9 cm _ 13 cm mirrors see Figure 3 (left)) rotates the camerarsquos centerof projection so that it records horizontal slices of a scene sequentiallyWe use a computer-controlled one-rpm servo motor torotate one of the mirrors and consequently scan the field of viewvertically The scenes are about 25 cm wide and placed about 1

meter from the camera With high gear ratios (up to 11000) thecontinuous rotation of the mirror is slow enough to allow the camerato record each line for about six seconds requiring about onehour for 600 lines (our video resolution) We generally capture extralines above and below the scene (up to 1000 lines) and thencrop them to match the aspect ratio of the physical scenes beforethe movie was reconstructedThese resulting images are combined into one matrixMijk wherei = 1672 and k = 1512 are the dimensions of the individualx-t streak images and j = 11000 addresses the second spatialdimension y For a given time instant k the submatrix Nij containsa two-dimensional image of the scene with a resolution of 672 _1000 pixels exposed for as short to 185 ps Combining the x-tslices of the scene for each scanline yields a 3D x-y-t data volumeas shown in Figure 5 (left) An x-y slice represents one frame of the

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 13: Gaurav.report on femto photography

time) image First for an unamplified laser pulse a single exposuretime of less than 2 ps would not collect enough light so the SNRwould be unworkably low As an example for a table-top sceneilluminated by a 100Wbulb only about 1 photon on average wouldreach the sensor during a 2 ps open-shutter period Second becauseof the time scales involved synchronization of the sensor and theillumination must be executed within picosecond precision Thirdstandalone streak sensors sacrifice the vertical spatial dimension inorder to code the time dimension thus producing x-t images Asa consequence their field of view is reduced to a single horizontalline of view of the scene We solve these problems with our ultrafast imaging system outlinedin Figure 2 (A photograph of the actual setup is shown inFigure 3 (left)) The light source is a femtosecond (fs) Kerr lensmode-locked TiSapphire laser which emits 50-fs with a centerwavelength of 795 nm at a repetition rate of 75 MHz and averagepower of 500 mW In order to see ultrafast events in a scene withmacro-scaled objects we focus the light with a lens onto a Lambertiandiffuser which then acts as a point light source and illuminatesthe entire scene with a spherically-shaped pulse (see Figure3 (right)) Alternatively if we want to observe pulse propagationitself rather than the interactions with large objects we direct thelaser beam across the field of view of the camera through a scatteringmedium (see the bottle scene in Figure 1)Because all the pulses are statistically identical we can record thescattered light from many of them and integrate the measurementsto average out any noise The result is a signal with a high SNR Tosynchronize this illumination with the streak sensor (HamamatsuC5680 [Hamamatsu 2012]) we split off a portion of the beam with

a glass slide and direct it onto a fast photodetector connected to thesensor so that now both detector and illumination operate synchronously(see Figure 2 (a))Capturing space-time planes The streak sensor then capturesan x-t image of a certain scanline (ie a line of pixels in the horizontaldimension) of the scene with a space-time resolution of672 _ 512 The exact time resolution depends on the amplificaamplificationof an internal sweep voltage signal applied to the streak sensorWith our hardware it can be adjusted from 030 ps to 507 ps Practicallywe choose the fastest resolution that still allows for captureof the entire duration of the event In the streak sensor a photocathodeconverts incoming photons arriving from each spatial locationin the scanline into electrons The streak sensor generates the x-timage by deflecting these electrons according to the time of theirarrival to different positions along the t-dimension of the sensor(see Figure 2(b) and 2(c)) This is achieved by means of rapidlychanging the sweep voltage between the electrodes in the sensorFor each horizontal scanline the camera records a scene illuminatedby the pulse and averages the light scattered by 45 _ 108

pulses (see Figure 2(d) and 2(e))Performance Validation To characterize the streak sensor wecompare sensor measurements with known geometry and verify thelinearity reproducibility and calibration of the time measurementsTo do this we first capture a streak image of a scanline of a simplescene a plane being illuminated by the laser after hitting the diffuser

(see Figure 4 (left)) Then by using a Faro digitizer arm [Faro2012] we obtain the ground truth geometry of the points along that

plane and of the point of the diffuser hit by the laser this allows usto compute the total travel time per path (diffuser-plane-streak sensor)for each pixel in the scanline We then compare the travel timecaptured by our streak sensor with the real travel time computed from the known geometry

Chapter 4 Capturing Space-Time VolumesAlthough the synchronized pulsed measurements overcome SNRissues the streak sensor still provides only a one-dimensionalmovie Extension to two dimensions requires unfeasible bandwidthsa typical dimension is roughly 103 pixels so a threedimensionaldata cube has 109 elements Recording such a largequantity in a 1010485769 second (1 ns) time widow requires a bandwidthof 1018 bytes far beyond typical available bandwidthsWe solve this acquisition problem by again utilizing the synchronizedrepeatability of the hardware A mirror-scanning system (two9 cm _ 13 cm mirrors see Figure 3 (left)) rotates the camerarsquos centerof projection so that it records horizontal slices of a scene sequentiallyWe use a computer-controlled one-rpm servo motor torotate one of the mirrors and consequently scan the field of viewvertically The scenes are about 25 cm wide and placed about 1

meter from the camera With high gear ratios (up to 11000) thecontinuous rotation of the mirror is slow enough to allow the camerato record each line for about six seconds requiring about onehour for 600 lines (our video resolution) We generally capture extralines above and below the scene (up to 1000 lines) and thencrop them to match the aspect ratio of the physical scenes beforethe movie was reconstructedThese resulting images are combined into one matrixMijk wherei = 1672 and k = 1512 are the dimensions of the individualx-t streak images and j = 11000 addresses the second spatialdimension y For a given time instant k the submatrix Nij containsa two-dimensional image of the scene with a resolution of 672 _1000 pixels exposed for as short to 185 ps Combining the x-tslices of the scene for each scanline yields a 3D x-y-t data volumeas shown in Figure 5 (left) An x-y slice represents one frame of the

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 14: Gaurav.report on femto photography

a glass slide and direct it onto a fast photodetector connected to thesensor so that now both detector and illumination operate synchronously(see Figure 2 (a))Capturing space-time planes The streak sensor then capturesan x-t image of a certain scanline (ie a line of pixels in the horizontaldimension) of the scene with a space-time resolution of672 _ 512 The exact time resolution depends on the amplificaamplificationof an internal sweep voltage signal applied to the streak sensorWith our hardware it can be adjusted from 030 ps to 507 ps Practicallywe choose the fastest resolution that still allows for captureof the entire duration of the event In the streak sensor a photocathodeconverts incoming photons arriving from each spatial locationin the scanline into electrons The streak sensor generates the x-timage by deflecting these electrons according to the time of theirarrival to different positions along the t-dimension of the sensor(see Figure 2(b) and 2(c)) This is achieved by means of rapidlychanging the sweep voltage between the electrodes in the sensorFor each horizontal scanline the camera records a scene illuminatedby the pulse and averages the light scattered by 45 _ 108

pulses (see Figure 2(d) and 2(e))Performance Validation To characterize the streak sensor wecompare sensor measurements with known geometry and verify thelinearity reproducibility and calibration of the time measurementsTo do this we first capture a streak image of a scanline of a simplescene a plane being illuminated by the laser after hitting the diffuser

(see Figure 4 (left)) Then by using a Faro digitizer arm [Faro2012] we obtain the ground truth geometry of the points along that

plane and of the point of the diffuser hit by the laser this allows usto compute the total travel time per path (diffuser-plane-streak sensor)for each pixel in the scanline We then compare the travel timecaptured by our streak sensor with the real travel time computed from the known geometry

Chapter 4 Capturing Space-Time VolumesAlthough the synchronized pulsed measurements overcome SNRissues the streak sensor still provides only a one-dimensionalmovie Extension to two dimensions requires unfeasible bandwidthsa typical dimension is roughly 103 pixels so a threedimensionaldata cube has 109 elements Recording such a largequantity in a 1010485769 second (1 ns) time widow requires a bandwidthof 1018 bytes far beyond typical available bandwidthsWe solve this acquisition problem by again utilizing the synchronizedrepeatability of the hardware A mirror-scanning system (two9 cm _ 13 cm mirrors see Figure 3 (left)) rotates the camerarsquos centerof projection so that it records horizontal slices of a scene sequentiallyWe use a computer-controlled one-rpm servo motor torotate one of the mirrors and consequently scan the field of viewvertically The scenes are about 25 cm wide and placed about 1

meter from the camera With high gear ratios (up to 11000) thecontinuous rotation of the mirror is slow enough to allow the camerato record each line for about six seconds requiring about onehour for 600 lines (our video resolution) We generally capture extralines above and below the scene (up to 1000 lines) and thencrop them to match the aspect ratio of the physical scenes beforethe movie was reconstructedThese resulting images are combined into one matrixMijk wherei = 1672 and k = 1512 are the dimensions of the individualx-t streak images and j = 11000 addresses the second spatialdimension y For a given time instant k the submatrix Nij containsa two-dimensional image of the scene with a resolution of 672 _1000 pixels exposed for as short to 185 ps Combining the x-tslices of the scene for each scanline yields a 3D x-y-t data volumeas shown in Figure 5 (left) An x-y slice represents one frame of the

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 15: Gaurav.report on femto photography

(see Figure 4 (left)) Then by using a Faro digitizer arm [Faro2012] we obtain the ground truth geometry of the points along that

plane and of the point of the diffuser hit by the laser this allows usto compute the total travel time per path (diffuser-plane-streak sensor)for each pixel in the scanline We then compare the travel timecaptured by our streak sensor with the real travel time computed from the known geometry

Chapter 4 Capturing Space-Time VolumesAlthough the synchronized pulsed measurements overcome SNRissues the streak sensor still provides only a one-dimensionalmovie Extension to two dimensions requires unfeasible bandwidthsa typical dimension is roughly 103 pixels so a threedimensionaldata cube has 109 elements Recording such a largequantity in a 1010485769 second (1 ns) time widow requires a bandwidthof 1018 bytes far beyond typical available bandwidthsWe solve this acquisition problem by again utilizing the synchronizedrepeatability of the hardware A mirror-scanning system (two9 cm _ 13 cm mirrors see Figure 3 (left)) rotates the camerarsquos centerof projection so that it records horizontal slices of a scene sequentiallyWe use a computer-controlled one-rpm servo motor torotate one of the mirrors and consequently scan the field of viewvertically The scenes are about 25 cm wide and placed about 1

meter from the camera With high gear ratios (up to 11000) thecontinuous rotation of the mirror is slow enough to allow the camerato record each line for about six seconds requiring about onehour for 600 lines (our video resolution) We generally capture extralines above and below the scene (up to 1000 lines) and thencrop them to match the aspect ratio of the physical scenes beforethe movie was reconstructedThese resulting images are combined into one matrixMijk wherei = 1672 and k = 1512 are the dimensions of the individualx-t streak images and j = 11000 addresses the second spatialdimension y For a given time instant k the submatrix Nij containsa two-dimensional image of the scene with a resolution of 672 _1000 pixels exposed for as short to 185 ps Combining the x-tslices of the scene for each scanline yields a 3D x-y-t data volumeas shown in Figure 5 (left) An x-y slice represents one frame of the

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 16: Gaurav.report on femto photography

meter from the camera With high gear ratios (up to 11000) thecontinuous rotation of the mirror is slow enough to allow the camerato record each line for about six seconds requiring about onehour for 600 lines (our video resolution) We generally capture extralines above and below the scene (up to 1000 lines) and thencrop them to match the aspect ratio of the physical scenes beforethe movie was reconstructedThese resulting images are combined into one matrixMijk wherei = 1672 and k = 1512 are the dimensions of the individualx-t streak images and j = 11000 addresses the second spatialdimension y For a given time instant k the submatrix Nij containsa two-dimensional image of the scene with a resolution of 672 _1000 pixels exposed for as short to 185 ps Combining the x-tslices of the scene for each scanline yields a 3D x-y-t data volumeas shown in Figure 5 (left) An x-y slice represents one frame of the

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 17: Gaurav.report on femto photography

final movie as shown in Figure 5 (right)

Figure 5 Left Reconstructed x-y-t data volume by stacking individualx-t images (captured with the scanning mirrors) Right Anx-y slice of the data cube represents one frame of the final movie

Chapter 5 Depicting Ultrafast Videos in 2DWe have explored several ways to visualize the information containedin the captured x-y-t data cube in an intuitive way Firstcontiguous Nij slices can be played as the frames of a movie Figure1 (bottom row) shows a captured scene (bottle) along with severalrepresentative Nij frames (Effects are described for variousscenes in Section 7) However understanding all the phenomenashown in a video is not a trivial task and movies composed of x-yframes such as the ones shown in Figure 10 may be hard to interpretMerging a static photograph of the scene from approximately thesame point of view with the Nij slices aids in the understanding oflight transport in the scenes (see movies within the supplementaryvideo) Although straightforward to implement the high dynamicrange of the streak data requires a nonlinear intensity transformationto extract subtle optical effects in the presence of high intensityreflections We employ a logarithmic transformation to this endWe have also explored single-image methods for intuitive visualization

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 18: Gaurav.report on femto photography

of full space-time propagation such as the color-coding inFigure 1 (right) which we describe in the following paragraphsIntegral Photo Fusion By integrating all the frames in novelways we can visualize and highlight different aspects of the lightflow in one photo Our photo fusion results are calculated asNij =PwkMijk fk = 1512g where wk is a weighting factordetermined by the particular fusion method We have tested severaldifferent methods of which two were found to yield the most intuitiveresults the first one is full fusion where wk = 1 for all kSumming all frames of the movie provides something resemblinga black and white photograph of the scene illuminated by the laserwhile showing time-resolved light transport effects An exampleis shown in Figure 6 (left) for the alien scene (More informationabout the scene is given in Section 7) A second technique rainbowfusion takes the fusion result and assigns a different RGB color toeach frame effectively color-coding the temporal dimension Anexample is shown in Figure 6 (middle)Peak Time Images The inherent integration in fusion methodsthough often useful can fail to reveal the most complex or subtlebehavior of light As an alternative we propose peak time imageswhich illustrate the time evolution of the maximum intensity in eachframe For each spatial position (i j) in the x-y-t volume we findthe peak intensity along the time dimension and keep informationwithin two time units to each side of the peak All other values inthe streak image are set to zero yielding a more sparse space-timevolume We then color-code time and sum up the x-y frames inthis new sparse volume in the same manner as in the rainbow fusioncase but use only every 20th frame in the sum to create blacklines between the equi-time paths or isochrones This results in amap of the propagation of maximum intensity contours which we

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 19: Gaurav.report on femto photography

term peak time image These color-coded isochronous lines can bethought of intuitively as propagating energy fronts Figure 6 (right)shows the peak time image for the alien scene and Figure 1 (topmiddle) shows the captured data for the bottle scene depicted usingthis visualization method As explained in the next section thisvisualization of the bottle scene reveals significant light transportphenomena that could not be seen with the rainbow fusion visualization

Figure 7 Understanding reversal of events in captured videosLeft Pulsed light scatters from a source strikes a surface (egat P1 and P2) and is then recorded by a sensor Time taken bylight to travel distances z1 + d1 and z2 + d2 is responsible for theexistence of two different time frames and the need of computationalcorrection to visualize the captured data in the world time frameRight Light appears to be propagating from P2 to P1 in cameratime (before unwarping) and from P1 to P2 in world time oncetime-unwarped Extended planar surfaces will intersect constanttimepaths to produce either elliptical or circular fronts

Chapter 6 Time UnwarpingVisualization of the captured movies (Sections 5 and 7) reveals resultsthat are counter-intuitive to theoretical and established knowledgeof light transport Figure 1 (top middle) shows a peak timevisualization of the bottle scene where several abnormal light transporteffects can be observed (1) the caustics on the floor whichpropagate towards the bottle instead of away from it (2) the curvedspherical energy fronts in the label area which should be rectilinearas seen from the camera and (3) the pulse itself being locatedbehind these energy fronts when it would need to precede themThese are due to the fact that usually light propagation is assumed

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 20: Gaurav.report on femto photography

to be infinitely fast so that events in world space are assumed to bedetected simultaneously in camera space In our ultrafast photographysetup however this assumption no longer holds and the finitespeed of light becomes a factor we must now take into account thetime delay between the occurrence of an event and its detection bythe camera sensorWe therefore need to consider two different time frames namelyworld time (when events happen) and camera time (when events aredetected) This duality of time frames is explained in Figure 7 lightfrom a source hits a surface first at point P1 = (i1 j1) (with (i j)being the x-y pixel coordinates of a scene point in the x-y-t datacube) then at the farther point P2 = (i2 j2) but the reflected lightis captured in the reverse order by the sensor due to different totalpath lengths (z1 + d1 gt z2 + d2) Generally this is due to the factthat for light to arrive at a given time instant t0 all the ray(later-time) isochrone than farther oness fromthe source to the wall to the camera must satisfy zi+di = ct0 sothat isochrones are elliptical Therefore although objects closer tothe source receive light earlier they can still lie on a higher In order to visualize all light transport events as they have occurred(not as the camera captured them) we transform the captured datafrom camera time to world time a transformation which we termtime unwarping Mathematically for a scene point P = (i j) weapply the following transformationt0

ij = tij +zij

c=_(1)where t0

ij and tij represent camera and world times respectively

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 21: Gaurav.report on femto photography

c is the speed of light in vacuum _ the index of refraction of themedium and zij is the distance from point P to the camera Forour table-top scenes we measure this distance with a Faro digitizerarm although it could be obtained from the data and the knownposition of the diffuser as the problem is analogous to that of bistaticLiDAR We can thus define light travel time from each point(i j) in the scene to the camera as _tij = t0

ij 1048576 tij = zij=(c=_)Then time unwarping effectively corresponds to offsetting data inthe x-y-t volume along the time dimension according to the valueof _tij for each of the (i j) points as shown in Figure 8In most of the scenes we only have propagation of light through airfor which we take _ _ 1 For the bottle scene we assume that thelaser pulse travels along its longitudinal axis at the speed of lightand that only a single scattering event occurs in the liquid insideWe take _ = 133 as the index of refraction of the liquid and ignorerefraction at the bottlersquos surface A step-by-step unwarping processis shown in Figure 9 for a frame (ie x-y image) of the bottle sceneOur unoptimized Matlab code runs at about 01 seconds per frameA time-unwarped peak-time visualization of the whole of this sceneis shown in Figure 1 (right) Notice how now the caustics originatefrom the bottle and propagate outward energy fronts along the labelare correctly depicted as straight lines and the pulse precedesrelated phenomena as expected

Chapter 7 Captured ScenesWe have used our ultrafast photography setup to capture interestinglight transport effects in different scenes Figure 10 summarizes

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 22: Gaurav.report on femto photography

them showing representative frames and peak time visualizationsThe exposure time for our scenes is between 185 ps for the crystalscene and 507 ps for the bottle and tank scenes which requiredimaging a longer time span for better visualization Please refer tothe video in the supplementary material to watch the reconstructedmovies Overall observing light in such slow motion reveals bothsubtle and key aspects of light transport We provide here briefdescriptions of the light transport effects captured in the differentscenesBottle This scene is shown in Figure 1 (bottom row) and hasbeen used to introduce time-unwarping A plastic bottle filled withwater diluted with milk is directly illuminated by the laser pulseentering through the bottom of the bottle along its longitudinal axisThe pulse scatters inside the liquid we can see the propagation ofthe wavefronts The geometry of the bottle neck creates some interestinglens effects making light look almost like a fluid Most ofthe light is reflected back from the cap while some is transmitted ortrapped in subsurface scattering phenomena Caustics are generatedon the tableTomato-tape This scene shows a tomato and a tape roll with awall behind them The propagation of the spherical wavefront afterthe laser pulse hits the diffuser can be seen clearly as it intersectsthe floor and the back wall (A B) The inside of the tape roll is outof the line of sight of the light source and is not directly illuminated

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 23: Gaurav.report on femto photography

It is illuminated later as indirect light scattered from the first wavereaches it (C) Shadows become visible only after the object hasbeen illuminated The more opaque tape darkens quickly after thelight front has passed while the tomato continues glowing for alonger time indicative of stronger subsurface scattering (D)Alien A toy alien is positioned in front of a mirror and wall Lightinteractions in this scene are extremely rich due to the mirror themultiple interreflections and the subsurface scattering in the toyThe video shows how the reflection in the mirror is actually formeddirect light first reaches the toy but the mirror is still completelydark (E) eventually light leaving the toy reaches the mirror andthe reflection is dynamically formed (F) Subsurface scattering isclearly present in the toy (G) while multiple direct and indirectinteractions between the wall and the mirror can also be seen (H)Crystal A group of sugar crystals is directly illuminated by thelaser from the left acting as multiple lenses and creating causticson the table (I) Part of the light refracted on the table is reflectedback to the candy creating secondary caustics on the table (J) Additionallyscattering events are visible within the crystals (K)

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 24: Gaurav.report on femto photography

Tank A reflective grating is placed at the right side of a tank filledwith milk diluted in water The grating is taken from a commercialspectrometer and consists of an array of small equally spacedrectangular mirrors The grating is blazed mirrors are tilted to concentratemaximum optical power in the first order diffraction forone wavelength The pulse enters the scene from the left travelsthrough the tank (L) and strikes the grating The grating reflectsand diffracts the beam pulse (M) The different orders of the diffractionare visible traveling back through the tank (N) As the figure(and the supplementary movie) shows most of the light reflectedfrom the grating propagates at the blaze angle

Chapter 8 Conclusions and Future WorkOur research fosters new computational imaging and image processingopportunities by providing incoherent time-resolved informationat ultrafast temporal resolutions We hope our workwill inspire new research in computer graphics and computationalphotography by enabling forward and inverse analysis of lighttransport allowing for full scene capture of hidden geometry andmaterials or for relighting photographs To this end capturedmovies and data of the scenes shown in this paper are available atfemtocamerainfo This exploitation in turn may influencethe rapidly emerging field of ultrafast imaging hardware

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 25: Gaurav.report on femto photography

The system could be extended to image in color by adding additionalpulsed laser sources at different colors or by using one continuouslytunable optical parametric oscillator (OPO) A second colorof about 400 nm could easily be added to the existing system bydoubling the laser frequency with a nonlinear crystal (about $1000)The streak tube is sensitive across the entire visible spectrum witha peak sensitivity at about 450 nm (about five times the sensitivityat 800 nm) Scaling to bigger scenes would require less timeresolution and could therefore simplify the imaging setup Scalingshould be possible without signal degradation as long as the cameraaperture and lens are scaled with the rest of the setup If theaperture stays the same the light intensity needs to be increasedquadratically to obtain similar resultsBeyond the ability of the commercially available streak sensor advancesin optics material science and compressive sensing maybring further optimization of the system which could yield increasedresolution of the captured x-t streak images Nonlinearshutters may provide an alternate path to femto-photography capturesystems However nonlinear optical methods require exoticmaterials and strong light intensities that can damage the objects ofinterest (and must be provided by laser light) Further they oftensuffer from physical instabilities

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 26: Gaurav.report on femto photography

We believe that mass production of streak sensors can lead to affordablesystems Also future designs may overcome the currentlimitations of our prototype regarding optical efficiency Futureresearch can investigate other ultrafast phenomena such as propagationof light in anisotropic media and photonic crystals or maybe used in applications such as scientific visualization (to understandultra-fast processes) medicine (to reconstruct subsurface elements)material engineering (to analyze material properties) orquality control (to detect faults in structures) This could provideradically new challenges in the realm of computer graphics Graphicsresearch can enable new insights via comprehensible simulationsand new data structures to render light in motion For instancerelativistic rendering techniques have been developed usingour data where the common assumption of constant irradiance overthe surfaces does no longer hold [Jarabo et al 2013] It may alsoallow a better understanding of scattering and may lead to new physically valid models as well as spawn new art forms

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 27: Gaurav.report on femto photography

ReferencesABRAMSON N 1978 Light-in-flight recording by holographyOptics Letters 3 4 121ndash123BUSCK J AND HEISELBERG H 2004 Gated viewing and highaccuracythree-dimensional laser radar Applied optics 43 244705ndash4710CAMPILLO A AND SHAPIRO S 1987 Picosecond streak camerafluorometry a review IEEE Journal of Quantum Electronics19 4 585ndash603CHARBON E 2007 Will avalanche photodiode arrays ever reach 1megapixel In International Image Sensor Workshop 246ndash249COLACcedil O A KIRMANI A HOWLAND G A HOWELL J CAND GOYAL V K 2012 Compressive depth map acquisitionusing a single photon-counting detector Parametric signal processingmeets sparsity In IEEE Computer Vision and PatternRecognition CVPR 2012 96ndash102DUGUAY M A AND MATTICK A T 1971 Pulsed-image generationand detection Applied Optics 10 2162ndash2170FARO 2012 Faro Technologies Inc Measuring Arms httpwwwfarocomGBUR G 2012 A camera fast enough to watch lightmove httpskullsinthestarscom20120104a-camera-fast-enough-to-watch-light-moveGELBART A REDMAN B C LIGHT R S SCHWARTZLOWC A AND GRIFFIS A J 2002 Flash lidar based on multipleslitstreak tube imaging lidar SPIE vol 4723 9ndash18GODA K TSIA K K AND JALALI B 2009 Serial timeencodedamplified imaging for real-time observation of fast dynamicphenomena Nature 458 1145ndash1149GUPTA O WILLWACHER T VELTEN A VEERARAGHAVANA AND RASKAR R 2012 Reconstruction of hidden3D shapes using diffuse reflections Optics Express 20 19096ndash19108HAMAMATSU 2012 Guide to Streak Cameras httpsaleshamamatsucomassetspdfcatsandguidese_streakhpdfHEBDEN J C 1993 Line scan acquisition for time-resolved

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April

Page 28: Gaurav.report on femto photography

imaging through scattering media Opt Eng 32 3 626ndash633HEIDE F HULLIN M GREGSON J AND HEIDRICH W2013 Low-budget transient imaging using photonic mixer devicesACM Trans Graph 32 4HUANG D SWANSON E LIN C SCHUMAN J STINSONW CHANG W HEE M FLOTTE T GREGORY K ANDPULIAFITO C 1991 Optical coherence tomography Science254 5035 1178ndash1181ITATANI J QUacuteE RacuteE F YUDIN G L IVANOV M Y KRAUSZF AND CORKUM P B 2002 Attosecond streak camera PhysRev Lett 88 173903JARABO A MASIA B AND GUTIERREZ D 2013 Transientrendering and relativistic visualization Tech Rep TR-01-2013Universidad de Zaragoza April