Fast Depth-of-Field Rendering with Surface Splatting
date post
05-Feb-2016Category
Documents
view
26download
1
Embed Size (px)
description
Transcript of Fast Depth-of-Field Rendering with Surface Splatting
Fast Depth-of-Field Rendering with Surface Splatting Jaroslav KivnekCTU Prague IRISA INRIA RennesJi raCTU Prague Kadi BouatouchIRISA INRIA RennesComputerGraphicsGroup
GoalDepth-of-field rendering with point-based objectsWhy point-based ?Efficient for complex objectsWhy depth-of-field ?Nice and naturally looking images
OverviewIntroduction Point-based renderingDepth-of-fieldDepth-of-field techniquesOur contribution: Point-based depth-of-field renderingBasic approachExtended method: depth-of-field with level of detailResultsDiscussionConclusions
Point-based renderingObject represented by points without connectivity
Point (surfel) position, normal, radius, material
Rendering = screen space surface reconstruction
Efficient for very complex objects
Depth-of-FieldMore naturally looking imagesImportant depth cue for perception of scene configurationDraws attention to the focused objects
Thin Lens Camera Modelimage planefocal planelensVPPF/n
Depth-of-Field Techniques in CGSupersamplingDistributed ray tracing [Cook et al. 1984]Sample the light paths through the lens
Multisampling [Haeberli & Akeley 1990]Several images from different viewpoints on the lensAverage the resulting images using accumulation buffer
Depth of Field Techniques in CGPost-filtering [Potmesil & Chakravarty 1981]Out-of-focus pixels displayed as CoCIntensity leakage, hypo-intensitySlow for larger kernels
Focus processor (filtering)Image synthesizer
Point-based rendering - splattingDraw each point as a fuzzy splat (an ellipse)Image = SPLATi
Our Basic ApproachPost-filteringFocus processor (filtering)Image + depthImage with DOFImage =i SPLATii SPLATi + depth
Our Basic Approach
Properties of our basic approachPROS+Avoids leakageReconstruction takes into account the splat depth+No hypo-intensitiesVisibility resolved after blurring +Handles transparencyIn the same way as the EWA splatting A-bufferCONS-Very slow, especially for large aperturesA lot of large overlapping splatsHigh number of fragments: E.g. Lion, no blur: 2.3 mil.; blur 90.2 mil. (40x more)
Our Extended MethodUse Level of Detail (LOD) to attack complexity blur = detailSelect lower LOD for blurred parts
# of fragments increases more slowlyE.g. Lion, no blur: 2.3 mil.; blur 5.3 mil. (2.3x more)Blurred img.Selected LOD
ObservationSelecting lower LOD for rendering equivalent to 1) selecting the fine LOD 2) low-pass filtering is screen space
Use LOD as a means for blurring not only to reduce complexity
Fine LODLower LOD
Effect of LOD SelectionHow to quantify the effect of LOD selection in terms of blur in the resulting image ?
We use Bounding sphere hierarchy Qsplat [Rusinkiewicz & Levoy, 2000]
Bounding Sphere HierarchyThe finest level: L=0Lower level: L=1 Building the hierarchy levels low-pass filtering + subsampling
LOD Filter in Screen SpaceGQL defined in local coordinates in object spaceGQL related to screen space by the local affine approximation J of the object-to-screen transformSelecting level L = filtering in screen space by GJQLJTScreen spaceGQLGJQLJTObject space
DOF with LOD - AlgorithmGiven the required screen space filter GQDOF Select LOD L such that support( GJQLJT ) < support ( r GQDOF )Apply an additional screen space filter GQDIFF to get GQDOF
rDOF = [r GJQLJT ] GQDIFF
ResultsNo Depth-of-Field everything in focus
ResultsTransparent mask in focus, male figure out of focus
ResultsMale figure in focus, transparent mask out of focus
ResultsOur algorithmReference solution (multisampling) Our blur looks too smooth because of the Gaussian filter
ResultsOur algorithmReference solution (multisampling) Artifacts due to incorrect surface reconstruction
DiscussionSimplifying assumptions & limitationsGaussian distribution of light within the CoC Mostly okWe are blurring the texture before lightingWe should blur after lightingPossible incorrect image reconstruction from blurred splats
ConclusionA novel algorithm for depth of field renderingLOD as a means for depth-blurring+ Transparency+ Avoids intensity leakage+ Running time independent of the DOF- Only for point based rendering- A number of artifacts can appearIdeal tool for interactive DOF previewingTrial and error camera parameters settingAcknowledgement: Grant 2159/2002 MSMT Czech Republic