Paris Master Class 2011 - 07 Dynamic Global Illumination
-
Upload
wolfgang-engel -
Category
Engineering
-
view
930 -
download
2
Transcript of Paris Master Class 2011 - 07 Dynamic Global Illumination
Dynamic Global Illumination
Wolfgang EngelConfetti Special Effects Inc., Carlsbad
Paris Master Class
Agenda
• Requirement for Real-Time GI• Ambient Cubes• Diffuse Cube Mapping • Screen-Space Ambient Occlusion• Screen-Space Global Illumination• Reflective Shadow Maps• Splatting Indirect Illumination (SII)
Agenda
Agenda
Agenda
Requirement for Real-Time GI• Pre-calculated lighting disadvantages– hard to mimic a 24 hour cycle– storing those light or radiosity maps on disk or even
the DVD / Blu-ray required a lot of memory– streaming the light or radiosity maps from disk or
hard-drive through hardware to the GPU consumes valuable memory bandwidth
– geometry with light maps or radiosity maps is not destructible anymore (this is a reason to avoid any solution with pre-computed maps)
– while the environment is lit nicely, it is hard to light characters in a consistent way with this environment
Requirement for Real-Time GI
• Following Next-gen Game requirements:– As less as possible pre-calculated– As less as possible memory consumption
-> as much as possible is calculated on the fly– 24 hour sun movement– Destructible Geometry
Ambient Cube
• Ambient Cube (as used in Half-Life 2) (http://www2.ati.com/developer/gdc/D3DTutorial10_Half-Life2_Shading.pdf)
->Interpolate between six color values• Six colors are stored spatially in level
data:– Represent ambient light flowing through
that volume in space– Application looks this up for each model
to determine the six ambient colors to use for a given model
• HL2: local lights that aren’t important enough to go directly into the vertex shader are also added to the ambient cube
Diffuse Cube Mapping
• [Brennan]• Steps– Render environment into a cube map or use static
map– Blur it – Fetch it by indexing with the vertex normal
• Vertex normal is used to index the cube map with the intention that it returns the total amount of light scattered from this direction
Diffuse Cube Mapping
Original
Blurred
Screen-Space Ambient Occlusion
Screen-Space Ambient Occlusion
• See for the original approach [Kajalin]• Requires only Depth Buffer in original approach• Uses a sphere-like sampling kernel
Screen-Space Ambient Occlusion
• Computes the amount of solid geometry around a point
• The ratio between solid and empty space approximates the amount of occlusion
P1
P2
P3
P1 is located on a flat plane -> 0.5P2 is located in the corner -> 0.75P3 is located on the edge -> 0.25
Screen-Space Ambient Occlusion
• A very simplified version of the source code comes down to:
// very simplified version …float fPixelDepth = Depth.Sample( DepthSampler, IN.tex.xy ).r;
for ( int i = 0; i < 8; i++ ) { float fSampleDepth = Depth.Sample(DepthSampler, vRotatedKernel.xy).r; float fDelta = max( fSampleDepth - fPixelDepth, 0 ); fOcclusion += fDelta;}
Screen-Space Ambient Occlusion
• Designing a SSAO Kernel– Distance attenuation
-> not necessary if the density of the samples is higher closer to the center of the kernel
– Noise is necessary because even 16 taps spread out are not enough– Running on a quarter-sized render target will make large kernels like 16-
tap possible
Screen-Space Ambient Occlusion
Screen-Space Global Illumination
Screen-Space Global Illumination
• Every pixel in the G-Buffer is considered a secondary light source -> point lights
• The normal is considered the light direction
• Radiant intensity emitted at point p in direction ω
Φp – the flux defines the brightness of the pixel lightnp – normal at point p spatial emission
Screen-Space Global Illumination
• Irradiance at surface point p with normal n due to pixel light p is thus
Φp – the flux defines the brightness of the pixel light
np – normal at point p spatial emission
• Evaluating indirect irradiance at a surface point x with normal x
PTarget
PSampling
V
NormalSampling
NormalTarget
Screen-Space Global Illumination
• PSampling contribution to Ptarget (Outgoing cosinus)ISampling = NSampling.-V
• Ptarget contribution (Incoming cosinus) Itarget = NTarget.V
• Attenuation: A = 1.0 / D2
• ResultResult = Isampling * Itarget * A * ColorScreenSpaceRT
Screen-Space Global Illumination
• Issues– All the issues from SSAO– Does not consider occlusion for the indirect light sources
-> color changes based on viewing angle -> Different with reflective shadow maps …
Reflective Shadow Maps
• Extended shadow map [Dachsbacher] - Multiple-Render Target (MRT) that holds– Depth Buffer – Normal– World space position– Flux
• Each pixel in the MRT is considered a secondary light source; considered local point lights-> called pixel lights
• Radiant intensity emitted at point p in direction ω
Φp – the flux defines the brightness of the pixel light
np – normal at point p spatial emission
• Irradiance at surface point p with normal n due to pixel light p is thus
Φp – the flux defines the brightness of the pixel lightnp – normal at point p spatial emission
• Evaluating indirect irradiance at a surface point x with normal x
• Does not consider occlusion for the indirect light sources -> color changes based on viewing angle
Reflective Shadow Maps
Reflective Shadow Maps
• Challenge: a typical RSM is 512x512 -> evaluation of all those too expensive
• Importance Sampling -> evaluate about 400 samplesHow to pick the sampling points:1. Uniformly distributed random numbers are applied to polar coordinates2. Then they compensate for the varying density by weighting the achieved samples
ξ1- uniformly distributed random numberξ2- uniformly distributed random number
• Precompute sampling pattern and re-use it for all indirect light computations
• Still too expensive -> Screen-Space interpolation
Reflective Shadow Maps
• Screen-Space Interpolation1. pass – compute the indirect illumination for a low-resolution image in screen-space2. pass – render full-res + check if we can interpolate from the four surrounding low-
res samples• A low-res sample is regarded as suitable for interpolation if
– the sample’s normal is similar to the pixel’s normal, – and if its world space location is close to the pixel’s location
• Each samples contribution is weighted by the factors used for bi-linear interpolation – If not all samples of the four samples are used -> normalization– If all three or four samples are considered as suitable -> interpolate between those
Reflective Shadow Maps
• How to pick important pixel lights
1. Not visible in shadow map: x is not directly illuminated -> not visible in shadow map2. Normals point away: x-1 and x-2 are relatively close, but since their normal points away from x they do not contribute3. Same floor: x1 is very close but lies on the same floor -> won’t contribute4. 1 – 3 not applicable: x2 will contribute5. What is important: distance between x and a pixel light xp in the shadow map is a reasonable approximation for their distance in world space-> if the depth values with respect to the light source differ significantly, the world space distance is much bigger and we thus overestimate the influence-> important indirect lights will always be close, and these must also be close in the shadow map
Splatting Indirect Illumination
•Build on RSM•Instead of iterating over all image pixels and gathering indirect light from RSM-> Selects a subset of RSM pixels == pixel lights and distributes their light to the image pixels, using a splat in screen-space-> Deferred lighting idea re-visited
Splatting Indirect Illumination
• Stores – Flux – N.L * Color– Normal – just world-space normal– Position – offset in neg. direction of normal– Gloss value – gloss value is instead of specular component
• in Reflective Shadow map
Splatting Indirect Illumination
•How to create a Reflective Shadow Map:RESULT_RSM psCreateRSM( FRAGMENT_RSM fragment ){ RESULT_RSM result;
float3 worldSpacePos = fragment.wsPos.xyz;float3 lightVector = normalize( lightPosition - worldSpacePos );float3 normal = normalize( fragment.normal );
// compute fluxfloat4 flux = materialColor * saturate( dot( normal.xyz, lightVector ) );
// we squeeze everything into two floats quadrupels, as vertex texture fetches are expensive!float squeezedFlux = encodeFlux( flux.xyz );
float phongExponent = 1.0f;float3 mainLightDirection = normal;
if ( dot( mainLightDirection, lightVector ) < 0.0f ) normal *= -1.0f;
result.color[ 0 ] = float4( mainLightDirection, squeezedFlux );
// we move position a little bit back in the direction of the normal// this is done to avoid rendering problems along common boundaries of walls -> illumination integral has a // singularity thereresult.color[ 1 ] = float4( worldSpacePos - normal * 0.2f, phongExponent );
return result;}
Splatting Indirect Illumination
• How do we fetch the RSM?• Splat position: pre-calculated distribution of sample
points in light space-> 3D poisson disk distribution
Left - Sample points in light space / Right – Splats in main view
Splatting Indirect Illumination
Pseudo code1.Fetch pre-calculated distribution of sample points in light space in a texture2.Calculate size of ellipsoid bounding volume in screen-space (see article for code)
Splatting Indirect Illumination
3.Reduce size of splat based on– Size of the bounding volume– Distance camera <-> pixel light
Splatting Indirect Illumination
• Going back to the original Reflective Shadow Map algorithm:
• Irradiance at surface point p with normal n due to pixel light p is thusΦp – the flux defines the brightness of the pixel lightnp – normal at point p spatial emission
• Evaluating indirect irradiance at a surface point x with normal x
PTarget
PSampling
V
NormalSampling
NormalTarget
Splatting Indirect Illumination
• PSampling contribution to Ptarget (Outgoing cosinus)ISampling = NSampling.-V
• Ptarget contribution (Incoming cosinus) Itarget = NTarget.V
• Attenuation: A = 1.0 / (D2 * Scale + Bias)
• ResultResult = Isampling * Itarget * A * ColorScreenSpaceRT
Splatting Indirect Illumination
• Glossiness based on outgoing cosinusfloat Gloss = pow(NSampling.-V, phongExp)
Splatting Indirect Illumination
• Let’s single-step through the splatting pixel shaderfloat4 psSI( FRAGMENT_SI fragment ) : COLOR{
float4 result;
// get surface info for this pixelfloat4 wsPos = tex2Dproj( DSSample0, fragment.pos2D );float4 normal = tex2Dproj( DSSample1, fragment.pos2D ) * 2.0f - 1.0f;
// decode world space position (8 bit render target!)wsPos = ( wsPos - 0.5f ) * WSSIZE;
// compute lightingfloat l2, lR, cosThetaI, cosThetaJ, Fij, phongExp;
// lightPos comes from the RSM read in the vertex shader// lightPos – pixel light// wsPos – fragment positionfloat3 R = fragment.lightPos - wsPos.xyz; // R = vector from fragment to pixel light
l2 = dot( R, R ); // squared length of R (needed again later)R *= rsqrt( l2 ); // normalize RlR = ( 1.0f / ( distBias + l2 * INV_WS_SIZE2_PI * 2 ) ); // distance attenuation (there's a global scene scaling factor "...WS_SIZE...")
Splatting Indirect Illumination// lightDir comes from RSM read in vertex shader// this is the world space normal stored in the RSM of the pixel light == direction of the pixel lightcosThetaI = saturate( dot( fragment.lightDir, -R ) ); // outgoing cosine
phongExp = fragment.lightDir.w;
// with a phong like energy attenuation and widening of the high intensity regionif ( phongExp > 1.0f )
cosThetaI = pow( cosThetaI, phongExp * l2 );
// compare world-space normal at fragment that gets rendered with RcosThetaJ = saturate( dot( normal, R ) ); // incoming cosineFij = cosThetaI * cosThetaJ * lR; // putting everything together
#ifdef SMOOTH_FADEOUT// screen-space position of center of splatfloat3 t1 = fragment.center2D.xyz / fragment.center2D.w;// screen-space position of the pixel we are currently shadingfloat3 t2 = fragment.pos2D.xyz / fragment.pos2D.w;// lightFlux.w holds fade out value based on distance between camera and the pixel light position// xy is based on screen-space and z is based on the distance between the camera and the pixel positionfloat fadeOutFactor = saturate( 2 - 6.667 * length( t1.xy - t2.xy ) / fragment.lightFlux.w );
Fij *= fadeOutFactor;#endif
result = fragment.lightFlux * Fij; // transfer energy! return result;
}
Green arrow – ROrange arrow – normal fragmentBlue arrow – pixel light direction
Splatting Indirect Illumination
• SII importance Sampling-> doesn’t seem to work on modern hardware-> ignore it for now
Conclusion
• Screen-Space Global Illumination – allows to generate the impression of color bleeding with the
lowest possible amount of memory– does not require any transform– ”fits” into a PostFX pipeline – has all the issues of SSAO + false colors
• Reflective Shadow Maps + SII– higher quality than SSGI– “fits” into a Deferred Lighting engine and any Cascaded Shadow
Map approach – requires one or more additional G-Buffers from the point of
view of the light– more expensive
References• [Brennan] Chris Brennan, “Diffuse Cube Mapping”, ShaderX, pp. 287 - 289• [Dachsbacher] Carsten Dachsbacher, Marc Stamminger, “Reflective Shadow Maps”,
http://www.vis.uni-stuttgart.de/~dachsbcn/download/rsm.pdfhttp://www.vis.uni-stuttgart.de/~dachsbcn/publications.html
• [DachsbacherSii] Carsten Dachsbacher, Marc Stamminger, “Splatting Indirect Illumination”, http://www.vis.uni-stuttgart.de/~dachsbcn/download/sii.pdf
• [Kajalin] Vladimir Kajalin, “Screen-Space Ambient Occlusion”, pp. 413 – 424, ShaderX7• [Loos] Bradford James Loos, Peter-Pike Sloan, “VolumetricObscurance”,
http://www.cs.utah.edu/~loos/publications/vo/vo.pdf• [Nichols] Greg Nichols, Chris Wyman, “Multiresolution Splatting for Indirect Illumination”,
http://www.cs.uiowa.edu/~cwyman/publications/files/techreports/UICS-TR-08-04.pdf• [Ritschel] Tobias Ritschel, “Imperfect Shadow Maps for Efficient Computation of Indirect
Illumination”, http://www.uni-koblenz.de/~ritschel/• [Zink] Jason Zink, “Screen Space Ambient Occlusion”,
http://wiki.gamedev.net/index.php/D3DBook:Screen_Space_Ambient_Occlusion