Week 8 - Friday. What did we talk about last time? Radiometry Photometry Colorimetry Lighting...

37
CS361 Week 8 - Friday

Transcript of Week 8 - Friday. What did we talk about last time? Radiometry Photometry Colorimetry Lighting...

Page 1: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

CS361Week 8 - Friday

Page 2: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Last time

What did we talk about last time? Radiometry Photometry Colorimetry Lighting with shader code

Ambient Directional (diffuse and specular) Point

Page 3: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Questions?

Page 4: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Project 3

Page 5: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Implementing Point Lights

Page 6: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Back to specular lighting

Adding a specular component to the diffuse shader requires incorporating the view vector

It will be included in the shader file and be set as a parameter in the C# code

Page 7: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Specular light declarations The camera location is added to the declarations As are specular colors and a shininess parameter

float4x4 World;float4x4 View;float4x4 Projection;float4x4 WorldInverseTranspose;float3 Camera;

static const float PI = 3.14159265f;

float4 AmbientColor = float4(1, 1, 1, 1);float AmbientIntensity = 0.1;

float3 DiffuseLightDirection = float3(1, 1, 0);float4 DiffuseColor = float4(1, 1, 1, 1);float DiffuseIntensity = 0.7;

float Shininess;float4 SpecularColor = float4(1, 1, 1, 1);float SpecularIntensity = 0.5;

Page 8: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Specular light structures

The output adds a normal so that the half vector can be computed in the pixel shader

A world position lets us compute the view vector to the camera

struct VertexShaderInput{

float4 Position : SV_POSITION;float3 Normal : NORMAL;

};

struct VertexShaderOutput{

float4 Position : SV_POSITION;float4 Color : COLOR;float3 Normal : NORMAL;float4 WorldPosition : POSITIONT;

};

Page 9: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Specular vertex shader

The same computations as the diffuse shader, but we store the normal and the transformed world position in the output

VertexShaderOutput VertexShaderFunction(VertexShaderInput input){

VertexShaderOutput output;

float4 worldPosition = mul(input.Position, World);output.WorldPosition = worldPosition;float4 viewPosition = mul(worldPosition, View);output.Position = mul(viewPosition, Projection);float3 normal = normalize(mul(input.Normal,

(float3x3)WorldInverseTranspose));float lightIntensity = dot(normal,

normalize(DiffuseLightDirection));output.Color = saturate(DiffuseColor * DiffuseIntensity *

lightIntensity);output.Normal = normal;return output;

}

Page 10: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Specular pixel shader

Here we finally have a real computation because we need to use the pixel normal (which is averaged from vertices) in combination with the view vector

The technique is the samefloat4 PixelShaderFunction(VertexShaderOutput input) : SV_Target{

float3 light = normalize(DiffuseLightDirection);float3 normal = normalize(input.Normal);float3 reflect = normalize(2 * dot(light, normal) * normal –

light);float3 view = normalize(input.WorldPosition - Camera);float dotProduct = dot(reflect, view);float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity *

SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color);

return saturate(input.Color + AmbientColor * AmbientIntensity + specular);

}

Page 11: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Point lights in SharpDX

Point lights model omni lights at a specific position They generally attenuate (get dimmer) over a distance

and have a maximum range DirectX has a constant attenuation, linear attenuation,

and a quadratic attenuation You can choose attenuation levels through shaders

They are more computationally expensive than directional lights because a light vector has to be computed for every pixel

It is possible to implement point lights in a deferred shader, lighting only those pixels that actually get used

Page 12: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Point light declarations

We add light positionfloat4x4 World;float4x4 View;float4x4 Projection;float4x4 WorldInverseTranspose;float3 LightPosition;float3 Camera;

static const float PI = 3.14159265f;

float4 AmbientColor = float4(1, 1, 1, 1);float AmbientIntensity = 0.1f;float LightRadius = 50;

float4 DiffuseColor = float4(1, 1, 1, 1);float DiffuseIntensity = 0.7;

float Shininess;float4 SpecularColor = float4(1, 1, 1, 1);float SpecularIntensity = 0.5f;

Page 13: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Point light structures

We no longer need color in the output We do need the vector to the camera from the location We keep the world location at that fragment

struct VertexShaderInput{

float4 Position : SV_POSITION;float3 Normal : NORMAL;

};

struct VertexShaderOutput{

float4 Position : SV_POSITION;float4 WorldPosition : POSITIONT;float3 Normal : NORMAL;

};

Page 14: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Point light vertex shader

We compute the normal and the world position

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)

{VertexShaderOutput output;

float4 worldPosition = mul(input.Position, World);output.WorldPosition = worldPosition;float4 viewPosition = mul(worldPosition, View);output.Position = mul(viewPosition, Projection);float3 normal = normalize(mul(input.Normal,

(float3x3)WorldInverseTranspose));output.Normal = normal;return output;

}

Page 15: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Point light pixel shader

Lots of junk in herefloat4 PixelShaderFunction(VertexShaderOutput input) : SV_Target{

float3 normal = normalize(input.Normal);float3 lightDirection = LightPosition –

(float3)input.WorldPosition;float intensity = pow(1.0f –

saturate(length(lightDirection)/LightRadius), 2); lightDirection = normalize(lightDirection); //normalize afterfloat3 view = normalize(Camera - (float3)input.WorldPosition);float diffuseColor = dot(normal, lightDirection); float3 reflect = normalize(2 * diffuseColor * normal –

lightDirection); float dotProduct = dot(reflect, view);float specular = (8 + Shininess) / (8 * PI) * SpecularIntensity *

SpecularColor * max(pow(dotProduct, Shininess), 0) * length(diffuseColor);

return saturate(diffuseColor + AmbientColor * AmbientIntensity + specular);

}

Page 16: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Student Lecture: BRDFs

Page 17: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

BRDFs

Page 18: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

BRDF theory

The bidirectional reflectance distribution function is a function that describes the difference between outgoing radiance and incoming irradiance

This function changes based on: Wavelength Angle of light to surface Angle of viewer from surface

For point or directional lights, we do not need differentials and can write the BRDF:

iL

o

θEL

fcos

)(),(

vvl

Page 19: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

How is this different?

We've been talking about lighting models Lambertian, specular, etc.

A BRDF is an attempt to model physics slightly better

A big difference is that different wavelengths are absorbed and reflected different by different materials

Rendering models in real time with (more) accurate BRDFs is still an open research problem

Page 20: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Spheres with different BRDFs

They also have global lighting (shadows and reflections) Taken from www.kevinbeason.com

Page 21: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Revenge of the BRDF

The BRDF is supposed to account for all the light interactions we discussed in Chapter 5 (reflection and refraction)

We can see the similarity to the lighting equation from Chapter 5, now with a BRDF:

n

kiLko kkθEfL

1

cos),()( vlv

Page 22: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

When a BRDF isn't enough…

If the subsurface scattering effects are great, the size of the pixel may matter

Then, a bidirectional surface scattering reflectance distribution function (BSSRDF) is needed

Or if the surface characteristics change in different areas, you need a spatially varying BRDF

And so on…

Page 23: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Constraints on BRDFs

Helmholtz reciprocity: f(l,v) = f(v,l)

Conservation of energy: Outgoing energy cannot be greater than incoming energy

The simplest BRDF is Lambertian shading We assume that energy is scattered equally in all

directions Integrating over the hemisphere gives a factor of π Dividing by π gives us exactly what we saw before:

n

kiLo k

L1

diff θcos)(c

v

Page 24: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Texture Mapping and Bump Mapping in Shaders

Page 25: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Texture mapping in a shader

We'll start with our specular shader for directional light and add textures to it

Page 26: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Texture

The texture for the ship is below:

Page 27: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Texturing additions

We add a texture variable called ModelTexture

We also add a SamplerState structure that specifies how to filter the texture

Texture2D ModelTexture;SamplerState ModelTextureSampler{

Filter = MIN_MAG_MIP_LINEAR;AddressU = Clamp;AddressV = Clamp;

};

Page 28: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Texturing structures

We add a texture coordinate to the input and the output of the vertex shaderstruct VertexShaderInput

{float4 Position : SV_POSITION;float3 Normal : NORMAL;float2 Texture : TEXCOORD;

};

struct VertexShaderOutput{

float4 Position : SV_POSITION;float4 Color : COLOR;float3 Normal : NORMAL;float4 WorldPosition : POSITIONT;float2 Texture : TEXCOORD;

};

Page 29: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Texturing vertex shader

Almost nothing changes here except that we copy the input texture coordinate into the output

VertexShaderOutput VertexShaderFunction(VertexShaderInput input){

VertexShaderOutput output;

float4 worldPosition = mul(input.Position, World);output.WorldPosition = worldPosition;float4 viewPosition = mul(worldPosition, View);output.Position = mul(viewPosition, Projection);float3 normal = normalize(mul(input.Normal,

(float3x3)WorldInverseTranspose));float lightIntensity = dot(normal,

normalize(DiffuseLightDirection));output.Color = saturate(DiffuseColor * DiffuseIntensity *

lightIntensity);output.Normal = normal;output.Texture = input.Texture;return output;

}

Page 30: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Texturing pixel shader

We have to pull the color from the texture and set its alpha to 1

Then scale the components of the color by the texture color float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target{

float3 light = normalize(DiffuseLightDirection);float3 normal = normalize(input.Normal);float3 reflect = normalize(2 * dot(light, normal) * normal – light);float3 view = normalize(input.WorldPosition - Camera);float dotProduct = dot(reflect, view);float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color);float4 textureColor = ModelTexture.Sample(ModelTextureSampler, input.Texture);textureColor.a = 1;return saturate(textureColor * input.Color + AmbientColor * AmbientIntensity + specular);

}

Page 31: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Updates to SharpDX

To use a texture, we naturally have to load a texture

We have to set the texture, but as a resource, not as a value

texture = Content.Load<Texture2D>("ShipTexture");

effect.Parameters["ModelTexture"].SetResource<Texture2D>(shipTexture);

Page 32: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Bump mapping in shaders

It's easiest to do bump mapping in SharpDX using a normal map

Of course, a normal map is hard to create by hand

What's more common is to create a height map and then use a tool for creating a normal map from it

xNormal is a free utility to do this http://www.xnormal.net/Downloads.aspx

Page 33: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Height map to normal map

The conversion from a grayscale height map to a normal map looks like this

Page 34: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

How does bump mapping work?

We have a normal to a surface, but there are also tangent directions

We call these the tangent and the binormal Apparently serious mathematicians

think it should be called the bitangent

The binormal is tangent to the surface and orthogonal to the other tangent

We distort the normal with weighted sums of the tangent and binormal (stored in our normal map)

Normal

Tangent

Binormal

Page 35: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Upcoming

Page 36: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Next time…

Choosing BRDFs Implementing BRDFs Image-based approaches to

sampling BRDFs

Page 37: Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.

Reminders

Finish reading Chapter 7Finish Project 2

Due tonight by midnight