Week 8 - Wednesday. What did we talk about last time? Textures Volume textures Cube maps ...

38
CS361 Week 8 - Wednesday

Transcript of Week 8 - Wednesday. What did we talk about last time? Textures Volume textures Cube maps ...

Page 1: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

CS361Week 8 - Wednesday

Page 2: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Last time

What did we talk about last time? Textures

Volume textures Cube maps Texture caching and compression Procedural texturing Texture animation Material mapping Alpha mapping

Bump mapping Normal maps Parallax mapping Relief mapping Heightfield texturing

Page 3: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Questions?

Page 4: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Project 2

Page 5: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Radiometry

Page 6: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Radiometry

Radiometry is the measurement of electromagnetic radiation (for us, specifically light)

Light is the flow of photons We'll generally think of photons as particles,

rather than waves Photon characteristics

Frequency ν = c/λ (Hertz) Wavelength λ = c/ν (meters) Energy Q = hν (joules) [h is Planck's

constant]

Page 7: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Radiometric quantities

We'll be interested in the following radiometric quantitiesQuantity Unit

Radiant energy joule (J)

Radiant flux watt (W)

Irradiance W/m2

Radiant intensity W/sr

Radiance W/(m2sr)

Page 8: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Concrete examples

Radiant flux: energy per unit time (power) Irradiance: energy per unit time through a

surface Intensity: energy per unit time per steradian

Page 9: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Radiance

The radiance L is what we care about since that's what sensors detect

We can think of radiance as the portion of irradiance within a solid angle

Or, we can think of radiance as the portion of a light's intensity that flow through a surface

Radiance doesn't change with distance

Page 10: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Photometry

Page 11: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Photometry

Radiometry just deals with physics Photometry takes everything from radiometry

and weights it by the sensitivity of the human eye

Photometry is just trying to account for the eye's differing sensitivity to different wavelengths

Page 12: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Photometric units

Because they're just rescalings of radiometric units, every photometric unit is based on a radiometric one

Luminance is often used to describe the brightness of surfaces, such as LCD screensRadiometric Quantity

Unit Photometric Quantity

Unit

Radiant energy

joule (J) Luminous energy talbot

Radiant flux watt (W) Luminous flux lumen

Irradiance W/m2 Illuminance lux

Radiant intensity

W/sr Luminous intensity

candela

Radiance W/(m2sr) Luminance nit

Page 13: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Colorimetry

Colorimetry is the science of quantifying human color perception

The CIE defined a system of three non-monochromatic colors X, Y, and Z for describing the human perceivable color space

RGB is a transform from these values into monochromatic red, green, and blue colors RGB can only express colors in

the triangle As you know, there are others

(HSV, HSL, etc.)

Page 14: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Lighting in SharpDX

Page 15: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Types of lights

Real light behaves consistently (but in a complex way)

For rendering purposes, we often divide light into categories that are easy to model Directional lights (like the sun) Omni lights (located at a point, but evenly

illuminate in all directions) Spotlights (located at a point and have intensity

that varies with direction) Textured lights (give light projections variety in

shape or color)▪ Similar to gobos, if you know anything about stage lighting

Page 16: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

SharpDX lights

With a programmable pipeline, you can express lighting models of limitless complexity

The old DirectX fixed function pipeline provided a few stock lighting models Ambient lights Omni lights Spotlights Directional lights All lights have diffuse, specular, and ambient color

Let's see how to implement these lighting models with shaders

Page 17: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Ambient lights

Ambient lights are very simple to implement in shaders

We've already seen the code The vertex shader must simply

transform the vertex into clip space (world x view x projection)

The pixel shader colors each fragment a constant color We could modulate this by a texture if we

were using one

Page 18: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Ambient light declarations

float4x4 World;float4x4 View;float4x4 Projection;

float4 AmbientColor = float4(1, 1, 1, 1);float AmbientIntensity;

struct VertexShaderInput{

float4 Position : SV_Position;};

struct VertexShaderOutput{

float4 Position : SV_Position;};

Page 19: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Ambient light vertex shader

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)

{VertexShaderOutput output;

float4 worldPosition = mul(input.Position, World);float4 viewPosition = mul(worldPosition, View);output.Position = mul(viewPosition, Projection);

return output;}

Page 20: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Ambient light pixel shader and technique

float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target{

return AmbientColor * AmbientIntensity;}

technique Ambient{

pass Pass1{

VertexShader = compile vs_2_0 VertexShaderFunction();

PixelShader = compile ps_2_0 PixelShaderFunction();}

}

Page 21: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Directional lights in SharpDX Directional lights model lights from a very

long distance with parallel rays, like the sun

It only has color (specular and diffuse) and direction

They are virtually free from a computational perspective

Directional lights are also the standard model for BasicEffect You don't have to use a shader to do them

Let's look at a diffuse shader first

Page 22: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Diffuse light declarations We add values for the diffuse light intensity and direction We add a WorldInverseTranspose to transform the normals We also add normals to our input and color to our output

float4x4 World;float4x4 View;float4x4 Projection;float4x4 WorldInverseTranspose;

float4 AmbientColor = float4(1, 1, 1, 1);float AmbientIntensity = 0.1;

float3 DiffuseLightDirection = float3(1, 1, 0);float4 DiffuseColor = float4(1, 1, 1, 1);float DiffuseIntensity = 0.7;

struct VertexShaderInput{

float4 Position : SV_POSITION;float3 Normal : NORMAL;

};

struct VertexShaderOutput{

float4 Position : SV_POSITION;float4 Color : COLOR;

};

Page 23: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Diffuse light vertex shader

Color depends on the surface normal dotted with the light vector

VertexShaderOutput VertexShaderFunction(VertexShaderInput input){

VertexShaderOutput output;

float4 worldPosition = mul(input.Position, World);float4 viewPosition = mul(worldPosition, View);output.Position = mul(viewPosition, Projection);float3 normal = mul(input.Normal,

(float3x3)WorldInverseTranspose);float lightIntensity = dot(normalize(normal),

normalize(DiffuseLightDirection));output.Color = saturate(DiffuseColor * DiffuseIntensity *

lightIntensity);return output;

}

Page 24: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Diffuse light pixel shader

No real differences here The diffuse color and ambient colors

are added together The technique is exactly the same

float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target{

return saturate(input.Color + AmbientColor * AmbientIntensity);

}

Page 25: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Specular lighting

Adding a specular component to the diffuse shader requires incorporating the view vector

It will be included in the shader file and be set as a parameter in the C# code

Page 26: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Specular light declarations The camera location is added to the declarations As are specular colors and a shininess parameter

float4x4 World;float4x4 View;float4x4 Projection;float4x4 WorldInverseTranspose;float3 Camera;

static const float PI = 3.14159265f;

float4 AmbientColor = float4(1, 1, 1, 1);float AmbientIntensity = 0.1;

float3 DiffuseLightDirection = float3(1, 1, 0);float4 DiffuseColor = float4(1, 1, 1, 1);float DiffuseIntensity = 0.7;

float Shininess;float4 SpecularColor = float4(1, 1, 1, 1);float SpecularIntensity = 0.5;

Page 27: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Specular light structures

The output adds a normal so that the half vector can be computed in the pixel shader

A world position lets us compute the view vector to the camera

struct VertexShaderInput{

float4 Position : SV_POSITION;float3 Normal : NORMAL;

};

struct VertexShaderOutput{

float4 Position : SV_POSITION;float4 Color : COLOR;float3 Normal : NORMAL;float4 WorldPosition : POSITIONT;

};

Page 28: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Specular vertex shader

The same computations as the diffuse shader, but we store the normal and the transformed world position in the output

VertexShaderOutput VertexShaderFunction(VertexShaderInput input){

VertexShaderOutput output;

float4 worldPosition = mul(input.Position, World);output.WorldPosition = worldPosition;float4 viewPosition = mul(worldPosition, View);output.Position = mul(viewPosition, Projection);float3 normal = normalize(mul(input.Normal,

(float3x3)WorldInverseTranspose));float lightIntensity = dot(normal,

normalize(DiffuseLightDirection));output.Color = saturate(DiffuseColor * DiffuseIntensity *

lightIntensity);output.Normal = normal;return output;

}

Page 29: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Specular pixel shader

Here we finally have a real computation because we need to use the pixel normal (which is averaged from vertices) in combination with the view vector

The technique is the samefloat4 PixelShaderFunction(VertexShaderOutput input) : SV_Target{

float3 light = normalize(DiffuseLightDirection);float3 normal = normalize(input.Normal);float3 reflect = normalize(2 * dot(light, normal) * normal –

light);float3 view = normalize(input.WorldPosition - Camera);float dotProduct = dot(reflect, view);float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity *

SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color);

return saturate(input.Color + AmbientColor * AmbientIntensity + specular);

}

Page 30: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Point lights in SharpDX

Point lights model omni lights at a specific position They generally attenuate (get dimmer) over a distance

and have a maximum range DirectX has a constant attenuation, linear attenuation,

and a quadratic attenuation You can choose attenuation levels through shaders

They are more computationally expensive than directional lights because a light vector has to be computed for every pixel

It is possible to implement point lights in a deferred shader, lighting only those pixels that actually get used

Page 31: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Point light declarations

We add light positionfloat4x4 World;float4x4 View;float4x4 Projection;float4x4 WorldInverseTranspose;float3 LightPosition;float3 Camera;

static const float PI = 3.14159265f;

float4 AmbientColor = float4(1, 1, 1, 1);float AmbientIntensity = 0.1f;float LightRadius = 50;

float4 DiffuseColor = float4(1, 1, 1, 1);float DiffuseIntensity = 0.7;

float Shininess;float4 SpecularColor = float4(1, 1, 1, 1);float SpecularIntensity = 0.5f;

Page 32: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Point light structures

We no longer need color in the output We do need the vector to the camera from the location We keep the world location at that fragment

struct VertexShaderInput{

float4 Position : SV_POSITION;float3 Normal : NORMAL;

};

struct VertexShaderOutput{

float4 Position : SV_POSITION;float4 WorldPosition : POSITIONT;float3 Normal : NORMAL;

};

Page 33: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Point light vertex shader

We compute the normal and the world position

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)

{VertexShaderOutput output;

float4 worldPosition = mul(input.Position, World);output.WorldPosition = worldPosition;float4 viewPosition = mul(worldPosition, View);output.Position = mul(viewPosition, Projection);float3 normal = normalize(mul(input.Normal,

(float3x3)WorldInverseTranspose));output.Normal = normal;return output;

}

Page 34: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Point light pixel shader

Lots of junk in herefloat4 PixelShaderFunction(VertexShaderOutput input) : SV_Target{

float3 normal = normalize(input.Normal);float3 lightDirection = LightPosition –

(float3)input.WorldPosition;float intensity = pow(1.0f –

saturate(length(lightDirection)/LightRadius), 2); lightDirection = normalize(lightDirection); //normalize afterfloat3 view = normalize(Camera - (float3)input.WorldPosition);float diffuseColor = dot(normal, lightDirection); float3 reflect = normalize(2 * diffuseColor * normal –

lightDirection); float dotProduct = dot(reflect, view);float specular = (8 + Shininess) / (8 * PI) * SpecularIntensity *

SpecularColor * max(pow(dotProduct, Shininess), 0) * length(diffuseColor);

return saturate(diffuseColor + AmbientColor * AmbientIntensity + specular);

}

Page 35: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Quiz

Page 36: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Upcoming

Page 37: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Next time…

BRDFs Implementing BRDFs Texture mapping in shaders

Page 38: Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Reminders

Finish reading Chapter 7 Summer REU opportunity:

Machine learning at the Florida Institute of Technology

Deadline March 31, 2015 http://www.amalthea-reu.org/