Alain Galvan ·6/5/2019 @ 7:30 AM · Updated 5 months ago
A review of material models used in real-time renderers and ray tracing engines.
Tags: blogshaderglslhlslvulkandirectxopenglspir-vmetal
Materials in computer graphics can be described with a Material Model, a function describing the output luminance (light energy) in the direction of the viewer of a given surface or volume. Such a function uses the environment around a given point, the lights that are visible to that point, the shadows cast by those lights at that point, and the observer's location to determine an output luminance and such interactions are modeled with Shading Models.
An Analytical Shading Model is an mathematical approximation of a surface/volume to light interaction. [Eisemann et al. 2012]
Such models are expressed for a given surface normal $n$, unit view vector $\omega$, unit light vector $l$. For a given surface point $p$, the view vector is defined as the direction from $p$ to the observer. $l$ defines the direction from $p$ towards the light.
By using measured data it's possible to build a mathematical model of a given material, or a model that can simulate a variety of materials. [Matusik et al. 2003] [Belcour et al. 2013]
Shading models can describe surfaces, volumes, or a combination of the two. The term Bidirectional (Scattering, Surface, Reflectance, Transmission) Distribution Function (BSDF) is a formal term for describing a shading model on a given surface. It's commonly abbreviated as BRDF, though BSDF is sometimes used as well, BxDF has been used by pbrt and Unreal Engine 4 to describe models that can serve for both surfaces and volumes.
A BRDF gives the amount of incoming light energy from direction $\hat{\omega}$ that turns into outgoing light energy in direction $\omega$. It is written with respect to a surface patch whose normal points upward.
$f\_{X,\hat{x}}(\hat{\omega}\_i,\hat{\omega}\_o)$
This models the excitant radiance $W / m^2 sr$ of a Surface, essentially what math is involved in a given surface that adds light intensity $L_o$ to the image.
BRDFs obey the following rules:
Helmhotz-reciprocity - The direction of the ray of light can be reversed (important for Bidirectional Path Tracing).
Positivity - an exit direction cannot have a negative probability. max(lighting, 0.0)
Energy Conserving - does not create energy, it is impossible for more light to come out than what has come in (unless of course it's a light source).
All incident light must be reflected when surface reflectivity is 1 (white furnace test).
It functions as the ratio of scattered radiance (specular light) and irradiance (diffuse light).
Over the years many different types of BRDFs have been developed:
Some of the more popular are:
In addition to:
Interpolation methods such as:
And many more described in literature. While each is fascinating in their own right, each has trade-offs in performance, accuracy, and complexity.
We'll be focusing on describing the most popular models in this list, reviewing how to use each of these models to describe metals (Gold, Steel, Aluminum), dialectics (plastics, rocks), anisotropic [Ashikhmin et al. 2000] (velvet), subsurface (skin, jello), refractive (glass, water), and translucent (stained Glass, rough glass) materials. We'll conclude with more literature on the subject of material models. [Georgiev et al. 2019] [Langlands 2017]
Normal Distribution Function (NDF) - The likelihood of a given facet to be oriented in a given direction.
Probability Distribution Function (PDF) - A model of the distribution of radiance in a given direction.
An analytical model approximates a path traced model of a given BRDF, giving a valid output luminosity with 1 sample. These are useful for real time rendering techniques where only one sample is expected per fragment.
One of the simplest BRDF models is Lambertian shading. Assuming that the surface is a completely diffuse reflection, meaning that light is scattered uniformly in all directions independent of the observer's view point, the model can be described as:
$L_0^d (p, \omega) = \kappa_d max(0, n \cdot l)$
float lambertianPDF(vec3 normal, vec3 lightDirection)
{
return max(0.0, dot(normal, lightDirection)) * ( 1.0 / PI );
}
vec3 lambertianLight(SurfaceData s, Light light)
{
return s.albedo * lambertianPDF(s.normal, light.direction);
}
Note the use of $\frac{1}{\pi}$, this is sometimes omitted in real time renderers. [Seblagarde 2012]
Where $\kappa_d$ is the reflectance that encodes how much energy is absorbed by the surface (albedo).
Dot product implicitly encodes that $L_o$ will be lower when the light arrives at a grazing angle, similar to real life. In real life, lambertian interactions are minimal, and most objects show specular interactions in some capacity.
Such reflections can even show more of the color of a light than of the material during reflection, dialectric materials such as water, plastic, and metal vs. metallic objects such as gold or steel.
Phong is based on the informal observation that highlights appear when the view vector aligns with the direction of the reflected light vector $r = 1 - 2(n \cdot l )n$. As soon as the view vector deviates from $r$, light starts disappearing. Phone modeled the specular component of the material with the following function:
$L_o^s (p, \omega) = \kappa_a + \kappa_d L_o^d (p, \omega) + \kappa_s max(0, \omega \cdot r)^a$
Where $\kappa_s$ is the specular-reflection constant that defines the ratio of reflection and $\kappa_d$ the ratio of the diffuse part $L_o^d$ usually $\kappa_s | \kappa_d < 1$. The constant term $\kappa_a$ ambient can be used to emulate indirect illumination, but is considered a hack. $\alpha$ defines the shininess of a material, the larger the value, the smaller and more focused the highlights of the surface will be.
float phongSpecular(vec3 normal, vec3 lightDirection, vec3 viewDirection, float shininess)
{
return pow(max(0.0, dot(reflect(normal, lightDirection), viewDirection)), shininess);
}
A modification of the Phong term suggested by Blinn. Instead of relying on a reflected light vector, the define a unit halfway vector between the view and the light $h = (\omega + l) / |\omega + l|$ which is cheaper to compute. This new model is also more accurate at reproducing measured BRDF surfaces. [Gotanda]
$L_o^s (p, \omega) = \kappa_a + \kappa_d L_o^d (p, \omega) + \kappa_s max(0, h \cdot n)^a$
float blinPhongSpecular(vec3 normal, vec3 lightDirection, vec3 viewDirection, float shininess)
{
vec3 halfDirection = normalize(viewDirection + lightDirection);
return pow(max(0.0, dot(halfDirection, normal)), shininess);
}
Schlick describes a fresnel approximation [Schlick 1994] that's used throughout computer graphics, first described as follows:
$R*\lambda(t, u, v, v', w) = S*\lambda(u) D(t, v, v', w)$
Where $S\_\lambda$ is the spectral factor:
$S*\lambda(u) = C*\lambda + (1 - C\_\lambda)(1 - u)^5$
float SchlickFresnel(float u)
{
float m = clamp(1.0 - u, 0.0, 1.0);
float m2 = m * m;
return m2 * m2 * m; // pow(m,5.0)
}
Oren-Nayar uses a ratio of lambertian to better approximate diffuse surfaces, better passing the white furnace test. [Oren et al. 1994]
float orenNayarDiffuse(
vec3 normal,
vec3 lightDirection,
vec3 viewDirection,
float roughness,
float albedo)
{
float LdotV = dot(lightDirection, viewDirection);
float NdotL = dot(lightDirection, normal);
float NdotV = dot(normal, viewDirection);
float s = LdotV - NdotL * NdotV;
float t = mix(1.0, max(NdotL, NdotV), step(0.0, s));
float sigma2 = roughness * roughness;
float A = 1.0 + sigma2 * (albedo / (sigma2 + 0.13) + 0.5 / (sigma2 + 0.33));
float B = 0.45 * sigma2 / (sigma2 + 0.09);
return albedo * max(0.0, NdotL) * (A + B * s / t) / 3.14159265;
}
GGX (Ground Glass Unknown) is an analytic BSDF model first proposed by [Walter et al. 2007] that took into account the micro-facet distribution of an underlying material.
Unreal Engine 4 expanded on this definition by defining a split sum approximation to GGX that used pre-convoluted cube map textures, and a pre-integrated LUT that maps vec2(cosTheta, roughness)
to the specular fresnel contribution of a given material. [^karis2013] [^hammon2017] [^carpentier2017] [^heitz2017]
vec3 ImportanceSampleGGX(vec2 Xi, float Roughness, vec3 N)
{
float a = Roughness * Roughness;
float Phi = 2 * PI * Xi.x;
float CosTheta = sqrt((1 - Xi.y) / (1 + (a * a - 1) * Xi.y));
float SinTheta = sqrt(1 - CosTheta * CosTheta);
vec3 H;
H.x = SinTheta * cos(Phi);
H.y = SinTheta * sin(Phi);
H.z = CosTheta;
vec3 UpVector = abs(N.z) < 0.999 ? vec3(0, 0, 1) : vec3(1, 0, 0);
vec3 TangentX = normalize(cross(UpVector, N));
vec3 TangentY = cross(N, TangentX);
// Tangent to world space
return TangentX * H.x + TangentY * H.y + N * H.z;
}
vec3 SpecularIBL(vec3 SpecularColor, float Roughness, vec3 N, vec3 V)
{
vec3 SpecularLighting = 0;
const uint NumSamples = 1024;
for (uint i = 0; i < NumSamples; i++) {
vec2 Xi = Hammersley(i, NumSamples);
4 vec3 H = ImportanceSampleGGX(Xi, Roughness, N);
vec3 L = 2 * dot(V, H) * H - V;
float NoV = saturate(dot(N, V));
float NoL = saturate(dot(N, L));
float NoH = saturate(dot(N, H));
float VoH = saturate(dot(V, H));
if (NoL > 0)
{
vec3 SampleColor = EnvMap.SampleLevel(EnvMapSampler, L, 0).rgb;
float G = G_Smith(Roughness, NoV, NoL);
float Fc = pow(1 - VoH, 5);
vec3 F = (1 - Fc) * SpecularColor + Fc;
// Incident light = SampleColor * NoL
// Microfacet specular = D*G*F / (4*NoL*NoV)
// pdf = D * NoH / (4 * VoH)
SpecularLighting += SampleColor * F * G * VoH / (NoH * NoV);
}
}
return SpecularLighting / NumSamples;
}
vec3 PrefilterEnvMap(float Roughness, vec3 R)
{
vec3 N = R;
vec3 V = R;
vec3 PrefilteredColor = 0;
const uint NumSamples = 1024;
for (uint i = 0; i < NumSamples; i++) {
vec2 Xi = Hammersley(i, NumSamples);
vec3 H = ImportanceSampleGGX(Xi, Roughness, N);
vec3 L = 2 * dot(V, H) * H - V;
float NoL = saturate(dot(N, L));
if (NoL > 0)
{
PrefilteredColor += EnvMap.SampleLevel(EnvMapSampler, L, 0).rgb * NoL;
TotalWeight += NoL;
}
}
return PrefilteredColor / TotalWeight;
}
vec2 IntegrateBRDF(float Roughness, float NoV)
{
vec3 V;
V.x = sqrt(1.0f - NoV * NoV); // sin
V.y = 0;
V.z = NoV; // cos
float A = 0;
float B = 0;
const uint NumSamples = 1024;
for (uint i = 0; i < NumSamples; i++) {
vec2 Xi = Hammersley(i, NumSamples);
vec3 H = ImportanceSampleGGX(Xi, Roughness, N);
vec3 L = 2 * dot(V, H) * H - V;
float NoL = saturate(L.z);
float NoH = saturate(H.z);
float VoH = saturate(dot(V, H));
if (NoL > 0)
{
float G = G_Smith(Roughness, NoV, NoL);
float G_Vis = G * VoH / (NoH * NoV);
float Fc = pow(1 - VoH, 5);
A += (1 - Fc) * G_Vis;
B += Fc * G_Vis;
}
}
return vec2(A, B) / NumSamples;
}
vec3 ApproximateSpecularIBL(vec3 SpecularColor, float Roughness, vec3 N,
vec3 V)
{
float NoV = saturate(dot(N, V));
vec3 R = 2 * dot(V, N) * N - V;
vec3 PrefilteredColor = PrefilterEnvMap(Roughness, R);
vec2 EnvBRDF = IntegrateBRDF(Roughness, NoV);
return PrefilteredColor * (SpecularColor * EnvBRDF.x + EnvBRDF.y);
}
Jakub Boksansky (@boksajak) wrote a Crash Course in BRDF Implementation here.
Brian Karris (@BrianKaris) wrote great reference to different specular BRDFs here.
For code samples refer to the following:
Joey de Vries @JoeyDeVriez provides an amazing summary of image based lighting for PBR in his LearnOpenGL.com blog.
Disney BRDF Explorer features examples of every BRDF you could possibly want to know about here.
Unity provides the source code to their shaders here in their archive, simply click on the arrow and select Built In Shaders, and check out their UnityStandardBRDF.cginc
file.
Unreal Engine 4 also provides the source code to their shaders if you link your Unreal account to your Github account. You'll find BRDF shaders in Engine/Shaders/Private/BRDF.ush
.
Morgan McGuire's G3D Engine features a material models example in G3D10/samples/minimalOpenGL/min.pix
.
If you install Marmoset Toolbag, you'll find its material model shaders in data/shader/mat
, with examples of Lambertian, GGX, Phong, Anisotropic, etc.