how do they work?

evan todd | | @etodd_


follow along at

i'd love to chat if you're into

  • event-driven servers
  • deployment automation
  • private clouds
  • deferred rendering
  • standing desks
  • oculus rift
  • graphic design
  • indie games
  • art in general
  • opengl es
  • python
  • html5
  • vim
  • c#
  • minimalist running
  • weird music
  • shaders... duh
  • or you know, anything else

what we will learn

  • no: how to write a shader that does x
  • yes: everything necessary to write shaders

pipeline overview

animation from the excellent simon schreibt's render hell

frame buffer

the screen is a 2d array of 24-bit numbers
each pixel consists of three 8-bit values ranging 0-255

normalized device coordinates


  • the source of these samples is self-contained
  • just copy and paste it into an html file to start hacking

an entire working webgl sample


matrix * input = output
* = 1

combining matrices

  • you can multiply matrices together to combine them
  • model matrix: move the vertices in world space
  • view matrix: apply camera position and rotation
  • projection matrix: convert the 3d vector to a 2d screen-space coordinate

perspective projection

var camera = new THREE.PerspectiveCamera
	45, // field of view (degrees)
	window.innerWidth / window.innerHeight, // aspect ratio
	1, // near plane
	1000 // far plane
camera.position.z = 500;

var geometry = new THREE.Geometry();

// cube!
geometry.vertices.push(new THREE.Vector3(-80, -80, -80));
geometry.vertices.push(new THREE.Vector3(-80, 80, -80));
geometry.vertices.push(new THREE.Vector3(80, -80, -80));
geometry.vertices.push(new THREE.Vector3(80, 80, -80));
geometry.vertices.push(new THREE.Vector3(-80, -80, 80));
geometry.vertices.push(new THREE.Vector3(-80, 80, 80));
geometry.vertices.push(new THREE.Vector3(80, -80, 80));
geometry.vertices.push(new THREE.Vector3(80, 80, 80));

what if we want to do something more complicated?

  • so far we have been using the "fixed function pipeline"
  • the gpu can only do matrix multiplication
  • if we want to move individual vertices, we have to send data from the cpu to gpu (expensive)
  • what if we instead run a program on the gpu itself?

baby's first vertex shader

input vertices, do math, output vertices

<script type="x-shader/x-vertex" id="vs">
	void main()
		gl_PointSize = 2.0;
		gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1);

things you should know about shaders

  • they are plain text
  • often included directly in binaries as char arrays
  • opengl compiles them at runtime
  • written in glsl (opengl shader language)
  • syntax is similar to c

three.js makes it crazy easy

var material = new THREE.ShaderMaterial(
	vertexShader: document.getElementById('vs').textContent,

glsl data types

  • bool, bvec2, bvec3, bvec4
  • int, ivec2, ivec3, ivec4
  • uint, uvec2, uvec3, uvec4
  • float, vec2, vec3, vec4


  • matnxn | 2 <= n <= 4
  • matn | 2 <= n <= 4
  • you can multiply matrices together

mat4x4 world;
mat4x4 view;
mat4x4 projection;
mat4x4 final = projection * view * world;

you can also multiply vectors with them if they are the right size

mat4x4 world;
vec3 position;
position = world * position; // ERROR
position = world * vec4(position, 1); // okay


you can access individual components of vectors

vec3 position;
float height = position.y;
// or:
height = position[1];

access multiple components simultaneously

vec4 position;
position.xy = vec2(0, 0);

mix and match

vec4 a, b;
a.zyx = b.yyy;

let's make an ocean

start with a flat plane in three.js

var geometry = new THREE.Geometry();
for (var x = -50; x < 50; x++)
	for (var z = -50; z < 50; z++)
		geometry.vertices.push(new THREE.Vector3(x, 0, z));

ocean animation

  • shaders are basically stateless
  • pass in the same input, you always get the same output
  • how can we make the output change over time?


  • so named because they remain constant for the entire draw call
  • we will pass one float into the shader each frame, representing time
  • every vertex will have access to this value

three.js is so great

var uniforms =
	time: { type: 'f', value: 0 }, // f for float

var material = new THREE.ShaderMaterial(
	vertexShader: document.getElementById('vs').textContent,
	uniforms: uniforms,

// snip...
var clock = new THREE.Clock();
function render()
	uniforms.time.value = clock.getElapsedTime();
	renderer.render(scene, camera);

and the vertex shader

how can we make each vertex behave differently?

  • why can't we keep track of anything between vertices?
  • gpu actually processes many vertices simultaneously

attributes to the rescue

vertex declaration specifies what data is attached to each vertex

texture coordinatevec2
blend weightsvec4
instance transformvec4
flux compressionfloat

three.js saves lives

var attributes =
	offset: { type: 'f', value: [] },

var geometry = new THREE.Geometry();
for (var x = -50; x < 50; x++)
	for (var z = -50; z < 50; z++)
		geometry.vertices.push(new THREE.Vector3(x, 0, z));
		attributes.offset.value.push((x + z) * 0.1);

var material = new THREE.ShaderMaterial(
	vertexShader: document.getElementById('vs').textContent,
	uniforms: uniforms,
	attributes: attributes,

and the vertex shader

connecting the dots

  • so far we've sent individual vertices into a vertex buffer object (vbo) without connecting them
  • everything is made of triangles, even rectangles are constructed from two triangles
  • a triangle is basically three integers which point to vertices in the vbo

index buffer

the most common vertex attribute


usually precalculated at design-time or during loading

of course three.js can do it for you, and even display them for debugging

let's do something fun with the normal


automatically handled by the gpu

fragment shader

  • gpu program, executed for each rasterized pixel in a triangle
  • output is four floats (rgba) ranging from 0 to 1

baby's first fragment shader

what inputs can we have?

  • uniforms
  • data passed from the vertex shader, called "varyings"


  • vertex shader can output extra data to the pixel shader
  • but which vertex does the data come from?
  • let's find out

let's attach a color to each vertex

var attributes =
	vertexColor: { type: 'v3', value: [] },

var geometry = new THREE.Geometry();

geometry.vertices.push(new THREE.Vector3(0, 2.0, 0));
geometry.vertices.push(new THREE.Vector3(-2.0, -2.0, 0));
geometry.vertices.push(new THREE.Vector3(2.0, -2.0, 0));

attributes.vertexColor.value.push(new THREE.Vector3(1, 0, 0));
attributes.vertexColor.value.push(new THREE.Vector3(0, 1, 0));
attributes.vertexColor.value.push(new THREE.Vector3(0, 0, 1));

geometry.faces.push(new THREE.Face3(0, 1, 2));

vertex shader:

fragment shader:

what if we pass the normal as a varying?

we could display the xyz values as rgb. vertex shader:

fragment shader. in glsl we can also address vector components with rgba


  • clearly it has something to do with the normal
  • we need to find out the angle between the normal and the light direction

dot product

dot(a, b) = a.x*b.x + a.y*b.y + a.z*b.z

if a and b are normalized, result = cosine of the angle between a and b

lighting fragment shader

why doesn't the light change when the bunny rotates?

look at the vertex shader

we need to transform the normal as well

texture mapping

  • we can give each vertex a 2d texture coordinate as an attribute
  • we pass the coordinate to the fragment shader
  • the gpu automagically interpolates to get the coordinate for each pixel
  • fragment shader samples the texture at that coordinate to get final color

three.js does it again

geometry.faceVertexUvs[0] = [];
	new THREE.Vector2(1, 0),
	new THREE.Vector2(0.5, 1),
	new THREE.Vector2(0, 0),

var uniforms =
	texture1: { type: 't', value: THREE.ImageUtils.loadTexture('texture.jpg') },

animated uvs