Game of Life implemented with a fragment shader
The Game of Life is a two-dimensional pixelated world. Each pixel of the world is either alive or dead (displayed as black or white). The world steps from one state to the next. Living pixels continue to live if they have two or three living neighbors. Dead pixels come alive if they previously had exactly three living neighbors. This style of simulation is called a “cellular automaton”.
Above, I’ve implemented this simulation as a WebGL fragment shader.
The key feature used is “rendering to texture”.
Normally, when you use functions like drawArrays
and drawElements
,
it draws straight to the screen.
But we can tell WebGL to instead render to a texture.
With this feature, textures can be read and written by fragment shaders,
meaning that we can use textures to store state!
Above, the entire state is stored as a 64x64 texture. I have one “stepper” fragment shader which reads this texture, and generates the next state:
precision mediump float;
uniform sampler2D previousState;
int wasAlive(vec2 coord) {
if (coord.x < 0.0 || 64.0 < coord.x || coord.y < 0.0 || 64.0 < coord.y) return 0;
vec4 px = texture2D(previousState, coord/64.0);
return px.r < 0.1 ? 1 : 0;
}
void main(void) {
vec2 coord = vec2(gl_FragCoord);
int aliveNeighbors =
wasAlive(coord+vec2(-1.,-1.)) +
wasAlive(coord+vec2(-1.,0.)) +
wasAlive(coord+vec2(-1.,1.)) +
wasAlive(coord+vec2(0.,-1.)) +
wasAlive(coord+vec2(0.,1.)) +
wasAlive(coord+vec2(1.,-1.)) +
wasAlive(coord+vec2(1.,0.)) +
wasAlive(coord+vec2(1.,1.));
bool nowAlive = wasAlive(coord) == 1 ? 2 <= aliveNeighbors && aliveNeighbors <= 3 : 3 == aliveNeighbors;
gl_FragColor = nowAlive ? vec4(0.,0.,0.,1.) : vec4(1.,1.,1.,1.);
}
I have a second, simpler fragment shader which I use to display the texture:
precision mediump float;
uniform sampler2D state;
void main(void) {
vec2 coord = vec2(gl_FragCoord)/64.0;
gl_FragColor = texture2D(state, coord);
}
WebGL does not allow you to render to the same texture you’re reading. Instead, I maintain two textures, and swap between them: step from texture 0 from texture 1, then step from texture 1 from texture 0, then repeat.
To tell WebGL to render to a texture, instead of to the screen,
we bind a framebuffer object to gl.FRAMEBUFFER
.
To tell WebGL to render to screen,
we unbind the framebuffer.
Here’s the core simulation loop:
const framebuffers = [gl.createFramebuffer(), gl.createFramebuffer()];
gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffers[0]);
gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, texture0, 0);
gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffers[1]);
gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, texture1, 0);
let nextStateIndex = 0;
window.setInterval(function() {
const previousStateIndex = 1 - nextStateIndex;
gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffers[nextStateIndex]);
gl.useProgram(stepperProg);
gl.enableVertexAttribArray(stepperProgCoordLoc);
gl.uniform1i(stepperProgPreviousStateLoc, previousStateIndex);
gl.drawElements(gl.TRIANGLE_FAN, 4, gl.UNSIGNED_BYTE, 0);
gl.bindFramebuffer(gl.FRAMEBUFFER, null);
gl.useProgram(displayProg);
gl.uniform1i(displayProgStateLoc, nextStateIndex);
gl.drawElements(gl.TRIANGLE_FAN, 4, gl.UNSIGNED_BYTE, 0);
nextStateIndex = previousStateIndex;
}, 100);
The initial process is bootstrapped with this tiny 64x64 image, which is a “Gosper Glider Gun”:
In future posts, I’ll use texture rendering for more simulations. I was thinking of making a simulation of water erosion. The state would represent a height-mapped landscape. I would bootstrap it with some Perlin noise. Each step of the simulation would put some water on the landscape, then flows the water at each pixel, dragging some land along with it. I can imagine it producing lakes, rivers, and ravines.
This page copyright James Fisher 2017. Content is not associated with my employer.