SDF - Simple Demo Framework

A simple framework for creating demoscene style demos using JavaScript and WebGL.

Currently a work in progress, but you can read a little bit more about it at

The repo will occasionally be mirrored to my webspace so that you can look at the current state of things. The five examples currently available are:


SDF needs to be copied to a web server to work. It is being developed and tested on Apache (under WAMP) and Chrome. SDF has also been successfully run on IIS but required some extra setup to allow IIS to serve .frag and .vert files.

Demo Config

Basic config for an SDF demo is done through the file "config.js", which should be placed in the root directory of your demo project. The format for the file is:

var myConfig = {
    scriptFilename: "script.json",
    audioFilename: "4mat - Surrender - 15 Bonus-Soft Touch.mp3",
    bpm: 160,
    startOffset: 0.207,

The fields have the following meanings:

  • scriptFilename: Name of the file to load the script from. Located in the user resource folder. Default value if omitted is "script.json"

  • audioFilename: Name of the audio file to load and play. Can by either an mp3 or ogg file but both mp3 and ogg versions of the audio file must exist - SDF loads the appropriate audio format for your browser. Located in the user resource folder. There is no default value - you must specify an audio file.

  • bpm: Beats per minute of the audio track. Use this to make it possible to sync your demo in bars, with a fixed 4/4 time signature. eg: at a bpm of 120, the timeline advances by 1 for every four beats or every 2 seconds. Default value if omitted is 240 bpm, which lets you sequence your demo in seconds rather than beats.

  • startOffset: Added to the start of the audio track, this lets you adjust for any time that passes before the first beat. Default value if omitted is 0.

Demo Sequencing

An SDF demo is scripted using a JSON5 formatted file. The script consists of a number of layers and groups, each with optional start points and lengths.

JSON5 is used as a much friendlier alternative to JSON. JSON itself is a pain to hand edit due to it's pedantic syntax and lack of support for comments. JSON5 is much more flexible and makes the whole process bearable.

A simple demo scripts looks something like this:

// Simple demo script

        type: "Image",
        start: 0,
        length: 4,
        imageFilename: "logo1.png",
        type: "Image",
        start: 4,
        length: 4,
        imageFilename: "logo2.png",

This would show an image ("logo1.png") for four bars, followed by a second image ("logo2.png") for the next four bars. More details about what the parameters do can be found below.

Layers and Effects

A demo script consists of one or more layers with optional (but highly useful) start times and lengths. To collect a number of layers together into scenes (where each scene also has a start time and length) you use the special "Group" layer.

Each layer type has a number of parameters. Some are specific to the layer type, some are generic. The generic ones are:

  • type: Type of layer to create. See below.

  • start: Floating point time value for when the layer is first visible.

  • length: Floating point time value for how long the layer is visible for.

Standard Layers and Effects

There are four types of built in layers:

Image Layer


FullScreenShader Layer


FullScreenEffect Layer


Group Layer


Writing Custom Layers and Effects


Demo Resources

An SDF demo generally contains a number of different resources of different types. These resources live inside the resources directory of your project. SDF's resource manager handles the loading of these resources and ensures that they all exist in memory when they are needed by the code. The resource manager spots duplicate requests and ignores them, meaning that you are free to reference an individual resource as many times as you want without causing it to be loaded more than once. The resource manager also handles conversion from raw files to JavaScript objects where it can. For example, a .png image is loaded as an HTML Image object, a .obj file as a Mesh object, etc..

As well as user resources there are also system resources. System resources include some shaders that are used by the default layers as well as a few generic textures. The resource manager treats the system resource path as a fallback location when a specified resource cannot be found in the user path. This means that you don't need to copy or move the system resources into your resource - they can still be found and loaded. It also means that you can override system resources by putting a file with the same name in the user resource path. This enables you to, for example, easily replace "loadingScreen.png" with your own.

The following different resource types are supported natively:


SDF supports WebGL shaders written using GLSL. Shaders are split into two parts: vertex and fragment shaders. Vertex shaders should be saved with the ".vert" extenstion and fragment shaders with the ".frag" extension.


SDF supports PNG and JPG images - these are the two image formats with pretty much universal browser support. PNG is the preferred format as it is lossless and allows the use of an alpha channel. Images are loaded as an HTML Image object.

There are two types of textures in this world: power of two (POT) and non-power of two (NPOT). A POT texture is one where both dimensions are a "power of two" number of pixels, eg: 1x1, 64x32, 256x256 or 512x1. In the dark old days of graphics hardware only POT textures were supported. In the modern world pretty much every graphics library supports both POT and NPOT. WebGL is more than a little backward in it's support of NPOT textures - they work, but with some big limitations. The advice I will give is this: if you are displaying images using the Image layer then make the source image exactly the same size as your output canvas, otherwise always use POT textures. If you ever find that your textures are not displaying at all, or coming out as pure black then check that you are using the correct dimensions.

3D Meshes

SDF supports .obj files and loads them as a Mesh object. You can also create meshes in code from lists of vertices using Mesh.createFromLists(). Meshes can then be rendered using the Mesh object member functions renderColoured(), renderTextured(), etc..

Obj file loading has limitations. Files that contain multiple meshes are loaded into SDF as a single mesh. Only triangle and quad faces are supported, so you should triangulate your model on export. Finally, materials are not supported. Texture UV co-ordinates are loaded from the file but you must load and apply the texture yourself, and the standard render functions only support one texture.


Some browsers support .ogg files only, some .mp3 only and some support both. This means that you will need to include your audio file in both formats. SDF's resource manager selects the appropriate file based on what the browser says it supports. Audio files are loaded as an HTML Audio object.


The demo script counts as a resource and should live in the resource directory.

Running Your Demo With Arguments

When running an SDL you can pass some optional arguments in as part of the URL query string. The available options are:

  • stats: Adds a stats.js counter to the page

  • mute: Runs the demo with audio muted

  • transport: Enables transport commands - left and right arrows to seek, up arrow to jump back to the start and space bar to pause/resume

  • transportPersistent: Playback position and pause state will persist if you re-load the page. This is great for iteration as it allows you to edit resources and then reload the page without having the demo jump back to the start.

As an example, this is what I generally use during development:

When actually sequencing a demo I would use:

Publishing Your Demo


External Libraries

SDF uses and is inspired by a number of libraries:


Do what you want you like with this code but don't try to misrepresent it. Don't try to pass it off as your own. Do give credit. Demo or die.