This outline lists the tutorials, in order, that are planned. Below this is a list of material that needs to be covered in tutorials, but is not yet covered.
These are the simpler tutorials. Little actual code, as they're mostly theory.
On OpenGL and Graphics
No code for this one. It simply covers how OpenGL works and how graphics and 3D rasterization works.
Covers how the source code for the most simple possible triangle renderer works.
- Shows how data flows down the rendering pipeline.
- The bare-minimum discussion of vertex attributes.
- The bare-minimum discussion of vertex shaders.
- The bare-minimum discussion of fragment shaders.
- The bare-minimum discussion of viewports.
Playing with Colors
This tutorial involves creating a colored triangle. First using fragment coordinate positions, then using interpolated vertex colors.
- gl_FragCoord and its contents.
- Multiple vertex arrays and attributes. Go into more explicit detail on exactly how OpenGL passes data to the vertex shader from a list of arrays.
- Interleaved vertex data. How to interleave vertex attributes. Also, explain why one would want to.
- Inputs and outputs between stages. Explain how outputs of vertex shaders and inputs of fragments shaders are connected.
- Describe how the connected variables are interpolated.
Positioning and Vertices
These tutorials revolve around vertex arrays, vertex transformations, and moving things around.
OpenGL's Moving Triangle
This tutorial has a triangle moving around the screen.
- Uniform variables. How to set them
- Input vs. Uniform vs. Constant. Granularity of how these variables change.
- GLSL arithmetic and standard functions.
Objects at Rest
This tutorial is all about perspective. Drawing a scene with a perspective camera. There should be a recognizable ground plane and a number of objects with different colors, so that we can tell them apart. Each object is directly part of the world, automatically placed in its proper location, rather than having its own local coordinate system.
- Perspective projection. The math behind making objects look like they're in a 3D world.
- Matrix math primer, and how to do much of the projection transform with matrices.
- How to do the aforementioned matrix math in GLSL.
- Using the above to transform objects from pre-projection space to post-projection clip space. Also, the perspective divide.
- VAOs. Storing the binding data for different objects.
- Depth buffers and depth testing.
Objects in Motion
This tutorial is about having objects move relative to their own local coordinate system. It deals with multiple matrix operations, and transforming objects into different coordinate spaces.
- Object-local coordinates. Having vertex position data in local coordinates, and how to transform from local coordinates to pre-projection space.
- The different kinds of transformations: rotation, translation, and scale. Deal with matrix concatenation through multiplication.
- Uniform Buffer Objects. Use these to store each object's per-instance transformation data (and possibly color). This keeps us from having to set these uniforms directly; we can just upload data to the buffer object. Use std140 layout.
World in Motion
This tutorial builds on the last, introducing a camera into the world. Thus, the world becomes it's own unique state, and the camera transform is necessary to transform into pre-projection space.
- Camera space, as distinct from world and clip-space space. This essentially names the space we've been calling pre-projection. It also involves an explicit pipeline of local-to-world-to-camera-to-clip-space transform sequence.
- UBOs for shared data (camera and projection matrices). This may involve multiple program objects that use different data.
These tutorials show how to light objects in a scene.
This tutorial deals with lighting a number of objects in the world. There will be directional and point lights.
- The basics of how light works and how lighting equations work
- The diffuse lighting equation
- Normals in vertex attribute data. Needed to make lighting work; they define the direction of the surface.
- Implement vertex lighting
- Implement multiple light sources, all combined in the vertex shader.
- Deal with perspective correct interpolation of light colors.
This tutorial shows how to perform per-fragment lighting.
- Show the limitations of per-vertex lighting. Use a large plane with a moving light source.
- Implement lighting in the fragment shader, by transforming the light to the proper spaces.
- Use some precomputations in the vertex shader as an optimization of fragment lighting.
This tutorial introduces specular lighting and the Phong lighting model. It implements it per-vertex and per-pixel, just to show how bad per-vertex lighting can get.
- Introduce the concept of specular lighting.
- Show an implementation of Phong lighting, per-fragment and per-vertex. Allow the user to play with the Phong exponent.
- Implement Blinn-Phong. Talk about how it is different. This program should be able to switch between the two.
- Implement the Gaussian distribution. Allow switching between the three specular terms. Talk about how it is more physically accurate.
- Discuss a bit about the combination of diffuse surface color and specular surface color. Specifically about how to make an object look shiny, and how this really isn't good enough to get strong, mirrored reflections.
- Point out the restrictions in using the "pow" function in GLSL.
This tutorial has several objects illuminated by many light sources. It shows how to do HDR as well, since we're working with a fairly large scene.
- Light sources add values when combined.
- Formally introduce the concept of material properties. These are a broad class of inputs to a lighting model which are distinct from lighting properties.
- Show light clipping when using many lights together.
- HDR allows working with an arbitrary dynamic range. Manually clamp at the end.
- Looping in shaders.
- Uniform arrays and their limitations.
- UBOs storing uniform array data.
- Linear colorspaces, gamma correction. Add gamma correction (to sRGB) into our tone mapping logic.
Lies and Imposters
This tutorial shows how to set up an impostor, complete with fully accurate depth.
- Introduce the idea of impostors.
- Show how to use discard to clip off the non-circle parts of a triangle.
- Use standard lighting techniques to color the pixels. The normal of the triangle is computed as needed.
- Show that fragment depth can be written, which gives the sphere its shape.
Explain how other tutorials get to texturing almost immediately, and explain why this one doesn't. Emphasis on shaders and such, particularly the imposter in the last section.
Textures are not Pictures
This tutorial illustrates the basic use of texturing. It shows a single mesh, under the influence of a light source. The texture on this mesh represents the specular reflectance value (for the Gaussian distribution). It should include multiple different reflectance textures, to show off the differences between them.
- First, introduce perspective correct interpolation.
- OpenGL texture objects. This includes a discussion of texture types and a very basic discussion of image formats. Include a bit on pixel transfers. Make a note about how OpenGL's texturing API is... weird.
- Texture mapping. Vertices are tagged with values called texture coordinates that represent locations in a texture to be used at that position. Introduce normalized texture coordinates: a way to talk about locations in a texture regardless of that texture's size.
- OpenGL sampler objects. These are used to hold some basic state about how to access the texture.
- GLSL texturing. Sampler types and texture functions.
- How to associate textures with programs. Similar to UBOs, this needs a diagram.
- Animating texture coordinates. Slide the texture across the surface by biasing the texture coordinate in the fragment shader.
- Textures replacing shader functions. Create a 2D texture that represents the Gaussian distribution. X is the dot(H,N), and Y is the Gaussian power.
More Images is Better
This tutorial shows off texture filtering and antialiasing. There are two sections. One that introduces mag filtering with a standard diffuse color texture. And the next that introduces texture aliasing with a checkerboard pattern mapped over a large square (no lighting).
Do note for the user that they may need to play with their graphics card's driver settings. Set all of the texture filtering modes to application-specific.
- Image formats. RGB vs. 1-channel (we used a 1-channel format last time).
- Pixel transfers. How to get RGB data to the texture correctly.
- Texture filtering. Changing how OpenGL access the texture, in magnification and minification modes.
- Texture aliasing. Abbreviated signal-processing lecture.
- Mipmap filtering and how it works. Also automatic mipmap generation.
- Texture isotropy. Show how mipmapping can over-filter the texture.
- Anisotropic filtering. Explain how this works.
Gamma correct texturing
Show off proper gamma correct texturing.
- Gamma colorspace and sRGB colorspace. Explain the issues around this, including filtering and the like. Create a lit scene with a ground plane, and with two textures, one with sRGB and one without it.
Climbing the Mountain
This tutorial introduces the concept of a height map. It uses vertex texturing to offset the Z-values of vertices on a flat plain. It then uses bump mapping and tangent-space texturing to bias the normals for lighting purposes.
- Vertex texturing. The restrictions compared to fragment texture access.
- Scaling the data from the texture.
- Constructing normals from the height-field data. Use the offset texture functions.
- Texture/tangent-space lighting. This requires binormal and tangent vectors, in order to transform the light direction from model space to tangent-space.
This tutorial uses texture-based spot lights. It demonstrates projective texturing.
- Projective texturing. Explain how this works, based on our prior projection knowledge. Also detail how
- Texture values defining lighting intensity. This also means floating-point textures.
- Using a global texture over a scene, in addition to local textures that may be available.