Commits

Jason McKesson committed 00dd6eb

Finished most of Tutorial 2, except for images.

  • Participants
  • Parent commits 6cfb4c6

Comments (0)

Files changed (3)

Documents/Basics/Tutorial 01.xml

             </glossdef>
         </glossentry>
         <glossentry>
+            <glossterm>Input Variable</glossterm>
+            <glossdef>
+                <para>A shader variable, declared at global scope. Input variables receive their
+                    values from earlier stages in the OpenGL rendering pipeline.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>Output Variable</glossterm>
+            <glossdef>
+                <para>A shader variable, declared at global scope, using the <literal>out</literal>
+                    keyword. Output variables written to by a shader are passed to later stages in
+                    the OpenGL rendering pipeline for processing.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
             <glossterm>Vertex Attribute</glossterm>
             <glossdef>
-                <para>A single input to a vertex shader. Each vertex attribute is a vector of up to
-                    4 elements in length. Vertex attributes are drawn from buffer objects; the
-                    connection between buffer object data and vertex inputs is made with the
-                        <function>glVertexAttribPointer</function> and
+                <para>Input variables to vertex shaders are called vertex attributes. Each vertex
+                    attribute is a vector of up to 4 elements in length. Vertex attributes are drawn
+                    from buffer objects; the connection between buffer object data and vertex inputs
+                    is made with the <function>glVertexAttribPointer</function> and
                         <function>glEnableVertexAttribArray</function> functions. Each vertex
                     attribute in a particular program object has an index; this index can be queried
                     with <function>glGetAttribLocation</function>. The index is used by the various

Documents/Basics/Tutorial 02.xml

 {
     float lerpValue = gl_FragCoord.y / 500.0f;
     
-    outputColor = mix(vec4(1.0f, 1.0f, 1.0f, 1.0f), vec4(0.2f, 0.2f, 0.2f, 1.0f), lerpValue);
+    outputColor = mix(vec4(1.0f, 1.0f, 1.0f, 1.0f),
+        vec4(0.2f, 0.2f, 0.2f, 1.0f), lerpValue);
 }]]></programlisting>
         </example>
         <para><varname>gl_FragCoord</varname> is a built-in variable that is only available in a
             <note>
                 <para>If you're wondering why it is <literal>(void*)48</literal> and not just 48,
                     that is because of some legacy API cruft. The reason why the function name is
-                            <function>glVertexAttrib<emphasis>Pointer</emphasis></function> is
+                            glVertexAttrib<quote>Pointer</quote> is
                     because the last parameter is technically a pointer to client memory. Or at
                     least, it could be in the past.</para>
             </note>
             <para>After this, we use <function>glDrawArrays</function> to render, then disable the
                 arrays with <function>glDisableVertexAttribArray.</function></para>
             <section>
-                <title>A Look at Drawing</title>
-                <para>In the last tutorial, we skimmed past the details of what exactly
+                <title>Drawing in Detail</title>
+                <para>In the last tutorial, we skimmed over the details of what exactly
                         <function>glDrawArrays</function> does. Let us take a closer look
                     now.</para>
                 <para>The various attribute array functions set up arrays for OpenGL to read from
                     when rendering. In our case here, we have two arrays. Each array has a buffer
                     object and an offset into that buffer where the array begins, but the arrays do
-                    not have an explicit size.</para>
+                    not have an explicit size. If we look at everything as C++ pseudo-code, what we
+                    have is this:</para>
+                <example>
+                    <title>Vertex Arrays</title>
+                    <programlisting><![CDATA[GLbyte *bufferObject = (void*){0.0f, 0.5f, 0.0f, 1.0f, 0.5f, -0.366f, ...};
+float *positionAttribArray[4] = (float *[4])(&(bufferObject + 0));
+float *colorAttribArray[4] = (float *[4])(&(bufferObject + 48));]]></programlisting>
+                </example>
+                <para>Each element of the <varname>positionAttribArray</varname> contains 4
+                    components. This is the case because the second parameter of
+                        <function>glVertexAttribPointer</function> is 4. Each component is a
+                    floating-point number; similarly because the third parameter is
+                        <literal>GL_FLOAT</literal>. The array takes its data from
+                        <varname>bufferObject</varname> because this was the buffer object that was
+                    bound at the time that <function>glVertexAttribPointer</function> was called.
+                    And the offset fro the beginning of the buffer object is 0 because that is the
+                    last parameter of <function>glVertexAttribPointer</function>.</para>
+                <para>The same goes for <varname>colorAttribArray</varname>.</para>
+                <para>Using the above pseudo-code, <function>glDrawArrays</function> is implemented
+                    as follows:</para>
+                <example>
+                    <title>Draw Arrays Implementation</title>
+                    <programlisting><![CDATA[void glDrawArrays(GLenum type, GLint start, GLint count)
+{
+    for(int element = start; element < start + count; element++)
+    {
+        VertexShader(positionAttribArray[element], colorAttribArray[element]);
+    }
+}
+]]></programlisting>
+                </example>
+                <para>This means that the vertex shader will be executed <varname>count</varname>
+                    times, and it will be given data beginning with the <varname>start</varname>-th
+                    element and continuing for <varname>count</varname> elements. So the first time
+                    the vertex shader gets run, it takes the position attribute from
+                        <literal>bufferObject[0 + (0 * 4 * sizeof(float))]</literal> and the color
+                    attribute from <literal>bufferObject[48 + (0 * 4 * sizeof(float))]</literal>.
+                    The second time pulls the position from <literal>bufferObject[0 + (1 * 4 *
+                        sizeof(float))]</literal> and color from <literal>bufferObject[48 + (1 * 4 *
+                        sizeof(float))]</literal>. And so on.</para>
             </section>
         </section>
         <section>
             <title>Vertex Shader</title>
-            <para/>
+            <para>Our new vertex shader looks like this:</para>
+            <example>
+                <title>Multi-input Vertex Shader</title>
+                <programlisting><![CDATA[#version 150
+
+in vec4 position;
+in vec4 color;
+
+smooth out vec4 theColor;
+
+void main()
+{
+    gl_Position = position;
+    theColor = color;
+}]]></programlisting>
+            </example>
+            <para>There are three new lines here. Let us take them one at a time.</para>
+            <para>The declaration of the global <varname>color</varname> defines a new input for the
+                vertex shader. So this shader, in addition to taking an input named
+                    <varname>position</varname> also takes a second input named
+                    <varname>color</varname>. As with the <varname>position</varname> input, the
+                tutorial queries the attribute location after the program is linked with the
+                    <function>glGetAttribLocation</function>:</para>
+            <example>
+                <title>Querying the Attributes</title>
+                <programlisting><![CDATA[   positionAttrib = glGetAttribLocation(theProgram, "position");
+    colorAttrib = glGetAttribLocation(theProgram, "color");]]></programlisting>
+            </example>
+            <para>The <varname>colorAttrib</varname> variable is used later when defining the
+                attribute arrays for rendering.</para>
+            <para>That much only gets the data into the vertex shader. We want to pass this data out
+                of the vertex shader. To do this, we must define an <glossterm>output
+                    variable</glossterm>. This is done using the <literal>out</literal> keyword. In
+                this case, the output variable is called <varname>theColor</varname> and is of type
+                    <type>vec4</type>.</para>
+            <para>The <literal>smooth</literal> keyword is an <glossterm>interpolation
+                    qualifier</glossterm>. We will see what this means in a bit later.</para>
+            <para>Of course, this simply defines the output variable. In <function>main</function>,
+                we actually write to it, assigning to it the value of <varname>color</varname> that
+                was given as a vertex attribute. This being shader code, we could have used some
+                other heuristic or arbitrary algorithm to compute the color. But for the purpose of
+                this tutorial, it is simply passing the value of an attribute passed to the vertex
+                shader.</para>
+            <para>Technically, the built-in variable <varname>gl_Position</varname> is defined as
+                    <literal>out vec4 gl_Position</literal>. So it is an output variable as well. It
+                is a special output because this value is directly used by the system, rather than
+                used only by shaders. User-defined outputs, like <varname>theColor</varname> above,
+                have no intrinsic meaning to the system. They only have an effect in so far as other
+                shader stages use them, as we will see next.</para>
+        </section>
+        <section>
+            <title>Fragment Program</title>
+            <para>The new fragment shader looks like this:</para>
+            <example>
+                <title>Fragment Shader with Input</title>
+                <programlisting><![CDATA[#version 150
+
+smooth in vec4 theColor;
+
+out vec4 outputColor;
+
+void main()
+{
+    outputColor = theColor;
+}]]></programlisting>
+            </example>
+            <para>This fragment shader defines an input variable. It is no coincidence that this
+                input variable is named and typed the same as the output variable from the vertex
+                shader. We are trying to feed information from the vertex shader to the fragment
+                shader. To do this, OpenGL requires that the output from the previous stage have the
+                same name and type as the input to the next stage. It also must use the same
+                interpolation qualifier; if the vertex shader used <literal>smooth</literal>, the
+                fragment shader must do the same.</para>
+            <para>This is a good part of the reason why OpenGL requires vertex and fragment shaders
+                to be linked together; if the names, types, or interpolation qualifiers do not
+                match, then OpenGL can raise an error at link time.</para>
+            <para>So the fragment shader receives the value output from the vertex shader. The
+                shader simply takes this value and copies it to the output. Thus, the color of each
+                fragment will simply be whatever the vertex shader passed along.</para>
         </section>
         <section>
             <title>Fragment Interpolation</title>
-            <para/>
+            <para>Now we come to the elephant in the room, so to speak. There is a communication
+                problem.</para>
+            <para>See, our vertex shader is run 3 times. This execution produces 3 output positions
+                    (<varname>gl_Position</varname>) and 3 output colors
+                    (<varname>theColor</varname>). The 3 positions are used to construct and
+                rasterize a triangle, producing a number of fragments.</para>
+            <para>The fragment shader is not run 3 times. It is run once for every fragment produced
+                by the rasterizer for this triangle. The number of fragments produced by a triangle
+                depends on the viewing resolution and how much area of the screen the triangle
+                covers. At a resolution of 500x500, an equilateral triangle the length of who's
+                sides is 1 has an area of ~0.433. The total screen area (on the range [-1, 1] in X
+                and Y) is 4, so our triangle covers approximately one-tenth of the screen. 500*500
+                is 250,000 pixels; one-tenth of this is 25,000. So our fragment shader gets executed
+                about 25,000 times.</para>
+            <para>There's a slight disparity here. If the vertex shader is directly communicating
+                with the fragment shader, and the vertex shader is outputting only 3 total color
+                values, where do the other 24,997 values come from?</para>
+            <para>The answer is <glossterm>fragment interpolation</glossterm>.</para>
+            <para>By using the interpolation qualifier <literal>smooth</literal> when defining the
+                vertex output and fragment input, we are telling OpenGL to do something special with
+                this value. Instead of each fragment receiving one of the values, what each fragment
+                gets is a <emphasis>blend</emphasis> between the three output values. The closer the
+                fragment is to one vertex, the more that vertex's output contributes to the value
+                that the fragment program receives.</para>
+            <para>Because such interpolation is by far the most common ode for communicating between
+                the vertex shader and the fragment shader, if you do not provide an interpolation
+                keyword, <literal>smooth</literal> will be used by default. There are two other
+                alternatives: <literal>noperspective</literal> and <literal>flat</literal>.</para>
+            <para>If you were to modify the tutorial and change <literal>smooth</literal> to
+                    <literal>noperspective</literal>, you would see no change. That does not mean a
+                change did not happen; our example is too simple for there to actually be a change.
+                The difference between <literal>smooth</literal> and
+                    <literal>noperspective</literal> is subtle, and only matters once we start using
+                more concrete and real tutorials. We will discuss this different much later.</para>
+            <para>The <literal>flat</literal> interpolation actually turns interpolation off. It
+                essentially says that, rather than interpolating between three values, each fragment
+                of the triangle will simply get the first of the three vertex shader output
+                variables. The fragment shader gets a flat value across the surface of the triangle,
+                hence the term <quote>flat.</quote></para>
+            <para>Each triangle has its own </para>
+        </section>
+        <section>
+            <title>The Final Image</title>
+            <para>When you run the tutorial, you will get the following image.</para>
+            <!--TODO: Insert image of interpolated colors here.-->
+            <para>The colors at each tip of the triangle are the pure red, green, and blue colors.
+                They blend together towards the center of the triangle.</para>
         </section>
     </section>
     <section>
         <title>In Review</title>
-        <para/>
+        <para>In this tutorial, you learned how to use much of the <varname>gl_FragCoord</varname>
+            built-in variable in a fragment shader. You learned that this data is in screen-space,
+            and how to convert screen space into more useful forms. You also learned how to pass
+            data directly from vertex shaders to fragment shaders. You learned about the
+            interpolation that happens to vertex shader outputs across the surface of a
+            triangle.</para>
         <section>
             <title>Further Study</title>
             <para>Here are some ideas to play around with.</para>
             <itemizedlist>
                 <listitem>
-                    <para/>
-                </listitem>
-                <listitem>
                     <para>Change the viewport in the FragPosition tutorial. Put the viewport in the
                         top half of the display, and then put it in the bottom half. See how this
                         affects the shading on the triangle.</para>
                 </listitem>
+                <listitem>
+                    <para>Combine the FragPosition tutorial with the Vertex Color tutorial. Use
+                        interpolated color from the vertex shader and combine that with a constant
+                        color, based on the screen-space position of the fragment.</para>
+                </listitem>
             </itemizedlist>
         </section>
     </section>
+    <glossary>
+        <title>Glossary</title>
+        <glossentry>
+            <glossterm>Fragment Interpolation</glossterm>
+            <glossdef>
+                <para/>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>Interpolation Qualifier</glossterm>
+            <glossdef>
+                <para/>
+            </glossdef>
+        </glossentry>
+    </glossary>
 </chapter>

Documents/Outline.xml

     <para>This tutorial outline will describe the relationship between the various tutorials, as
         well as the expected order in which they appear.</para>
     <section>
-        <title>Hello, Triangle!</title>
-        <para>The most basic of GL programs, this will draw a single solid triangle over a blank
-            background.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>That 3D graphics is made of triangles.</para>
-            </listitem>
-            <listitem>
-                <para>The process of scan conversion.</para>
-            </listitem>
-            <listitem>
-                <para>The data pathway of OpenGL, from input vertex attributes to output fragment
-                    data.</para>
-            </listitem>
-            <listitem>
-                <para>The absolute bare-minimum vertex buffer code.</para>
-            </listitem>
-            <listitem>
-                <para>The absolute bare-minimum vertex shader code.</para>
-            </listitem>
-            <listitem>
-                <para>The absolute bare-minimum fragment shader code.</para>
-            </listitem>
-        </itemizedlist>
+        <title>Introduction</title>
+        <section>
+            <title>Hello, Triangle!</title>
+            <para>The most basic of GL programs, this will draw a single solid triangle over a blank
+                background.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>That 3D graphics is made of triangles.</para>
+                </listitem>
+                <listitem>
+                    <para>The process of scan conversion.</para>
+                </listitem>
+                <listitem>
+                    <para>The data pathway of OpenGL, from input vertex attributes to output
+                        fragment data.</para>
+                </listitem>
+                <listitem>
+                    <para>The absolute bare-minimum vertex buffer code.</para>
+                </listitem>
+                <listitem>
+                    <para>The absolute bare-minimum vertex shader code.</para>
+                </listitem>
+                <listitem>
+                    <para>The absolute bare-minimum fragment shader code.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
+        <section>
+            <title>Playing with Colors</title>
+            <para>This tutorial puts colors on our triangle.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>gl_FragCoord and its values.</para>
+                </listitem>
+                <listitem>
+                    <para>Vertex arrays/streams. A discussion of how vertex data gets passed
+                        around.</para>
+                </listitem>
+                <listitem>
+                    <para>Buffer objects. The containers for vertex data.</para>
+                </listitem>
+                <listitem>
+                    <para>Multiple vertex attributes. Matching vertex attributes between the vertex
+                        shader and the vertex array. Passing colors to OpenGL.</para>
+                </listitem>
+                <listitem>
+                    <para>Interleaving vertex arrays. The colors should be interleaved with the
+                        positions.</para>
+                </listitem>
+                <listitem>
+                    <para>Inputs and outputs between GLSL stages.</para>
+                </listitem>
+                <listitem>
+                    <para>Interpolation of the stage inputs/outputs.</para>
+                </listitem>
+            </itemizedlist>
+            <para>Tutorial sub-files:</para>
+            <orderedlist>
+                <listitem>
+                    <para>Use gl_FragCoord to calculate fragment colors based on the position of the
+                        fragments. Use the moving triangle as a base.</para>
+                </listitem>
+                <listitem>
+                    <para>Use multiple vertex arrays to send position and color data. Use a vertex
+                        shader output/fragment shader input to pass the per-vertex colors
+                        through.</para>
+                </listitem>
+            </orderedlist>
+        </section>
     </section>
     <section>
-        <title>Playing with Colors</title>
-        <para>This tutorial puts colors on our triangle.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>gl_FragCoord and its values.</para>
-            </listitem>
-            <listitem>
-                <para>Vertex arrays/streams. A discussion of how vertex data gets passed
-                    around.</para>
-            </listitem>
-            <listitem>
-                <para>Buffer objects. The containers for vertex data.</para>
-            </listitem>
-            <listitem>
-                <para>Multiple vertex attributes. Matching vertex attributes between the vertex
-                    shader and the vertex array. Passing colors to OpenGL.</para>
-            </listitem>
-            <listitem>
-                <para>Interleaving vertex arrays. The colors should be interleaved with the
-                    positions.</para>
-            </listitem>
-            <listitem>
-                <para>Inputs and outputs between GLSL stages.</para>
-            </listitem>
-            <listitem>
-                <para>Interpolation of the stage inputs/outputs.</para>
-            </listitem>
-        </itemizedlist>
-        <para>Tutorial sub-files:</para>
-        <orderedlist>
-            <listitem>
-                <para>Use gl_FragCoord to calculate fragment colors based on the position of the
-                    fragments. Use the moving triangle as a base.</para>
-            </listitem>
-            <listitem>
-                <para>Use multiple vertex arrays to send position and color data. Use a vertex
-                    shader output/fragment shader input to pass the per-vertex colors
-                    through.</para>
-            </listitem>
-        </orderedlist>
+        <title>Vertices and Positioning</title>
+        <section>
+            <title>OpenGL's Moving Triangle</title>
+            <para>This tutorial has a triangle moving around on the screen.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>OpenGL Objects. They hold state, and you bind them to change state and to
+                        render.</para>
+                </listitem>
+                <listitem>
+                    <para>Uniform variables in the OpenGL Shading Language. How to set them in the
+                        API and how to retrieve them in GLSL code.</para>
+                </listitem>
+                <listitem>
+                    <para>Granularity in GLSL: input vs. uniform vs. constant. How often each
+                        changes.</para>
+                </listitem>
+                <listitem>
+                    <para>Basic arithmetic in GLSL. Vector-on-vector arithmetic.</para>
+                </listitem>
+                <listitem>
+                    <para>The extent of the space output by a vertex shader (clip space).</para>
+                </listitem>
+                <listitem>
+                    <para>Clipping and the Viewport.</para>
+                </listitem>
+                <listitem>
+                    <para>Multi-buffering (SwapBuffers).</para>
+                </listitem>
+            </itemizedlist>
+            <para>Tutorial sub-files:</para>
+            <orderedlist>
+                <listitem>
+                    <para>Use BufferSubData to update the triangle's position manually.</para>
+                </listitem>
+                <listitem>
+                    <para>Use uniforms and a vertex shader to move the triangle's position. The
+                        position offset comes directly from the user.</para>
+                </listitem>
+                <listitem>
+                    <para>Use a vertex shader that generates the position offset based solely on a
+                        time from start.</para>
+                </listitem>
+                <listitem>
+                    <para>Use the time value from the last example in the fragment shader to do some
+                        color interpolation.</para>
+                </listitem>
+            </orderedlist>
+        </section>
+        <section>
+            <title>Objects at Rest</title>
+            <para>This tutorial shows a scene of objects, along with a recognizable ground plane.
+                This should all be rendered with a perspective projection.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Matrices and matrix math. A basic overview of matrix mathematics.</para>
+                </listitem>
+                <listitem>
+                    <para>Perspective projection. The math for making objects look like they're in a
+                        3D world.</para>
+                </listitem>
+                <listitem>
+                    <para>Matrices in GLSL, and vector/matrix operations thereupon.</para>
+                </listitem>
+                <listitem>
+                    <para>World to clip transform. How to convert from objects in world-space to
+                        clip-space.</para>
+                </listitem>
+                <listitem>
+                    <para>Perspective-correct interpolation.</para>
+                </listitem>
+                <listitem>
+                    <para>VAOs, multiple. Use these as an example of OpenGL objects storing
+                        state.</para>
+                </listitem>
+                <listitem>
+                    <para>Depth buffers. How depth buffers work to hide surfaces.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
+        <section>
+            <title>Objects in Motion</title>
+            <para>This tutorial shows a scene with objects moving in their own coordinate system.
+                This will include using the same mesh in different locations.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Object-local coordinates. Each object can have its own natural coordinate
+                        space. Multiple instances of objects rendered using the same mesh.</para>
+                </listitem>
+                <listitem>
+                    <para>Object-to-world transform. How to compute the transformation from
+                        object-space to world-space.</para>
+                </listitem>
+                <listitem>
+                    <para>Uniform Buffer Objects. How to have per-instance data and change the
+                        instance data with a single setting, rather than multiple settings. Use
+                        std140 layout.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
+        <section>
+            <title>World in Motion</title>
+            <para>This tutorial has an animated camera moving through a scene containing moving
+                objects and a recognizable floor plane.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Camera-space, as distinct from world and object-local. How to compute
+                        camera-space, and build a sequence of transformations from object to clip
+                        space.</para>
+                </listitem>
+                <listitem>
+                    <para>UBOs for shared uniform data (common matrices).</para>
+                </listitem>
+            </itemizedlist>
+        </section>
     </section>
     <section>
-        <title>OpenGL's Moving Triangle</title>
-        <para>This tutorial has a triangle moving around on the screen.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>OpenGL Objects. They hold state, and you bind them to change state and to
-                    render.</para>
-            </listitem>
-            <listitem>
-                <para>Uniform variables in the OpenGL Shading Language. How to set them in the API
-                    and how to retrieve them in GLSL code.</para>
-            </listitem>
-            <listitem>
-                <para>Granularity in GLSL: input vs. uniform vs. constant. How often each
-                    changes.</para>
-            </listitem>
-            <listitem>
-                <para>Basic arithmetic in GLSL. Vector-on-vector arithmetic.</para>
-            </listitem>
-            <listitem>
-                <para>The extent of the space output by a vertex shader (clip space).</para>
-            </listitem>
-            <listitem>
-                <para>Clipping and the Viewport.</para>
-            </listitem>
-            <listitem>
-                <para>Multi-buffering (SwapBuffers).</para>
-            </listitem>
-        </itemizedlist>
-        <para>Tutorial sub-files:</para>
-        <orderedlist>
-            <listitem>
-                <para>Use BufferSubData to update the triangle's position manually.</para>
-            </listitem>
-            <listitem>
-                <para>Use uniforms and a vertex shader to move the triangle's position. The position
-                    offset comes directly from the user.</para>
-            </listitem>
-            <listitem>
-                <para>Use a vertex shader that generates the position offset based solely on a time
-                    from start.</para>
-            </listitem>
-            <listitem>
-                <para>Use the time value from the last example in the fragment shader to do some
-                    color interpolation.</para>
-            </listitem>
-        </orderedlist>
+        <title>Basic Lighting</title>
+        <section>
+            <title>Lights on</title>
+            <para>This tutorial has a scene with several animated objects and a floor, all lit by a
+                directional and point light, with different colors.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Normals for vertices, and how these interact with faceted models.</para>
+                </listitem>
+                <listitem>
+                    <para>Vertex attribute compression: normalized attributes, and doing
+                        decompression in the shader.</para>
+                </listitem>
+                <listitem>
+                    <para>Lighting models. How to compute diffuse reflectance based on a light
+                        direction and normal. The importance of an ambient lighting term to model
+                        incidental reflectance.</para>
+                </listitem>
+                <listitem>
+                    <para>Directional lights vs. point lights.</para>
+                </listitem>
+                <listitem>
+                    <para>Implementing lighting in a vertex shader for both directional and point
+                        lights. Combining results from </para>
+                </listitem>
+            </itemizedlist>
+        </section>
+        <section>
+            <title>Plane Lights</title>
+            <para>This tutorial has a scene with a ground plane and an animated light moving over
+                it.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Limitations of per-vertex lighting.</para>
+                </listitem>
+                <listitem>
+                    <para>Implementing per-fragment lighting.</para>
+                </listitem>
+                <listitem>
+                    <para>Different transforms for lighting. Use different transforms to optimize
+                        fragment lighting by doing some of the computations in the vertex
+                        shader.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
     </section>
     <section>
-        <title>Objects at Rest</title>
-        <para>This tutorial shows a scene of objects, along with a recognizable ground plane. This
-            should all be rendered with a perspective projection.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>Matrices and matrix math. A basic overview of matrix mathematics.</para>
-            </listitem>
-            <listitem>
-                <para>Perspective projection. The math for making objects look like they're in a 3D
-                    world.</para>
-            </listitem>
-            <listitem>
-                <para>Matrices in GLSL, and vector/matrix operations thereupon.</para>
-            </listitem>
-            <listitem>
-                <para>World to clip transform. How to convert from objects in world-space to
-                    clip-space.</para>
-            </listitem>
-            <listitem>
-                <para>VAOs, multiple. Use these as an example of OpenGL objects storing
-                    state.</para>
-            </listitem>
-            <listitem>
-                <para>Depth buffers. How depth buffers work to hide surfaces.</para>
-            </listitem>
-        </itemizedlist>
+        <title>Texturing</title>
+        <section>
+            <title>Texturing the World</title>
+            <para>This tutorial involves putting a texture on a simple, lit object.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Texture objects. An OpenGL object that holds images.</para>
+                </listitem>
+                <listitem>
+                    <para>Normalized texture coordinates. Vertex attributes that are used to apply a
+                        texture to a surface.</para>
+                </listitem>
+                <listitem>
+                    <para>Texture filtering. How OpenGL computes inbetween values for fragments when
+                        you sample a texture.</para>
+                </listitem>
+                <listitem>
+                    <para>The GLSL side of texturing. Samplers and texture functions in fragment
+                        shaders.</para>
+                </listitem>
+                <listitem>
+                    <para>Associating textures with programs. Sampler uniforms and texture image
+                        units.</para>
+                </listitem>
+                <listitem>
+                    <para>Combining texture colors with the results of lighting.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
+        <section>
+            <title>More Images is Better</title>
+            <para>This tutorial shows a ground plane with a highly aliased texture. An animated
+                camera shows off the aliasing. Then we apply mipmapping and anisotropic filtering to
+                the surface to improve it.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Texture wrapping. How normalized texture coordinates outside of the [0, 1]
+                        range are interpreted.</para>
+                </listitem>
+                <listitem>
+                    <para>Texture aliasing. Where it comes from, and how to solve it.</para>
+                </listitem>
+                <listitem>
+                    <para>Mipmap generation.</para>
+                </listitem>
+                <listitem>
+                    <para>Mipmap filtering. How it works, and how to set it in OpenGL.</para>
+                </listitem>
+                <listitem>
+                    <para>Anisotropic filtering. How it works and how to set it in OpenGL.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
+        <section>
+            <title>Climbing the Mountain</title>
+            <para>This tutorial uses a height map and adjust vertex positions and normals to match
+                it. The height map is a texture. No lighting yet.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Internal image formats, particularly 1-channel textures.</para>
+                </listitem>
+                <listitem>
+                    <para>Vertex texture accessing. How it differs from fragment textures
+                        (mipmapping and such).</para>
+                </listitem>
+                <listitem>
+                    <para>Using textures for non-color information. Also, scaling of the data that
+                        comes out of the image.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
+        <section>
+            <title>The Bumpy Mountain</title>
+            <para>Build a height field, but with more fine details. This requires multiple textures:
+                a height texture and a more detailed bump map on top of it. Fill in the details with
+                a bump map. Obviously, this will need to have a light in the scene.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Constructing normals from the height field in the vertex shader.</para>
+                </listitem>
+                <listitem>
+                    <para>Offset textures. Constructing normals from the texture map, using offsets
+                        into the detailed bump map.</para>
+                </listitem>
+                <listitem>
+                    <para>Texture space-based lighting. Transforming the light into the space of the
+                        texture to do lighting. This requires binormal and tangent vectors.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
     </section>
     <section>
-        <title>Objects in Motion</title>
-        <para>This tutorial shows a scene with objects moving in their own coordinate system. This
-            will include using the same mesh in different locations.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>Object-local coordinates. Each object can have its own natural coordinate
-                    space. Multiple instances of objects rendered using the same mesh.</para>
-            </listitem>
-            <listitem>
-                <para>Object-to-world transform. How to compute the transformation from object-space
-                    to world-space.</para>
-            </listitem>
-            <listitem>
-                <para>Uniform Buffer Objects. How to have per-instance data and change the instance
-                    data with a single setting, rather than multiple settings. Use std140
-                    layout.</para>
-            </listitem>
-        </itemizedlist>
+        <title>Framebuffers</title>
+        <section>
+            <title>Ghostly Visage</title>
+            <para>This tutorial involves a scene with some opaque and some transparent
+                objects.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>What the alpha value of a color means. Specifically, that it means
+                        whatever you want.</para>
+                </listitem>
+                <listitem>
+                    <para>Framebuffer blending. The blend function, how it works, and how to change
+                        it in OpenGL.</para>
+                </listitem>
+                <listitem>
+                    <para>Backface culling. Making sure that the back faces of blended objects don't
+                        get rendered.</para>
+                </listitem>
+                <listitem>
+                    <para>How blending interacts with depth writing and testing. Namely, that you
+                        have to manually sort objects now: turn depth writes off and depth tests
+                        on.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
+        <section>
+            <title>Video Camera</title>
+            <para>This tutorial involves rendering a view of one scene to a texture used in a
+                different location.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Framebuffer objects and renderbuffers. How to render to different
+                        targets.</para>
+                </listitem>
+                <listitem>
+                    <para>Viewport settings.</para>
+                </listitem>
+                <listitem>
+                    <para>Rendering the same scene from multiple camera angles. Managing the data
+                        for doing so.</para>
+                </listitem>
+                <listitem>
+                    <para>Render to texture. Rendering to a texture and then using that texture as a
+                        source for rendering elsewhere.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
+        <section>
+            <title>Selecting the Masses</title>
+            <para>This tutorial creates a number of entities that all move around, on pre-defined
+                paths, over a surface of bumpy terrain. They don't interact with the terrain. We
+                then render projected selection circles onto the ground beneath them.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Projective texturing. How projecting a texture over a surface works
+                        mathematically, and what support there is in the language.</para>
+                </listitem>
+                <listitem>
+                    <para>Multi-pass rendering. Rendering geometry multiple times with a different
+                        program/texture set. Not strictly necessary in this example, but it shows
+                        how to do it if you need to.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
+        <section>
+            <title>The Coming of Shadows</title>
+            <para>This tutorial creates mountainous terrain, and then applies shadow mapping against
+                a directional light. The light should animate to accentuate the effect.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Depth textures. Single-channel textures that take depth data from an
+                        OpenGL rendering. Can be used as direct render targets.</para>
+                </listitem>
+                <listitem>
+                    <para>Texture comparison modes. Changing how the filtering algorithm works so
+                        that texture access compare to a given value, rather than simply sample from
+                        a point. This includes shadow sampler usage.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
     </section>
     <section>
-        <title>World in Motion</title>
-        <para>This tutorial has an animated camera moving through a scene containing moving objects
-            and a recognizable floor plane.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>Camera-space, as distinct from world and object-local. How to compute
-                    camera-space, and build a sequence of transformations from object to clip
-                    space.</para>
-            </listitem>
-            <listitem>
-                <para>UBOs for shared uniform data (common matrices).</para>
-            </listitem>
-        </itemizedlist>
-    </section>
-    <section>
-        <title>Lights on</title>
-        <para>This tutorial has a scene with several animated objects and a floor, all lit by a
-            directional and point light, with different colors.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>Normals for vertices, and how these interact with faceted models.</para>
-            </listitem>
-            <listitem>
-                <para>Vertex attribute compression: normalized attributes, and doing decompression
-                    in the shader.</para>
-            </listitem>
-            <listitem>
-                <para>Lighting models. How to compute diffuse reflectance based on a light direction
-                    and normal. The importance of an ambient lighting term to model incidental
-                    reflectance.</para>
-            </listitem>
-            <listitem>
-                <para>Directional lights vs. point lights.</para>
-            </listitem>
-            <listitem>
-                <para>Implementing lighting in a vertex shader for both directional and point
-                    lights. Combining results from </para>
-            </listitem>
-        </itemizedlist>
-    </section>
-    <section>
-        <title>Plane Lights</title>
-        <para>This tutorial has a scene with a ground plane and an animated light moving over
-            it.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>Limitations of per-vertex lighting.</para>
-            </listitem>
-            <listitem>
-                <para>Implementing per-fragment lighting.</para>
-            </listitem>
-            <listitem>
-                <para>Different transforms for lighting. Use different transforms to optimize
-                    fragment lighting by doing some of the computations in the vertex shader.</para>
-            </listitem>
-        </itemizedlist>
-    </section>
-    <section>
-        <title>Texturing the World</title>
-        <para>This tutorial involves putting a texture on a simple, lit object.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>Texture objects. An OpenGL object that holds images.</para>
-            </listitem>
-            <listitem>
-                <para>Normalized texture coordinates. Vertex attributes that are used to apply a
-                    texture to a surface.</para>
-            </listitem>
-            <listitem>
-                <para>Texture filtering. How OpenGL computes inbetween values for fragments when you
-                    sample a texture.</para>
-            </listitem>
-            <listitem>
-                <para>The GLSL side of texturing. Samplers and texture functions in fragment
-                    shaders.</para>
-            </listitem>
-            <listitem>
-                <para>Associating textures with programs. Sampler uniforms and texture image
-                    units.</para>
-            </listitem>
-            <listitem>
-                <para>Combining texture colors with the results of lighting.</para>
-            </listitem>
-        </itemizedlist>
-    </section>
-    <section>
-        <title>More Images is Better</title>
-        <para>This tutorial shows a ground plane with a highly aliased texture. An animated camera
-            shows off the aliasing. Then we apply mipmapping and anisotropic filtering to the
-            surface to improve it.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>Texture wrapping. How normalized texture coordinates outside of the [0, 1]
-                    range are interpreted.</para>
-            </listitem>
-            <listitem>
-                <para>Texture aliasing. Where it comes from, and how to solve it.</para>
-            </listitem>
-            <listitem>
-                <para>Mipmap generation.</para>
-            </listitem>
-            <listitem>
-                <para>Mipmap filtering. How it works, and how to set it in OpenGL.</para>
-            </listitem>
-            <listitem>
-                <para>Anisotropic filtering. How it works and how to set it in OpenGL.</para>
-            </listitem>
-        </itemizedlist>
-    </section>
-    <section>
-        <title>Climbing the Mountain</title>
-        <para>This tutorial uses a height map and adjust vertex positions and normals to match it.
-            The height map is a texture. No lighting yet.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>Internal image formats, particularly 1-channel textures.</para>
-            </listitem>
-            <listitem>
-                <para>Vertex texture accessing. How it differs from fragment textures (mipmapping
-                    and such).</para>
-            </listitem>
-            <listitem>
-                <para>Using textures for non-color information. Also, scaling of the data that comes
-                    out of the image.</para>
-            </listitem>
-        </itemizedlist>
-    </section>
-    <section>
-        <title>The Bumpy Mountain</title>
-        <para>Build a height field, but with more fine details. This requires multiple textures: a
-            height texture and a more detailed bump map on top of it. Fill in the details with a
-            bump map. Obviously, this will need to have a light in the scene.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>Constructing normals from the height field in the vertex shader.</para>
-            </listitem>
-            <listitem>
-                <para>Offset textures. Constructing normals from the texture map, using offsets into
-                    the detailed bump map.</para>
-            </listitem>
-            <listitem>
-                <para>Texture space-based lighting. Transforming the light into the space of the
-                    texture to do lighting. This requires binormal and tangent vectors.</para>
-            </listitem>
-        </itemizedlist>
-    </section>
-    <section>
-        <title>Ghostly Visage</title>
-        <para>This tutorial involves a scene with some opaque and some transparent objects.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>What the alpha value of a color means. Specifically, that it means whatever
-                    you want.</para>
-            </listitem>
-            <listitem>
-                <para>Framebuffer blending. The blend function, how it works, and how to change it
-                    in OpenGL.</para>
-            </listitem>
-            <listitem>
-                <para>Backface culling. Making sure that the back faces of blended objects don't get
-                    rendered.</para>
-            </listitem>
-            <listitem>
-                <para>How blending interacts with depth writing and testing. Namely, that you have
-                    to manually sort objects now: turn depth writes off and depth tests on.</para>
-            </listitem>
-        </itemizedlist>
-    </section>
-    <section>
-        <title>Video Camera</title>
-        <para>This tutorial involves rendering a view of one scene to a texture used in a different
-            location.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>Framebuffer objects and renderbuffers. How to render to different
-                    targets.</para>
-            </listitem>
-            <listitem>
-                <para>Viewport settings.</para>
-            </listitem>
-            <listitem>
-                <para>Rendering the same scene from multiple camera angles. Managing the data for
-                    doing so.</para>
-            </listitem>
-            <listitem>
-                <para>Render to texture. Rendering to a texture and then using that texture as a
-                    source for rendering elsewhere.</para>
-            </listitem>
-        </itemizedlist>
-    </section>
-    <section>
-        <title>Selecting the Masses</title>
-        <para>This tutorial creates a number of entities that all move around, on pre-defined paths,
-            over a surface of bumpy terrain. They don't interact with the terrain. We then render
-            projected selection circles onto the ground beneath them.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>Projective texturing. How projecting a texture over a surface works
-                    mathematically, and what support there is in the language.</para>
-            </listitem>
-            <listitem>
-                <para>Multi-pass rendering. Rendering geometry multiple times with a different
-                    program/texture set. Not strictly necessary in this example, but it shows how to
-                    do it if you need to.</para>
-            </listitem>
-        </itemizedlist>
-    </section>
-    <section>
-        <title>The Coming of Shadows</title>
-        <para>This tutorial creates mountainous terrain, and then applies shadow mapping against a
-            directional light. The light should animate to accentuate the effect.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>Depth textures. Single-channel textures that take depth data from an OpenGL
-                    rendering. Can be used as direct render targets.</para>
-            </listitem>
-            <listitem>
-                <para>Texture comparison modes. Changing how the filtering algorithm works so that
-                    texture access compare to a given value, rather than simply sample from a point.
-                    This includes shadow sampler usage.</para>
-            </listitem>
-        </itemizedlist>
-    </section>
-    <section>
-        <title>Of Metal and Plastic</title>
-        <para>This tutorial involves creating a single mesh that has multiple lighting models: one
-            reflective and one very diffuse. There should be an animated light or two that shows
-            this off.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>BDRFs: Lighting models that are a function of surface normal, angle to the
-                    light, and angle to the camera.</para>
-            </listitem>
-            <listitem>
-                <para>The Phong specular lighting model.</para>
-            </listitem>
-            <listitem>
-                <para>Using a texture's value to control the strength of the Phong curve. Introduce
-                    floating-point textures here.</para>
-            </listitem>
-        </itemizedlist>
-    </section>
-    <section>
-        <title>Dynamic Lighting</title>
-        <para>This tutorial takes a scene with directional lighting and shadows, with specular
-            lighting on some of the objects (and an identifiable ground), and applies basic HDR to
-            it.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>Non-clamped color spaces. Noting that the [0, 1] range is an approximation,
-                    and that light darkening is completely wrong.</para>
-            </listitem>
-            <listitem>
-                <para>16-bit floating-point values. Useful for blending and not taking up nearly as
-                    much room/performance.</para>
-            </listitem>
-            <listitem>
-                <para>Floating-point render targets. Don't forget the hardware limitations.</para>
-            </listitem>
-            <listitem>
-                <para>HDR techniques. How to reduce a floating-point texture to a [0, 1] range
-                    color.</para>
-            </listitem>
-            <listitem>
-                <para>Why one should use HDR.</para>
-            </listitem>
-        </itemizedlist>
-    </section>
-    <section>
-        <title>Blooming</title>
-        <para>This tutorial takes the previous scene and adds blooming of high powered
-            lights.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>Blooming. A multi-pass algorithm of operations over the same texture. All done
-                    before reduction to the integer colorspace.</para>
-            </listitem>
-            <listitem>
-                <para>Introduce R11_G11_B10 as an optimization.</para>
-            </listitem>
-        </itemizedlist>
-    </section>
-    <section>
-        <title>Mirror Mirror</title>
-        <para>This tutorial has a skybox world with a shiny object and light source in it. The shiny
-            object should reflect the world and have proper specular with the light source. It
-            should still use HDR, but blooming is not required.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>Skybox: Displays a static world around the object.</para>
-            </listitem>
-            <listitem>
-                <para>Cubemaps. Used to get the reflected color, as well as render the
-                    skybox.</para>
-            </listitem>
-            <listitem>
-                <para>RGB9_E5 Texture format. The skybox should use this format. Compact
-                    floating-point format with good precision and large range of values.</para>
-            </listitem>
-            <listitem>
-                <para>Properly combining lighting models. Adding lights from different sources to
-                    achieve result.</para>
-            </listitem>
-        </itemizedlist>
-    </section>
-    <section>
-        <title>Dark Shadows</title>
-        <para>This tutorial has an animated point-light source and a world of objects, some
-            animated. The point light should cast a shadow via cube-based shadow mapping.</para>
-        <para>Concepts:</para>
-        <itemizedlist>
-            <listitem>
-                <para>Depth-formatted cube maps. Using cubemaps as depth comparison textures.</para>
-            </listitem>
-            <listitem>
-                <para>Render to Cubemap, using 6 render targets (but one depth renderbuffer).</para>
-            </listitem>
-            <listitem>
-                <para>Geometry shaders and layered rendering. These act as optimizations
-                    (theoretically, at least).</para>
-            </listitem>
-        </itemizedlist>
+        <title>Advanced Lighting</title>
+        <section>
+            <title>Of Metal and Plastic</title>
+            <para>This tutorial involves creating a single mesh that has multiple lighting models:
+                one reflective and one very diffuse. There should be an animated light or two that
+                shows this off.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>BDRFs: Lighting models that are a function of surface normal, angle to the
+                        light, and angle to the camera.</para>
+                </listitem>
+                <listitem>
+                    <para>The Phong specular lighting model.</para>
+                </listitem>
+                <listitem>
+                    <para>Using a texture's value to control the strength of the Phong curve.
+                        Introduce floating-point textures here.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
+        <section>
+            <title>Dynamic Lighting</title>
+            <para>This tutorial takes a scene with directional lighting and shadows, with specular
+                lighting on some of the objects (and an identifiable ground), and applies basic HDR
+                to it.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Non-clamped color spaces. Noting that the [0, 1] range is an
+                        approximation, and that light darkening is completely wrong.</para>
+                </listitem>
+                <listitem>
+                    <para>16-bit floating-point values. Useful for blending and not taking up nearly
+                        as much room/performance.</para>
+                </listitem>
+                <listitem>
+                    <para>Floating-point render targets. Don't forget the hardware
+                        limitations.</para>
+                </listitem>
+                <listitem>
+                    <para>HDR techniques. How to reduce a floating-point texture to a [0, 1] range
+                        color.</para>
+                </listitem>
+                <listitem>
+                    <para>Why one should use HDR.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
+        <section>
+            <title>Blooming</title>
+            <para>This tutorial takes the previous scene and adds blooming of high powered
+                lights.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Blooming. A multi-pass algorithm of operations over the same texture. All
+                        done before reduction to the integer colorspace.</para>
+                </listitem>
+                <listitem>
+                    <para>Introduce R11_G11_B10 as an optimization.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
+        <section>
+            <title>Mirror Mirror</title>
+            <para>This tutorial has a skybox world with a shiny object and light source in it. The
+                shiny object should reflect the world and have proper specular with the light
+                source. It should still use HDR, but blooming is not required.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Skybox: Displays a static world around the object.</para>
+                </listitem>
+                <listitem>
+                    <para>Cubemaps. Used to get the reflected color, as well as render the
+                        skybox.</para>
+                </listitem>
+                <listitem>
+                    <para>RGB9_E5 Texture format. The skybox should use this format. Compact
+                        floating-point format with good precision and large range of values.</para>
+                </listitem>
+                <listitem>
+                    <para>Properly combining lighting models. Adding lights from different sources
+                        to achieve result.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
+        <section>
+            <title>Dark Shadows</title>
+            <para>This tutorial has an animated point-light source and a world of objects, some
+                animated. The point light should cast a shadow via cube-based shadow mapping.</para>
+            <para>Concepts:</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Depth-formatted cube maps. Using cubemaps as depth comparison
+                        textures.</para>
+                </listitem>
+                <listitem>
+                    <para>Render to Cubemap, using 6 render targets (but one depth
+                        renderbuffer).</para>
+                </listitem>
+                <listitem>
+                    <para>Geometry shaders and layered rendering. These act as optimizations
+                        (theoretically, at least).</para>
+                </listitem>
+            </itemizedlist>
+        </section>
     </section>
     <section>
         <title>Twisty Objects, All Alike</title>