Commits

Jason McKesson committed 6aaec5f

Changed the name of the document.

Comments (0)

Files changed (18)

Documents/Building the Tutorials.xml

+<?xml version="1.0" encoding="UTF-8"?>
+<?oxygen RNGSchema="http://docbook.org/xml/5.0/rng/docbookxi.rng" type="xml"?>
+<?oxygen SCHSchema="http://docbook.org/xml/5.0/rng/docbookxi.rng"?>
+<article xmlns="http://docbook.org/ns/docbook" xmlns:xi="http://www.w3.org/2001/XInclude"
+    xmlns:xlink="http://www.w3.org/1999/xlink" version="5.0">
+    <title>Building the Tutorials</title>
+    <para>To build these tutorials, you will need to download the <link
+            xlink:href="http://industriousone.com/premake">Premake 4</link> utility for your
+        platform of choice. Currently, the tutorials only work on Windows. The tutorial distribution
+        comes with the external dependencies that they need (FreeGLUT and FreeImage).</para>
+    <para>Premake is a utility like <link xlink:href="http://www.cmake.org/">CMake</link>: it
+        generates build files for a specific platform. Unlike CMake, Premake is strictly a
+        command-line utility. Premake's build scripts are written in the <link
+            xlink:href="http://www.lua.org/home.html">Lua language</link>, unlike CMake's build
+        scripts that use their own language.</para>
+    <para>Note that Premake only generates build files; once the build files are created, you can
+        use them as normal. It can generate project files for Visual Studio, <link
+            xlink:href="http://www.codeblocks.org/">Code::Blocks</link>, and XCode, as well as GNU
+        Makefiles. And unless you want to modify one of the tutorials, you only need to run Premake
+        once for each tutorial.</para>
+    <para/>
+</article>

Documents/Outline.xml

+<?xml version="1.0" encoding="UTF-8"?>
+<?oxygen RNGSchema="http://docbook.org/xml/5.0/rng/docbookxi.rng" type="xml"?>
+<?oxygen SCHSchema="http://docbook.org/xml/5.0/rng/docbookxi.rng"?>
+<article xmlns="http://docbook.org/ns/docbook" xmlns:xi="http://www.w3.org/2001/XInclude"
+    version="5.0">
+    <title>Outline</title>
+    <para>This tutorial outline will describe the relationship between the various tutorials, as
+        well as the expected order in which they appear.</para>
+    <section>
+        <title>Hello, Triangle!</title>
+        <para>The most basic of GL programs, this will draw a single solid triangle over a blank
+            background.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>That 3D graphics is made of triangles.</para>
+            </listitem>
+            <listitem>
+                <para>The process of scan conversion.</para>
+            </listitem>
+            <listitem>
+                <para>The data pathway of OpenGL, from input vertex attributes to output fragment
+                    data.</para>
+            </listitem>
+            <listitem>
+                <para>The absolute bare-minimum vertex buffer code.</para>
+            </listitem>
+            <listitem>
+                <para>The absolute bare-minimum vertex shader code.</para>
+            </listitem>
+            <listitem>
+                <para>The absolute bare-minimum fragment shader code.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>OpenGL's Moving Triangle</title>
+        <para>This tutorial has a triangle moving around on the screen.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>OpenGL Objects. They hold state, and you bind them to change state and to
+                    render.</para>
+            </listitem>
+            <listitem>
+                <para>Uniform variables in the OpenGL Shading Language. How to set them in the API
+                    and how to retrieve them in GLSL code.</para>
+            </listitem>
+            <listitem>
+                <para>Granularity in GLSL: input vs. uniform vs. constant. How often each
+                    changes.</para>
+            </listitem>
+            <listitem>
+                <para>Basic arithmetic in GLSL. Vector-on-vector arithmetic.</para>
+            </listitem>
+            <listitem>
+                <para>The extent of the space output by a vertex shader (clip space).</para>
+            </listitem>
+            <listitem>
+                <para>Clipping and the Viewport.</para>
+            </listitem>
+            <listitem>
+                <para>Multi-buffering (SwapBuffers).</para>
+            </listitem>
+        </itemizedlist>
+        <para>Tutorial sub-files:</para>
+        <orderedlist>
+            <listitem>
+                <para>Use BufferSubData to update the triangle's position manually.</para>
+            </listitem>
+            <listitem>
+                <para>Use uniforms and a vertex shader to move the triangle's position. The position
+                    offset comes directly from the user.</para>
+            </listitem>
+            <listitem>
+                <para>Use a vertex shader that generates the position offset based solely on a time
+                    from start.</para>
+            </listitem>
+            <listitem>
+                <para>Use the time value from the last example in the fragment shader to do some
+                    color interpolation.</para>
+            </listitem>
+        </orderedlist>
+    </section>
+    <section>
+        <title>Playing with Colors</title>
+        <para>This tutorial puts colors on our triangle.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>gl_FragCoord and its values.</para>
+            </listitem>
+            <listitem>
+                <para>Vertex arrays/streams. A discussion of how vertex data gets passed
+                    around.</para>
+            </listitem>
+            <listitem>
+                <para>Buffer objects. The containers for vertex data.</para>
+            </listitem>
+            <listitem>
+                <para>Multiple vertex attributes. Matching vertex attributes between the vertex
+                    shader and the vertex array. Passing colors to OpenGL.</para>
+            </listitem>
+            <listitem>
+                <para>Interleaving vertex arrays. The colors should be interleaved with the
+                    positions.</para>
+            </listitem>
+            <listitem>
+                <para>Inputs and outputs between GLSL stages.</para>
+            </listitem>
+            <listitem>
+                <para>Interpolation of the stage inputs/outputs.</para>
+            </listitem>
+        </itemizedlist>
+        <para>Tutorial sub-files:</para>
+        <orderedlist>
+            <listitem>
+                <para>Use gl_FragCoord to calculate fragment colors based on the position of the
+                    fragments. Use the moving triangle as a base.</para>
+            </listitem>
+            <listitem>
+                <para>Use multiple vertex arrays to send position and color data. Use a vertex
+                    shader output/fragment shader input to pass the per-vertex colors
+                    through.</para>
+            </listitem>
+        </orderedlist>
+    </section>
+    <section>
+        <title>Objects at Rest</title>
+        <para>This tutorial shows a scene of objects, along with a recognizable ground plane. This
+            should all be rendered with a perspective projection.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Matrices and matrix math. A basic overview of matrix mathematics.</para>
+            </listitem>
+            <listitem>
+                <para>Perspective projection. The math for making objects look like they're in a 3D
+                    world.</para>
+            </listitem>
+            <listitem>
+                <para>Matrices in GLSL, and vector/matrix operations thereupon.</para>
+            </listitem>
+            <listitem>
+                <para>World to clip transform. How to convert from objects in world-space to
+                    clip-space.</para>
+            </listitem>
+            <listitem>
+                <para>VAOs, multiple. Use these as an example of OpenGL objects storing
+                    state.</para>
+            </listitem>
+            <listitem>
+                <para>Depth buffers. How depth buffers work to hide surfaces.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>Objects in Motion</title>
+        <para>This tutorial shows a scene with objects moving in their own coordinate system. This
+            will include using the same mesh in different locations.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Object-local coordinates. Each object can have its own natural coordinate
+                    space. Multiple instances of objects rendered using the same mesh.</para>
+            </listitem>
+            <listitem>
+                <para>Object-to-world transform. How to compute the transformation from object-space
+                    to world-space.</para>
+            </listitem>
+            <listitem>
+                <para>Uniform Buffer Objects. How to have per-instance data and change the instance
+                    data with a single setting, rather than multiple settings. Use std140
+                    layout.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>World in Motion</title>
+        <para>This tutorial has an animated camera moving through a scene containing moving objects
+            and a recognizable floor plane.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Camera-space, as distinct from world and object-local. How to compute
+                    camera-space, and build a sequence of transformations from object to clip
+                    space.</para>
+            </listitem>
+            <listitem>
+                <para>UBOs for shared uniform data (common matrices).</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>Lights on</title>
+        <para>This tutorial has a scene with several animated objects and a floor, all lit by a
+            directional and point light, with different colors.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Normals for vertices, and how these interact with faceted models.</para>
+            </listitem>
+            <listitem>
+                <para>Vertex attribute compression: normalized attributes, and doing decompression
+                    in the shader.</para>
+            </listitem>
+            <listitem>
+                <para>Lighting models. How to compute diffuse reflectance based on a light direction
+                    and normal. The importance of an ambient lighting term to model incidental
+                    reflectance.</para>
+            </listitem>
+            <listitem>
+                <para>Directional lights vs. point lights.</para>
+            </listitem>
+            <listitem>
+                <para>Implementing lighting in a vertex shader for both directional and point
+                    lights. Combining results from </para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>Plane Lights</title>
+        <para>This tutorial has a scene with a ground plane and an animated light moving over
+            it.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Limitations of per-vertex lighting.</para>
+            </listitem>
+            <listitem>
+                <para>Implementing per-fragment lighting.</para>
+            </listitem>
+            <listitem>
+                <para>Different transforms for lighting. Use different transforms to optimize
+                    fragment lighting by doing some of the computations in the vertex shader.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>Texturing the World</title>
+        <para>This tutorial involves putting a texture on a simple, lit object.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Texture objects. An OpenGL object that holds images.</para>
+            </listitem>
+            <listitem>
+                <para>Normalized texture coordinates. Vertex attributes that are used to apply a
+                    texture to a surface.</para>
+            </listitem>
+            <listitem>
+                <para>Texture filtering. How OpenGL computes inbetween values for fragments when you
+                    sample a texture.</para>
+            </listitem>
+            <listitem>
+                <para>The GLSL side of texturing. Samplers and texture functions in fragment
+                    shaders.</para>
+            </listitem>
+            <listitem>
+                <para>Associating textures with programs. Sampler uniforms and texture image
+                    units.</para>
+            </listitem>
+            <listitem>
+                <para>Combining texture colors with the results of lighting.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>More Images is Better</title>
+        <para>This tutorial shows a ground plane with a highly aliased texture. An animated camera
+            shows off the aliasing. Then we apply mipmapping and anisotropic filtering to the
+            surface to improve it.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Texture wrapping. How normalized texture coordinates outside of the [0, 1]
+                    range are interpreted.</para>
+            </listitem>
+            <listitem>
+                <para>Texture aliasing. Where it comes from, and how to solve it.</para>
+            </listitem>
+            <listitem>
+                <para>Mipmap generation.</para>
+            </listitem>
+            <listitem>
+                <para>Mipmap filtering. How it works, and how to set it in OpenGL.</para>
+            </listitem>
+            <listitem>
+                <para>Anisotropic filtering. How it works and how to set it in OpenGL.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>Climbing the Mountain</title>
+        <para>This tutorial uses a height map and adjust vertex positions and normals to match it.
+            The height map is a texture. No lighting yet.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Internal image formats, particularly 1-channel textures.</para>
+            </listitem>
+            <listitem>
+                <para>Vertex texture accessing. How it differs from fragment textures (mipmapping
+                    and such).</para>
+            </listitem>
+            <listitem>
+                <para>Using textures for non-color information. Also, scaling of the data that comes
+                    out of the image.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>The Bumpy Mountain</title>
+        <para>Build a height field, but with more fine details. This requires multiple textures: a
+            height texture and a more detailed bump map on top of it. Fill in the details with a
+            bump map. Obviously, this will need to have a light in the scene.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Constructing normals from the height field in the vertex shader.</para>
+            </listitem>
+            <listitem>
+                <para>Offset textures. Constructing normals from the texture map, using offsets into
+                    the detailed bump map.</para>
+            </listitem>
+            <listitem>
+                <para>Texture space-based lighting. Transforming the light into the space of the
+                    texture to do lighting. This requires binormal and tangent vectors.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>Ghostly Visage</title>
+        <para>This tutorial involves a scene with some opaque and some transparent objects.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>What the alpha value of a color means. Specifically, that it means whatever
+                    you want.</para>
+            </listitem>
+            <listitem>
+                <para>Framebuffer blending. The blend function, how it works, and how to change it
+                    in OpenGL.</para>
+            </listitem>
+            <listitem>
+                <para>Backface culling. Making sure that the back faces of blended objects don't get
+                    rendered.</para>
+            </listitem>
+            <listitem>
+                <para>How blending interacts with depth writing and testing. Namely, that you have
+                    to manually sort objects now: turn depth writes off and depth tests on.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>Video Camera</title>
+        <para>This tutorial involves rendering a view of one scene to a texture used in a different
+            location.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Framebuffer objects and renderbuffers. How to render to different
+                    targets.</para>
+            </listitem>
+            <listitem>
+                <para>Viewport settings.</para>
+            </listitem>
+            <listitem>
+                <para>Rendering the same scene from multiple camera angles. Managing the data for
+                    doing so.</para>
+            </listitem>
+            <listitem>
+                <para>Render to texture. Rendering to a texture and then using that texture as a
+                    source for rendering elsewhere.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>Selecting the Masses</title>
+        <para>This tutorial creates a number of entities that all move around, on pre-defined paths,
+            over a surface of bumpy terrain. They don't interact with the terrain. We then render
+            projected selection circles onto the ground beneath them.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Projective texturing. How projecting a texture over a surface works
+                    mathematically, and what support there is in the language.</para>
+            </listitem>
+            <listitem>
+                <para>Multi-pass rendering. Rendering geometry multiple times with a different
+                    program/texture set. Not strictly necessary in this example, but it shows how to
+                    do it if you need to.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>The Coming of Shadows</title>
+        <para>This tutorial creates mountainous terrain, and then applies shadow mapping against a
+            directional light. The light should animate to accentuate the effect.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Depth textures. Single-channel textures that take depth data from an OpenGL
+                    rendering. Can be used as direct render targets.</para>
+            </listitem>
+            <listitem>
+                <para>Texture comparison modes. Changing how the filtering algorithm works so that
+                    texture access compare to a given value, rather than simply sample from a point.
+                    This includes shadow sampler usage.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>Of Metal and Plastic</title>
+        <para>This tutorial involves creating a single mesh that has multiple lighting models: one
+            reflective and one very diffuse. There should be an animated light or two that shows
+            this off.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>BDRFs: Lighting models that are a function of surface normal, angle to the
+                    light, and angle to the camera.</para>
+            </listitem>
+            <listitem>
+                <para>The Phong specular lighting model.</para>
+            </listitem>
+            <listitem>
+                <para>Using a texture's value to control the strength of the Phong curve. Introduce
+                    floating-point textures here.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>Dynamic Lighting</title>
+        <para>This tutorial takes a scene with directional lighting and shadows, with specular
+            lighting on some of the objects (and an identifiable ground), and applies basic HDR to
+            it.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Non-clamped color spaces. Noting that the [0, 1] range is an approximation,
+                    and that light darkening is completely wrong.</para>
+            </listitem>
+            <listitem>
+                <para>16-bit floating-point values. Useful for blending and not taking up nearly as
+                    much room/performance.</para>
+            </listitem>
+            <listitem>
+                <para>Floating-point render targets. Don't forget the hardware limitations.</para>
+            </listitem>
+            <listitem>
+                <para>HDR techniques. How to reduce a floating-point texture to a [0, 1] range
+                    color.</para>
+            </listitem>
+            <listitem>
+                <para>Why one should use HDR.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>Blooming</title>
+        <para>This tutorial takes the previous scene and adds blooming of high powered
+            lights.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Blooming. A multi-pass algorithm of operations over the same texture. All done
+                    before reduction to the integer colorspace.</para>
+            </listitem>
+            <listitem>
+                <para>Introduce R11_G11_B10 as an optimization.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>Mirror Mirror</title>
+        <para>This tutorial has a skybox world with a shiny object and light source in it. The shiny
+            object should reflect the world and have proper specular with the light source. It
+            should still use HDR, but blooming is not required.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Skybox: Displays a static world around the object.</para>
+            </listitem>
+            <listitem>
+                <para>Cubemaps. Used to get the reflected color, as well as render the
+                    skybox.</para>
+            </listitem>
+            <listitem>
+                <para>RGB9_E5 Texture format. The skybox should use this format. Compact
+                    floating-point format with good precision and large range of values.</para>
+            </listitem>
+            <listitem>
+                <para>Properly combining lighting models. Adding lights from different sources to
+                    achieve result.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>Dark Shadows</title>
+        <para>This tutorial has an animated point-light source and a world of objects, some
+            animated. The point light should cast a shadow via cube-based shadow mapping.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Depth-formatted cube maps. Using cubemaps as depth comparison textures.</para>
+            </listitem>
+            <listitem>
+                <para>Render to Cubemap, using 6 render targets (but one depth renderbuffer).</para>
+            </listitem>
+            <listitem>
+                <para>Geometry shaders and layered rendering. These act as optimizations
+                    (theoretically, at least).</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>Twisty Objects, All Alike</title>
+        <para>This tutorial involves rendering a lot of fairly simple animating objects. They all
+            share the same texture. There should be a basic light in the scene.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Instanced rendering. An optimization for rendering multiple objects of the
+                    same kind.</para>
+            </listitem>
+            <listitem>
+                <para>Buffer textures. Used for getting data up to the card.</para>
+            </listitem>
+            <listitem>
+                <para>Buffer object streaming. Used for transferring the data efficiently.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>Twisty Objects, All Different</title>
+        <para>As above, only the objects have more per-instance data. Different textures and
+            material parameters per-instance.</para>
+        <para>Concepts:</para>
+        <itemizedlist>
+            <listitem>
+                <para>Conditional logic in fragment shaders.</para>
+            </listitem>
+            <listitem>
+                <para>Non-uniform control flow and Grad textures. Used to select between multiple
+                    texture instances.</para>
+            </listitem>
+            <listitem>
+                <para>Array textures. Used as a better means for selecting which texture to
+                    use.</para>
+            </listitem>
+        </itemizedlist>
+    </section>
+    <section>
+        <title>Functionality that needs tutorials</title>
+        <glosslist>
+            <glossentry>
+                <glossterm>Hierarchical spaces.</glossterm>
+                <glossdef>
+                    <para>Matrix stacks and such.</para>
+                    <para>Might be shown off with a robot or something that moves.</para>
+                </glossdef>
+            </glossentry>
+            <glossentry>
+                <glossterm>More vertex attributes</glossterm>
+                <glossdef>
+                    <para>Might be shown off with skinning.</para>
+                </glossdef>
+            </glossentry>
+            <glossentry>
+                <glossterm>Non-Triangle Primitives</glossterm>
+                <glossdef>
+                    <para>This might also be a good place to show off primitive restart.</para>
+                </glossdef>
+            </glossentry>
+            <glossentry>
+                <glossterm>Conditional discard</glossterm>
+                <glossdef>
+                    <para>AKA: Alpha test. Also use pre-multiplied alpha.</para>
+                    <para>This might be shown via rendering a tree or a chain-linked fence.</para>
+                </glossdef>
+            </glossentry>
+            <glossentry>
+                <glossterm>Pixel transfer</glossterm>
+                <glossdef>
+                    <para>Includes PBO for asynchronous delivery</para>
+                </glossdef>
+            </glossentry>
+            <glossentry>
+                <glossterm>Compressed image formats</glossterm>
+                <glossdef>
+                    <para>Show these off.</para>
+                </glossdef>
+            </glossentry>
+            <glossentry>
+                <glossterm>Transform feedback</glossterm>
+                <glossdef>
+                    <para>This should be justified by having a large vertex shader with a good
+                        quantity of vertices, as well as RTT. The feedback is an optimization: you
+                        feedback to simplify things, then use the feedback data with special
+                        shaders.</para>
+                </glossdef>
+            </glossentry>
+            <glossentry>
+                <glossterm>2D in 3D</glossterm>
+                <glossdef>
+                    <para>Ortho projections and depth ranges.</para>
+                </glossdef>
+            </glossentry>
+            <glossentry>
+                <glossterm>Near/far clipping</glossterm>
+                <glossdef>
+                    <para>Some kind of example that shows this stuff off and how to fix it.</para>
+                </glossdef>
+            </glossentry>
+            <glossentry>
+                <glossterm>Z-fighting</glossterm>
+                <glossdef>
+                    <para>An example to show how it can happen.</para>
+                </glossdef>
+            </glossentry>
+            <glossentry>
+                <glossterm>Stencil buffer</glossterm>
+                <glossdef>
+                    <para>Something to show their utility.</para>
+                </glossdef>
+            </glossentry>
+            <glossentry>
+                <glossterm>3D Textures</glossterm>
+                <glossdef>
+                    <para>Something to show their utility.</para>
+                </glossdef>
+            </glossentry>
+            <glossentry>
+                <glossterm>Multisampling</glossterm>
+                <glossdef>
+                    <para>Something to show the need for this.</para>
+                </glossdef>
+            </glossentry>
+            <glossentry>
+                <glossterm>Exact window pixel alignment</glossterm>
+                <glossdef>
+                    <para>Show how to do this in OpenGL.</para>
+                </glossdef>
+            </glossentry>
+            <glossentry>
+                <glossterm>Scissor box</glossterm>
+                <glossdef>
+                    <para>Something to show this feature off.</para>
+                </glossdef>
+            </glossentry>
+            <glossentry>
+                <glossterm>Flat Shading</glossterm>
+                <glossdef>
+                    <para>Show this off.</para>
+                </glossdef>
+            </glossentry>
+            <glossentry>
+                <glossterm>Early Depth Test</glossterm>
+                <glossdef>
+                    <para>Optimization.</para>
+                </glossdef>
+            </glossentry>
+        </glosslist>
+    </section>
+</article>

Documents/Tutorial 00/Core Graphics.xml

+<?xml version="1.0" encoding="UTF-8"?>
+<?oxygen RNGSchema="http://docbook.org/xml/5.0/rng/docbookxi.rng" type="xml"?>
+<?oxygen SCHSchema="http://docbook.org/xml/5.0/rng/docbookxi.rng"?>
+<article xmlns="http://docbook.org/ns/docbook" xmlns:xi="http://www.w3.org/2001/XInclude"
+    xmlns:xlink="http://www.w3.org/1999/xlink" version="5.0" xml:id="core_graphics">
+</article>

Documents/Tutorial 00/Tutorial 0.xml

+<?xml version="1.0" encoding="UTF-8"?>
+<?oxygen RNGSchema="http://docbook.org/xml/5.0/rng/docbookxi.rng" type="xml"?>
+<?oxygen SCHSchema="http://docbook.org/xml/5.0/rng/docbookxi.rng"?>
+<chapter xml:id="tut_00" xmlns="http://docbook.org/ns/docbook" xmlns:xi="http://www.w3.org/2001/XInclude"
+    xmlns:xlink="http://www.w3.org/1999/xlink" version="5.0">
+    <title>Introduction</title>
+    <para>Unlike most of the tutorials, this introduction is purely text. There is no source code or
+        project associated with this tutorial.</para>
+    <para>Here, we will be discussing graphical rendering theory and OpenGL. This serves as a primer
+        to the rest of the tutorials.</para>
+    <section>
+        <title>Graphics and Rendering</title>
+        <para>These tutorials are for users with any knowledge of graphics or none. As such, there
+            is some basic background information you need to know before we can start looking at
+            actual OpenGL code.</para>
+        <para>Everything you see on your screen, even the text you are reading right now (assuming
+            you are reading this on an electronic display device, rather than a printout) is simply
+            a two-dimensional array of pixels. If you take a screenshot of something on your screen,
+            and blow it up, it will look very blocky.</para>
+        <!--TODO: Add an image of blocky pixels here.-->
+        <para>Each of these blocks is a <glossterm>pixel</glossterm>. The word <quote>pixel</quote>
+            is derived from the term <quote><acronym>Pic</acronym>ture
+                <acronym>El</acronym>ement</quote>. Every pixel on your screen has a particular
+            color. A two-dimensional array of pixels is called an
+            <glossterm>image</glossterm>.</para>
+        <para>The purpose of graphics of any kind is therefore to determine what color to put in
+            what pixels. This determination is what makes text look like text, windows look like
+            windows, and so forth.</para>
+        <para>Since all graphics are just a two-dimensional array of pixels, how does 3D work? 3D
+            graphics is thus a system of producing colors for pixels that convince you that the
+            scene you are looking at is a 3D world rather than a 2D image. The process of converting
+            a 3D world into a 2D image of that world is called
+            <glossterm>rendering.</glossterm></para>
+        <para>There are several methods of rendering a 3D world. The process used by real-time
+            graphics hardware, such as that found in your computer, involves a very great deal of
+            fakery. This process is called <glossterm>rasterization,</glossterm> and a rendering
+            system that uses rasterization is called a <glossterm>rasterizer.</glossterm></para>
+        <para>In rasterizers, all objects that you see are empty shells. There are techniques that
+            are used to allow you to cut open these empty shells, but this simply replaces part of
+            the shell with another shell that shows what the inside looks like. Everything is a
+            shell.</para>
+        <para>All of these shells are made of triangles. Even surfaces that appear to be round are
+            merely triangles if you look closely enough. There are techniques that generate more
+            triangles for objects that appear closer or larger, so that the viewer can almost never
+            see the faceted silhouette of the object. But they are always made of triangles.</para>
+        <note>
+            <para>Some rasterizers use planar quadrilaterals: four-sided objects, where all of the
+                lines lie in the same plane. One of the reasons that graphics hardware always uses
+                triangles is that all of the lines of triangles are guaranteed to be in the same
+                plane.</para>
+        </note>
+        <para>Objects made from triangles are often called <glossterm>geometry</glossterm>, a
+                <glossterm>model</glossterm> or a <glossterm>mesh</glossterm>; these terms are used
+            interchangeably.</para>
+        <para>The process of rasterization has several phases. These phases are ordered into a
+            pipeline, where triangles enter from the top and a 2D image is filled in at the bottom.
+            This is one of the reasons why rasterization is so amenable to hardware acceleration: it
+            operates on each triangle one at a time, in a specific order.</para>
+        <para>This also means that the order in which meshes are submitted to the rasterizer can
+            affect its output.</para>
+        <para>OpenGL is an API for accessing a hardware-based rasterizer. As such, it conforms to
+            the model for rasterizers. A rasterizer receives a sequence of triangles from the user,
+            performs operations on them, and writes pixels based on this triangle data. This is a
+            simplification of how rasterization works in OpenGL, but it is useful for our
+            purposes.</para>
+        <formalpara>
+            <title>Triangles and Vertices</title>
+            <para>Triangles consist of 3 vertices. A vertex consists of a collection of data. For
+                the sake of simplicity (we will expand upon this later), let us say that this data
+                must contain a point in three dimensional space. Any 3 points that are not on the
+                same line create a triangle, so the smallest information for a triangle consists of
+                3 three-dimensional points.</para>
+        </formalpara>
+        <para>A point in 3D space is defined by 3 numbers or coordinates. An X coordinate, a Y
+            coordinate, and a Z coordinate. These are commonly written with parenthesis, as in (X,
+            Y, Z).</para>
+        <section>
+            <title>Rasterization Overview</title>
+            <para>The rasterization pipeline, particularly for modern hardware, is very complex.
+                This is a very simplified overview of this pipeline. It is necessary to have a
+                simple understanding of the pipeline before we look at the details of rendering
+                things with OpenGL. Those details can be overwhelming without a high level
+                overview.</para>
+            <formalpara>
+                <title>Clip Space Transformation</title>
+                <para>The first phase of rasterization is to transform the vertices of each triangle
+                    into a certain volume of space. Everything within this volume will be rendered
+                    to the output image, and everything that falls outside of this region will not
+                    be. This region corresponds to the view of the world that the user wants to
+                    render, to some degree.</para>
+            </formalpara>
+            <para>The volume that the triangle is transformed into is called, in OpenGL parlance,
+                    <glossterm>clip space</glossterm>. The positions of a vertex of a triangle in
+                clip space are called <glossterm>clip coordinates.</glossterm></para>
+            <para>Clip coordinates are a little different from regular positions. A position in 3D
+                space has 3 coordinates. A position in clip space has <emphasis>four</emphasis>
+                coordinates. The first three are the usual X, Y, Z positions; the fourth is called
+                W. This last coordinate actually defines the extents of clip space is for this
+                vertex.</para>
+            <para>Clip space can actually be different for different vertices. It is a region of 3D
+                space on the range [-W, W] in each of the X, Y, and Z directions. So vertices with a
+                different W coordinate are in a different clip space cube from other vertices. Since
+                each vertex can have an independent W component, each vertex of a triangle exists in
+                its own clip space.</para>
+            <para>In clip space, the positive X direction is to the right, the positive Y direction
+                is up, and the positive Z direction is away from the viewer.</para>
+            <!--TODO: Add an image of clip space here.-->
+            <para>The process of transforming vertices into clip space is quite arbitrary. OpenGL
+                provides a lot of flexibility in this step. We will cover this step in detail
+                throughout the tutorials.</para>
+            <para>Because clip space is the visible transformed version of the world, any triangles
+                that fall outside of this region are discarded. Any triangles that are partially
+                outside of this region undergo a process called <glossterm>clipping.</glossterm>
+                This breaks the triangle apart into a number of smaller triangles, such that the
+                smaller triangles cover the area within clip space. Hence the name <quote>clip
+                    space.</quote></para>
+            <formalpara>
+                <title>Normalized Coordinates</title>
+                <para>Clip space is interesting, but inconvenient. The extent of this space is
+                    different for each vertex, which makes visualizing a triangle rather difficult.
+                    Therefore, clip space is transformed into a more reasonable coordinate space:
+                        <glossterm>normalized device coordinates</glossterm>.</para>
+            </formalpara>
+            <para>This process is very simple. The X, Y, and Z of each vertex's position is divided
+                by W to get normalized device coordinates. That is all.</para>
+            <para>Therefore, the space of normalized device coordinates is essentially just clip
+                space, except that the range of X, Y and Z are [-1, 1]. The directions are all the
+                same. The division by W is an important part of projecting 3D triangles onto 2D
+                images, but we will cover that in a future tutorial.</para>
+            <formalpara>
+                <title>Window Transformation</title>
+                <para>The next phase of rasterization is to transform the vertices of each triangle
+                    again. This time, they are converted from normalized device coordinates to
+                        <glossterm>window coordinates</glossterm>. As the name suggests, window
+                    coordinates are relative to the window that OpenGL is running within.</para>
+            </formalpara>
+            <para>Even though they refer to the window, they are still three dimensional
+                coordinates. The X goes to the right, Y goes up, and Z goes away, just as for clip
+                space. The only difference is that the bounds for these coordinates depends on the
+                viewable window. It should also be noted that while these are in window coordinates,
+                none of the precision is lost. These are not integer coordinates; they are still
+                floating-point values, and thus they have precision beyond that of a single
+                pixel.</para>
+            <para>The bounds for Z are [0, 1], with 0 being the closest and 1 being the farthest.
+                Vertex positions outside of this range are not visible.</para>
+            <para>Note that window coordinates have the bottom-left position as the (0, 0) origin
+                point. This is counter to what users are used to in window coordinates, which is
+                having the top-left position be the origin. There are transform tricks you can play
+                to allow you to work in a top-left coordinate space.</para>
+            <para>The full details of this process will be discussed at length as the tutorials
+                progress.</para>
+            <formalpara>
+                <title>Scan Conversion</title>
+                <para>After converting the coordinates of a triangle to window coordinates, the
+                    triangle undergoes a process called <glossterm>scan conversion.</glossterm> This
+                    process takes the triangle and breaks it up based on the arrangement of window
+                    pixels over the output image that the triangle covers.</para>
+            </formalpara>
+            <!--TODO: Show a series of images, starting with a triangle, then overlying it with a pixel grid, followed by one showing
+which pixels get filled in.-->
+            <para>The specifics of which pixels get used and which do not for a triangle is not
+                important. What matters more than that is the fact that if two triangles are
+                perfectly adjacent, such that they share the same input vertex positions, the output
+                rasterization will never have holes or double coverage. Along the shared line, there
+                will be no overlap or holes between the two triangles.</para>
+            <para>The result of scan converting a triangle is a sequence of boxes along the area of
+                the triangle. These boxes are called <glossterm>fragments.</glossterm></para>
+            <para>Each fragment has certain data associated with it. This data contains the 2D
+                location of the fragment in window coordinates, and the Z value of the fragment.
+                This Z value is known as the depth of the fragment. There may be other information
+                that is part of a fragment, and we will expand on that in later tutorials.</para>
+            <formalpara>
+                <title>Fragment Processing</title>
+                <para>This phase takes a fragment from a scan converted triangle and transforms it
+                    into one or more color values and a single depth value. The order that fragments
+                    from a single triangle are processed in is irrelevant; since a single triangle
+                    lies in a single plane, fragments generated from it cannot possible overlap.
+                    However, the fragments from another triangle can possibly overlap. Since order
+                    is important in a rasterizer, the fragments from one triangle must be processed
+                    before the fragments from another triangle.</para>
+            </formalpara>
+            <para>This phase is quite arbitrary. The user of OpenGL has a lot of options of how to
+                decide what color to assign a fragment. We will cover this step in detail throughout
+                the tutorials.</para>
+            <note>
+                <title>Direct3D Note</title>
+                <para>Direct3D prefers to call this stage <quote>pixel processing</quote> or
+                        <quote>pixel shading</quote>. This is a misnomer, because as we will see in
+                    tutorials on antialiasing, multiple fragments from a single triangle can be
+                    combined together to form an output pixel. Also, the fragment has not been
+                    written to the image as of yet. Indeed, this step can conditionally prevent
+                    rendering of a fragment based on arbitrary computations. Thus a
+                        <quote>pixel</quote> in D3D parlance may never actually become a pixel at
+                    all.</para>
+            </note>
+            <formalpara>
+                <title>Fragment Writing</title>
+                <para>After generating one or more colors and a depth value, the fragment is written
+                    to the destination image. This step involves more than simply writing to the
+                    destination image. Combining the color and depth with the colors that are
+                    currently in the image can involve a number of computations. These will be
+                    covered in detail in various tutorials.</para>
+            </formalpara>
+        </section>
+        <section>
+            <title>Colors</title>
+            <para>Previously, a pixel was stated to be an element in a 2D image that has a
+                particular color. A color can be described in many ways.</para>
+            <para>In computer graphics, the usual description of a color is as a series of numbers
+                on the range [0, 1]. Each of the numbers corresponds to the intensity of a
+                particular reference color; thus the final color represented by the series of
+                numbers is a mix of these reference colors.</para>
+            <para>The set of reference colors is called a <glossterm>colorspace</glossterm>. The
+                most common color space a screen is RGB, where the reference colors are Red, Green
+                and Blue. Printed works tend to use CMYK (Cyan, Magenta, Yellow, Black). Since we're
+                dealing with rendering to a screen, and because OpenGL requires it, we will use the
+                RGB colorspace.</para>
+            <para>So a pixel in OpenGL is defined as 3 values on the range [0, 1] that represent a
+                color in the RGB colorspace. This will get extended slightly, as we deal with
+                transparency later.</para>
+        </section>
+        <section>
+            <title>Shader</title>
+            <para>A <glossterm>shader</glossterm> is a program designed to be run on a renderer as
+                part of the rendering operation. Regardless of the kind of rendering system in use,
+                shaders can only be executed at certain points in that rendering process. These
+                    <glossterm>shader stages</glossterm> represent hooks where a user can add
+                arbitrary algorithms to create a specific visual effect.</para>
+            <para>In term of rasterization as outlined above, there are several shader stages where
+                arbitrary processing is both economical for performance and offers high utility to
+                the user. For example, the transformation of an incoming vertex to clip space is a
+                useful hook for user-defined code, as is the processing of a fragment into final
+                colors and depth.</para>
+            <para>Shaders for OpenGL are run on the actual rendering hardware. This can often free
+                up valuable CPU time for other tasks, or simply perform operations that would be
+                difficult if not impossible without the flexibility of executing arbitrary code. A
+                downside of this is that they must live within certain limits, some of them quite
+                il-defined, that CPU code does not have to.</para>
+            <para>There are a number of shading languages available to various APIs. The one used in
+                this tutorial is the primary shading language of OpenGL. It is called,
+                unimaginatively, the OpenGL Shading Language, or <acronym>GLSL</acronym>. for short.
+                It looks deceptively like C, but it is very much <emphasis>not</emphasis> C.</para>
+        </section>
+    </section>
+    <section>
+        <title>What is OpenGL</title>
+        <para>Before we can begin looking into writing an OpenGL application, we must first know
+            what it is that we are writing. What exactly is OpenGL?</para>
+        <section>
+            <title>OpenGL as an API</title>
+            <para>OpenGL is usually looked at as an Application Programming Interface
+                    (<acronym>API</acronym>). The OpenGL API has been exposed to a number of
+                languages. But the one that they all ultimately use at their lowest level is the C
+                API.</para>
+            <para>The API, in C, is defined by a number of typedefs, #defined enumerator values, and
+                functions. The typedefs define basic GL types like <type>GLint</type>,
+                    <type>GLfloat</type> and so forth.</para>
+            <para>Complex aggregates like structs are never directly exposed in OpenGL. Any such
+                constructs are hidden behind the API. This makes it easier to expose the OpenGL API
+                to non-C languages without having a complex conversion layer.</para>
+            <para>In C++, if you wanted an object that contained an integer, a float, and a string,
+                you would create it and access it like this:</para>
+            <programlisting>struct Object
+{
+    int anInteger;
+    float aFloat;
+    char *aString;
+};
+
+//Create the storage for the object.
+Object newObject;
+
+//Put data into the object.
+newObject.anInteger = 5;
+newObject.aFloat = 0.4f;
+newObject.aString = "Some String";
+</programlisting>
+            <para>In OpenGL, you would use an API that looks more like this:</para>
+            <programlisting>//Create the storage for the object
+GLuint objectName;
+glGenObject(1, &amp;objectName);
+
+//Put data into the object.
+glBindObject(GL_MODIFY, objectName);
+glObjectParameteri(GL_MODIFY, GL_OBJECT_AN_INTEGER, 5);
+glObjectParameterf(GL_MODIFY, GL_OBJECT_A_FLOAT, 0.4f);
+glObjectParameters(GL_MODIFY, GL_OBJECT_A_STRING, "Some String");</programlisting>
+            <para>None of these are actual OpenGL commands, of course. This is simply an example of
+                what the interface to such an object would look like.</para>
+            <para>OpenGL owns the storage for all OpenGL objects. Because of this, the user can only
+                access an object by reference. Almost all OpenGL objects are referred to by an
+                unsigned integer (the <type>GLuint</type>). Objects are created by a function of the
+                form <function>glGen*</function>, where * is the type of the object. The first
+                parameter is the number of objects to create, and the second is a
+                    <type>GLuint*</type> array that receives the newly created object names.</para>
+            <para>To modify most objects, they must first be bound to the context. Many objects can
+                be bound to different locations in the context; this allows the same object to be
+                used in different ways. These different locations are called
+                    <glossterm>targets</glossterm>; all objects have a list of valid targets, and
+                some have only one. In the above example, the fictitious target
+                    <quote>GL_MODIFY</quote> is the location where <varname>objectName</varname> is
+                bound.</para>
+            <para>The functions that actually change values within the object are given a target
+                parameter, so that they could modify objects bound to different targets.</para>
+            <para>Note that all OpenGL objects are not as simple as this example, and the functions
+                that change object state do not all follow these naming conventions. Also, exactly
+                what it means to bind an object to the context is explained below.</para>
+        </section>
+        <section>
+            <title>The Structure of OpenGL</title>
+            <para>The OpenGL API is defined as a state machine. Almost all of the OpenGL functions
+                set or retrieve some state in OpenGL. The only functions that do not change state
+                are functions that use the currently set state to cause rendering to happen.</para>
+            <para>You can think of the state machine as a very large struct with a great many
+                different fields. This struct is called the OpenGL <glossterm>context</glossterm>,
+                and each field in the context represents some information necessary for
+                rendering.</para>
+            <para>Objects in OpenGL are thus defined as a list of fields in this struct that can be
+                saved and restored. <glossterm>Binding</glossterm> an object to a target within the
+                context causes the data in this object to replace some of the context's state. Thus
+                after the binding, future function calls that read from or modify this context state
+                will read or modify the state within the object.</para>
+            <para>Objects are usually represented as <type>GLuint</type> integers; these are handles
+                to the actual OpenGL objects. The integer value 0 is special; it acts as the object
+                equivalent of a NULL pointer. Binding object 0 means to unbind the currently bound
+                object. This means that the original context state, the state that was in place
+                before the binding took place, now becomes the context state.</para>
+            <para>Due note that this is simply a model of OpenGL's <emphasis>behavior.</emphasis>
+                This is most certainly <emphasis>not</emphasis> how it is actually
+                implemented.</para>
+        </section>
+        <section>
+            <title>The OpenGL Specification</title>
+            <para>To be technical about it, OpenGL is not an API; it is a specification. A document.
+                The C API is merely one way to implement the spec. The specification defines the
+                initial OpenGL state, what each function does, and what is supposed to happen when
+                you call a rendering function.</para>
+            <para>The specification is written by the OpenGL <glossterm>Architectural Review
+                    Board</glossterm> (<acronym>ARB</acronym>), a group of representatives from
+                companies like Apple, NVIDIA, and AMD (the ATi part), among others. The ARB is part
+                of the <link xlink:href="http://www.khronos.org/">Khronos Group</link>.</para>
+            <para>The specification is a very complicated and technical document. I do not suggest
+                that the novice graphics programmer read it. If you do however, the most important
+                thing to understand about it is this: it describes <emphasis>results</emphasis>, not
+                implementation. For example, the spec says that clipping of triangles happens before
+                transforming them from clip-space to normalized device coordinate space. Hardware
+                almost certainly does clipping in normalized device coordinate space, simply because
+                all the vertices are in the same space. It doesn't matter to the results, so it is
+                still a valid OpenGL implementation.</para>
+        </section>
+    </section>
+    <glossary>
+        <title>Glossary</title>
+        <glossentry>
+            <glossterm>pixel</glossterm>
+            <glossdef>
+                <para>The smallest division of a digital image. A pixel has a particular color in a
+                    particular colorspace.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>image</glossterm>
+            <glossdef>
+                <para>A two-dimensional array of pixels.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>colorspace</glossterm>
+            <glossdef>
+                <para>The set of reference colors that define a way of representing a color in
+                    computer graphics. All colors are defined relative to a particular
+                    colorspace.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>rendering</glossterm>
+            <glossdef>
+                <para>The process of taking the source 3D world and converting it into a 2D image
+                    that represents a view of that world from a particular angle.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>rasterization</glossterm>
+            <glossdef>
+                <para>A particular rendering method, used to convert a series of 3D triangles into a
+                    2D image.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>geometry, model, mesh</glossterm>
+            <glossdef>
+                <para>A single object in 3D space made of triangles.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>vertex</glossterm>
+            <glossdef>
+                <para>One of the 3 elements that make up a triangle. Vertices can contain arbitrary
+                    of data, but among that data is a 3-dimensional position representing the
+                    location of the vertex in 3D space.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>clip space, clip coordinates</glossterm>
+            <glossdef>
+                <para>A region of three-dimensional space into which vertex positions are
+                    transformed. These vertex positions are 4 dimensional quantities. The fourth
+                    component (W) of clip coordinates represents the visible range of clip space for
+                    that vertex. So the X, Y, and Z component of clip coordinates must be between
+                    [-W, W] to be a visible part of the world.</para>
+                <para>In clip space, positive X goes right, positive Y up, and positive Z
+                    away.</para>
+                <para>Clip-space vertices are output by the vertex processing stage of the rendering
+                    pipeline.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>normalized device coordinates</glossterm>
+            <glossdef>
+                <para>These are clip coordinates that have been divided by their fourth component.
+                    This makes this range of space the same for all components. Vertices with
+                    positions on the range [-1, 1] are visible, and other vertices are not.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>window space, window coordinates</glossterm>
+            <glossdef>
+                <para>A region of three-dimensional space that normalized device coordinates are
+                    mapped to. The X and Y positions of vertices in this space are relative to the
+                    destination image. The origin is in the bottom-left, with positive X going right
+                    and positive Y going up. The Z value is a number on the range [0, 1], where 0 is
+                    the closest value and 1 is the farthest. Vertex positions outside of this range
+                    are not visible.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>scan conversion</glossterm>
+            <glossdef>
+                <para>The process of taking a triangle in window space and converting it into a
+                    number of fragments based on projecting it onto the pixels of the output
+                    image.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>fragment</glossterm>
+            <glossdef>
+                <para>A single element of a scan converted triangle. A fragment can contain
+                    arbitrary data, but among that data is a 3-dimensional position, identifying the
+                    location on the triangle in window space where this fragment originates
+                    from.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>shader</glossterm>
+            <glossdef>
+                <para>A program designed to be executed by a renderer, in order to perform some
+                    user-defined operations.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>shader stage</glossterm>
+            <glossdef>
+                <para>A particular place in a rendering pipeline where a shader can be executed to
+                    perform a computation.</para>
+            </glossdef>
+        </glossentry>
+    </glossary>
+</chapter>

Documents/Tutorial 01/Tutorial 01.xml

+<?xml version="1.0" encoding="UTF-8"?>
+<?oxygen RNGSchema="http://docbook.org/xml/5.0/rng/docbookxi.rng" type="xml"?>
+<?oxygen SCHSchema="http://docbook.org/xml/5.0/rng/docbookxi.rng"?>
+<chapter xmlns="http://docbook.org/ns/docbook" xmlns:xi="http://www.w3.org/2001/XInclude"
+    xmlns:xlink="http://www.w3.org/1999/xlink" version="5.0">
+    <title>Hello, Triangle!</title>
+    <para>It is traditional for tutorials and introductory books on programming languages start with
+        a program called <quote>Hello, World!</quote> This program is the simplest code necessary to
+        print the text <quote>Hello, World!</quote> It serves as a good test to see that one's build
+        system is functioning and that one can compile and execute code.</para>
+    <para>Using OpenGL to write actual text is rather involved. In lieu of text, our first tutorial
+        will be drawing a single triangle to the screen.</para>
+    <section>
+        <title>Framework and FreeGLUT</title>
+        <para>The source to this tutorial, found in <filename>Tut1 Hello
+                Triangle/tut1.cpp</filename>, is fairly simple. The project file that builds the
+            final executable actually uses two source files: the tutorial file and a common
+            framework file found in <filename>framework/framework.cpp</filename>. The framework file
+            is where the actual initialization of FreeGLUT is done; it is also where main is. This
+            file simply uses functions defined in the main tutorial file.</para>
+        <para>FreeGLUT is a fairly simple OpenGL initialization system. It creates and manages a
+            single window; all OpenGL commands refer to this window. Because windows in various GUI
+            systems need to have certain book-keeping done, how the user interfaces with this is
+            rigidly controlled.</para>
+        <para>The framework file expects 4 functions to be defined: <function>init</function>,
+                <function>display</function>, <function>reshape</function>, and
+                <function>keyboard</function>. The <function>init</function> function is called
+            after OpenGL is initialized. This gives the tutorial file the opportunity to load what
+            it needs into OpenGL before actual rendering takes place. The
+                <function>reshape</function> function is called by FreeGLUT whenever the window is
+            resized. This allows the tutorial to make whatever OpenGL calls are necessary to keep
+            the window's size in sync with OpenGL. The <function>keyboard</function> function is
+            called by FreeGLUT whenever the user presses a key. This gives the tutorial the chance
+            to process some basic user input.</para>
+        <para>The <function>display</function> function is where the most important work happens.
+            FreeGLUT will call this function when it detects that the screen needs to be rendered
+            to.</para>
+    </section>
+    <section>
+        <title>Dissecting Display</title>
+        <para>The <function>display</function> function seems on the surface to be fairly simple.
+            However, the functioning of it is fairly complicated and intertwined with the
+            initialization done in the <function>init</function> function.</para>
+        <example>
+            <title>The <function>display</function> Function</title>
+            <programlisting>glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
+glClear(GL_COLOR_BUFFER_BIT);
+
+glUseProgram(theProgram);
+
+glBindBuffer(GL_ARRAY_BUFFER, positionBufferObject);
+glEnableVertexAttribArray(positionAttrib);
+glVertexAttribPointer(positionAttrib, 4, GL_FLOAT, GL_FALSE, 0, 0);
+
+glDrawArrays(GL_TRIANGLES, 0, 3);
+
+glDisableVertexAttribArray(positionAttrib);
+glUseProgram(0);
+
+glutSwapBuffers();</programlisting>
+        </example>
+        <para>Let us examine this code in detail.</para>
+        <para>The first two lines clear the screen. <function>glClearColor</function> is one of
+            those state setting functions; it sets the color to use when clearing the screen. It
+            sets the clearing color to black. <function>glClear</function> does not set OpenGL
+            state; it causes the screen to be cleared. The <literal>GL_COLOR_BUFFER_BIT</literal>
+            parameter means that the clear call will affect the color buffer, causing it to be
+            cleared to the current clearing color.</para>
+        <para>The next line sets the current shader program to be used by all subsequent rendering
+            commands. We will go into detail as to how this works later.</para>
+        <para>The next three commands all set state. These command set up the coordinates of the
+            triangle to be rendered. They tell OpenGL the location in memory that the positions of
+            the triangle will come from. The specifics of how these work will be detailed
+            later.</para>
+        <para>The <function>glDrawArrays</function> function is, as the name suggests, a rendering
+            function. It uses the current state to generate a stream of vertices that will form
+            triangles.</para>
+        <para>The next two lines are simply cleanup work, undoing some of the setup that was done
+            for the purposes of rendering.</para>
+        <para>The last line, <function>glutSwapBuffers</function>, is a FreeGLUT command, not an
+            OpenGL command. The OpenGL framebuffer, as we set up in
+                <filename>framework.cpp</filename>, is double-buffered. This means that the image
+            that are currently being shown to the user is <emphasis>not</emphasis> the same image we
+            are rendering to. Thus, all of our rendering is hidden from view until it is shown to
+            the user. This way, the user never sees a half-rendered image.
+                <function>glutSwapBuffers</function> is the function that causes the image we are
+            rendering to be displayed to the user.</para>
+    </section>
+    <section>
+        <title>Following the Data</title>
+        <para>In the <link linkend="tut_00">basic background section</link>, we described the
+            functioning of the OpenGL pipeline. We will now revisit this pipeline in the context of
+            the code in tutorial 1. This will give us an understanding about the specifics of how
+            OpenGL goes about rendering data.</para>
+        <section>
+            <title>Vertex Transfer</title>
+            <para>The first stage in the rasterization pipeline is transforming vertices to clip
+                space. Before OpenGL can do this however, it must receive a list of vertices. So the
+                very first stage of the pipeline is sending triangle data to OpenGL.</para>
+            <para>This is the data that we wish to transfer:</para>
+            <programlisting>const float vertexPositions[] = {
+    0.75f, 0.75f, 0.0f, 1.0f,
+    0.75f, -0.75f, 0.0f, 1.0f,
+    -0.75f, -0.75f, 0.0f, 1.0f,
+};</programlisting>
+            <para>Each line of 4 values represents a 4D position of a vertex. These are four
+                dimensional because, as you may recall, clip-space is 4D as well. These vertex
+                positions are already in clip space. What we want OpenGL to do is render a triangle
+                based on this vertex data. Since every 4 floats represents a vertex's position, we
+                have 3 vertices: the minimum number for a triangle.</para>
+            <para>Even though we have this data, OpenGL cannot use it directly. OpenGL has some
+                limitations on what memory it can read from. You can allocate vertex data all you
+                want yourself; OpenGL cannot directly see any of your memory. Therefore, the first
+                step is to allocate some memory that OpenGL <emphasis>can</emphasis> see, and fill
+                that memory with our data. This is done with something called a <glossterm>buffer
+                    object.</glossterm></para>
+            <para>A buffer object is a linear array of memory allocated by OpenGL at the behest of
+                the user. This memory is controlled by the user, but the user has only indirect
+                control over it. Think of a buffer object as an array of GPU memory. The GPU can
+                read this memory quickly, so storing data in it has performance advantages.</para>
+            <para>The buffer object in the tutorial was created during initialization. Here is the
+                code responsible for creating the buffer object:</para>
+            <example>
+                <title>Buffer Object Initialization</title>
+                <programlisting>void InitializeVertexBuffer()
+{
+    glGenBuffers(1, &amp;positionBufferObject);
+    
+    glBindBuffer(GL_ARRAY_BUFFER, positionBufferObject);
+    glBufferData(GL_ARRAY_BUFFER, sizeof(vertexPositions), vertexPositions, GL_STATIC_DRAW);
+    glBindBuffer(GL_ARRAY_BUFFER, 0);
+}</programlisting>
+            </example>
+            <para>The first line creates the buffer object, storing the handle to the object in the
+                global variable <varname>positionBufferObject</varname>. Though the object now
+                exists, it doesn't own any memory yet. That is because we have not allocated any
+                with this object.</para>
+            <para>The <function>glBindBuffer</function> function makes the buffer object the
+                currently bound buffer to the <literal>GL_ARRAY_BUFFER</literal> binding target. As
+                mentioned in Tutorial 0, objects in OpenGL usually have to be bound to the context
+                in order for them to do anything, and buffer objects are no exception.</para>
+            <para>The <function>glBufferData</function> function allocates memory for the buffer
+                currently bound to <literal>GL_ARRAY_BUFFER</literal>, which is the one we just
+                created. We already have some vertex data; the problem is that it is in our memory
+                rather than OpenGL's memory. This function allocates enough GPU memory to store our
+                vertex data. The third parameter is a pointer to the data to initialize the buffer
+                with; we give it our vertex data. The fourth parameter is something we will look at
+                in future tutorials.</para>
+            <para>The second bind buffer call is simply cleanup. By binding the buffer object 0 to
+                    <literal>GL_ARRAY_BUFFER</literal>, we cause the buffer object previously bound
+                to that target to become unbound from it. This was not strictly necessary, as any
+                later binds to this target will simply unbind what is already there. But unless you
+                have very strict control over your rendering, it is usually a good idea to unbind
+                the objects you bind.</para>
+            <para>This is all just to get the vertex data in the GPU's memory. But buffer objects
+                are not formatted; as far as OpenGL is concerned, all we did was fill a buffer
+                object with random binary data. We now need to do something that tells OpenGL that
+                there is vertex data in this buffer object.</para>
+            <para>We do this in the rendering code. That is the purpose of these lines:</para>
+            <programlisting>glBindBuffer(GL_ARRAY_BUFFER, positionBufferObject);
+glEnableVertexAttribArray(positionAttrib);
+glVertexAttribPointer(positionAttrib, 4, GL_FLOAT, GL_FALSE, 0, 0);</programlisting>
+            <para>The first function we have seen before. It simply says that we are going to use
+                this buffer object.</para>
+            <para>The second function, <function>glEnableVertexAttribArray</function> is something
+                we will explain in the next section, when we talk about where
+                    <varname>positionAttrib</varname> comes from. Suffice it to say that this
+                function tells OpenGL that the vertex data called <varname>positionAttrib</varname>
+                will be provided in rendering calls. Without this function, the next one is
+                unimportant.</para>
+            <para>The third function is the real key. <function>glVertexAttribPointer</function>,
+                despite having the word <quote>Pointer</quote> in it, does not deal with pointers.
+                Instead, it deals with buffer objects.</para>
+            <para>This function tells OpenGL where a particular piece of vertex data is coming from.
+                The buffer that is bound to GL_ARRAY_BUFFER at the time that this function is called
+                is the buffer object that will be associated with this piece of data.</para>
+            <para>What this particular function call is saying is this. <quote>The piece of vertex
+                    data called <varname>positionAttrib</varname> comes from the buffer object
+                        <varname>positionBufferObject</varname>. This piece of vertex data contains
+                    32-bit float values, and each piece is a sequence of 4 of them. The data starts
+                    at the 0th byte of the buffer object, and each set of 4 32-bit floats is tightly
+                    packed together.</quote> This means that our data of 24 floats represents enough
+                information for the 3 vertices of a single triangle; this is exactly what we
+                want.</para>
+            <para>The specifics of this function call will be discussed in later tutorials.</para>
+            <para>Once OpenGL knows where to get its vertex data from, it can now use that vertex
+                data to render.</para>
+            <programlisting>glDrawArrays(GL_TRIANGLES, 0, 3);</programlisting>
+            <para>This function seems very simple on the surface, but it does a great deal. The
+                second and third parameters represent the start index and the number of indices to
+                read from our vertex data. We start at the 0th index, and read 3 vertices from it.
+                The first parameter tells OpenGL that it is to take every 3 vertices that it gets as
+                an independent triangle. Thus, it will read 3 vertices and connect them to form a
+                triangle.</para>
+            <para>Again, we will go into details in another tutorial.</para>
+        </section>
+        <section>
+            <title>Vertex Processing and Shaders</title>
+            <para>Now that we can tell OpenGL what the vertex data is, we come to the next stage of
+                the pipeline: vertex processing. This is one of two programmable stages that we will
+                cover in this tutorial, so this involve the use of a
+                <glossterm>shader.</glossterm></para>
+            <para>All a shader is is a program that runs on the GPU. There are several possible
+                shader stages in the pipeline, and each has its own inputs and outputs. The purpose
+                of a shader is to take its inputs, as well as potentially various other data, and
+                convert them into a set of outputs.</para>
+            <para>Each shader is executed over a set of inputs. It is important to note that a
+                shader, of any stage, operates <emphasis>completely independently</emphasis> of any
+                other shader of that stage. There can be no crosstalk between separate executions of
+                a shader. Execution for each set of inputs starts from the beginning of the shader
+                and continues to the end. A shader defines what its inputs and outputs are, and it
+                is illegal for a shader to complete without writing to all of its outputs.</para>
+            <para>Vertex shaders, as the name implies, operate on vertices. Specifically, each
+                invocation of a vertex shader operates on a <emphasis>single</emphasis> vertex.
+                These shaders must output, among any other user-defined outputs, a clip-space
+                position for that vertex. Where this comes from is up to the shader.</para>
+            <para>Shaders in OpenGL are written in the OpenGL Shading Language
+                    (<acronym>GLSL</acronym>). This language looks suspiciously like C, but it is
+                very much not C. It has far too many limitations to be C (for example, recursion is
+                forbidden). This is what our simple vertex shader looks like:</para>
+            <example>
+                <title>Vertex Shader</title>
+                <programlisting>#version 150
+
+in vec4 position;
+void main()
+{
+    gl_Position = position;
+}</programlisting>
+            </example>
+            <para>This looks fairly simple. The first line states that the version of GLSL used by
+                this shader is version 1.50. A version declaration is required for all GLSL
+                shaders.</para>
+            <para>The next line defines an input to the vertex shader. The input is called
+                    <varname>position</varname> and is of type <type>vec4</type>: a 4-dimensional
+                vector of floating-point values.</para>
+            <para>As with C, a shader's execution starts with the <function>main</function>
+                function. This shader is very simple, copying the input <varname>position</varname>
+                into something called <varname>gl_Position</varname>. This is a variable that is
+                    <emphasis>not</emphasis> defined in the shader; that is because it is a standard
+                variable defined by every vertex shader. If you see an identifier in GLSL that
+                starts with <quote>gl_,</quote> then it must be a built-in identifier.</para>
+            <para><varname>gl_Position</varname> is defined as:</para>
+            <programlisting>out vec4 gl_Position;</programlisting>
+            <para>Recall that the minimum a vertex shader must do is generate a clip-space position
+                for the vertex. That is what <varname>gl_Position</varname> is: it is the output
+                that represents a clip-space position.</para>
+            <formalpara>
+                <title>Vertex Attributes</title>
+                <para>Inputs to and outputs from a shader stage come from somewhere and go to
+                    somewhere. Thus, the input <varname>position</varname> must be filled in with
+                    data somewhere. So where does that data come from? Inputs to a vertex shader are
+                    called <glossterm>vertex attributes</glossterm>.</para>
+            </formalpara>
+            <para>You might recognize something similar to the term <quote>vertex attribute.</quote>
+                For example, <quote>glEnable<emphasis>VertexAttrib</emphasis>Array</quote> or
+                        <quote>gl<emphasis>VertexAttrib</emphasis>Pointer.</quote></para>
+            <para>This is how data flows down the pipeline in OpenGL. When rendering starts, vertex
+                data in a buffer object is read based on setup work done by
+                    <varname>glVertexAttribPointer</varname>. This function describes where the data
+                for an attribute comes from. The connection between a particular call to
+                    <function>glVertexAttribPointer</function> and the string name of an input value
+                to a vertex shader is somewhat complicated. This is where that mysterious variable,
+                    <varname>positionAttrib,</varname> comes into play.</para>
+            <para>The details of compiling a shader will be gone over a bit later, but the
+                connection is made with this call:</para>
+            <programlisting>positionAttrib = glGetAttribLocation(theProgram, "position");</programlisting>
+            <para>The variable <varname>theProgram</varname> represents the vertex shader (and the
+                fragment shader, but that's for later). The function
+                    <function>glGetAttribLocation</function> takes the given string that specifies a
+                vertex input, and it returns a number that represents that particular input. This
+                number is then used in subsequent <function>glVertexAttribPointer</function> calls
+                and the like to represent the attribute for <quote>position.</quote></para>
+        </section>
+        <section>
+            <title>Rasterization</title>
+            <para>All that has happened thus far is that 3 vertices have been given to OpenGL and it
+                has transformed them with a vertex shader into 3 positions in clip-space. Next, the
+                vertex positions are transformed into normalized-device coordinates by dividing the
+                3 XYZ components of the position by the W component. In our case, W is always 1.0,
+                so the positions are already effectively in normalized-device coordinates.</para>
+            <para>After this, the vertex positions are transformed into window coordinates. This is
+                done with something called the <glossterm>viewport transform</glossterm>. This is so
+                named because of the function used to set it up, <function>glViewport</function>.
+                The tutorial calls this function every time the window's size changes. Remember that
+                the framework calls <function>reshape</function> whenever the window's size changes.
+                So the tutorial's implementation of reshape is this:</para>
+            <example>
+                <title>Reshaping Window</title>
+                <programlisting>void reshape (int w, int h)
+{
+    glViewport(0, 0, (GLsizei) w, (GLsizei) h);
+}</programlisting>
+            </example>
+            <para>This tells OpenGL what area of the available area we are rendering to. In this
+                case, we change it to match the full available area; without this function, resizing
+                the window would have no effect on the rendering. Also, make note of the fact that
+                we make no effort to keep the aspect ratio constant.</para>
+            <para>Recall that window coordinates are in a lower-left coordinate system. So the point
+                (0, 0) is the bottom left of the window. This function takes the bottom left
+                position as the first two coordinates, and the width and height of the viewport
+                rectangle as the other two coordinates.</para>
+            <para>Once in window coordinates, OpenGL can now take these 3 vertices and scan-convert
+                it into a series of fragments. In order to do this however, OpenGL must decide what
+                the list of vertices represents.</para>
+            <para>OpenGL can interpret a list of vertices in a variety of different ways. The way
+                OpenGL interprets vertex lists is given by the draw command:</para>
+            <programlisting>glDrawArrays(GL_TRIANGLES, 0, 3);</programlisting>
+            <para>The enum <literal>GL_TRIANGLES</literal> tells OpenGL that every 3 vertices of the
+                list should be taken to be a triangle. Since we passed only 3 vertices, we get 1
+                triangle.</para>
+        </section>
+        <section>
+            <title>Fragment Processing</title>
+            <para>A fragment shader is used to compute the output color(s) of a fragment. The inputs
+                of a fragment shader include the window-space XYZ position of the fragment. It can
+                also include user-defined data, but we will get to that.</para>
+            <para>Our fragment shader looks like this:</para>
+            <example>
+                <title>Fragment Shader</title>
+                <programlisting>#version 150
+
+out vec4 outputColor;
+void main()
+{
+   outputColor = vec4(1.0f, 1.0f, 1.0f, 1.0f);
+}</programlisting>
+            </example>
+            <para>As with the vertex shader, the first line states that the shader uses GLSL version
+                1.50.</para>
+            <para>The next line specifies an output for the fragment shader. The output variable is
+                of type <type>vec4</type>.</para>
+            <para>The main function simply sets the output color to a 4-dimensional vector, with all
+                of the components as 1.0f. This sets the Red, Green, and Blue components of the
+                color to full intensity, which is 1.0; this creates the white color of the triangle.
+                The fourth component is something we will see in later tutorials.</para>
+            <para>This fragment shader does not do anything with the position input.</para>
+            <para>After the fragment shader executes, the fragment output color is written to the
+                output image.</para>
+        </section>
+    </section>
+    <section>
+        <title>Making Shaders</title>
+        <para>We glossed over exactly how these text strings called shaders actually get used. We
+            will go into some detail on that now.</para>
+        <note>
+            <para>If you are familiar with how shaders work in other APIs, that will not help you
+                here. OpenGL shaders work very differently from the way they work in other
+                APIs.</para>
+        </note>
+        <para>Shaders are written in a C-like language. So OpenGL uses a very C-like compilation
+            model. In C, each individual .c file is compiled into an object file. Then, one or more
+            object files are linked together into a single program (or static/shared library).
+            OpenGL does something very similar.</para>
+        <para>A shader string is compiled into a <glossterm>shader object</glossterm>; this is
+            analogous to an object file. One or more shader objects is linked into a
+                <glossterm>program object</glossterm>.</para>
+        <para>A program object in OpenGL contains code for <emphasis>all</emphasis> of the shaders
+            to be used for rendering. In the tutorial, we have a vertex and a fragment shader; both
+            of these are linked together into a single program object. Building that program object
+            is the responsibility of this code:</para>
+        <example>
+            <title>Program Initialization</title>
+            <programlisting>void InitializeProgram()
+{
+    std::vector&lt;GLuint> shaderList;
+    
+    shaderList.push_back(CreateShader(GL_VERTEX_SHADER, strVertexShader));
+    shaderList.push_back(CreateShader(GL_FRAGMENT_SHADER, strFragmentShader));
+    
+    theProgram = CreateProgram(shaderList);
+    
+    positionAttrib = glGetAttribLocation(theProgram, "position");
+}</programlisting>
+        </example>
+        <para>The first statement simply creates a list of the shader objects we intend to link
+            together. The next two statements compile our two shader strings. The
+                <function>CreateShader</function> function is a utility function defined by the
+            tutorial that compiles a shader.</para>
+        <para>Compiling a shader into a shader object is a lot like compiling source code. Most
+            important of all, it involves error checking. This is the implementation of
+                <function>CreateShader</function>:</para>
+        <example>
+            <title>Shader Creation</title>
+            <programlisting>GLuint CreateShader(GLenum eShaderType, const std::string &amp;strShaderFile)
+{
+    GLuint shader = glCreateShader(eShaderType);
+    const char *strFileData = strShaderFile.c_str();
+    glShaderSource(shader, 1, &amp;strFileData, NULL);
+    
+    glCompileShader(shader);
+    
+    GLint status;
+    glGetShaderiv(shader, GL_COMPILE_STATUS, &amp;status);
+    if (status == GL_FALSE)
+    {
+        GLint infoLogLength;
+        glGetShaderiv(shader, GL_INFO_LOG_LENGTH, &amp;infoLogLength);
+        
+        GLchar *strInfoLog = new GLchar[infoLogLength + 1];
+        glGetShaderInfoLog(shader, infoLogLength, NULL, strInfoLog);
+        
+        const char *strShaderType = NULL;
+        switch(eShaderType)
+        {
+        case GL_VERTEX_SHADER: strShaderType = "vertex"; break;
+        case GL_GEOMETRY_SHADER: strShaderType = "geometry"; break;
+        case GL_FRAGMENT_SHADER: strShaderType = "fragment"; break;
+        }
+        
+        fprintf(stderr, "Compile failure in %s shader:\n%s\n", strShaderType, strInfoLog);
+        delete[] strInfoLog;
+    }
+
+	return shader;
+}</programlisting>
+        </example>
+        <para>An OpenGL shader object is, as the name suggests, an object. So the first step is to
+            create the object with <function>glCreateShader</function>. This function creates a
+            shader of a particular type (vertex or fragment), so it takes a parameter that tells
+            what kind of object it creates. Since each state has certain syntax rules and
+            pre-defined variables and constants, the </para>
+        <note>
+            <para>Shader and program objects are objects in OpenGL. But they were rather differently
+                from other kinds of OpenGL objects. For example, creating buffer objects, as shown
+                above, uses a function of the form <quote>glGen*</quote> where * is
+                    <quote>Buffer</quote>. It takes a number of objects to create and a list to put
+                those object handles in.</para>
+            <para>There are many other differences between shader/program objects and other kinds of
+                OpenGL objects.</para>
+        </note>
+        <para>The next step is to actually compile the text shader into the object. The C-style
+            string is retrieved from the C++ <classname>std::string</classname> object, and it is
+            fed into the shader object with the <function>glShaderSource</function> function. The
+            first parameter is the shader object to put the string into. The next parameter is the
+            number of strings to put into the shader. Compiling multiple strings into a single
+            shader object works analogously to compiling header files in C files. Except of course
+            that the .c file explicitly lists the files it includes, while you must manually add
+            them with <function>glShaderSource</function>.</para>
+        <para>The next parameter is an array of const char* strings. The last parameter is normally
+            an array of lengths of the strings. We pass in <literal>NULL</literal>, which tells
+            OpenGL to assume that the string is null-terminated. In general, unless you need to use
+            the null character in a string, there is no need to use the last parameter.</para>
+        <para>Once the strings are in the object, they are compiled with
+                <function>glCompileShader</function>, which simply takes the shader object to
+            compile.</para>
+        <para>After compiling, we need to see if the compilation was successful. We do this by
+            calling <function>glGetShaderiv</function> to retrieve the
+                <literal>GL_COMPILE_STATUS</literal>. If this is GL_FALSE, then the shader failed to
+            compile; otherwise compiling was successful.</para>
+        <para>If compilation fails, we do some error reporting. It prints a message to stderr that
+            explains what failed to compile. It also prints an info log from OpenGL that describes
+            the error; this of this log as the compiler output from a regular C compilation.</para>
+        <para>After creating both shader objects, we then pass them on to the
+                <function>CreateProgram</function> function:</para>
+        <example>
+            <title>Program Creation</title>
+            <programlisting>GLuint CreateProgram(const std::vector&lt;GLuint> &amp;shaderList)
+{
+    GLuint program = glCreateProgram();
+    
+    for(size_t iLoop = 0; iLoop &lt; shaderList.size(); iLoop++)
+    	glAttachShader(program, shaderList[iLoop]);
+    
+    glLinkProgram(program);
+    
+    GLint status;
+    glGetProgramiv (program, GL_LINK_STATUS, &amp;status);
+    if (status == GL_FALSE)
+    {
+        GLint infoLogLength;
+        glGetProgramiv(program, GL_INFO_LOG_LENGTH, &amp;infoLogLength);
+        
+        GLchar *strInfoLog = new GLchar[infoLogLength + 1];
+        glGetProgramInfoLog(program, infoLogLength, NULL, strInfoLog);
+        fprintf(stderr, "Linker failure: %s\n", strInfoLog);
+        delete[] strInfoLog;
+    }
+    
+    return program;
+}</programlisting>
+        </example>
+        <para>This function is fairly simple. It first creates an empty program object with
+                <function>glCreateProgram</function>. This function takes no parameters; remember
+            that program objects are a combination of <emphasis>all</emphasis> shader stages.</para>
+        <para>Next, it attaches each of the previously created shader objects to the programs, by
+            calling the function <function>glAttachShader</function> in a loop over the
+                <classname>std::vector</classname> of shader objects. The program does not need to
+            be told what stage each shader object is for; the shader object itself remembers
+            this.</para>
+        <para>Once all of the shader objects are attached, the code links the program with
+                <function>glLinkProgram</function>. Similar to before, we must then fetch the
+            linking status by calling <function>glGetProgramiv</function> with
+                <literal>GL_LINK_STATUS</literal>. If it is GL_FALSE, then the linking failed and we
+            print the linking log. Otherwise, we return the created program.</para>
+        <formalpara>
+            <title>Vertex Attribute Indexes</title>
+            <para>The last line of <function>InitializeProgram</function> is the key to how
+                attributes are linked between the vertex array data and the vertex program's
+                input.</para>
+        </formalpara>
+        <programlisting>positionAttrib = glGetAttribLocation(theProgram, "position");</programlisting>
+        <para>The function <function>glGetAttribLocation</function> takes a successfully linked
+            program and a string naming one of the inputs of the vertex shader in that program. It
+            returns the attribute index of that input. Therefore, when we use this program, if we
+            want to send some vertex data to the <varname>position</varname> input variable, we
+            simply use the <varname>positionAttrib</varname> value we retrieved from
+                <function>glGetAttribLocation</function> in our call to
+                <function>glVertexAttribPointer</function>.</para>
+        <formalpara>
+            <title>Using Programs</title>
+            <para>To tell OpenGL that rendering commands should use a particular program object, the
+                    <function>glUseProgram</function> function is called. In the tutorial this is
+                called twice in the <function>display</function> function. It is called with the
+                global <varname>theProgram</varname>, which tells OpenGL that we want to use that
+                program for rendering until further notice. It is later called with 0, which tells
+                OpenGL that no programs will be used for rendering.</para>
+        </formalpara>
+        <note>
+            <para>For the purposes of these tutorials, using program objects when rendering is
+                    <emphasis>not</emphasis> optional. OpenGL does have, in its compatibility
+                profile, default rendering state that takes over when a program is not being used.
+                We will not be using this, and you are encouraged to avoid its use as well.</para>
+        </note>
+    </section>
+    <section>
+        <title>Cleanup</title>
+        <para>The tutorial allocates a lot of system resources. It allocates a buffer object, which
+            represents memory on the GPU. It creates two shader objects and a program object. But it
+            never explicitly deletes any of this.</para>
+        <para>Part of this is due to the nature of FreeGLUT, which does not provide hooks for a
+            cleanup function. But part of it is also due to the nature of OpenGL itself. In a simple
+            example such as this, there is no need to delete anything. OpenGL will clean up its own
+            assets when OpenGL is shut down as part of window deactivation.</para>
+        <para>It is generally good form to delete objects that you create before shutting down
+            OpenGL. And you certainly should do it if you encapsulate objects in C++ objects, such
+            that destructors will delete the OpenGL objects. But it isn't strictly necessary.</para>
+    </section>
+    <section>
+        <title>In Review</title>
+        <para>At this point, you have a good general overview of how things work in OpenGL. You know
+            how to compile and link shaders, how to pass some basic vertex data to OpenGL, and how
+            to render a triangle.</para>
+        <section>
+            <title>Further Study</title>
+            <para>Even with a simple tutorial like this, there are many things to play around with
+                and investigate.</para>
+            <itemizedlist>
+                <listitem>
+                    <para>Change the color value set by the fragment shader to different values. Use
+                        values in the range [0, 1], and see what happens when you go outside that
+                        range.</para>
+                </listitem>
+                <listitem>
+                    <para>Change the positions of the vertex data. Keep position values in the [-1,
+                        1] range, then see what happens when triangles go outside this range. Notice
+                        what happens when you change the Z value of the positions (note: nothing
+                        should happen). Keep W at 1.0 for now.</para>
+                </listitem>
+                <listitem>
+                    <para>Change the values that <function>reshape</function> gives to
+                            <function>glViewport</function>. Make them bigger or smaller than the
+                        window and see what happens. Shift them around to different quadrants within
+                        the window.</para>
+                </listitem>
+                <listitem>
+                    <para>Change the <function>reshape</function> function so that it respects
+                        aspect ratio. This means that the area rendered to may be smaller than the
+                        window area. Also, try to make it so that it always centers it within the
+                        window.</para>
+                </listitem>
+                <listitem>
+                    <para>Change the clear color, using values in the range [0, 1]. Notice how this
+                        interacts with changes to the viewport.</para>
+                </listitem>
+                <listitem>
+                    <para>Add another 3 vertices to the list, and change the number of vertices sent
+                        in the <function>glDrawArrays</function> call from 3 to 6. Add more and play
+                        with them.</para>
+                </listitem>
+            </itemizedlist>
+        </section>
+        <section>
+            <title>OpenGL Functions of Note</title>
+            <glosslist>
+                <glossentry>
+                    <glossterm>glClearColor, glClear</glossterm>
+                    <glossdef>
+                        <para>These functions clear the current viewable area of the screen.
+                                <function>glClearColor</function> sets the color to clear, while
+                                <function>glClear</function> with the
+                                <literal>GL_COLOR_BUFFER_BIT</literal> value causes the image to be
+                            cleared with that color.</para>
+                    </glossdef>
+                </glossentry>
+                <glossentry>
+                    <glossterm>glGenBuffers, glBindBuffer, glBufferData</glossterm>
+                    <glossdef>
+                        <para>These functions are used to create and manipulate buffer objects.
+                                <function>glGenBuffers</function> creates one or more buffers,
+                                <function>glBindBuffer</function> attaches it to a location in the
+                            context, and <function>glBufferData</function> allocates memory and
+                            fills this memory with data from the user into the buffer object.</para>
+                    </glossdef>
+                </glossentry>
+                <glossentry>
+                    <glossterm>glEnableVertexAttribArray, glDisableVertexAttribArray,
+                        glVertexAttribPointer</glossterm>
+                    <glossdef>
+                        <para>These functions control vertex attribute arrays.
+                                <function>glEnableVertexAttribArray</function> activates the given
+                            attribute index, <function>glDisableVertexAttribArray</function>
+                            deactivates the given attribute index, and
+                                <function>glVertexAttribPointer</function> defines the format and
+                            source location (buffer object) of the vertex data.</para>
+                    </glossdef>
+                </glossentry>
+                <glossentry>
+                    <glossterm>glDrawArrays</glossterm>
+                    <glossdef>
+                        <para>This function initiates rendering, using the currently active vertex
+                            attributes and the current program object (among other state). It causes
+                            a number of vertices to be pulled from the attribute arrays in
+                            order.</para>
+                    </glossdef>
+                </glossentry>
+                <glossentry>
+                    <glossterm>glViewport</glossterm>
+                    <glossdef>
+                        <para>This function defines the current viewport transform. It defines as a
+                            region of the window, specified by the bottom-left position and a
+                            width/height.</para>
+                    </glossdef>
+                </glossentry>
+                <glossentry>
+                    <glossterm>glCreateShader, glShaderSource, glCompileShader</glossterm>
+                    <glossdef>
+                        <para>These functions create a working shader object.
+                                <function>glCreateShader</function> simply creates an empty shader
+                            object of a particular shader stage. <function>glShaderSource</function>
+                            sets strings into that object; multiple calls to this function simply
+                            overwrite the previously set strings.
+                                <function>glCompileShader</function> causes the shader object to be
+                            compiled with the previously set strings.</para>
+                    </glossdef>
+                </glossentry>
+                <glossentry>
+                    <glossterm>glCreateProgram, glAttachShader, glLinkProgram</glossterm>
+                    <glossdef>
+                        <para>These functions create a working program object.
+                                <function>glCreateProgram</function> creates an empty program
+                            object. <function>glAttachShader</function> attaches a shader object to
+                            that program. Multiple calls attach multiple shader objects.
+                                <function>glLinkProgram</function> links all of the previously
+                            attached shaders into a complete program.</para>
+                    </glossdef>
+                </glossentry>
+                <glossentry>
+                    <glossterm>glUseProgram</glossterm>
+                    <glossdef>
+                        <para>This function causes the given program to become the current program.
+                            All rendering taking place after this call will use this program for the
+                            various shader stages. If the program 0 is given, then no program is
+                            current.</para>
+                    </glossdef>
+                </glossentry>
+                <glossentry>
+                    <glossterm>glGetAttribLocation</glossterm>
+                    <glossdef>
+                        <para>This function retrieves the attribute index of a named attribute. It
+                            takes the program to find the attribute in, and the name of the input
+                            variable of the vertex shader that the user is looking for the attribute
+                            index to.</para>
+                    </glossdef>
+                </glossentry>
+            </glosslist>
+        </section>
+    </section>
+    <glossary>
+        <title>Glossary</title>
+        <glossentry>
+            <glossterm>Buffer Object</glossterm>
+            <glossdef>
+                <para>An OpenGL object that represents a linear array of memory, containing
+                    arbitrary data. The contents of the buffer are defined by the user, but the
+                    memory is allocated by OpenGL. Data in buffer objects can be used for many
+                    purposes, including storing vertex data to be used when rendering.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>Vertex Attribute</glossterm>
+            <glossdef>
+                <para>A single input to a vertex shader. Each vertex attribute is a vector of up to
+                    4 elements in length. Vertex attributes are drawn from buffer objects; the
+                    connection between buffer object data and vertex inputs is made with the
+                        <function>glVertexAttribPointer</function> and
+                        <function>glEnableVertexAttribArray</function> functions. Each vertex
+                    attribute in a particular program object has an index; this index can be queried
+                    with <function>glGetAttribLocation</function>. The index is used by the various
+                    other vertex attribute functions to refer to that specific attribute.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>Viewport Transform</glossterm>
+            <glossdef>
+                <para>The process of transforming vertex data from normalized device coordinate
+                    space to window space. It specifies the viewable region of a window.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>Shader Object</glossterm>
+            <glossdef>
+                <para>An object in the OpenGL API that is used to compile shaders and represent the
+                    compiled shader's information. Each shader object is typed based on the shader
+                    stage that it contains data for.</para>
+            </glossdef>
+        </glossentry>
+        <glossentry>
+            <glossterm>Program Object</glossterm>
+            <glossdef>
+                <para>An object in the OpenGL API that represents the full sequence of all shader
+                    processing to be used when rendering. Program objects can be queried for
+                    attribute locations and various other information about the program. They also
+                    contain some state that will be seen in later tutorials.</para>
+            </glossdef>
+        </glossentry>
+    </glossary>
+</chapter>

Documents/Tutorial 02/Tutorial 02.xml

+<?xml version="1.0" encoding="UTF-8"?>
+<?oxygen RNGSchema="http://docbook.org/xml/5.0/rng/docbookxi.rng" type="xml"?>
+<?oxygen SCHSchema="http://docbook.org/xml/5.0/rng/docbookxi.rng"?>
+<chapter xmlns="http://docbook.org/ns/docbook" xmlns:xi="http://www.w3.org/2001/XInclude"
+    xmlns:xlink="http://www.w3.org/1999/xlink" version="5.0">
+    <title>OpenGL's Moving Triangle</title>
+    <para>This tutorial will be building off of the previous tutorial. In that tutorial, we had a
+        single, static triangle. Here, we will move it around.</para>
+    <section>
+        <title>Moving the Vertices</title>
+        <para>The simplest way one might think to move a triangle or other object around is to
+            simply modify the vertex position data directly. From the previous tutorial, we learned
+            that the vertex data is stored in a buffer object. This is what
+                <filename>tut2a.cpp</filename> does.</para>
+        <para>The modifications are done in two steps. The first step is to generate the X, Y offset
+            that will be applied to each position. The second is to apply that offset to each vertex
+            position. The generation of the offset is done with the
+                <function>ComputePositionOffset</function> function:</para>
+        <example>
+            <title>Computation of Position Offsets</title>
+            <programlisting>void ComputePositionOffsets(float &amp;fXOffset, float &amp;fYOffset)
+{
+    const float fLoopDuration = 5.0f;
+    const float fScale = 3.14159f * 2.0f / fLoopDuration;
+    
+    float fElapsedTime = glutGet(GLUT_ELAPSED_TIME) / 1000.0f;
+    
+    float fCurrTimeThroughLoop = fmodf(fElapsedTime, fLoopDuration);
+    
+    fXOffset = cosf(fCurrTimeThroughLoop * fScale) * 0.5f;
+    fYOffset = sinf(fCurrTimeThroughLoop * fScale) * 0.5f;
+}</programlisting>
+        </example>
+        <para>This function computes offsets in a loop. The offsets produce circular motion, and the
+            offsets will reach the beginning of the circle every 5 seconds (controlled by
+                <varname>fLoopDuration</varname>). The function
+                <function>glutGet(GLUT_ELAPSED_TIME)</function> retrieves the integer time in
+            milliseconds since the application started. The <function>fmodf</function> function
+            computes the floating-point modulus of the time. In lay terms, it takes the first
+            parameter and returns the remainder of the division between that and the second
+            parameter. Thus, it returns a value on the range [0, <varname>fLoopDuration</varname>),
+            which is what we need to create a periodically repeating pattern.</para>
+        <para>The <function>cosf</function> and <function>sinf</function> functions compute the
+            cosine and sine respectively. It isn't important to know exactly how these functions
+            work, but they effectively compute a circle of radius 2. By multiplying by 0.5f, it
+            shrinks the circle down to a radius of 1.</para>
+        <para>Once the offsets are computed, the offsets have to be added to the vertex data. This
+            is done with the <function>AdjustVertexData</function> function:</para>
+        <example>
+            <title>Adjusting the Vertex Data</title>
+            <programlisting>void AdjustVertexData(float fXOffset, float fYOffset)
+{
+    std::vector&lt;float> fNewData(ARRAY_COUNT(vertexPositions));
+    memcpy(&amp;fNewData[0], vertexPositions, sizeof(vertexPositions));
+    
+    for(int iVertex = 0; iVertex &lt; ARRAY_COUNT(vertexPositions); iVertex += 4)
+    {
+        f