Issues

Issue #728 resolved

Consideration on replacing GLee

Jason McKesson
created an issue

GLee is the library that Love2D relies on to access OpenGL extensions. However, it is quite old and unsupported. Any bugs that appear in it would have to be fixed by the Love2D developers themselves. Looking at the repository, there have been several upgrades to GLee done manually to support later versions.

Is there any chance of GLee being replaced with something else? I am aware that GLee is nice and simple to use, but you don't have to sacrifice simplicity for functionality or a well-supported loader.

I actually maintain a project that could be used instead of GLee: the OpenGL Loader Generator. It's basically a system that generates OpenGL function loaders, for any platform, for any use. If you're interested in how effectively it is maintained, it was only 30 minutes behind GLEW as the second loader to support OpenGL 4.4 when it was released.

And it can do GLee-style loading quite well, so if you don't want to call an initialization function, you don't have to. Though you'll still need one if you want to check extension availability (just as with GLee).

The main feature of the generator is to be able to generate headers and source files that contain just the versions and extensions you want. So rather than getting everything (and thus having 2.5MB of source files), you can just pair things down to what you actually need. This also prevents you from accidentally using something you shouldn't, because it's simply not there.

If, for example, you want to make sure that nobody accidentally uses an OpenGL 3.0+ feature, just don't export those versions.

I've forked from the Love repo, and I would be interested in having a go at doing the port myself. If it works, is that something a pull request might be accepted for?

Comments (7)

  1. Alex Szpakowski

    Yeah, GLee should definitely be replaced. Its code generator for creating GLee.c/.h from the official GL headers doesn't even work properly anymore, since the layout of the headers available on opengl.org is a bit different now that they're generated from the XML data.

    I've looked at a few different options for replacement, but I think I want whatever we use to have combined runtime desktop GL and GL ES 2.0+ support. I made a (now out of date) proof-of-concept ES 2 port of LÖVE several months ago, and I ended up using an outdated fork of GLEW because it provided combined desktop and ES loading. IMO, the ES 2 port's codebase wouldn't be as nice if it'd have to have either a completely separate backend or a bunch of #ifdef's everywhere, instead of the combined GL + ES detection / loading.

    I haven't looked into glLoadGen too much, but would it be very difficult to modify it to do that? I recently did some experiments with someone's WIP loader (GLAD) where I hacked in combined support (along with a bunch of other fixes and cleanup which haven't made it upstream), and the fact that it parses the official XML files made it pretty easy to add.

  2. Jason McKesson reporter

    It depends on what you mean by "do that" with regard to OpenGL ES support.

    If you mean to generate separate-but-compatible loading, so that different platform builds can #include the appropriate header, that's possible (though it would not be without effort, since I'd now have to add a fourth specification type). There can be compile/runtime switches to detect which is included for those places where it is needed (GL ES 2.0 is rather picky about how it does image uploading, for example).

    There would still be some issues though, particularly for extensions like ARB_framebuffer_object. The desktop GL version doesn't have ARB suffixes, but the GL ES 2.0 does. And that's not the only such extension where that happens. The "solution" that was used in initOpenGLFunctions would probably work (and probably better than it does for buffer objects), but there may be quite a few such extensions. Also, that would not be particularly easy to automate.

    If you mean to generate some kind of amalgam header, where it includes some kind of common subset between desktop GL and GL ES and loads those common functions from one or the other, no. Even if it were possible, I wouldn't be convinced that it's a good idea.

  3. Alex Szpakowski

    There would still be some issues though, particularly for extensions like ARB_framebuffer_object. The desktop GL version doesn't have ARB suffixes, but the GL ES 2.0 does.

    FBOs are core in ES 2+, so there aren't ARB suffixes for it (and the one for ES 1.x is OES.)

    KHR_debug is common to desktop and ES and its tokens have the KHR suffix on ES but not in desktop - but the XML already basically handles that for you. The only issue there for a GL loader is that people might get a bit confused when reading the header.

    Are you saying there's an enum with different values in desktop and ES?

    The only real issue I had with generating a combined GL+ES loader was the typedefs, which would probably just need a single #ifdef to determine which to use and then everything would be rosy.

  4. Jason McKesson reporter

    but the XML already basically handles that for you

    I'm not sure what you mean by "handles that". It has one list of functions for desktop GL, and another list of functions for GL ES. The ARB-suffixed version does have an "alias" attribute defined for it, but what exactly would you do with that? Pretend that one function is the other? Load one function pointer into the other's data?

    Also, not everything shared between them has an "alias" attribute. For example, glTexStorage2D from ARB_texture_storage/GL 4.2 has an equivalent glTexStorage2DExt from EXT_texture_storage. But there's no "alias" attribute specifying that. Now you probably won't be using these particular functions, but my point is that, unless you go through the entire spec and verify every function you want to use has an alias tag, you can't rely on it.

    I'm always leery about trying to have code pretend that desktop OpenGL and GL ES are equivalent. That's because they're not, and pretending that they are is dangerous. Consider this code from Love2D:

    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, 2, 2, 0, GL_RGBA, GL_UNSIGNED_BYTE, px);
    

    This is not legal OpenGL ES 2.0 code. Oh, I'm sure some ES 2.0 implementations will execute it (*cough* NVIDIA *cough*), but it's not legal according to the specification. And that's extremely dangerous, since you won't know it's broken until you try to test on a platform that implements the specification correctly.

    The use of sized internal formats like GL_RGBA8 is forbidden in ES 2.0. This is a common trap in code that tries to be API neutral between ES and desktop GL. ES 2.0 deduces the size of the internal format from the pixel transfer parameters you pass in.

    This bug would have been caught if the ES 2.0 and desktop GL headers were separate, because ES 2.0 doesn't have the GL_RGBA8 enumerator at all. So compiling it for ES 2.0 would have failed. That's why I strongly suggest that you keep the builds separate.

    I understand the desire to have things be simple, where most of your rendering code doesn't have to care about the differences between ES 2.0 and desktop GL. And you can have that. But you need to have a way to verify that those differences are actually being taken into account when they matter. And the most effective way to do that is to catch those errors at compile time. Not to wait until runtime testing to find out that someone checked something bad in that subtly breaks the system.

    The only real issue I had with generating a combined GL+ES loader was the typedefs, which would probably just need a single #ifdef to determine which to use and then everything would be rosy.

    Wait. If you're already doing a #ifdef to choose between a desktop GL and OpenGL ES build... why does it matter if you have one header or two? All your OpenGL including comes about via the OpenGL.h header of yours. What is the functional difference between having OpenGL.h #include a desktop GL or GL ES header based on the build setting, and having a single GL header that #ifdefs around typedefs and such?

  5. Alex Szpakowski

    This bug would have been caught if the ES 2.0 and desktop GL headers were separate, because ES 2.0 doesn't have the GL_RGBA8 enumerator at all. So compiling it for ES 2.0 would have failed. That's why I strongly suggest that you keep the builds separate.

    The same could be said about the differences between any similar extensions or core GL versions. It's definitely not a reason to avoid doing this since we're already doing it by supporting more than one GL version as well as various extensions in one header. And that particular difference was not very difficult at all to deal with at the time. As I've said, I already did this with a GLEW fork. :)

    Wait. If you're already doing a #ifdef to choose between a desktop GL and OpenGL ES build... why does it matter if you have one header or two? All your OpenGL including comes about via the OpenGL.h header of yours. What is the functional difference between having OpenGL.h #include a desktop GL or GL ES header based on the build setting?

    The difference is that all of the code in src/modules/graphics/opengl/, aside from one single #ifdef for typedefs, will compile on all supported platforms. There is basically nothing large there that should be exclusive to desktop or ES, just things that require a few different lines here or a different enum there.

    I'm always leery about trying to have code pretend that desktop OpenGL and GL ES are equivalent. That's because they're not, and pretending that they are is dangerous.

    I agree, they're different. But they're also nearly identical. The big differences are in what sort of costs and constraints are placed on the game's code, which is not really meant to be handled by LÖVE's internals.

  6. Jason McKesson reporter

    I see your point. You'd have to #ifdef out code that accessed functions/enums that didn't exist in GL ES, rather than simply letting the runtime checks catch them.

    Sadly, I don't think glLoadGen can help you here. The generator is, at it's core, designed to generate code for one specification at a time. Whether GL, WGL, GLX, etc, it just doesn't have the notion of writing multiple specs to the same code generated file.

    Giving it such an idea would be... difficult. And somewhat antithetical to its purpose. After all, the whole point of the tool is to generate lean and efficient headers/source loading code.

    While glLoadGen can't really help, that doesn't mean I can't. I do have some experience parsing through the XML files (obviously). So if you need some help pulling together some kind of loader generation framework, just let me know.

  7. Log in to comment