OpenGL error that seemingly occurs in Debug mode only

I’m following along with the OpenGL tutorial on LearnOpenGL up to this point, where the instructor explains how to load a 3D object file using Assimp and render it with OpenGL. In my code, there is the following function:

fn catchOpenGLError(comptime location: usize) void {
    var error_num: usize = 0;
    while (true) {
        const error_code = c.glGetError();
        if (error_code == c.GL_NO_ERROR) {
            break;
        }
        error_num += 1;
        std.debug.print("OpenGL error code {d} encountered\n", .{error_code});
    }   
    if (error_num > 0) {
        std.debug.print("{d} total error(s) encountered before line {d} in source code\n", .{ error_num, location }); 
    }   
}

and it is intended to be called anywhere like so: catchOpenGLError(@src().line).

The problem is that in Debug mode, the program outputs this

OpenGL error code 1281 encountered
1 total error(s) encountered before line 129 in source code

while in ReleaseFast mode (where the code is compiled with zig build -Drelease=true), nothing gets printed to the screen.

This behavior is extremely confusing to me. Casting stuff like undefined behaviors put in by erroneous programming aside, I figure that the optimization mode should not affect the well-defined execution of core elements of the program and language, much less something that’s very much external to the language, like OpenGL is to Zig here.

I’ve made a compressed tarball of the project here, reduced to as much of a minimally reproducible example as I can manage. You need GLFW and Assimp to compile the program. I’m using Zig 0.15.1. I’ve tried everything I can think of to no avail, so I’d really appreciate it if someone could pinpoint the issue here. Thanks a ton for reading this far.

It most likely means that you pass an undefined value, which gets optimized away in release modes, into one of the OpenGL functions.

However it is difficult to figure out which function it actually is, since you misplaced the catchOpenGLError function.
So I’ll give you a few tips for OpenGL debugging:

  • If you must use glGetError, then you must place it after each OpenGL API call (or at least after each block of API calls). Otherwise it gets more difficult to figure out where the error actually came from.
    In your code you seem to have placed it in front of the API calls, so the information that an error happened before it is maximally useless, because you have to check all places where the function is called.
  • Use glDebugMessageCallback, it requires OpenGL 4.3, but that’s available on almost all devices nowadays (except macos) and it is a lot more helpful since it tells you what exactly went wrong and in which function.
  • If that doesn’t help you can also use renderdoc, capture the first frame, and it will show you more information for the failing functions, including the values that got passed into it.
1 Like

There is nothing in your example that jumps out as the obvious, the error code you get is for an invalid value being passed as an argument, if that is helpful at all.

On a side note, glGetError only stores a single error, they do not get queued. Once an error is set, it stops recording errors until you call glGetError to fetch it, which also sets the current state to GL_NO_ERROR, so calling in a loop not needed.
EDIT: Disregard my lies! As was pointed out below, it will still record errors of different types. The loop is correct.

@src() does work in release builds, so there is something else going on, not the builtin working differently. The likely culprit is some safety-checked error in debug builds, but it is treated as UB in release ones. This could include a GL function failing, the state not being checked at the time, but uninitialized values being used because the function failed and did not set them as you expect.

It might also be better to share a repo for a the minimal example instead of a tarball. I only speak for myself, but I would rather just review some code in the browser than a sizable archive. It is likely possible for us to locate the error without needing to replicate the environment and build the project.

1 Like
const texture_image_file_absolute_path = try std.fmt.allocPrint(allocator, "{s}/{s}", .{ self.model_folder_absolute_path, texture_file_name_normal });

This returns a non-0-terminated string.

fn create(self: *Texture, texture_image_file_absolute_path: []const u8, pixel_type: c.GLenum, texture_unit_offset: u32) void {
    var width: c_int = undefined;
    var height: c_int = undefined;
    var channel_num: c_int = undefined;
    const texture_image_data = c.stbi_load(@ptrCast(texture_image_file_absolute_path), &width, &height, &channel_num, 0);

Here you pass that string to stbi_load, which expects a 0-terminated string ([*:0]const u8). In debug modes the buffer the string is formatted to is initialized with @memset(byte_ptr[0..byte_count], undefined) (0xAA bytes) which is optimized away in unsafe release modes. This means that in debug modes, stb will read 0xAA garbage after the path. In release modes you might be lucky and get a 0 in the right spot after the end of the string.

This also means that you get a null pointer back from stbi_load() (which you should probably check) and that width/height/channel_num will remain undefined. The reason GL logs GL_INVALID_VALUE instead of segfaulting is because 0xAA… is negative when interpreted as a signed integer, so GL won’t even try to dereference the pointer. The fact that it passes in release modes might once again be luck (e.g. if the values got initialized to 0).

1 Like

Not quite correct, you should still use a loop. You’re correct that once an error like INVALID_VALUE has been recorded, GL won’t record subsequent INVALID_VALUE errors until the first one has been cleared (so 10 invalid GL commands followed by glGetError() will still yield INVALID_VALUENO_ERROR), but it might still “queue up” other different error codes like INVALID_ENUM. The only way to completely clear all errors is to use a loop.

2 Likes

Thank you so much! This was indeed a null pointer dereference error, which, when combined with the fact that width and height equal 0xAAAAAAAA (which, when interpreted as a c_int, i.e., the type of width and height defined in the code, and also the type of the corresponding parameters in the function prototype of glTexImage2D, on 64-bit systems, is -1431655766, a negative number) in Debug mode and 0 (most of the time) in ReleaseFast mode, leads to OpenGL reporting GL_INVALID_VALUE (given when “a value parameter is not a legal value for that function”, which actually makes total sense now) but doesn’t do so in ReleaseFast mode (the error is still reported in ReleaseSafe mode though).

On another note, while I marked @castholm’s post as the solution, I still really appreciate everyone’s input here. I wouldn’t have gotten my misconception about how OpenGL queues up error codes cleared otherwise.

I know no one asked, but some of the helpful lessons I got out of this are that diligence must be exercised whenever a []u8 is passed to a C function as a string, and do not be lazy and not check for null pointer dereference errors, especially with C functions.

Off topic, but in
fn catchOpenGLError(comptime location: usize) void {

location should not be comptime. Essentially what you’re doing is forcing the compiler to make a new version of the function for every it’s called from. This only serves to duplicate machine code unnecessarily. Instead, your function signature should look like this:
fn catchOpenGLError(location: std.builtin.SourceLocation) void {
which additionally removes the need to include .line in the calls to this function. In practice location will be optimized to a pointer, so you’re not paying the cost of copying it.

1 Like