So I’m currently learning OpenGL, and I chose to learn it with Zig. I struggled with a strange issue for hours until I finally found the fix.
Here’s the function I use to create a shader:
pub fn init(allocator: Allocator, vertex_path: []const u8, fragment_path: []const u8) !Shader {
const vertex_file = try std.fs.cwd().openFile(vertex_path, .{});
const fragment_file = try std.fs.cwd().openFile(fragment_path, .{});
defer vertex_file.close();
defer fragment_file.close();
const vertex_stat = try vertex_file.stat();
const vertex_source = try vertex_file.readToEndAlloc(allocator, vertex_stat.size);
defer allocator.free(vertex_source);
const fragment_stat = try fragment_file.stat();
const fragment_source = try fragment_file.readToEndAlloc(allocator, fragment_stat.size);
defer allocator.free(fragment_source);
// vertex shader
const vertex = gl.CreateShader(gl.VERTEX_SHADER);
defer gl.DeleteShader(vertex);
gl.ShaderSource(vertex, 1, &.{vertex_source.ptr}, null);
gl.CompileShader(vertex);
try checkCompileErrors(vertex, .vertex);
// fragment Shader
const fragment = gl.CreateShader(gl.FRAGMENT_SHADER);
defer gl.DeleteShader(fragment);
gl.ShaderSource(fragment, 1, &.{fragment_source.ptr}, null);
gl.CompileShader(fragment);
try checkCompileErrors(fragment, .fragment);
const id = gl.CreateProgram();
gl.AttachShader(id, vertex);
gl.AttachShader(id, fragment);
gl.LinkProgram(id);
try checkCompileErrors(id, .program);
return .{
.id = id,
};
}
Here’s the vertex shader that was causing the issue:
#version 330 core
layout (location = 0) in vec3 aPos;
void main() {
gl_Position = vec4(aPos, 1.0);
}
Originally, I read the shader file using the reader()
interface like this:
const vertex_stat = try vertex_file.stat();
const vertex_source = try vertex_file.reader().readAllAlloc(allocator, vertex_stat.size);
defer allocator.free(vertex_source);
When I ran my program like this, I get this shader compile error
And the only way to “fix” it was to add spaces after each line of the GLSL code, which made no sense.
However, when I switched to using readToEndAlloc()
instead without the reader()
interface, like this:
const vertex_stat = try vertex_file.stat();
const vertex_source = try vertex_file.readToEndAlloc(allocator, vertex_stat.size);
defer allocator.free(vertex_source);
The shader compiled perfectly.
What really confused me is that the fragment shader compiled fine even when using the reader()
interface. Here’s the fragment shader that worked:
#version 330 core
out vec4 FragColor;
void main()
{
FragColor = vec4(0.5, 0.5, 0.5, 0.0);
}
If anyone knows why this happened, please let me know. I spent a lot of time trying to track this down, and I’d love to understand what was going on under the hood. Sorry for the long post.