Thoughts on working with numbers in Zig
Disclaimer: NOT a native English speaker, NOT an expert programmer, NOT an
expert on Zig.
First of all I do not want this to be some kind of rant, but just to provide
some constructive criticism of what I consider an otherwise near 10/10
programming and development experience.
Now that I have a few months of hobby-programming in Zig under my belt,
including a little stupid hobby game (far from anything “real”), I feel like
sharing my one and probably only real complaint about the Zig programming
language:
Working with numbers in Zig.
The language refuses to guess the intent of the programmer and automatically
cast numbers, lest there is no possibility of information-loss. This sounded
good to me in theory, but in practice, it has turned into a seemingly endless
source of frustration. It is not like other language safety features like a
strong type system or a “borrow-checker”, that you eventually learn and then
they stop nagging you as much. Numbers is Zig will continue to nag you forever!
Below is an example of code, that Zig will refuse to compile:
for (0..10) |i| {
const position = m.vec3(i * 1.5, i * 1.5, i * 1.5);
try backend.drawCube(position, .{}, rotation, "material-name");
}
Why? Because i is an unsigned 64 bit number, and you are not allowed to
multiply that with a floating point number. Well… I am sorry, but I did not
ask for an unsigned 64 bit number, I just asked you to count from 0 to 9, and
draw cubes with an offset.
This kind of compiler-behavior just feels condescending to me. Like the
programmer cannot be trusted with not causing some kind of bug whenever numbers
are involved. Zig is not like that with other things like memory management.
There are many other examples, I just picked one. Another irritating instance
of working with numbers in Zig, is when you find yourself casting the same
number back and forth just to please the compiler.
And while it probably saves us from making a mistake once in a while, I believe
it to be a net-negative to the development process. Here are some reasons:
- It makes programming less fun or more tedious than it has to be.
- Messes with the natural readability of mathematical expressions, like these
two examples:
Natural:
a = b / c;
Working with numbers in Zig:
a: f32 = @as(f32, @floatFromInt(b)) / c;
Which I believe to make it harder to debug the parts of your code which contain
lots of math, like a graphics shader (which, luckily, we write in other
languages).
As an aside, I have noticed that writing parsing (eg. during Advent of Code) and
statemachines (as is often useful in things like games), is particularly
pleasant in Zig (especially the labeled switch). And I cannot help but wonder,
if the reason for that, is that the language designers write a lot of that kind
of code. In that case, maybe I can hope that the Zig core team will include
someone who deals a lot with math.
Thank you for reading my (not a rant-) post.
