I’m completely lost here. I just started with Zig two days ago and I’ve been testing it out with some simple code snippets. However, I’ve been stuck for the past few hours while doing simple bit twiddling. This code won’t compile:
You have to tell it what the bit width of the integer literal should be (with @as like markus said), otherwise the exact semantics of the left shift are not well defined.
I’m using 0.10.1. Coercion to a u64 worked, but I’m surprised Zig couldn’t automatically infer the type. It seems like it’d be annoying to constantly cast RHS bit shift operands to a u6 and then explicitly cast integer literals to the desired type.
Why should (1 << n) be a u64 instead of, say, a u128? Why is x the defining type rather than (1 << n)? Why should n being u6 imply anything about the type of (1 << n)?
Zig, by and large, does not assume or imply. In addition, it tends not to walk large dependency chains to find out anything about types in order to keep compilation fast.