I have a result type defined like this
pub const Result = switch (action) {
.map => union(enum) {
ignore: void,
replace: Expr,
},
.filter => union(enum) {
ignore: void,
discard: void,
replace: Expr,
},
};
I have both left and right values of this type and would like to switch on a combination of them but there’s no switch (left, right)
in Zig.
I figured I’ll convert both left and right to int values, shift and or them and switch on the result.
So I did
const R_IGNORE: u16 = @intFromEnum(.ignore);
const R_REPLACE: u16 = @intFromEnum(.replace);
const R_DISCARD = if (action == .walk) @intFromEnum(.discard) else 0;
const L_IGNORE = R_IGNORE << 16;
const L_REPLACE = R_REPLACE << 16;
const L_DISCARD = if (action == .walk) R_DISCARD << 16 else 0;
and then
const value = (@intFromEnum(left) << 16) | @intFromEnum(right);
switch (value) {
L_DISCARD | R_DISCARD => return .discard,
}
How do I fix the resulting error and is there a better approach to switching on a combination of enum values?
src/ir.zig:386:58: error: type 'u1' cannot represent integer value '16'
const value = (@intFromEnum(left) << 16) | @intFromEnum(right);
^~
src/ir.zig:386:58: error: type 'u0' cannot represent integer value '16'
const value = (@intFromEnum(left) << 16) | @intFromEnum(right);
^~
src/ir.zig:386:58: error: type 'u0' cannot represent integer value '16'
const value = (@intFromEnum(left) << 16) | @intFromEnum(right);
^~
src/ir.zig:386:58: error: type 'u0' cannot represent integer value '16'
const value = (@intFromEnum(left) << 16) | @intFromEnum(right);
dimdin
April 7, 2025, 8:50am
2
const value = @as(u32, @(intFromEnum(left)) << 16 | @as(u32, @intFromEnum(right));
1 bit (u1
) can hold a two value enum
.
Shifting a 1 bit type by 16 bits requires at least a 17 bit type to have a meaningful result.
1 Like
How does the compiler come up with u0 and u1, though?
the compiler forces you to use a type that can represent the number bits you can shift. tbh im not sure why its asking for a u0
cause even when shifting a u1
you can shift 0 or 1 times, the only type you can’t shift (excluding non integers) is a u0
.
joed
April 7, 2025, 10:31am
5
You can’t shift a u1
more than 0-bits without possible overflow.
var x: u1 = 1;
x <<= 0; // fine
x <<= 1; // overflows!
For any integer type uXX
you can compute the integer type needed to store a shift using std.math.Log2Int(uXX)
.
std.math.Log2Int(u32) == u5
std.math.Log2Int(u33) == u6
std.math.Log2Int(u1) == u0
1 Like
You can’t shift at all without possible overflows
how is this different?
1 Like
joed
April 7, 2025, 10:37am
7
Actually yeah, you’re right any shift could overflow.
The general rule for Zig shifts though is that you’re not allowed to shift an X-bit integer by more than X-1 bits, probably because it doesn’t make much semantic sense to do so. Given any u32
, left shifting it by 32-bits is always zero for example.
just realised that, we must be sharing a braincell
1 Like
I ended up with this. I could make 2 versions of fn value
, one for comp time and one for runtime but, hopefully, the compiler will optimize it for me.
The single switch beats a whole lot of if statements hands down!
const value = struct {
inline fn value(left: Result, right: Result) u32 {
return @as(u32, @intFromEnum(left)) << 16 | @as(u32, @intFromEnum(right));
}
}.value;
if (action == .filter) {
switch (value(left, right)) {
value(.discard, .discard) => return .discard,
value(.discard, .ignore) => return .{ .replace = rhs },
value(.discard, .replace) => return .{ .replace = right.replace },
value(.ignore, .discard) => return .{ .replace = lhs },
value(.replace, .discard) => return .{ .replace = left.replace },
else => unreachable,
}
}
Is there a way to improve this further?
probably generate an enum type to switch on, for exhaustive switching
My enum type, Result
is at the beginning of the thread. I need to switch on a combination of 2 values of type Result
. How would I generate an enum type to switch on here?
i meant generate an enum for the the combinations of result so you could
switch (value) {
.discard_discard => //...,
.discard_ignore => //...,
//...
}
Any pointers on how to do this at comp time and then map a runtime value onto it?
set the tag for each variant to be the math you already do, so you can just @enumFromInt()
on it