First of all the compiler does not know the actual value of .Value_COUNT.
Unlike in C, Zig enums are scoped. So multiple enums can use the same element names.
So in this case you’d need to use Values.Value_COUNT, so the compiler knows which enum you are referencing here.
Secondly the compiler doesn’t allow implicit conversions between enums and integers.
To explicitly convert between them you can use @intFromEnum(Values.Value_COUNT)
Ah! Thank you. I forgot to mention that I tried @intFromEnum, but specifying the full enum name was the missing piece. I’m confused why I had to fully specify the enum name, since there is only one with that value, but at least it works!
We say that the prefix . operator infers the type based on the expected type for that location. E.g.
const my_val: Values = .Value_COUNT;
This works because your .Value_COUNT is an “enum literal”, and then when it gets assigned to a Values variable it is able to implicitly coerce to Values.Value_COUNT. It’s similar to how you can do:
const x: i32 = 5;
5 is a numeric literal, (and a comptime_int), but when you assign it to an i32 it is able to properly coerce to that type. 5 by itself does not inherently have a defined bit representation or even a number of bits. It’s not until you give it a type with those properties that it actually has those things defined.
As the other commenter said, Zig does not drop the elements of the enum into the global namespace, so to use it you have to tell Zig in some way to look inside of Values. I.e. Values.Value_COUNT, @as(Values, .Value_COUNT), or the aforementioned assignment examples would work.
The way I would probably do this is not with an enum, but with a const variable.