The docs for @bitSizeOf
state that:
This function returns the number of bits it takes to store T
in memory if the type were a field in a packed struct/union. The result is a target-specific compile time constant.
Why is the result “target specific”? Shoudn’t packed fields have the same size on all targets? For example, shouldn’t a u13
be 13 bits on all targets when in a packed struct?