Why is the result of @bitSizeOf target-specific?

The docs for @bitSizeOf state that:

This function returns the number of bits it takes to store T in memory if the type were a field in a packed struct/union. The result is a target-specific compile time constant.

Why is the result “target specific”? Shoudn’t packed fields have the same size on all targets? For example, shouldn’t a u13 be 13 bits on all targets when in a packed struct?

Yes, u13 is 13 bits, but usize is target-specific.

6 Likes

Just elaborating a little: all uX / iX / fX are of course consistent across all targets, but *T (pointers) and usize / isize vary in size depending on the target architecture, and c_int, c_long, c_longdouble, etc vary depending on the target ABI.

6 Likes