Hi,
Looking at the Zig grammar, it does not appear to have a way to differentiate between 32 bit and 64 bit FP literal constants at the source code level. Are FP literal constants that appear in the source code always converted with the highest available FP precision, and then down cast to a lower precision? Or does the Zig parser look at the type before deciding what precision of standard library function to call to translate the ascii text in the source code into a fixed precision of bits?