I encounter a compile error when parsing JSON to struct (.object) which has a i64 member. It only shows up on Zig 0.15.1. It compiled fine on pre-0.15.1.
The following shows the compile problem. Comment and uncomment the f64 and the i64 field to see the problem.
For some reason when parsing to Value it’s parsing the number as a float, instead of an int, looking at the source I don’t see how that could happen. It shouldn’t parse as float unless its -0 or has a ., e or E in it.
Thanks for the idea. Adding a null as default value still causes the same compile problem.
Note that it’s a compile error, not runtime error. Not sure it’s the comptime computation error in the JSON paring code or it’s a compiler regression since it compiles and runs fine on pre-0.15.1. I just want to see if people have noticed it before or it’s a real bug. If it is, I’ll file a bug report.
That was just a FYI if you wanted that behaviour, I thought you might since you were using optional types. It is not at all relevant to the issue.
Looking again, this looks like a regression in the compiler. For some reason I thought it would know that switch branch wouldn’t be encountered, but it’s a runtime condition so ofc It’s going to evaluate it.
The compile error is saying a comptime number has a value too large to be an f64.
What it should do is recognise the condition will always be false and move on.
Do you need to have a std.json.Value for some reason?
If not, you can parse straight from the slice to your struct.
Looking at it from a Javascript perspective it’s not surprising tbh, because all numbers in JS are f64 (ignoring the new-ish BigInt stuff which isn’t supported by JSON.parse/stringify anyway).
So I would expect a spec-compliant JSON parser to parse all numbers into an f64 first and only then convert to whatever type is wanted by the user, which would mean that integers which require more than 53 bits lose data, which then in turn should trigger either a Zig compiler error or at least runtime error.
The spec doesn’t place any limits on the precision of JSON numbers. I know this because of a silent data-loss causing bug in the DynamoDB console back when I was still at AWS. DynamoDB, which uses JSON, doesn’t limit the precision. JSON.parse does. A certain part of the console was doing a round trip between string and parsed JSON objects, silently corrupting user data. I don’t recall if it was actually fixed, or if they just started forbidding the use of that part of the UI if your data contained numbers outside the range of an f64, because I was mostly working with a different team soon after this bug came up.
If you want to be spec compliant, and fully inter-op with everywhere, then IMO a compliant parser should attempt to parse it as the type you are serializing into, not as an f64. If you’re field is an i64, then it should parse as an i64 and error it if it’s out of range. For safe-roundtripping, the parser should also support just dumping the ASCII bytes that represent the number into a newtype’d slice instead of parsing it:
Serialization is another question, that I haven’t really thought about all that much. One option would be to have two different modes. A .loose that allows numbers of any size to be serialized (as well as allowing other vagueness in the spec that may cause problems with interoperability). The other mode would be a .strict_ijson mode that errors if your data cannot be represented as I-JSON, so, among other things, you would receive an error.Overflow or error.OutOfRange if a number can’t fit into an f64.