Compile error on parsing JSON to struct with i64 type

I encounter a compile error when parsing JSON to struct (.object) which has a i64 member. It only shows up on Zig 0.15.1. It compiled fine on pre-0.15.1.

The following shows the compile problem. Comment and uncomment the f64 and the i64 field to see the problem.


const std = @import("std");

const Foo = struct {
    name:   []const u8,
    // num:    ?f64,   // The f64 type works fine.
    num:    ?i64,   // The i64 type causes compile problem on 0.15.1.
};

pub fn main() !void {
    var gpa = std.heap.GeneralPurposeAllocator(.{}){};
    const alloc = gpa.allocator();

    const json1 = "{ \"name\": \"foobar\", \"num\":10 }";
    const parsed1 = try std.json.parseFromSlice(std.json.Value, alloc, json1, .{});
    const value1 = parsed1.value;
    std.debug.print("value1: {any}\n", .{value1});

    const parsed2: std.json.Parsed(Foo) = try std.json.parseFromValue(Foo, alloc, value1, .{});
    const foo2 = parsed2.value;
    std.debug.print("foo2: {any}\n", .{foo2});
}

The compile error is:

C:\zig\lib\std\json\static.zig:570:44: error: type 'f64' cannot represent integer value '9223372036854775807'
                    if (f > std.math.maxInt(T)) return error.Overflow;
                            ~~~~~~~~~~~~~~~^~~
referenced by:
    innerParseFromValue__anon_33336: C:\zig\lib\std\json\static.zig:588:55
    innerParseFromValue__anon_32968: C:\zig\lib\std\json\static.zig:663:72
    parseFromValueLeaky__anon_30785: C:\zig\lib\std\json\static.zig:186:31
    parseFromValue__anon_30700: C:\zig\lib\std\json\static.zig:172:43
    main: json_test.zig:19:70
    callMain [inlined]: C:\zig\lib\std\start.zig:627:37
    WinStartup: C:\zig\lib\std\start.zig:443:53
    comptime: C:\zig\lib\std\start.zig:68:30
    start: C:\zig\lib\std\std.zig:110:27

For some reason when parsing to Value it’s parsing the number as a float, instead of an int, looking at the source I don’t see how that could happen. It shouldn’t parse as float unless its -0 or has a ., e or E in it.

try stepping through it with a debugger

FYI because it’s an easy mistake

Optional types won’t be parsed to null if they are missing from the json.

The parser will use the default field value if it’s available, otherwise it’s an error.

So to get that behaviour, you have to add a default value of null.

Thanks for the idea. Adding a null as default value still causes the same compile problem.

Note that it’s a compile error, not runtime error. Not sure it’s the comptime computation error in the JSON paring code or it’s a compiler regression since it compiles and runs fine on pre-0.15.1. I just want to see if people have noticed it before or it’s a real bug. If it is, I’ll file a bug report.

That was just a FYI if you wanted that behaviour, I thought you might since you were using optional types. It is not at all relevant to the issue.

Looking again, this looks like a regression in the compiler. For some reason I thought it would know that switch branch wouldn’t be encountered, but it’s a runtime condition so ofc It’s going to evaluate it.

The compile error is saying a comptime number has a value too large to be an f64.

What it should do is recognise the condition will always be false and move on.

Do you need to have a std.json.Value for some reason?
If not, you can parse straight from the slice to your struct.

just checked, it is fixed on master

1 Like

Good to know. Thanks.

Looking at it from a Javascript perspective it’s not surprising tbh, because all numbers in JS are f64 (ignoring the new-ish BigInt stuff which isn’t supported by JSON.parse/stringify anyway).

So I would expect a spec-compliant JSON parser to parse all numbers into an f64 first and only then convert to whatever type is wanted by the user, which would mean that integers which require more than 53 bits lose data, which then in turn should trigger either a Zig compiler error or at least runtime error.

The spec doesn’t place any limits on the precision of JSON numbers. I know this because of a silent data-loss causing bug in the DynamoDB console back when I was still at AWS. DynamoDB, which uses JSON, doesn’t limit the precision. JSON.parse does. A certain part of the console was doing a round trip between string and parsed JSON objects, silently corrupting user data. I don’t recall if it was actually fixed, or if they just started forbidding the use of that part of the UI if your data contained numbers outside the range of an f64, because I was mostly working with a different team soon after this bug came up.

If you want to be spec compliant, and fully inter-op with everywhere, then IMO a compliant parser should attempt to parse it as the type you are serializing into, not as an f64. If you’re field is an i64, then it should parse as an i64 and error it if it’s out of range. For safe-roundtripping, the parser should also support just dumping the ASCII bytes that represent the number into a newtype’d slice instead of parsing it:

const Number = struct { []const u8 };
3 Likes

Serialization is another question, that I haven’t really thought about all that much. One option would be to have two different modes. A .loose that allows numbers of any size to be serialized (as well as allowing other vagueness in the spec that may cause problems with interoperability). The other mode would be a .strict_ijson mode that errors if your data cannot be represented as I-JSON, so, among other things, you would receive an error.Overflow or error.OutOfRange if a number can’t fit into an f64.