Probably a rookie mistake, mismatched type error in zig-bincode

Hello Zig forum, i am very fairly new to zig, and at the moment i’m trying to write a small chat client-server project to check out the language.

For this i am using a binary format known as bincode, and fourtunately a implementation exists in Zig, but 0.12.0 seems to be problematic in this library.

A pull request above does fix a fair share of issues, but given a struct that has a const u8, the compiler is not happy:

bincode.zig:202:62: error: expected type '@typeInfo(@typeInfo(@TypeOf(bincode.read__anon_4260)).Fn.return_type.?).ErrorUnion.error_set!usize', found 'u64'
                            .unsigned => std.math.cast(U, z) orelse return error.FailedToCastZZ,

Struct example:

pub const MyStruct = struct {
    somefield: MyTestEnum,
    spec: []const u8,
};

I have tried to use 0.11.0, and here there is no issues in question. I have been scratching my head for the past 4 days, so i came here for a different perspective, hopefully

Hey, welcome to Ziggit :slight_smile:

That error message happens when you have not unpacked an error union type.

const x = allocator.alloc(u8, 100);
x[0]; // will give you that error

const x = try allocator.alloc(u8, 100);
x[0]; // will be fine

Can you post the code that’s causing you to have that problem?

1 Like

Adding to what @AndrewCodeDev posted, this is also common when you actually use a try but in a function that does not return an error. In that case you have to catch the error and handle it.

const Mint = struct {
        authority: bincode.Option([32]u8),
        supply: u64,
        decimals: []count u8,
        is_initialized: bool,
        freeze_authority: bincode.Option([32]u8),
    };

    var buffer = std.ArrayList(u8).init(testing.allocator);
    defer buffer.deinit();

    const expected: Mint = .{
        .authority = bincode.Option([32]u8).from([_]u8{ 1, 2, 3, 4 } ** 8),
        .supply = 1,
        .decimals = "1234",
        .is_initialized = true,
        .freeze_authority = bincode.Option([32]u8).from([_]u8{ 5, 6, 7, 8 } ** 8),
    };

    try bincode.write(buffer.writer(), expected, .{});


    const actual = try bincode.readFromSlice(testing.allocator, Mint, buffer.items, .{});
    defer bincode.readFree(testing.allocator, actual);

This code from the testing code scope still causes the issue. I wish i could explain it better, but can you setup a similar environment and test it out?

Your error is coming from bincode.zig – I don’t believe that’s a standard file?

Can you post the code surrounding:

.unsigned => std.math.cast(U, z) orelse return error.FailedToCastZZ,

If that’s someone else’s library, we can probably find the issue and submit a PR to get it fixed (or maybe it has been fixed?)

Bincode.zig is the library i locally import from the github link i sent earlier in the OP.

You can fetch it here: https://raw.githubusercontent.com/lithdew/bincode-zig/ef3056b897bcc451c922407144620f98ba5f7caf/bincode.zig

Ah, gotcha - yeah, that’s quite a function.

Reading a slice from a stream of bytes seems… odd… to me. Slices are pointers so I’m not sure what you’d be getting from that operation. Philosophically speaking, it seems like prohibiting slices makes sense.

I’ll have to look at it more a bit later, but if you remove the slice, it works?

So far only the slices are giving a headache, 0.11.0 works though.

I’d like to know if that’s actually a bug, though. It compiles in 0.11.0, but is that actually correct?

Since a slice is supposed to tell me “go here to get data” (again, it’s just a pointer), you’d be reading its length and address from a stream of bytes.

Now, I can clearly see in your case that we can guarantee that the address will be valid, but that isn’t going to be true in general.

On this line, you are setting the slice to static string data:

.decimals = "1234",

However, that slice can hold any generic address. What does it mean to read that off a series of bytes in the context of a larger program where we don’t know if the address and length that is read is accurate?

I don’t know, i’m not sure. The thing that’s troubling is that the original author of this library last updated it when 0.11.0 was the latest release. I managed to compile it like you said, on 0.11.0.

Okay, because I think you need the answer to that question first here (I could be wrong) but here’s my thinking…

If you serialize something like decimals: [N]u8, decimal_count: u32 or something like that, then you can always guarantee you know where the decimals are and you know how many you have.

You may have been getting a false positive on your last test if you were setting it up like how it’s happening above. That’s guarenteed to work. But if I just read a stream of bytes off the disk and it tells me that I have string data at address x and it has length n, that seems like it would be a silent setup to a fatal error.

Again, I’d have to look at their code more later.

The size is not always known

As in not known at all? As in “it could be 3, or it could be greater than 10 million” or is it within some expressible range? If it is, you can set N to be the size of your range and say how many of them are in use.

If that’s the case, then I can’t see how that would work. You’d need to set the address of the slice to valid memory after you have deserialized it because it could just be pointing to garbage memory.

Either way, the error message is essentially saying this:

bincode.zig:202:62: error: expected type '@typeInfo(@typeInfo(@TypeOf(bincode.read__anon_4260)).Fn.return_type.?).ErrorUnion.error_set!usize', found 'u64'

This part:

@typeInfo(@TypeOf(bincode.read__anon_4260)).Fn.return_type.?

Refers to the return type of the function. Since it’s a generic function, you get read__anon_4260 which is its monomorphized handle. The return_type.? is getting the return of the function because this part…

'@typeInfo(...).ErrorUnion.error_set!usize',

Is extracting the error union type that it can return and joining that to usize. So something is failing because it’s saying it expects "Every Erroy from Read or usize" and it’s just getting usize instead.

as in a slice with a undefined size, like a u8

Okay, so I dug into the code a bit more and I’m narrowing it down… for anyone following along at home, here’s the branch of the read function we’re talking about:

        .Pointer => |info| {
            switch (info.size) {
                .One => {
                    const data = try gpa.create(info.child);
                    errdefer gpa.destroy(data);
                    data.* = try bincode.read(gpa, info.child, reader, params);
                    return data;
                },
                .Slice => {
                    const entries = try gpa.alloc(info.child, try bincode.read(gpa, usize, reader, params));
                    errdefer gpa.free(entries);
                    for (entries) |*entry| {
                        entry.* = try bincode.read(gpa, info.child, reader, params);
                    }
                    return entries;
                },
                else => {},
            }
        },

That gets unrolled from a Struct branch that walks through the field types.

So what’s happening here is that it write stores the slice bytes in place and then read allocates a new slice and then copies them one at a time. So I was half right - they take that into account by reading values off of the bytes by reading the child type for each value. So that’s what handles the slice issue.

I’ll download it tonight and play around with it to see what’s causing your error.

Thanks for your patience, when i was scrolling through the library i also stumbled upon that portion of the lib, but i wasn’t exactly sure…

No problem - 4 days seems like a painful amount of time. Someone may snipe this answer before I get around to downloading it but I’ll help you get an answer by tonight :slight_smile:

Are you compiling for a target where usize is not u64?

1 Like

I also want to point out that this is should to be const and not count.