Stuck with std.fmt.parseInt error

Hi! I’m trying to learn Zig by building a ANSI code color library.

I’m loving the language, but I got stuck trying to convert a hex color code to a RGB color code, specifically when using std.fmt.parseInt to convert the hex code to a integer.

Can someone please help me understand why I keep getting an error{Overflow,InvalidCharacter} in the code bellow?

inline fn parse(comptime attribute: anytype) []const u8 {
    switch (@TypeOf(attribute)) {
    [...]
    Color.Hex => {
        const hex_code: []const u8 = if (attribute.code[0] == '#') attribute.code[1..] else attribute.code;
        const hex_to_dec: u32 = try std.fmt.parseInt(u32, hex_code, 16) catch @compileError("Error parsing hex code " ++ attribute.code);
    [...]
}

The Color.Hex is a struct that has the hex color code as one of it’s fields and the type of the color (background or foreground) as the other.

When I call this function in a test or in a script, I get the following error:

src/uniduni_t.zig:72:41: error: expected type ‘const u8’, found ‘error{Overflow,InvalidCharacter}’

const hex_to_dec: u32 = try std.fmt.parseInt(u32, hex_code, 16) catch @compileError("Error parsing hex code " ++ attribute.code);

I’m no IT professional, so I’m really proud of what I acomplished, despite the simplicity of the library. If you want to look the code, it’s in here: fwinter/uniduni_t - Codeberg.org (not updated with my attempt of convert hex to rgb).

Thanks in advance!

If you print hex_code after declaring it, what do you get?

Wait, the error is using both try and catch. If you use try, the error is returned from the function. If you use catch, you handle it on the spot. So you use one or the other but not both at the same time.

Oh, and welcome to Ziggit! :smile_cat:

3 Likes

Thanks for the welcoming message. Are you the youtuber that made zig in depth? This course is helping me a lot.

Anyways, thanks for the response! I updated the code but I’m still getting an error, although the code prints the message I put in @compileError function:

src/uniduni_t.zig:72:101: error: expected type ‘const u8’, found ‘error{Overflow,InvalidCharacter}’
const hex_to_dec: u32 = std.fmt.parseInt(u32, hex_code, 16) catch |e| @compileError(e);
^
src/uniduni_t.zig:897:34: note: called from here
const parsed_code = parse(Color.Hex{
~~~~~^
src/uniduni_t.zig:912:42: note: called from here
var actual = Uniduni_t.init().hex(“#ffffff”, .foreground);

The updated code is the following:

    inline fn parse(comptime attribute: anytype) []const u8 {
        switch (@TypeOf(attribute)) {
            [...]
            Color.Hex => {
                const hex_code: []const u8 = if (attribute.code[0] == '#') attribute.code[1..] else attribute.code;
                std.debug.print("{s}: {any}\n{s}: {any}\n", .{ hex_code, @TypeOf(hex_code), attribute.code, @TypeOf(attribute.code) });
                const hex_to_dec: u32 = std.fmt.parseInt(u32, hex_code, 16) catch |e| @compileError(e);
                std.debug.print("{any}\n", .{hex_to_dec});

Thanks for all the helping!

Yeah that’s me! :^) Keep in mind that Zig in Depth is strictly focused on Zig version 0.11, so if you’re trying things in 0.12 or 0.13-dev, they may not work. I’m making a new series, Zig Master, for the most recent versions.

About your code, what’s the value of hex_code? It can give us clues as to why it’s not parsing.

Edit: If the @compileError is not letting you see the debug print output, you can remove it temporarily.
Edit2: Or you can print the value at compile time with @compileLog.

Wow. I’m a fan of your Youtube channel. Congratulations on your work! I’m already looking forward to Zig Master.

I started developing this library in version 0.11 and I’m currently on version 0.12, but I’ve been stuck with this error since version 0.11.

The @compileLog output is the following:

Compile Log Output:
@as(*const [22:0]u8, "{s}: {any}\n{s}: {any}\n"), @as(struct{comptime []const u8 = "#ffffff"[0..6], comptime type =
 []const u8, comptime []const u8 = "#ffffff"[0..7], comptime type = []const u8}, .{ "#ffffff"[0..6], []const u8, "#
ffffff"[0..7], []const u8 })

Thanks.

So from what I see, hex_code still starts with # and that’s why parseInt is failing. It’s like the if (attribute.code[0] == '#') is always false.

You are right about the if statement not being processed. That’s another error I will try to deal.

But the error we’ve been discussing still occurs even if I pass a hex code without the ‘#’ char as parameter. Look:

src/uniduni_t.zig:72:83: error: Error parsing hex code a7c080
const hex_to_dec: u32 = std.fmt.parseInt(u32, hex_code, 16) catch @compileError("Error parsing hex
code " ++ attribute.code);
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

src/uniduni_t.zig:31:46: note: called from here
                    const parsed_code = parse(value);
                                        ~~~~~^~~~~~~
src/example6.zig:9:39: note: called from here
    const first = Uniduni_t.init().add(.{ Color.Hex.fg("a7c080"), Color.Hex.bg("2d353b") }); // First way
                  ~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
referenced by:
    callMain: /usr/lib/zig/std/start.zig:511:32
    callMainWithArgs: /usr/lib/zig/std/start.zig:469:12
    remaining reference traces hidden; use '-freference-trace' to see all reference traces

Compile Log Output:
@as(*const [22:0]u8, "{s}: {any}\n{s}: {any}\n"), @as(struct{comptime []const u8 = "a7c080"[0..6], comptime type = 
[]const u8, comptime []const u8 = "a7c080"[0..6], comptime type = []const u8}, .{ "a7c080"[0..6], []const u8, "a7c0
80"[0..6], []const u8 })

EDIT1: It’s curious that when I pass the #ffffff hex code, the compile log show “#ffffff”[0…6] as the value of hex_code, since the if statement is initializing the hex_code variable as “#ffffff”[1…].

I checked and the if statement is working… I don’t know why this is happening, but I guess the error is related with the one we’ve been discussing.

I’ll study the code again to see if I can guess what is wrong.

EDIT2: I ran only this part of the code, and I’m still getting an ‘error{Overflow,InvalidCharacter}’ as return:

Color.Hex => {
                const hex_code: []const u8 = "ffffff"; //if (attribute.code[0] == '#') attribute.code[0..6] else attribute.code;
                @compileLog("{s}: {any}\n{s}: {any}\n", .{ hex_code, @TypeOf(hex_code), attribute.code, @TypeOf(attribute.code) });
                const hex_to_dec: u32 = std.fmt.parseInt(u32, hex_code, 16) catch @compileError("Error parsing hex code " ++ hex_code);
                @compileLog("{any}\n", .{hex_to_dec});
},

src/uniduni_t.zig:72:83: error: Error parsing hex code ffffff
const hex_to_dec: u32 = std.fmt.parseInt(u32, hex_code, 16) catch @compileError("Error parsing hex
code " ++ hex_code);
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

[...]
Compile Log Output:
@as(*const [22:0]u8, "{s}: {any}\n{s}: {any}\n"), @as(struct{comptime []const u8 = &.{ 102, 102, 102, 102, 102, 102
 }[0..6], comptime type = []const u8, comptime []const u8 = "#ffffff"[0..7], comptime type = []const u8}, .{ &.{ 10
2, 102, 102, 102, 102, 102 }[0..6], []const u8, "#ffffff"[0..7], []const u8 })

EDIT3: even if I put a string literal as the buffer parameter of the parseInt function, I still get an error:

[...]
const hex_to_dec: u32 = std.fmt.parseInt(u32, "ffffff", 16) catch @compileError("Error parsing hex code " ++ hex_code);

Hey, I got curious and tried playing with the parseInt method. Running on v0.12.0. This works perfectly fine, so do the tests in fmt.zig of the std lib. Can you run those?

const std = @import("std");
const expect = std.testing.expect;

test "parse hex one byte" {
    const hex = "ff";
    const expected = 255;
    const result = try std.fmt.parseInt(u8, hex, 16);
    try expect(result == expected);
}

test "parse hex two bytes" {
    const hex = "ffff";
    const expected = 65535;
    const result = try std.fmt.parseInt(u16, hex, 16);
    try expect(result == expected);
}

test "parse hex three bytes" {
    const hex = "ffffff";
    const expected = 16777215;
    const result = try std.fmt.parseInt(u24, hex, 16);
    try expect(result == expected);
}

test "parse hex three bytes (explicit type)" {
    const hex: []const u8 = "ffffff";
    const expected = 16777215;
    const result = try std.fmt.parseInt(u24, hex, 16);
    try expect(result == expected);
}

test "parse hex three bytes (from slice)" {
    const hex_code = "#ffffff";
    const hex = if (hex_code[0] == '#') hex_code[1..] else hex_code;
    const expected = 16777215;
    const result = try std.fmt.parseInt(u24, hex, 16);
    try expect(result == expected);
}

test "parse hex three bytes (from pointer)" {
    const bytes = "#ffffff";
    const hex_code = &bytes;
    const hex = if (hex_code.*[0] == '#') hex_code.*[1..] else hex_code.*;
    const expected = 16777215;
    const result = try std.fmt.parseInt(u24, hex, 16);
    try expect(result == expected);
}

What if you use catch @panic("Woops");? I’m suspecting that the @compileError is being evaluated at compile time no matter what. Only a compile time known conditional can prevent an @compileError from being triggered.

2 Likes

Yes. I did run perfectly fine. I’ve tested the parseInt function in another file and the it worked like a charm, so I know it’s something I’m doing wrong in my code :slight_smile:

That’s it. The code ran!

Thanks a lot for all the help!

1 Like

I’ll try to explain all the errors for a deeper understanding of what’s going on. At every step you get compile errors because of different reasons.

inline fn parse(comptime attribute: anytype) []const u8 {
    // ...
    const hex_to_dec: u32 = try std.fmt.parseInt(u32, hex_code, 16) catch @compileError("Error parsing hex code " ++ attribute.code)
    // ...
}

It is usefull to read the full error message:

src\main.zig:14:32: error: expected type '[]const u8', found 'error{Overflow,InvalidCharacter}'
            const hex_to_dec = try std.fmt.parseInt(u32, hex_code, 16) catch @compileError("Error parsing hex code " ++ attribute.code);
                               ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
src\main.zig:9:46: note: function cannot return an error
inline fn parse(comptime attribute: anytype) []const u8 {
                                             ^~~~~~~~~~

In this case the compilation stops even before reaching catch @compileError part. try std.fmt.parseInt(u32, hex_code, 16) is a syntax sugar for

std.fmt.parseInt(u32, hex_code, 16) catch |e| return e

So it tries to return an error of type error{Overflow,InvalidCharacter}, but parse doesn’t return errors, its return type is []const u8.

Of course, if you change the return type of parse to ![]const u8, you’ll get different compilation error, but that’s too offtopic.I’m just trying to explain errors you see.

src/uniduni_t.zig:72:101: error: expected type ‘[]const u8’, found ‘error{Overflow,InvalidCharacter}’
const hex_to_dec: u32 = std.fmt.parseInt(u32, hex_code, 16) catch |e| @compileError(e);
^

You see this error because @compileError takes one parameter comptime msg: []const u8. It expects a string, but you give it e which is an error of type error{Overflow,InvalidCharacter}.

If you want to convert an error to a string, use @errorName(e). Again, it doesn’t resolve your core problem, because changing e to @errorName(e) still gives another error. I won’t explain it to stay on topic.

const hex_to_dec: u32 = std.fmt.parseInt(u32, "ffffff", 16) catch @compileError("Error parsing hex code " ++ hex_code);

This @compileError always fires, as @dude_the_builder said. I’ll try to explain the reason.

@compileError isn’t evaluated only if a branch where it occurs isn’t compiled by the compiler. For example,

if (false) {
    @compileError("This branch is not compiled");
}

This compiles without errors, because false is comptime-known, and the compiler discard such branches.

Similarly, it works with all branching constructs, including catch.

@as(error{E}!i32, 42) catch @compileError("This branch is not compiled");

@as(error{E}!i32, 42) is a comtime-known value, so the compiler discards catch at comptime.

In your case, even though all arguments of std.fmt.parseInt(u32, "ffffff", 16) are comtime-known, the function is still executed at runtime, because that’s the default behavior in zig. The function call is always executed at runtime, except when it happens in comtime-only context (for example, when initializing static constants or when you explicitly call the function in a comtpime block/expression).

Since the call is deferred to runtime, it’s unknown at comptime what exactly is returned. At comtime the compiler just known that std.fmt.parseInt(u32, "ffffff", 16) returns a value of type error{Overflow,InvalidCharacter}!u32. Hence, it has to compile all branches, including your catch. That’s why this @compileError is triggered.

@dude_the_builder’s suggestion to change this @compileError to @panic("Woops") works, because it allows compiling the code successfully, and deferring an error (if it occurs) to runtime.

But there is a but. I don’t see you whole code of parse and can’t say whether making parameter attribute comptime is reasonable, but let’s assume you really need it to be known at comptime. In this case, you can force evaluation of parseInt at comptime, thus allowing @compileError again.

const hex_to_dec = comptime std.fmt.parseInt(u32, hex_code, 16) catch @compileError("Error parsing hex code " ++ attribute.code);

This is possible, because u32, hex_code, and 16 are all comtime-known values. Note that you don’t have to force hex_code to be evaluated at comtime, because it is already is, since in this line no functions is called and compiler evaluates everything at comtime automatically (again, because attribute.code is comtime).

const hex_code = if (attribute.code[0] == '#') attribute.code[1..] else attribute.code;

This approach is better than the version with @panic, because if you have an error, you know about it at comtime instead of runtime. That’s always good to move all your errors to comtime where possible.

But there is another but. You should consider whether you really need parameter attribute comptime. Currently your function is locked to comptime-known arguments only. In the code you’ve shown, there is nothing that inehently has to be locked to comptime. You could write a function that may be executed both for comptime and runtime arguments. This is more usefull, because the code become reusable in more contexts. In this case the caller may decide whether they want to execute function at runtime or comptime, if it’s possible.

Note that by having attribute: anytype parameter your function is still considered generic. The type of attribute is still comptime-known, but the value of attribute isn’t comtime-known. comtime attribute: anytype makes the value comptime-known too.

In the case when attribute is not comptime you’d have to try parseInt or catch @panic error from it. The try approach is better, since it gives the caller more opportunities (the caller may somehow handle the error and continue execution). Anyway, if your function is called in comptime contexts, you still can catch @compileError.

Reproducible example. Please note that when you change hex_comtime to, for example, #zfffff, you’ll get a comptime error. And if you change hex_runtime to #zfffff, the compilation will be successful, but you’ll get a runtime error.

const std = @import("std");

const Color = struct {
    const Hex = struct {
        code: []const u8
    };
};

inline fn parse(attribute: anytype) ![]const u8 {
    switch (@TypeOf(attribute)) {
        Color.Hex => {
            const hex_code = if (attribute.code[0] == '#') attribute.code[1..] else attribute.code;
            const hex_to_dec = try std.fmt.parseInt(u32, hex_code, 16);
            _ = hex_to_dec;
            return "";
        },
        else => {
            @compileError("Wrong type");
        },
    }
}

pub fn main() !void {
    const hex_comptime: Color.Hex = .{ .code = "#ffffff" };
    _ = comptime parse(hex_comptime) catch |e| @compileError("Error " ++ @errorName(e) ++ " parsing hex code: " ++ hex_comptime.code);
    var hex_runtime: Color.Hex = .{ .code = "#ffffff" };
    _ = try parse(hex_runtime);
    _ = &hex_runtime;
}

Sorry for the wall of text.

5 Likes

Why is try f() catch legal? Whichever one binds first sucks in the error and the other is left operating on something that is no longer an error union. It would be like trying to a try or catch a non-error returning function (which is a compile time error).

Seems like a type checking bug.

1 Like

No, thank you for the master class!

1 Like

Can’t test this at the moment but I believe it isn’t legal. That’s why I mentioned that you have to do one or the other but not both at the same time.

1 Like

Is it? try f(); is just a syntactic sugar for f() catch |err| return err;

1 Like

it does not:

sry, i misunderstood you.

1 Like

@Tosti, there’s no need to apologize for the long message.

Thank you so much for your lesson. It helped me to understand some questions I was trying to answer.

My code need to be evaluated at compile-time, so I was trying to find a way to make it work.

I am already grateful for all the help I got here to discover the bug in my code. Ziggit has a really good community and I’m thankfull for all of your effort in explaining things.

3 Likes

Strictly speaking, it may be legal. It depends on the return type of f. It turns out, you can have an error union with a payload which itself is an error union. This code compiles.

fn myTest() !void {
    // var to force the compilation of all branches
    var t: error{A}!error{B}!void = {};
    _ = &t;
    _ = try t catch |e| std.debug.print("{}\n", .{ e });
}

More than that, this also compiles.

fn myTest() !void {
    var t: error{A}!error{B}!error{C}!void = {};
    _ = &t;
    _ = try t catch |e1| std.debug.print("{}\n", .{ e1 }) catch |e2| std.debug.print("{}\n", .{ e2 });
}

You may have it as deep as you want.

Some @compileLogging helped me to reveal the precedence. It’s interpreted as (added parenthesis)

var t: (error{A}!(error{B}!(error{C}!void))) = {};
_ = (((try t) catch |e1| std.debug.print("{}\n", .{ e1 })) catch |e2| std.debug.print("{}\n", .{ e2 }));

Unfortunately, I haven’t found a way to assign error.B or error.C to t. Assigning error.A works.

var t1: error{A}!error{B}!error{C}!void = error.A; // compiles
var t2: error{A}!error{B}!error{C}!void = error.B; // error: expected type 'error{A}', found type 'error{B}'
var t3: error{A}!error{B}!error{C}!void = error.C; // error: expected type 'error{A}', found type 'error{C}'

By the way, error{A}!error{B} is illegal type.

error: error union with payload of error set type 'error{B}' not allowed

I’m not sure that having something like error{A}!error{B}!void may be useful in actual code. Especially if assigning error.B to a variable of this type doesn’t work. But at least this type may exist in the current version of zig.

5 Likes