Why does a value have to be `var` to be `deinit`-able?

Consider the following code:

const std = @import("std");

fn createAHashMap(allocator: std.mem.Allocator) std.AutoHashMap(u8, u32) {
    var response_map = std.AutoHashMap(u8, u32).init(allocator);
    response_map.put('a', 1) catch unreachable;
    response_map.put('b', 2) catch unreachable;
    return response_map;
}

pub fn main() !void {
    var gpa = std.heap.GeneralPurposeAllocator(.{}){};
    defer _ = gpa.deinit();
    const allocator = gpa.allocator();

    const my_map = createAHashMap(allocator);
    defer my_map.deinit();
    std.debug.print("Value of 'a' is {}\n", .{my_map.get('a').?});
}

I declared my_map as const because, to my mind, in the context of main it is const - it’s not being mutated, none of its keys or values change. However, attempting to run this code gives:

scratch.zig:32:17: error: expected type '*hash_map.HashMap(u8,u32,hash_map.AutoContext(u8),80)', found '*const hash_map.HashMap(u8,u32,hash_map.AutoContext(u8),80)'
    defer my_map.deinit();

I’m curious about this design decision. Is deinit-ing seen as a type of mutation? I guess that’s technically true, but it doesn’t feel like the same thing as “adding/removing a value to/from a map”. Is this a philosophical choice, or is there a technical reason why this had to be true? Or, is there a category of bug that this is saving programmers from writing?

        pub fn deinit(self: *Self) void {
            self.unmanaged.deinit(self.allocator);
            self.* = undefined;
        }

Looks like it’s purely to poison the memory in Debug and Safe modes

3 Likes

because the deinit function takes a non const pointer which cannot be made from a const variable, it does this as @biom4st3r said to poison the memory in debug and safe to make use after frees easier to catch

1 Like