What makes "ban returning pointer to stack memory" difficult?

By the same token, most uncomplicated bugs of this type are shallow. I’ve certainly returned directly from a stack frame a few times, but the next thing I try to do with that code shows the mistake.

We do get new users here who are confused by how memory works in Zig and don’t understand what went wrong. But we shouldn’t optimize for that case either.

I believe calling this just ‘costly’ is a subjective underestimate, but there’s some space in the correctness / expense map which might be acceptable.

I have a concern with this which isn’t perhaps obvious. You mentioned GCC and Clang, which can catch some bugs like that, and issue warnings. Zig doesn’t have warnings.

Which raises a problem with catching some-but-not-all of something. Namely, the project has a goal to standardize the language, so there’s a basis for multiple implementations. This requires that any given text file either compiles, or it doesn’t, and that it behaves in substantially the same way.

This is a bad fit for a project like you’re suggesting, because the only way to get it is to draw an arbitrary and complex-to-describe line, saying a Zig compiler will catch every case like this and will permit any other bugs of this nature. Improving analysis further would be a nonstandard extension. The conformance suite would have to include non-catchable bugs, or an implementation could easily refuse to compile programs allowed by the standard.

I’d like Zig to have a whole pluggable validation system, some included with the compiler and others third party. This would be a great addition to something like that. But I don’t think it should be part of the specification for the compiler, and that means that the compiler itself should not be doing it.

3 Likes

Such bugs can hard to detect if the pointer returned is high up on the stack. Consider the following:

const std = @import("std");

fn foo(i: usize) []const u8 {
    if (i == 0) {
        var buffer: [64]u8 = undefined;
        return std.fmt.bufPrint(&buffer, "It's a cookbook!", .{}) catch unreachable;
    } else {
        return foo(i - 1);
    }
}

pub fn main() !void {
    for (1..16) |i| {
        std.debug.print("{s}\n", .{foo(i)});
    }
}
�����
�������
������
@����@����
It's a cookbook!
�����
It's a cookbook!
It's a cookbook!
It's a cookbook!
It's a cookbook!
It's a cookbook!
It's a cookbook!
It's a cookbook!
It's a cookbook!
It's a cookbook!

With a StackFallbackAllocator it happens quite easily, since allocations occur from the top down:

const std = @import("std");

var gpa = std.heap.DebugAllocator(.{}).init;

fn foo() []const u8 {
    var sfb = std.heap.stackFallback(64 * 1000, gpa.allocator());
    return std.fmt.allocPrint(sfb.get(), "It's a cookbook!", .{}) catch unreachable;
}

pub fn main() !void {
    std.debug.print("{s}\n", .{foo()});
    if (!gpa.detectLeaks()) {
        std.debug.print("No leaks!\n", .{});
    }
}
It's a cookbook!
No leaks!
2 Likes

I don’t think this is necessary. If your program is using dangling pointers, it’s undefined behavior, and the compiler can do whatever it wants. The compiler may choose to reject it, or it may choose to compile it into garbage. Either way, it’s still compliant. Note that we are not talking about valid programs here. If the compiler still compiles the code, whatever comes out of it is fundamentally invalid.
For standardization purposes, we can just say: “if the compiler can prove that every code path leads to undefined behavior, it needs to reject the program”. Different compilers will have different proving capabilities, some may compile it (it’s still a garbage program), and some don’t.

Zig isn’t doing undefined behavior. It’s doing illegal behavior, and some of the distinctions between these are clearer than others.

One of the known goals is “comptime-known illegal behavior will not compile”. But note that my simple examples are not comptime-known as that concept currently exists in the compiler. A change there would be to what comptime knowability means, as distinct from adding static checks of one sort or another.

Another being “runtime-known illegal behavior will panic in safe modes”, and as Andrew pointed out, adding poorly-behaved stack uses to the detection list is a goal.

I want to re-emphasize the distinction between comptime-known, and that which could be statically determined while compiling. They’re different concepts.

And again, I think having an entire validation mechanism built into the toolchain would be excellent, and really elevate Zig above other tools in the same design space (further elevate that is).

I also think “a Zig compiler will, or will not, compile a text file, and the same decision will be reached for any given text file” is a pretty good goal here. To me that’s the distinction between illegal and undefined in a nutshell. YMMV.

We’re not stuck trying to find common ground between a bunch of compilers written out in the wild, so the constraints the C committee operates under shouldn’t be propagated, given all the experience we’ve had with the downsides of that decision.

But validation is great, it’s open ended and there’s always more of it. Let’s have that by all means.

2 Likes