How much data is too much to pass around on the stack?

I am deserializing data that is at most the length of an ethernet frame (about 1500 bytes): (Exact sizes not shown as I don’t know them off the top of my head)

I could represent this two ways:

pub const Segment = struct {
    mbx_header: mailbox.Header, // 6 bytes
    coe_header: coe.Header, // 2 bytes
    seg_header: SDOSegmentHeaderServer, // 8 bytes
    data: std.BoundedArray(u8, data_max_size), // up to 1400 bytes
}

or

pub const Segment = struct {
    mbx_header: mailbox.Header, // 6 bytes
    coe_header: coe.Header, // 2 bytes
    seg_header: SDOSegmentHeaderServer, // 8 bytes
    data: [] const u8, // up to 1400 bytes
}

I know that passing around too much data on the stack (perhaps megabytes) is concerning from a code portability standpoint (the stack size is usually about 8 megabytes for linux but embedded platforms can be smaller).

Also, I will have to be more careful about the lifetime of the data I deserialize from in the slice case, otherwise I risk invalidating the slice.

What else should I be thinking about?

As long as you don’t do it many times recursively, a few kilobytes on the stack should not be a problem.

One thing to watch out for with large arrays is that zig sometimes adds an extra memcpy when accessing arrays, e.g. for(segment.data.buffer) would likely cause zig to copy the entire array. However as long as you use segment.data.slice() it should not be a problem.

I imagine a memory-starved embedded system would likelier be heapless as you can’t afford to set aside chunks of memory for long duration. Running out of stack space would mean running out of memory altogether.