I’m trying to debug a test case on macOS. The code uses the fstatat()
imported from C to get the size of a test:
const std = @import("std");
const c = @cImport({
@cInclude("unistd.h");
@cInclude("fcntl.h");
@cInclude("sys/stat.h");
});
pub fn main() !void {
const dirfd = c.openat(c.AT_FDCWD, ".", c.O_DIRECTORY | c.O_RDONLY);
if (dirfd < 0) return error.UnableToOpenDirectory;
var info: c.struct_stat = undefined;
if (c.fstatat(dirfd, "test.zig", &info, 0) != 0) return error.UnableToGetStat;
std.debug.print("size = {d}\n", .{info.st_size});
}
size = 0
The code basically stats itself, so zero is obviously the wrong result.
The equivalent code in C works correctly:
#include <unistd.h>
#include <fcntl.h>
#include <sys/stat.h>
#include <stdio.h>
int main() {
int dirfd = openat(AT_FDCWD, ".", O_DIRECTORY | O_RDONLY);
if (dirfd < 0) goto error;
struct stat info;
if (fstatat(dirfd, "test.c", &info, 0) < 0) goto error;
printf("size = %zu\n", info.st_size);
return 0;
error:
printf("Error\n");
return 1;
}
size = 372
Print-out of the stat struct:
cimport.struct_stat{
.st_dev = 16777222,
.st_mode = 1052,
.st_nlink = 13,
.st_ino = 2151778714020,
.st_uid = 20,
.st_gid = 0,
.st_rdev = 1757947205,
.st_atimespec = cimport.struct_timespec{ .tv_sec = 621266321, .tv_nsec = 1757947204 },
.st_mtimespec = cimport.struct_timespec{ .tv_sec = 155724084, .tv_nsec = 1757947204 },
.st_ctimespec = cimport.struct_timespec{ .tv_sec = 159462706, .tv_nsec = 505 },
.st_birthtimespec = cimport.struct_timespec{ .tv_sec = 8, .tv_nsec = 4096 },
.st_size = 0,
.st_blocks = 0,
.st_blksize = 0,
.st_flags = 0,
.st_gen = 2863311530,
.st_lspare = -1431655766,
.st_qspare = { -6148914691236517206, -6148914691236517206 }
}
The size is showing up in the tv_nsec
field of st_ctimespec
. -6148914691236517206 in hex is, as you might have guessed, 0xAAAA_AAAA_AAAA_AAAA. It seems I’m calling the 32-bit version of the function with a 64-bit struct. Is there a define that needs to be set or something?
The macOS version is Monterey :sadface: