Mapping [64]u8 buffer to a struct

I took only a part of ELF header and observe strange things:

const std = @import("std");

const ElfHdr1 = extern struct {
    sign: u32, // 7F 45 4C 46, '[DEL]ELF'
    clas: u8,  // 2 for 64 bit
    data: u8,  // endianess, 1 for LE
    vers: u8   // == 1
};

const ElfHdr2 = packed struct {
    sign: u32, // 7F 45 4C 46, '[DEL]ELF'
    clas: u8,  // 2 for 64 bit
    data: u8,  // endianess, 1 for LE
    vers: u8   // == 1
};

pub fn main() !void {
    std.debug.print("sizeof(ElfHdr1) = {}\n", .{@sizeOf(ElfHdr1)});
    std.debug.print("sizeof(ElfHdr2) = {}\n", .{@sizeOf(ElfHdr2)});
}

This prints 8 for both variants:

$ ./elf 
sizeof(ElfHdr1) = 8
sizeof(ElfHdr2) = 8

But this must be 7, mustn’t it?

Same in C:

#include <stdio.h>

struct elf_hdr {
    int sign;
    char clas;
    char data;
    char vers;
} __attribute__((packed));

int main(void) {
    printf("sizeof(elf_hdr) = %lu\n", sizeof(struct elf_hdr));
}

Prints 7, as it must be:

$ ./a.out 
sizeof(elf_hdr) = 7

Is it a bug? Or do I misunderstand something completely?