I won’t mind! My contract is that you can ping me liberally for whatever reason at whatever time without thinking twice, but, symmetrically, I reserve the right to not feel bad if I don’t respond!
I wouldn’t say that handles are particularly “hot” in temporal locality sense — this is quite an old trick. I personally first learned about it from the famous https://gameprogrammingpatterns.com book (though I “spoke prose” before that in the small, when working with graphs).
I would also maybe caution against talking about “the Handle pattern”. As with terms like OOP and FP, this is a nebulous concept which can be applied to many quite different patterns & language features, and, while it is useful to refer to an element of this cloud, if you want to think about it, you have to think very concretely about a specific implementation, as details matter a lot. Are compressed pointers in HotSpot JVM handles?
From this point of view:
Zig’s enum(usize) { _ } is “a happy accident of language design” (heard this memorable turn of the phrase from yole). It is a way to give an integer a fancy, snobbish top-hat, which prevents its mingling with lowly ints. I don’t think _ was meant for that, I think the original use-case was “I want to have an enum with a list of known variants, but sometimes I want to store an unknown one”, and then later it was discovered that you can flip it around and make _ the 99% percent use-case.
This is a distinct feature than newtypes as found in Haskell/Go/Odin. Those languages allow for
type email = string;
but you can’t const email = enum([]const u8) { _ } in Zig. At the same time, in low-level languages 99% of the time you want to newtype, you want to newtype a handle/index, so Zig gets to eat 80% of cake using 0% of extra features.
Ada’s Access types are something else entirely. They are nominal pointers and the interesting thing about them is not whether they are indexes or pointers internally (I actually don’t know that!), but the associated memory management machinery. This is something I’d love to understand better: how Ada actually manages memory? I think it sits “between” pure stack allocation and general RAII, but I am confused by everything I’ve read so far. I can’t concisely explain how Ada does unbounded strings, how it returns dynamically-sized values.