Clang (and probably gcc as well) have an extension type of __uint128_t which is a 128 bit unsigned integer. <stdlib.h> on macOS for arm64 transitively includes a header that uses this type, leading to compilation failure.
Comment #1 by bugzilla — 2021-12-02T09:17:28Z
Unfortunately, DMD does not support 128 bit integers.
Not sure what to do about this, can it be #define'd away with a macro?
Comment #2 by dave287091 — 2021-12-02T19:28:56Z
(In reply to Walter Bright from comment #1)
> Unfortunately, DMD does not support 128 bit integers.
>
> Not sure what to do about this, can it be #define'd away with a macro?
It’d be nice to support 128 bit integers. Some algorithms require a 128 bit multiply.
As far as I can tell, they are used in struct definitions that are used to save the arm neon registers if someone uses the deprecated <ucontext.h> api. The actual usage is here:
struct __darwin_arm_neon_state64
{
__uint128_t __v[32];
__uint32_t __fpsr;
__uint32_t __fpcr;
};
struct __darwin_arm_neon_state
{
__uint128_t __v[16];
__uint32_t __fpsr;
__uint32_t __fpcr;
};
They seem to only be included into <stdlib.h> as it includes machine specific types, even though it doesn’t actually use them.
Possibly you could fake it with something like this:
struct fake_u128 {
_Alignas(16) unsigned long long a;
unsigned long long b;
};
#define __uint128_t struct fake_u128
Comment #3 by bugzilla — 2021-12-05T08:10:18Z
Yeah, having a fake type should do it. But it should be a typedef, not a #define.
It's pretty certain that when dmd forks the preprocessor, it needs to insert some definitions at the beginning of it. I don't think there's any other reasonable way to support the variety of C extensions out there.