The following code (reduced varint encoding) works as expected on Linux x86_64 if built without flags or just one of -O and -release, but fails if both -O -release are specified:
---
import core.stdc.stdio;
ubyte foo(uint n) {
ubyte[5] buf = void;
ubyte wsize;
while (true) {
if ((n & ~0x7F) == 0) {
buf[wsize++] = cast(ubyte)n;
break;
} else {
buf[wsize++] = cast(ubyte)((n & 0x7F) | 0x80);
n >>= 7;
}
}
printf("%hhu\n", wsize);
return buf[0];
}
void main() {
printf("%hhx\n", foo(3));
}
---
More specifically, the output (printf()s for shorter assembly) is »1 e0« for the optimized build instead of »1 3«, and the program crashes most of the time (different errors: segfaults, illegal instruction, glibc free() assert triggers, …) – stack corruption?