Found by Iain Buclaw:
```
void main()
{
ubyte[1] w = cast(ubyte[1]) "A"; // Passes
uint[1] x = cast(uint[1]) "\xFF\xAA\xDD\xEE";
ulong[2] y = cast(ulong[2]) "\x11\x22\x33\x44\x55\x66\x77\x88\xAA\xBB\xCC\xDD\xEE\xFF\x00\x99";
}
```
These should be semantic errors, not backend errors.
```
Error: e2ir: cannot cast `"\xff\xaa\xdd\xee"` of type `string` to type `uint[1]`
Error: e2ir: cannot cast `"\x11\"3DUfw\x88\xaa\xbb\xcc\xdd\xee\xff\0\x99"` of type `string` to type `ulong[2]`
```
Comment #1 by bugzilla — 2024-01-30T20:50:29Z
Consider:
void test()
{
const char[8] a = "12345678";
ulong[1] i = cast(ulong[1])a; // works
ulong[1] j = cast(ulong[1])"12345678"; // fails to compile
}
Both should compile. It being a hex string should be irrelevant.
Comment #2 by robert.schadek — 2024-12-13T19:32:53Z