Comment #0 by bearophile_hugs — 2014-11-19T10:27:39Z
I think (with the current dynamic array design) decreasing array lenghts should be @nogc:
void main() @nogc {
int[3] a = [1, 2, 3];
auto b = a[];
b = b[0 .. $ - 1]; // OK
assert(b.length == 2);
b.length--; // Error, rejects-valid
}
dmd 2.067alpha gives:
test2.d(6,13): Error: setting 'length' in @nogc function main may cause GC allocation
In theory this too should be @nogc, but I think this can be done only with some weak form of dependent typing and only if the increase/decrease values are compile-time constants:
void main() @nogc {
int[3] a = [1, 2, 3];
auto b = a[];
b.length -= 1;
assert(b.length == 2);
b.length += -1;
assert(b.length == 1);
}
Comment #1 by robert.schadek — 2024-12-13T18:35:44Z