std.range.Stride has an off-by-one error in its length() function, which causes the reported length to be one less than the actual length whenever _input.length % _n != 0.
import std.stdio, std.range;
void main() {
uint[] foo = [1,2,3,4,5];
auto s = stride(foo, 2);
writeln(s.length); // 2
uint realLength = 0;
foreach(elem; s) {
realLength++;
}
writeln(realLength); // 3
}
This can be fixed by changing the length function in std.range.Stride to the following:
size_t length()
{
return (_input.length % _n == 0) ?
_input.length / _n :
_input.length / _n + 1;
}
The fix can be verified by the following test case:
import std.stdio, std.range;
void main() {
foreach(l; 0..10) {
foreach(s; 1..l) {
uint[] foo = new uint[l];
auto st = stride(foo, s);
auto len1 = st.length;
uint len2 = 0;
foreach(elem; st) {
len2++;
}
assert(len1 == len2);
writeln(len1, "\t", len2);
}
}
}
Comment #1 by andrei — 2009-08-27T22:31:31Z
I fixed length like this:
return (_input.length - 1) / _n + 1;
Thanks!
Comment #2 by andrei — 2009-08-27T23:38:34Z
(In reply to comment #1)
> I fixed length like this:
>
> return (_input.length - 1) / _n + 1;
>
> Thanks!
In fact this doesn't work for _input.length == 0. So I rewrote it as:
return (_input.length + _n - 1) / _n