Comment #0 by bearophile_hugs — 2012-05-30T04:57:00Z
I have code that generates arrays of mutable chars, and I have to convert them to BigInts. I'd like this to be allowed:
import std.bigint: BigInt;
void main() {
char[] s1 = "123".dup;
assert(BigInt(s1) == 123);
char[] s2 = "0xABC".dup;
assert(BigInt(s2) == 2748);
}
DMD 2.060alpha gives:
...\dmd2\src\phobos\std\bigint.d(97): Error: function std.internal.math.biguintcore.BigUint.fromHexString (string s) is not callable using argument types (char[])
...\dmd2\src\phobos\std\bigint.d(97): Error: cannot implicitly convert expression (s[2u..__dollar]) of type char[] to string
...\dmd2\src\phobos\std\bigint.d(99): Error: function std.internal.math.biguintcore.BigUint.fromDecimalString (string s) is not callable using argument types (char[])
...\dmd2\src\phobos\std\bigint.d(99): Error: cannot implicitly convert expression (s) of type char[] to string
test.d(4): Error: template instance std.bigint.BigInt.__ctor!(char[]) error instantiating
Current workaround: I use a cast:
char[] s1 = "123".dup;
assert(BigInt(cast(string)s1) == 123);
Comment #1 by github-bugzilla — 2012-07-01T19:53:10Z