Bug 9725 – std.string.format does wasteful UTF decoding
Status
RESOLVED
Resolution
FIXED
Severity
normal
Priority
P3
Component
phobos
Product
D
Version
D2
Platform
All
OS
All
Creation time
2013-03-14T21:14:00Z
Last change time
2013-07-13T06:48:25Z
Keywords
performance, pull
Assigned to
nobody
Creator
dlang-bugzilla
Comments
Comment #0 by dlang-bugzilla — 2013-03-14T21:14:17Z
import std.string; void main() { format("\xFF"); }
This program throws, indicating that the "format" function performs UTF-8 decoding.
This is a pointless waste of cycles. All format specifiers are in the lower-ASCII range, and both the input and output are UTF-8. Should the format specifier syntax require reading exactly one Unicode character, it should do so when required, rather than reading the entire string one dchar at a time, then encoding the results back into an UTF-8 string.
It is worth noting that writefln does not have the same issue.
Comment #1 by dlang-bugzilla — 2013-03-14T21:20:54Z
Actually, it looks like a bug in a static if in Appender.put(). Looking into it.
Comment #2 by dlang-bugzilla — 2013-03-14T21:35:24Z
Looks like the fix to issue 5663 wasn't complete, because const(immutable(char)) is immutable(char), not const(char).
Comment #3 by dlang-bugzilla — 2013-03-14T21:41:28Z