Bug 22612 – std.json doesn't parse duplicate keys

Status
NEW
Severity
normal
Priority
P3
Component
dlang.org
Product
D
Version
D2
Platform
All
OS
All
Creation time
2021-12-20T07:07:00Z
Last change time
2024-12-15T15:27:19Z
Assigned to
No Owner
Creator
Răzvan Ștefănescu
Moved to GitHub: dlang.org#3992 →

Comments

Comment #0 by rumbu — 2021-12-20T07:07:00Z
auto j = parseJSON(`{ "key": 1, "key" : 2 }`); writeln(j); //outputs only {"key":2} Of course, j["key"] contains 2, value 1 gets lost in translation. According to ECMA-404: "The JSON syntax does not impose any restrictions on the strings used as names, does not require that name strings be unique, and does not assign any significance to the ordering of name/value pairs"
Comment #1 by salihdb — 2021-12-20T09:14:40Z
Probably, but the 2nd object with the same name is overwritten with the 1st. In summary, data points to a single object. NodeJS has the same effect: const jsonStr ='{"name":"Salih","age":42,"name":"SALIH"}'; const data = JSON.parse(jsonStr); var assert = require('assert'); assert(data.name != "Salih");
Comment #2 by b2.temp — 2021-12-20T10:48:42Z
the spec quoted is clearly only about syntax, not about semantics of conflicted keys. off-topic but yaml is better on that aspect.
Comment #3 by rumbu — 2021-12-20T15:03:57Z
(In reply to Salih Dincer from comment #1) > Probably, but the 2nd object with the same name is overwritten with the 1st. > In summary, data points to a single object. > > NodeJS has the same effect: > > const jsonStr ='{"name":"Salih","age":42,"name":"SALIH"}'; > const data = JSON.parse(jsonStr); > > var assert = require('assert'); > assert(data.name != "Salih"); I don't expect anything else from JavaScript :) I would never post this if I didn't encounter such json in the wild. Initially I had the same reaction, but the client providing that json stream pointed me to the ECMA standard and I lost all my arguments. I know that in the corresponding RFC says that you SHOULD not have duplicate keys, but SHOULD is interpreted by some people as "CAN". There are npm packages even for node.js which can handle duplicate keys. The ideea is that as long as duplicate keys are allowed in the standard, at least std.json can provide in the documentation that last value wins (there are other approaches - first value wins, error, silently transform duplicate keys in arrays).
Comment #4 by dfj1esp02 — 2021-12-20T15:29:23Z
A documentation issue, right?
Comment #5 by rumbu — 2021-12-20T16:34:11Z
(In reply to anonymous4 from comment #4) > A documentation issue, right? Yes, if we assume that std.json is not ECMA compliant. I don't understand why stdx.data.json didn't replace already the old std.json. It can handle also my case of duplicate keys since the parser is public.
Comment #6 by salihdb — 2021-12-20T17:17:50Z
> ... at least std.json can > provide in the documentation > that last value wins ... You're right...
Comment #7 by robert.schadek — 2024-12-15T15:27:19Z
THIS ISSUE HAS BEEN MOVED TO GITHUB https://github.com/dlang/dlang.org/issues/3992 DO NOT COMMENT HERE ANYMORE, NOBODY WILL SEE IT, THIS ISSUE HAS BEEN MOVED TO GITHUB