Bug 19056 – UDAs can be added to imports but not retrieved

Status
NEW
Severity
normal
Priority
P3
Component
dmd
Product
D
Version
D2
Platform
All
OS
All
Creation time
2018-07-04T17:54:30Z
Last change time
2024-12-13T18:59:28Z
Assigned to
No Owner
Creator
elpenguino+D
Moved to GitHub: dmd#17869 →

Comments

Comment #0 by elpenguino+D — 2018-07-04T17:54:30Z
It is currently legal to attach UDAs to imports, but there's no way to retrieve them. ie: ``` enum X; @X import y = std.stdio; pragma(msg, __traits(getAttributes, y)); ``` I would expect the compiler to either print the attributes or emit an error, not print an empty tuple.
Comment #1 by razvan.nitu1305 — 2018-07-06T09:50:48Z
It seems that UDAs do not work with aliases and `@X import y = std.stdio;` creates an alias behind the scenes. Note that: enum X; int a; @X alias g = a; pragma(msg, __traits(getAttributes, g)); prints an empty tuple too because __traits(getAttributes) first resolves the symbol and then checks if any attributes exist in order to accomodate this case: enum X; @X int a; alias g = a; pragma(msg, __traits(getAttributes, g)); prints tuple((X)). In my opinion, the best solution would be to make adding UDAs to aliases illegal and also not allow UDAs to imports as long as imports create aliases behind the scenes.
Comment #2 by smauldersauce — 2018-07-08T10:48:01Z
> In my opinion, the best solution would be to make adding UDAs to aliases illegal and also not allow UDAs to imports as long as imports create aliases behind the scenes. I disagree, this would cause a lot of breaking changes. It will make it difficult to write code that would otherwise be simple. Attributes are allowed to be applied to aliases even if it does nothing is for simplicity. enum value = 10; struct Foo { @nogc: alias bar = value; // should be an error to allow "@nogc:" // to apply to whole scope for simplicity } Just as some attributes can be applied to symbols that do not do anything: @nogc int variable; // Still allowed, doesn't do anything
Comment #3 by smauldersauce — 2018-07-08T10:48:55Z
should not be an error*
Comment #4 by andrei — 2018-07-09T14:58:47Z
I think named imports and aliases should be entirely transparent, so shouldn't accept attributes (syntactically or semantically). That means fixing this bug entails rejecting the code. Will discuss with Walter.
Comment #5 by bugzilla — 2018-07-09T19:16:36Z
First off, I agree that aliases should be aliases, not modifications to other symbols. I.e. attributes must not affect the alias. (Changing this behavior will have far reaching consequences in the internal compiler implementation, and would be at risk of a lot of unexpected corner cases.) As to making: @X alias g = a; // (1) illegal (as opposed to simply ignoring the @X), consider: @X { alias g = a; } // (2) @Y: alias g = a; // (3) Making those illegal will cause problems. Then the issue is should (1) be rejected as a special case while ignoring the attributes in (2) and (3)? This behavior isn't the case anywhere else. The three forms are equivalent, it's an easy rule to remember, and easy to have a consistent, correct implementation. We'd be trading "why don't attributes affect the alias" to "why does the attribute syntax have different behavior w.r.t. aliases"? Is that a net improvement? Therefore I'm opposed to adding a special case to reject (1). The compiler is working as intended and designed. (I also changed it to "Enhancement", because the compiler is working as designed.) Note that a similar issue has come up recently with: int foo(scope int x) Should 'scope' be an error or be ignored? The answer is ignored, because making it an error would make writing generic code awkward, clumsy, and ugly.
Comment #6 by razvan.nitu1305 — 2018-07-11T11:11:06Z
> Making those illegal will cause problems. Then the issue is should (1) be > rejected as a special case while ignoring the attributes in (2) and (3)? > This behavior isn't the case anywhere else. The three forms are equivalent, > it's an easy rule to remember, and easy to have a consistent, correct > implementation. final: int a; final {int b;} compile just fine. Whereas: final int a; does not.
Comment #7 by robert.schadek — 2024-12-13T18:59:28Z
THIS ISSUE HAS BEEN MOVED TO GITHUB https://github.com/dlang/dmd/issues/17869 DO NOT COMMENT HERE ANYMORE, NOBODY WILL SEE IT, THIS ISSUE HAS BEEN MOVED TO GITHUB