Discussion:
Local functions infer attributes?
Manu via Digitalmars-d
2014-09-28 02:42:22 UTC
Permalink
void f() pure nothrow @nogc
{
void localFunc()
{
}

localFunc();
}

Complains because localFunc is not @nogc or nothrow.
Doesn't complain about pure though.

Is it reasonable to say that the scope of the outer function is
nothrow+ at nogc, and therefore everything declared within should also be
so?
deadalnix via Digitalmars-d
2014-09-28 02:56:55 UTC
Permalink
On Sunday, 28 September 2014 at 02:42:29 UTC, Manu via
Post by Manu via Digitalmars-d
{
void localFunc()
{
}
localFunc();
}
Doesn't complain about pure though.
Is it reasonable to say that the scope of the outer function is
nothrow+ at nogc, and therefore everything declared within should
also be
so?
No as the function could be returned and used elsewhere, where
these attribute aren't necessary. Also, inferring everything is
quite expensive and we want D to compile fast.

But maybe inference could be triggered on error ?
Walter Bright via Digitalmars-d
2014-09-28 03:08:05 UTC
Permalink
Post by Manu via Digitalmars-d
{
void localFunc()
{
}
localFunc();
}
Doesn't complain about pure though.
Is it reasonable to say that the scope of the outer function is
nothrow+ at nogc, and therefore everything declared within should also be
so?
No as the function could be returned and used elsewhere, where these attribute
aren't necessary. Also, inferring everything is quite expensive and we want D to
compile fast.
But maybe inference could be triggered on error ?
Since the function body is always present, inference should always be done. It
isn't costly. Should file an enhancement request on this.
deadalnix via Digitalmars-d
2014-09-28 03:50:12 UTC
Permalink
Post by Walter Bright via Digitalmars-d
Since the function body is always present, inference should
always be done. It isn't costly. Should file an enhancement
request on this.
If you have a control flow graph to analyze it can become quite
costly.
Walter Bright via Digitalmars-d
2014-09-28 04:04:21 UTC
Permalink
Post by Walter Bright via Digitalmars-d
Since the function body is always present, inference should always be done. It
isn't costly. Should file an enhancement request on this.
If you have a control flow graph to analyze it can become quite costly.
Nope! It happens as part of semantic analysis.
deadalnix via Digitalmars-d
2014-09-28 05:48:19 UTC
Permalink
Post by Walter Bright via Digitalmars-d
On Sunday, 28 September 2014 at 03:08:09 UTC, Walter Bright
Post by Walter Bright via Digitalmars-d
Since the function body is always present, inference should
always be done. It
isn't costly. Should file an enhancement request on this.
If you have a control flow graph to analyze it can become
quite costly.
Nope! It happens as part of semantic analysis.
Obviously, it does. The complexity of the computation still can
go quite high.
Walter Bright via Digitalmars-d
2014-09-28 06:25:02 UTC
Permalink
Post by Walter Bright via Digitalmars-d
Post by Walter Bright via Digitalmars-d
Since the function body is always present, inference should always be done. It
isn't costly. Should file an enhancement request on this.
If you have a control flow graph to analyze it can become quite costly.
Nope! It happens as part of semantic analysis.
Obviously, it does. The complexity of the computation still can go quite high.
If you look at how it is currently implemented, you'll see it doesn't go high.
bearophile via Digitalmars-d
2014-09-28 08:49:35 UTC
Permalink
the function could be returned and used elsewhere, where these
attribute aren't necessary.
Filed:
https://issues.dlang.org/show_bug.cgi?id=13550

Bye,
bearophile
Walter Bright via Digitalmars-d
2014-09-28 08:54:00 UTC
Permalink
Post by bearophile via Digitalmars-d
https://issues.dlang.org/show_bug.cgi?id=13550
Thanks!
Trass3r via Digitalmars-d
2014-09-30 08:33:26 UTC
Permalink
Also, inferring everything is quite
expensive and we want D to compile fast.
Doesn't the compiler have to do that anyway?
I'd expect a proper compiler to check if my code is actually what
I claim it is. It's quite easy to mark something as e.g. nogc in
the first version and later on add code with allocations.
deadalnix via Digitalmars-d
2014-09-30 22:39:53 UTC
Permalink
Post by Trass3r via Digitalmars-d
Also, inferring everything is quite
expensive and we want D to compile fast.
Doesn't the compiler have to do that anyway?
I'd expect a proper compiler to check if my code is actually
what
I claim it is. It's quite easy to mark something as e.g. nogc in
the first version and later on add code with allocations.
It is for the function body, but most qualifier are transitive,
so depends on the inference on other function.
Andrei Alexandrescu via Digitalmars-d
2014-09-28 12:21:10 UTC
Permalink
Post by Manu via Digitalmars-d
{
void localFunc()
{
}
localFunc();
}
Doesn't complain about pure though.
Is it reasonable to say that the scope of the outer function is
nothrow+ at nogc, and therefore everything declared within should also be
so?
Interesting. I'd guess probably not, e.g. a function may define a static
local function and return its address (without either throwing or
creating garbage), whereas that local function itself may do whatever it
pleases.

However, local functions have their body available by definition so they
should have all deducible attributes deducted. That should take care of
the problem.


Andrei

P.S. I also notice that my latest attempt at establishing communication
has remained ignored.
Manu via Digitalmars-d
2014-09-29 00:38:28 UTC
Permalink
On 28 September 2014 22:21, Andrei Alexandrescu via Digitalmars-d
Post by Andrei Alexandrescu via Digitalmars-d
Post by Manu via Digitalmars-d
{
void localFunc()
{
}
localFunc();
}
Doesn't complain about pure though.
Is it reasonable to say that the scope of the outer function is
nothrow+ at nogc, and therefore everything declared within should also be
so?
Interesting. I'd guess probably not, e.g. a function may define a static
local function and return its address (without either throwing or creating
garbage), whereas that local function itself may do whatever it pleases.
However, local functions have their body available by definition so they
should have all deducible attributes deducted. That should take care of the
problem.
Andrei
P.S. I also notice that my latest attempt at establishing communication has
remained ignored.
I was out of town (was on my phone), and now I'm home with 2 guests,
and we're working together. I can't sit and craft a pile of example
cases until I'm alone and have time to do so. I haven't ignored it,
but I need to find the time to give you what you want.

That said, my friend encountered one of my frequently recurring pain
cases himself yesterday:
struct S(T...)
{
void f(T args) {}
}

S!(int, ref S) fail; // <-- no clean way to do this. I need this very
frequently, and he reached for it too, so I can't be that weird.
Walter Bright via Digitalmars-d
2014-09-29 00:51:25 UTC
Permalink
Post by Manu via Digitalmars-d
That said, my friend encountered one of my frequently recurring pain
struct S(T...)
{
void f(T args) {}
}
S!(int, ref S) fail; // <-- no clean way to do this. I need this very
frequently, and he reached for it too, so I can't be that weird.
S!(int, S*)
Manu via Digitalmars-d
2014-09-29 01:31:40 UTC
Permalink
On 29 September 2014 10:51, Walter Bright via Digitalmars-d
Post by Walter Bright via Digitalmars-d
Post by Manu via Digitalmars-d
That said, my friend encountered one of my frequently recurring pain
struct S(T...)
{
void f(T args) {}
}
S!(int, ref S) fail; // <-- no clean way to do this. I need this very
frequently, and he reached for it too, so I can't be that weird.
S!(int, S*)
That's different.

I feel like I have to somehow justify to you guys how meta code works
in D. I have meta code that is no less than 5 layers deep. It's
complex, but at the same time, somehow surprisingly elegant and simple
(this is the nature of D I guess).
If I now assume throughout my meta "pointer means ref", then when I
actually pass a pointer in, the meta can't know if it was meant to be
a ref or not. It results in complex explicit logic to handle at almost
every point due to a loss of information.

You can't call f() with the same syntax anymore (you need an '&')
which is a static if in the meta, you can't use the S* arg in the same
meta (needs a '*') which is another static if. Assignments are
changed, and unexpected indexing mechanics appear. When implementation
logic expects and understands the distinction between pointers and
ref's, this confuses that logic. When I interface between languages
(everything I never do binds to at least C++, and in this case, also
Lua), this complicates the situation.

I can't conflate 2 things that aren't the same. It leads to a lot of
mess in a lot of places.
Walter Bright via Digitalmars-d
2014-09-29 02:43:17 UTC
Permalink
Post by Manu via Digitalmars-d
Post by Walter Bright via Digitalmars-d
S!(int, S*)
That's different.
I feel like I have to somehow justify to you guys how meta code works
in D. I have meta code that is no less than 5 layers deep. It's
complex, but at the same time, somehow surprisingly elegant and simple
(this is the nature of D I guess).
If I now assume throughout my meta "pointer means ref", then when I
actually pass a pointer in, the meta can't know if it was meant to be
a ref or not. It results in complex explicit logic to handle at almost
every point due to a loss of information.
You can't call f() with the same syntax anymore (you need an '&')
which is a static if in the meta, you can't use the S* arg in the same
meta (needs a '*') which is another static if. Assignments are
changed, and unexpected indexing mechanics appear. When implementation
logic expects and understands the distinction between pointers and
ref's, this confuses that logic. When I interface between languages
(everything I never do binds to at least C++, and in this case, also
Lua), this complicates the situation.
I can't conflate 2 things that aren't the same. It leads to a lot of
mess in a lot of places.
You're right that tuples in D cannot contain storage classes (and ref is just
one storage class, there's also out and in, etc.).

You can use autoref, but I haven't understood why that doesn't work for you.
Timon Gehr via Digitalmars-d
2014-09-29 04:23:07 UTC
Permalink
Post by Walter Bright via Digitalmars-d
You're right that tuples in D cannot contain storage classes
void foo(ref int x){}
alias p=ParameterTypeTuple!foo;
pragma(msg, p); // (ref int)

(But this does not help.)
Walter Bright via Digitalmars-d
2014-09-29 05:16:09 UTC
Permalink
Post by Timon Gehr via Digitalmars-d
Post by Walter Bright via Digitalmars-d
You're right that tuples in D cannot contain storage classes
void foo(ref int x){}
alias p=ParameterTypeTuple!foo;
pragma(msg, p); // (ref int)
(But this does not help.)
You're right, I had forgotten about that.
Daniel N via Digitalmars-d
2014-09-30 06:06:44 UTC
Permalink
Post by Timon Gehr via Digitalmars-d
Post by Walter Bright via Digitalmars-d
You're right that tuples in D cannot contain storage classes
void foo(ref int x){}
alias p=ParameterTypeTuple!foo;
pragma(msg, p); // (ref int)
(But this does not help.)
Well, only if you are sufficiently desperate. ;)

struct S(alias T)
{
void f(ParameterTypeTuple!T p)
{
}
}

S!((ref int x, int y){}) s;
Manu via Digitalmars-d
2014-09-30 07:07:14 UTC
Permalink
On 30 September 2014 16:06, Daniel N via Digitalmars-d
Post by Daniel N via Digitalmars-d
Post by Timon Gehr via Digitalmars-d
Post by Walter Bright via Digitalmars-d
You're right that tuples in D cannot contain storage classes
void foo(ref int x){}
alias p=ParameterTypeTuple!foo;
pragma(msg, p); // (ref int)
(But this does not help.)
Well, only if you are sufficiently desperate. ;)
struct S(alias T)
{
void f(ParameterTypeTuple!T p)
{
}
}
S!((ref int x, int y){}) s;
I have actually thought of that ;) ... but I tend to think that only D
users present on this forum are likely to make sense of that code, and
why.
deadalnix via Digitalmars-d
2014-09-30 07:18:17 UTC
Permalink
On Tuesday, 30 September 2014 at 07:10:08 UTC, Manu via
Post by Manu via Digitalmars-d
I have actually thought of that ;) ... but I tend to think that
only D
users present on this forum are likely to make sense of that
code, and
why.
You can probably wrap this in a nice library solution with an
explicit name.
Daniel N via Digitalmars-d
2014-09-30 07:48:59 UTC
Permalink
Post by deadalnix via Digitalmars-d
On Tuesday, 30 September 2014 at 07:10:08 UTC, Manu via
Post by Manu via Digitalmars-d
I have actually thought of that ;) ... but I tend to think
that only D
users present on this forum are likely to make sense of that
code, and
why.
You can probably wrap this in a nice library solution with an
explicit name.
Hmm, I see your point... otherwise maybe user facing code survive
using a string?

struct S_impl(alias T)
{
void f(ParameterTypeTuple!T p)
{
}
}

template S(string decl)
{
mixin("alias S = S_impl!((" ~ decl ~ ") {});");
}

S!"ref int x, int y" s;
John Colvin via Digitalmars-d
2014-09-30 13:47:21 UTC
Permalink
On Tuesday, 30 September 2014 at 07:10:08 UTC, Manu via
Post by Manu via Digitalmars-d
On 30 September 2014 16:06, Daniel N via Digitalmars-d
Post by Daniel N via Digitalmars-d
Post by Timon Gehr via Digitalmars-d
Post by Walter Bright via Digitalmars-d
You're right that tuples in D cannot contain storage classes
void foo(ref int x){}
alias p=ParameterTypeTuple!foo;
pragma(msg, p); // (ref int)
(But this does not help.)
Well, only if you are sufficiently desperate. ;)
struct S(alias T)
{
void f(ParameterTypeTuple!T p)
{
}
}
S!((ref int x, int y){}) s;
I have actually thought of that ;) ... but I tend to think that
only D
users present on this forum are likely to make sense of that
code, and
why.
Perhaps this might help you a little:

http://code.dlang.org/packages/storageclassutils

sure, it's not as elegant as one would like, but it at least
provides some basic utility.
John Colvin via Digitalmars-d
2014-09-30 13:58:51 UTC
Permalink
Post by deadalnix via Digitalmars-d
On Tuesday, 30 September 2014 at 07:10:08 UTC, Manu via
Post by Manu via Digitalmars-d
On 30 September 2014 16:06, Daniel N via Digitalmars-d
On Monday, 29 September 2014 at 04:23:08 UTC, Timon Gehr
Post by Timon Gehr via Digitalmars-d
Post by Walter Bright via Digitalmars-d
You're right that tuples in D cannot contain storage classes
void foo(ref int x){}
alias p=ParameterTypeTuple!foo;
pragma(msg, p); // (ref int)
(But this does not help.)
Well, only if you are sufficiently desperate. ;)
struct S(alias T)
{
void f(ParameterTypeTuple!T p)
{
}
}
S!((ref int x, int y){}) s;
I have actually thought of that ;) ... but I tend to think
that only D
users present on this forum are likely to make sense of that
code, and
why.
http://code.dlang.org/packages/storageclassutils
sure, it's not as elegant as one would like, but it at least
provides some basic utility.
Also, I just wrote it in a few hours after seeing your post, so
obviously there are plenty of improvements that could be made.
Manu via Digitalmars-d
2014-09-29 21:31:50 UTC
Permalink
On 29 September 2014 12:43, Walter Bright via Digitalmars-d
Post by Walter Bright via Digitalmars-d
Post by Manu via Digitalmars-d
Post by Walter Bright via Digitalmars-d
S!(int, S*)
That's different.
I feel like I have to somehow justify to you guys how meta code works
in D. I have meta code that is no less than 5 layers deep. It's
complex, but at the same time, somehow surprisingly elegant and simple
(this is the nature of D I guess).
If I now assume throughout my meta "pointer means ref", then when I
actually pass a pointer in, the meta can't know if it was meant to be
a ref or not. It results in complex explicit logic to handle at almost
every point due to a loss of information.
You can't call f() with the same syntax anymore (you need an '&')
which is a static if in the meta, you can't use the S* arg in the same
meta (needs a '*') which is another static if. Assignments are
changed, and unexpected indexing mechanics appear. When implementation
logic expects and understands the distinction between pointers and
ref's, this confuses that logic. When I interface between languages
(everything I never do binds to at least C++, and in this case, also
Lua), this complicates the situation.
I can't conflate 2 things that aren't the same. It leads to a lot of
mess in a lot of places.
You're right that tuples in D cannot contain storage classes (and ref is
just one storage class, there's also out and in, etc.).
I know, that's my whole point. I think 'storage class' is the original sin.
in is declared as scope const, or in my fantasy world scope(const(T)).

out is the only interesting storage class (conceptually) I've used. It
is effectively just an alias for ref, but it also encodes some special
non-type information, which is to initialise prior to the call.
Post by Walter Bright via Digitalmars-d
You can use autoref, but I haven't understood why that doesn't work for you.
I'm not writing java or python code here. I consider it a fundamental
quality of a 'systems language', or perhaps just a native language,
that I have the explicit power to produce binary I intend.
I can't have the compiler deciding if something is to be ref or not.
My code is rarely only consumed by other D code. I have lots of cross
language linkage, and also dynamic libraries (which expect a specific
ABI).

auto ref wouldn't work in that situation I gave above anyway...
S!(auto ref T) is no more valid than S!(ref T).
Daniel N via Digitalmars-d
2014-10-11 05:09:08 UTC
Permalink
Post by Walter Bright via Digitalmars-d
Post by Manu via Digitalmars-d
That said, my friend encountered one of my frequently
recurring pain
struct S(T...)
{
void f(T args) {}
}
S!(int, ref S) fail; // <-- no clean way to do this. I need
this very
frequently, and he reached for it too, so I can't be that
weird.
S!(int, S*)
Yet another solution...

http://dpaste.dzfl.pl/e28c55416ce2

... locally you always use ref, but when instantiating another
template you need to forward the original pointer type to the
next level... this way it's possible to avoid an explosion of
"static if":s.

Andrei Alexandrescu via Digitalmars-d
2014-09-29 08:02:43 UTC
Permalink
Post by Manu via Digitalmars-d
I was out of town (was on my phone), and now I'm home with 2 guests,
and we're working together. I can't sit and craft a pile of example
cases until I'm alone and have time to do so. I haven't ignored it,
but I need to find the time to give you what you want.
Thanks, don't feel under any obligation express or implied to follow
through.
Post by Manu via Digitalmars-d
That said, my friend encountered one of my frequently recurring pain
struct S(T...)
{
void f(T args) {}
}
S!(int, ref S) fail; // <-- no clean way to do this. I need this very
frequently, and he reached for it too, so I can't be that weird.
I understand. The short answer to this is D cannot do that and we cannot
afford to change the language to make it do that.

There are two longer answers.

The first longer answer is there are ways to do that if it's a
necessity, as you know and probably did. I agree it's not clean/easy.

The second longer answer is if you do this it means you're trying to
write C++ in D: in C++ ref is part of the type (well... sort of) and
therefore someone coming from C++ and translating the respective designs
into D will find the change frustrating.

I've seen this pattern several times. Most recent was in conjunction
with iterators vs. ranges. There's discussion going on in C++ circles
about adding ranges to C++. One common issue raised by people gravitates
around designs where the choice of using iterators is long foregone, and
would think that replacing iterators with ranges throughout should just
work. Take a look:
http://article.gmane.org/gmane.comp.lib.boost.devel/191978

The reality is algorithms implemented with ranges look different from
algorithms implemented with iterators. Similarly, generic code with D
will look different from generic code with C++. Our hope is, of course,
that all told range-based and D-generics code has advantages going for
it, but the most dramatic of those advantages most certainly won't be
discovered and reaped in designs that are copies of the respective C++ ones.

So you can't be that weird and neither is your friend, but I speculate
that both of you have a solid C++ background :o).

About this situation in particular, yes, ref being part of the type does
help this declaration, but overall as a design decision for the C++
programming value its value is questionable. T& is not a first class
type and that hurts everyone everywhere - you can't create a value of
that types, and virtually all parts of the language have special rules
telling what happens when the actual type is a reference type. I think
Walter made the right call here and we shouldn't change anything.


Andrei
deadalnix via Digitalmars-d
2014-09-29 23:25:49 UTC
Permalink
On Monday, 29 September 2014 at 08:02:45 UTC, Andrei Alexandrescu
Post by Andrei Alexandrescu via Digitalmars-d
I understand. The short answer to this is D cannot do that and
we cannot afford to change the language to make it do that.
There are two longer answers.
I think this is because ref have several conflated meaning:
- "I want to mess up with the argument" (à la swap). This is the
meaning it has right now.
- "Burrowing". Which is the same as previous behavior, except
for classes and delegates, so the whole scope story, as they are
natural "reference types".
- "Do not copy" aka const ref. This one is for performance
reason. It doesn't really matter if a copy is made or not as the
damn thing is const, but one want to avoid expensive copy when
the thing passed down is fat.

Each of them have their own set of cases where you want and do
not want ref.
Manu via Digitalmars-d
2014-09-30 00:32:59 UTC
Permalink
On 30 September 2014 09:25, deadalnix via Digitalmars-d
Post by deadalnix via Digitalmars-d
On Monday, 29 September 2014 at 08:02:45 UTC, Andrei Alexandrescu
Post by Andrei Alexandrescu via Digitalmars-d
I understand. The short answer to this is D cannot do that and we cannot
afford to change the language to make it do that.
There are two longer answers.
- "I want to mess up with the argument" (à la swap). This is the
meaning it has right now.
- "Burrowing". Which is the same as previous behavior, except
for classes and delegates, so the whole scope story, as they are
natural "reference types".
- "Do not copy" aka const ref. This one is for performance
reason. It doesn't really matter if a copy is made or not as the
damn thing is const, but one want to avoid expensive copy when
the thing passed down is fat.
I don't see ref that way at all. I see it so much simpler than that:
ref is a type of pointer. It's effectively T(immutable(*)).
It's uses are emergent from what it is; a good way to pass big things
around in argument lists, or share references to a single instance of
something.
It also offers an advantage due to the nature that the pointer is
immutable; you don't need the pointer syntax baggage (*, &) when
dealing with ref's, which is very convenient in user code.

I think all this 'meaning' is a mistake; all that does is confuse the
matter. You don't attribute such 'meaning' to a normal pointer, it's a
primitive type.
If it's defined by some conceptual rubbish, then when it's attempted
to be used in some situation that doesn't perfectly fit the conceptual
attribution, you find yourself in awkward logical situations with
weird edge cases.
If you just say "it's an immutable pointer, and follows value syntax
semantics for practical reasons", then it's very clear what it does.
It's also very clear what that means to the ABI, and relationships
with other languages.

A similar mistake from C is where 'int' doesn't mean '32bits', it
means some conceptual nonsense that seemed like a good idea at the
time, but the result is, you don't really know what 'int' is, and
everyone reinvents it so it works reliably (typedef int32 something).
Post by deadalnix via Digitalmars-d
Each of them have their own set of cases where you want and do
not want ref.
I agree, cases where ref should be used are non-trivial. It may be a
result of external factors, some user logic, or explicit request from
user code. The fact that it lives outside the type system makes that
whole reality very awkward indeed.
deadalnix via Digitalmars-d
2014-09-30 00:44:57 UTC
Permalink
On Tuesday, 30 September 2014 at 00:33:08 UTC, Manu via
Post by Manu via Digitalmars-d
I don't see ref that way at all. I see it so much simpler than
ref is a type of pointer. It's effectively T(immutable(*)).
It's uses are emergent from what it is; a good way to pass big
things
around in argument lists, or share references to a single
instance of
something.
"I don't agree with your diagnostic. I'm suing ref conforming of
cases 2 and 3 of your diagnostic".

That sounds like a self defeating statement.
Manu via Digitalmars-d
2014-09-30 00:55:41 UTC
Permalink
On 30 September 2014 10:44, deadalnix via Digitalmars-d
Post by deadalnix via Digitalmars-d
On Tuesday, 30 September 2014 at 00:33:08 UTC, Manu via
Post by Manu via Digitalmars-d
ref is a type of pointer. It's effectively T(immutable(*)).
It's uses are emergent from what it is; a good way to pass big things
around in argument lists, or share references to a single instance of
something.
"I don't agree with your diagnostic. I'm suing ref conforming of
cases 2 and 3 of your diagnostic".
That sounds like a self defeating statement.
... huh?

I'm not saying your analysis of use cases are false, I'm just saying
that I felt the reasoning is backwards, and possibly conceptually
limiting. I think the use cases should be emergent of the definition,
not that the definition should be defined by (read: limited to) the
typical uses.

You said ref conflated several meanings, I'm just framing it
differently. I don't think it conflates them, I think they're all good
uses of the tool.
Manu via Digitalmars-d
2014-09-30 00:03:27 UTC
Permalink
On 29 September 2014 18:02, Andrei Alexandrescu via Digitalmars-d
Post by Andrei Alexandrescu via Digitalmars-d
Post by Manu via Digitalmars-d
I was out of town (was on my phone), and now I'm home with 2 guests,
and we're working together. I can't sit and craft a pile of example
cases until I'm alone and have time to do so. I haven't ignored it,
but I need to find the time to give you what you want.
Thanks, don't feel under any obligation express or implied to follow
through.
Well, after your unflattering caricature, I intend to take the time to
try and craft some good examples of various cases.
I also have a strong suspicion (valid or otherwise) that there is
literally nothing I could do to convince you, no matter the quality of
my evidence.
There will always be the "it's too late", "it's a breaking change!",
defence. Unless you and Walter are both invested in a change, I'm
pretty certain it's impossible.
So, your observation may well have been fair. Perhaps that is a true
pattern of mine, but I don't think 'I'm just a dick like that', I
think there is reason behind it.

There is certainly time in the past where I've gone to great lengths
to try and prove my cases. In more recent times, and as my (few)
outstanding daily problems with the language have become more
controversial, I've become fairly convinced that such effort on my
part is just wasting my time, and only serves as a process for me to
highlight my frustrations to myself.
You obviously don't approach programming the same way I do, I can't
appeal to you.

Upon closer examination of my mood and feelings interacting with this
forum recently, I realise that this is a significant factor in my
interaction, although perhaps being mostly subconscious. Ie, I'm more
likely to disappear and try and get some work done, than spend the
time trying to win a futile argument.
I did try very hard to extract myself from this NG, but it seems it's
very hard to resist! When topics I care about appear, or when I'm just
wildly frustrated about something that agitates me on a daily basis, I
keep coming back! >_<

Point is, I feel like I've been an engaged member of this community
for a long time now, but I feel like I have practically nothing more
to add.
My experience and industry use cases are no longer interesting,
they've presented any value that they had already. I get the feeling
I've affected all of the change that I am capable of, and I need to
decide if I'm happy with D as is, or not, rather than maintain my
ambient frustration that it's sitting at 99%, with the last 1%
unattainable to me unless I fork the language for my own use :/

Trouble for me is, I've invested so much time now. I find myself in a
very awkward situation where I'm too far in... I can't go back to C++,
but the situation is such that I'll never convince the rest of my
colleagues to jump on board. Believe it or not, I'm a lot more
reasonable (and patient) than most.


I'm quite certain that these days, that thing that Scott Myers
mentioned about the C++ committee; where practically anything can be
used to justify anything, so getting changes approved is mostly an
appeal to the emotional state of the people in charge, is already
present in D.
Post by Andrei Alexandrescu via Digitalmars-d
Post by Manu via Digitalmars-d
That said, my friend encountered one of my frequently recurring pain
struct S(T...)
{
void f(T args) {}
}
S!(int, ref S) fail; // <-- no clean way to do this. I need this very
frequently, and he reached for it too, so I can't be that weird.
I understand. The short answer to this is D cannot do that and we cannot
afford to change the language to make it do that.
There are two longer answers.
The first longer answer is there are ways to do that if it's a necessity, as
you know and probably did. I agree it's not clean/easy.
The second longer answer is if you do this it means you're trying to write
C++ in D: in C++ ref is part of the type (well... sort of) and therefore
someone coming from C++ and translating the respective designs into D will
find the change frustrating.
What about someone *interacting* with C++, not just 'coming from'?
I also interact with Lua and C# a lot. Lua is ref by nature, C# also
uses ref a lot.
In 6 years, the only time I've found an opportunity to use D in
isolation, is a small webserver in vibe.d.

My opinion is that ref is agreed to be critically important, as
demonstrated to me by all the other languages I use.
Post by Andrei Alexandrescu via Digitalmars-d
I've seen this pattern several times. Most recent was in conjunction with
iterators vs. ranges. There's discussion going on in C++ circles about
adding ranges to C++. One common issue raised by people gravitates around
designs where the choice of using iterators is long foregone, and would
think that replacing iterators with ranges throughout should just work. Take
a look: http://article.gmane.org/gmane.comp.lib.boost.devel/191978
The reality is algorithms implemented with ranges look different from
algorithms implemented with iterators. Similarly, generic code with D will
look different from generic code with C++. Our hope is, of course, that all
told range-based and D-generics code has advantages going for it, but the
most dramatic of those advantages most certainly won't be discovered and
reaped in designs that are copies of the respective C++ ones.
I'm not writing ranges or generics. Those things are fine, and they
are also self-contained within D.

I obviously use D completely differently to you, and value an entirely
different set of features.
Trust me, I'm not copying designs from C++, I use D specifically for
use cases where C++ is utterly impotent.

For me, 80% of generic code is about binding things (and a further 15%
is serialisation). As said before, C++, C#, Lua; it is super-common in
gamedev for the ecosystem to be made of (at least) 3 significant
languages.
Bindings are traditionally cumbersome, brittle, and one of the biggest
hassles, frustrations, sources of time/productivity loss using C++ for
game tech, is maintaining such bindings.
D is the only language I know powerful enough to automate that mess.
This is why D was attractive to me, and it is still why D is
attractive to me. I _can't escape ref_ when writing this sort of code,
and I write a lot of it.
I don't think it's at all fair to say "I'm using D wrong", or "stop
trying to be a C++ user in D", that's not the case here..


It's also a bit strange to hear you say all this when the key focus of
development at the moment is "C++, GC", "C++, GC".
Post by Andrei Alexandrescu via Digitalmars-d
So you can't be that weird and neither is your friend, but I speculate that
both of you have a solid C++ background :o).
No, I think we're pretty normal, and populace.
Post by Andrei Alexandrescu via Digitalmars-d
About this situation in particular, yes, ref being part of the type does
help this declaration, but overall as a design decision for the C++
programming value its value is questionable. T& is not a first class type
and that hurts everyone everywhere - you can't create a value of that types,
and virtually all parts of the language have special rules telling what
happens when the actual type is a reference type.
You're saying T& is not a first-class type and hurts everyone
everywhere, I agree, and D is no different, except it is even worse.
I'd like to explore ref as a first-class type. Do you have reason to
believe that's completely unworkable? I'm not suggesting to clone
C++'s design. I think there's room for a design that improves on C++.
Post by Andrei Alexandrescu via Digitalmars-d
I think Walter made the right call here and we shouldn't change anything.
This pretty much validates my opening suspicion. What do I do? STFU
and deal with it?

I suspect you'd think very differently if you worked with the people I
do, have the conversations with colleagues I have, and struggled with
the code I have for as long as I have. I dread to think how much ref
has cost me in $/hr terms.
You make ref sound worthless. Is there actually any situation where
you find value in ref?
If it's meaningless to you, and so 'un-D', why not let it have useful
meaning for those who find it significant, important even? :/
bearophile via Digitalmars-d
2014-09-30 00:29:24 UTC
Permalink
Post by Manu via Digitalmars-d
Trouble for me is, I've invested so much time now.
If your think your choice was the wrong one, don't invest even
more in something you think is wasted effort. Otherwise if you
like D, then try to improve it from the inside, writing
dmd/Phobos/druntime pull requests, instead of doing it from the
outside.
Post by Manu via Digitalmars-d
I find myself in a very awkward situation where I'm too far
in... I can't go back to C++,
Have you taken a look at Rust?

Bye,
bearophile
Manu via Digitalmars-d
2014-09-30 01:02:36 UTC
Permalink
On 30 September 2014 10:29, bearophile via Digitalmars-d
Post by Manu via Digitalmars-d
Trouble for me is, I've invested so much time now.
If your think your choice was the wrong one, don't invest even more in
something you think is wasted effort.
It's not to say it's the 'wrong choice'. I'm definitely an early
adopter by nature, and in the case of D, I backed the only horse than
I found realistic to solve my industry's chronic abuse.

Perhaps I was being unrealistic when I thought I'd be able to get more
colleagues on board than I have?
It's just super annoying when the things that send them running are so
bloody trivial! (although, apparently important)
In the case of ref, I can't think of any programmers that I've
introduced to D that haven't complained about ref within their first
hour or 2 of interaction.
I certainly hit the wall with ref within hours of contact with D, and
6 years later, it's exactly as bad as it was within those first few
hours.

The biggest issue inhibiting people getting on board though, by far,
is the debugging experience. #1 issue, hands down.
Practical issues > language issues.
Otherwise if you like D, then try to
improve it from the inside, writing dmd/Phobos/druntime pull requests,
instead of doing it from the outside.
I'd never have my PR's pulled.

I'm also not as interested in language development as it might appear.
I'm interested in writing code and getting work done, and minimising
friction.
I'm interested in more efficient ways to get my work done, and also
opportunities to write more efficient code, but that doesn't mean I
want to stop doing my work and instead work on HOW I do my work.
Post by Manu via Digitalmars-d
I find myself in a very awkward situation where I'm too far
in... I can't go back to C++,
Have you taken a look at Rust?
Yeah, it's just too weird for me to find realistic. It also more
rigidly asserts it's opinions on you, which are in many cases, not
optimal. Rust typically shows a performance disadvantage, which I care
about.
Perhaps more importantly, for practical reasons, I can't ever imagine
convincing a studio of hundreds of programmers to switch to rust. C++
programmers can learn D by osmosis, but staff retraining burden to
move to Rust seems completely unrealistic to me.
bearophile via Digitalmars-d
2014-09-30 01:29:13 UTC
Permalink
Post by Manu via Digitalmars-d
In the case of ref, I can't think of any programmers that I've
introduced to D that haven't complained about ref within their
first
hour or 2 of interaction.
Most of the times I have no problems with D ref. Perhaps you are
trying to use D too much like you use C++.
Post by Manu via Digitalmars-d
I'd never have my PR's pulled.
"Working from the inside" also means writing patches that have a
sufficiently high probability of getting pulled after some
changes and improvements.
Post by Manu via Digitalmars-d
I'm also not as interested in language development as it might
appear.
Yet you discuss about language design all the time. I've
discussed a lot about D, but often the topics that I have a bit
more reliable opinions on are only the ones where I have direct
experience (like Ranges). That's why I have suggested to write
patches. With them you may be able to refine your own opinions
about D.
Post by Manu via Digitalmars-d
Yeah, it's just too weird for me to find realistic.
I don't see it so much weird. It's a lot like a C crossed with
the most obvious and simplified parts of ML, plus memory areas
tracking, and small bits from Erlang and C++ and little more. It
contains only small amounts of OOP, exceptions and GC, and
currently its generics are still first order only, so it looks
simple and coesive. I think an average programmer can learn it
enough to be productive for single-thread user code (not for
library code) in two or three months or less. But I think you
should not even try to use it as you use Ada (unlike D).

Bye,
bearophile
ixid via Digitalmars-d
2014-09-30 09:13:00 UTC
Permalink
Otherwise if you like D, then try to
improve it from the inside, writing dmd/Phobos/druntime pull
requests,
instead of doing it from the outside.
I'd never have my PR's pulled.
I'm also not as interested in language development as it might
appear.
I'm interested in writing code and getting work done, and
minimising
friction.
I'm interested in more efficient ways to get my work done, and
also
opportunities to write more efficient code, but that doesn't
mean I
want to stop doing my work and instead work on HOW I do my work.
Post by Manu via Digitalmars-d
I find myself in a very awkward situation where I'm too far
in... I can't go back to C++,
Have you taken a look at Rust?
Yeah, it's just too weird for me to find realistic. It also more
rigidly asserts it's opinions on you, which are in many cases,
not
optimal. Rust typically shows a performance disadvantage, which
I care
about.
Perhaps more importantly, for practical reasons, I can't ever
imagine
convincing a studio of hundreds of programmers to switch to
rust. C++
programmers can learn D by osmosis, but staff retraining burden
to
move to Rust seems completely unrealistic to me.
You're a vital alternative voice, please try to stick with us.
The interest your talk and presence generated for D was huge and
the games industry should be a major target for D. I also suspect
Andrei is doing a major project at the moment which is making him
uncharacteristically harsh in his responses, from his POV he's
doing something massive to help D while the community has gone
into a negative mode.

I'm surprised at the lack of importance insufficient control over
ref seems to be given, though my understanding is pretty basic.
It feels a little similar to inlining.

It might be an effective argument to give bearophile some of the
problematic code and see what his idiomatic D version looks like
and if what you're after is elegantly achievable. Clunky code
would seem like a stronger argument at this point after many
words have been exchanged. I think people are not really aware of
the issues and if they believe the things are truly achievable
with the language as it stands they can demonstrate it, with
benchmarks etc.
Sad panda via Digitalmars-d
2014-09-30 09:52:44 UTC
Permalink
Post by ixid via Digitalmars-d
Otherwise if you like D, then try to
improve it from the inside, writing dmd/Phobos/druntime pull
requests,
instead of doing it from the outside.
I'd never have my PR's pulled.
I'm also not as interested in language development as it might
appear.
I'm interested in writing code and getting work done, and
minimising
friction.
I'm interested in more efficient ways to get my work done, and
also
opportunities to write more efficient code, but that doesn't
mean I
want to stop doing my work and instead work on HOW I do my
work.
Post by Manu via Digitalmars-d
I find myself in a very awkward situation where I'm too far
in... I can't go back to C++,
Have you taken a look at Rust?
Yeah, it's just too weird for me to find realistic. It also
more
rigidly asserts it's opinions on you, which are in many cases,
not
optimal. Rust typically shows a performance disadvantage,
which I care
about.
Perhaps more importantly, for practical reasons, I can't ever
imagine
convincing a studio of hundreds of programmers to switch to
rust. C++
programmers can learn D by osmosis, but staff retraining
burden to
move to Rust seems completely unrealistic to me.
You're a vital alternative voice, please try to stick with us.
The interest your talk and presence generated for D was huge
and the games industry should be a major target for D. I also
suspect Andrei is doing a major project at the moment which is
making him uncharacteristically harsh in his responses, from
his POV he's doing something massive to help D while the
community has gone into a negative mode.
+10 <3

Pardon the pandering, but I actually see Andrei Walter and you as
making up the trinity of idealism, codegen pragmatism and
industry use respectively.
Andrei Alexandrescu via Digitalmars-d
2014-09-30 12:30:26 UTC
Permalink
from his POV he's doing something massive to help D while the community
has gone into a negative mode.
Indeed that's an accurate characterization of my POV. -- Andrei
bearophile via Digitalmars-d
2014-09-30 12:58:35 UTC
Permalink
Post by ixid via Digitalmars-d
It might be an effective argument to give bearophile some of
the problematic code and see what his idiomatic D version looks
like and if what you're after is elegantly achievable.
Manu is quite more expert than me in the kind of code he writes.
So what you propose is just going to show my limits/ignorance...

Bye,
bearophile
bachmeier via Digitalmars-d
2014-09-30 14:38:48 UTC
Permalink
I also suspect Andrei is doing a major project at the moment
which is making him uncharacteristically harsh in his
responses, from his POV he's doing something massive to help D
while the community has gone into a negative mode.
There are only two kinds of languages: the ones people complain
about and the ones nobody uses. The "negative mode" will likely
become more negative as D continues to grow in popularity.
Walter Bright via Digitalmars-d
2014-09-30 22:34:25 UTC
Permalink
Post by ixid via Digitalmars-d
It might be an effective argument to give bearophile some of the
problematic code and see what his idiomatic D version looks like and if
what you're after is elegantly achievable.
Or heck, ask the n.g. Lots of people here are very creative in their
solutions to various D problems.

You've shown me code that is essentially "I want to do XYZ with ref" but
it's still at a low level - step up a layer or two.
Walter Bright via Digitalmars-d
2014-09-30 22:34:25 UTC
Permalink
Post by ixid via Digitalmars-d
It might be an effective argument to give bearophile some of the
problematic code and see what his idiomatic D version looks like and if
what you're after is elegantly achievable.
Or heck, ask the n.g. Lots of people here are very creative in their
solutions to various D problems.

You've shown me code that is essentially "I want to do XYZ with ref" but
it's still at a low level - step up a layer or two.
via Digitalmars-d
2014-09-30 11:45:35 UTC
Permalink
On Tuesday, 30 September 2014 at 00:03:40 UTC, Manu via
Post by Manu via Digitalmars-d
I've affected all of the change that I am capable of, and I
need to
decide if I'm happy with D as is, or not, rather than maintain
my
ambient frustration that it's sitting at 99%, with the last 1%
unattainable to me unless I fork the language for my own use :/
Try to add some rudimentary support for ref in the compiler
yourself?

It is not a big semantic change (it is only pointers after all).
Wyatt via Digitalmars-d
2014-09-30 13:08:55 UTC
Permalink
On Tuesday, 30 September 2014 at 11:45:37 UTC, Ola Fosheim
Post by via Digitalmars-d
(it is only pointers after all).
Semi-tangential to this discussion, but this bit hits on
something I've been thinking for a little while... ref is, at its
core, trying to be a non-nullable pointer. And I get the strong
sense that it's failing at it.

-Wyatt
via Digitalmars-d
2014-09-30 13:20:07 UTC
Permalink
Post by Wyatt via Digitalmars-d
Semi-tangential to this discussion, but this bit hits on
something I've been thinking for a little while... ref is, at
its core, trying to be a non-nullable pointer. And I get the
strong sense that it's failing at it.
That's a very perceptive observation. :)

So you also get a sense that D could have non-nullable reference
semantics for a ref-type?

And possibly extend it to be assignable as a
const-mutable-reference (pointer) using a dedicated operator or
using the more ugly "&" notation or a cast?

In Simula you had two assignment operators that made the the
ref/value issue visually distinct.

a :- b; //a points to b
x := y; // x is assigned value of y

It is analogue to the "==" vs "is" distinction that many
languages make. I think it makes sense to have that visual
distinction. Makes code easier to browse.
Wyatt via Digitalmars-d
2014-09-30 13:08:55 UTC
Permalink
On Tuesday, 30 September 2014 at 11:45:37 UTC, Ola Fosheim
Post by via Digitalmars-d
(it is only pointers after all).
Semi-tangential to this discussion, but this bit hits on
something I've been thinking for a little while... ref is, at its
core, trying to be a non-nullable pointer. And I get the strong
sense that it's failing at it.

-Wyatt
via Digitalmars-d
2014-09-30 11:45:35 UTC
Permalink
On Tuesday, 30 September 2014 at 00:03:40 UTC, Manu via
Post by Manu via Digitalmars-d
I've affected all of the change that I am capable of, and I
need to
decide if I'm happy with D as is, or not, rather than maintain
my
ambient frustration that it's sitting at 99%, with the last 1%
unattainable to me unless I fork the language for my own use :/
Try to add some rudimentary support for ref in the compiler
yourself?

It is not a big semantic change (it is only pointers after all).
Continue reading on narkive:
Loading...