-
Notifications
You must be signed in to change notification settings - Fork 205
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
??=
has an oddity with inference.
#3031
Comments
It's probably not wrong, but it is slightly annoying. The reason The context type of the second operand of If you write The context type of If you write That comes back to bite us with
So, generally, for assignments (composite or not) in a context where the value is also used, will use the type of the assigned variable as context type, and then worry later about whether that type can be used in the outer context. Is it something we can improve? We could use the a greatest(ish) lower bound of the context types. I'd probably only do that if one type is a subtype of the other. But then the result should be valid for all the contexts. (A kind of context promotion?) (I'm a little surprised that the context type doesn't seem to be the parameter type of the setter, for a getter/setter pair, instead it's just the type of the getter. Even when the getter has type |
Just happened to browse here and thought I'd pitch in my two cents. Irhn gave a fantastic explanation of the compiler's reasoning behind how the "context type" of parameters is determined and how they're inferred in their relevant "contexts". I'd like to take all of that and wrap it in one simple example that illustrates this topic very succinctly. Imagine your code, int? value;
int sum(List<int> values) {
return value ??= values.fold(0, (a, b) => a + b);
} rewritten as such, int? value;
int sum(List<int> values) {
value ??= values.fold(0, (a, b) => a + b);
return value;
} This makes it a bit clearer how dart is interpreting your code. What is/should be the type of I have a hunch this is similar to the problem with accessing nullable class members. Imagine a scenario like this: class StringPrinter {
String? s;
StringPrinter(this.s);
void printLength() {
if (s != null) {
print(s.length); // errors
}
}
} The compiler will complain that "The property 'length' can't be unconditionally accessed because the receiver can be 'null'. Try making the access conditional (using '?.') or adding a null check to the target ('!')." Basically, this means that even though you've checked that Hopefully this explanation makes sense to you (and that it's accurate, heh). |
I think the treatment of the context type of an assignment is relevant here. For example: X showContextType<X>(Object? o) {
print(X);
return o as X;
}
void main() {
dynamic d;
int i = d = showContextType(0); // Prints 'dynamic'.
} As we can see, the right hand side of the assignment So we're getting a less precise static typing because we allow You could say that this is unnecessary, we should just reorder the code and use We could allow the context type schema of the assignment (here: In general, we could consider using a context type schema for @skylon07, you mention the fact that So let's consider an example where we use a local variable to ensure that we faithfully reproduce the semantics of int intId(int i) => i; // Used to give `fold` the context type `int`.
int sum(List<int> values) {
var v = value;
return v != null ? v : value = intId(values.fold(0, (a, b) => a + b));
} This illustrates that we could use the improved context type schema for assignments (here: @stereotype441, WDYT? This would be a breaking change. Also, I mentioned some operations on type schemas that may only be well defined for types (for instance, taking the greatest lower bound). Do you see that as a significant issue? On the other hand, we're talking about changing a context type schema to a subtype schema, and doing that only in the case where an assignment occurs as a subterm of another expression. Those scenarios might not be very breaking in practice. If it isn't too breaking then it seems to be doing "the right thing", and the |
@lrhn said:
@eernstg made a similar suggestion:
Greatest lower bound is fully defined for type schemas. The definition is here: https://github.com/dart-lang/language/blob/main/resources/type-system/inference.md#upper-bound. So no worries about this change not being well-defined. (Thanks @leafpetersen for pointing this out to me when I couldn't find it back in November 😃) I don't see any issue with these proposals. Personally I would probably go with Erik's suggestion of always using greatest lower bound (rather than Lasse's more conservative suggestion of only doing it when one type context is a subtype of the other), just because it feels more principled to me. I'm guessing the reason for Lasse's more conservative suggestion is because our GLB algorithm isn't perfect, and doesn't actually produce the greatest bound. But I suspect that it won't come up very often, and besides, since context types are only a suggestion, even if the context is occasionally more specific than it should be, it probably will be extremely rare for the error to actually create a problem. It certainly would be worth prototyping the change and running it through the internal codebase to see if it breaks anything! |
Yes, I'm very worried about what GLB can come up with. More than LUB. If it becomes Just consider what would happen with |
Right, but since type schemas are only suggestions, not required types, this would not be a problem at all in your example, because there are no types to be inferred. The type of the list literal Or am I missing something? |
No, that example does work. We need inference to use the context type for something. List<num> x;
List<int> y;
x = y = [1, 2, 3].fold([], (x, y) => x..add(y)); A little harder to get to, but still doable. It type-checks today because we infer If we tried to GLB Context type schemas are only suggestions during bottom-up inferencing, we don't complain if the static type doesn't match the context type (in most cases), but they are dictates during top-down inference. |
I think we have two separate choices on the table here:
The main point I was making was that we should focus on plain assignments. That is, we should allow the context type schema of If we have that then the treatment of Other forms like So that was the first question. The second one is about using GLB or something more conservative. I tend to prefer using GLB, but it would of course be nice to have some concrete data about the breakage, if any. Note that this example would continue to work, because List<num> x;
List<int> y;
x = y = [1, 2, 3].fold([], (x, y) => x..add(y)); but this one would fail (because List<num> x;
Iterable<int> y;
x = y = [1, 2, 3].fold([], (x, y) => x..add(y)); However, this example fails already today because |
I honestly don't know what's happening here, but somehow, when applying
??=
to a nullable variable with an inferred generic method (in this case,.fold
), it infers the type to be nullable when it obviously is not. See below:The text was updated successfully, but these errors were encountered: