Skip to content

Optional (?) in function signature interferes with inference of completely unrelated type #37163

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Herriau opened this issue Mar 2, 2020 · 2 comments · Fixed by #37261
Closed
Assignees
Labels
Bug A bug in TypeScript Fix Available A PR has been opened for this issue

Comments

@Herriau
Copy link

Herriau commented Mar 2, 2020

TypeScript Version: 3.8.3

Search Terms:
wrong inferred generic type infer operator optional argument signature interference

Expected behavior:

In the example below CType should have been resolved to number.

Actual behavior:

CType was resolved to number | boolean | undefined.

Interestingly when I take the optional marker ? out of the r parameter in the signature of either SomeAbstractClass.foo or SomeAbstractClass.bar, then CType is inferred properly even though such a change seems like it would play absolutely no role in the inference of CType.

Related Issues:

Code

declare class SomeBaseClass {
  set<K extends keyof this>(key: K, value: this[K]): this[K];
}

abstract class SomeAbstractClass<C, M, R> extends SomeBaseClass {
  foo!: (r?: R) => void;
  bar!: (r?: any) => void;
  abstract baz(c: C): Promise<M>;
}

class SomeClass extends SomeAbstractClass<number, string, boolean> {
  async baz(context: number): Promise<string> {
    return `${context}`;
  }
}

type CType<T> = T extends SomeAbstractClass<infer C, any, any> ? C : never;
type MType<T> = T extends SomeAbstractClass<any, infer M, any> ? M : never;
type RType<T> = T extends SomeAbstractClass<any, any, infer R> ? R : never;

type SomeClassC = CType<SomeClass>; // = number | boolean | undefined ✗ (expected number)
type SomeClassM = MType<SomeClass>; // = string ✓
type SomeClassR = RType<SomeClass>; // = boolean ✓
Output
"use strict";
class SomeAbstractClass extends SomeBaseClass {
}
class SomeClass extends SomeAbstractClass {
    async baz(context) {
        return `${context}`;
    }
}
Compiler Options
{
  "compilerOptions": {
    "noImplicitAny": true,
    "strictNullChecks": true,
    "strictFunctionTypes": true,
    "strictPropertyInitialization": true,
    "strictBindCallApply": true,
    "noImplicitThis": true,
    "noImplicitReturns": true,
    "useDefineForClassFields": false,
    "alwaysStrict": true,
    "allowUnreachableCode": false,
    "allowUnusedLabels": false,
    "downlevelIteration": false,
    "noEmitHelpers": false,
    "noLib": false,
    "noStrictGenericChecks": false,
    "noUnusedLocals": false,
    "noUnusedParameters": false,
    "esModuleInterop": true,
    "preserveConstEnums": false,
    "removeComments": false,
    "skipLibCheck": false,
    "checkJs": false,
    "allowJs": false,
    "declaration": true,
    "experimentalDecorators": false,
    "emitDecoratorMetadata": false,
    "target": "ES2017",
    "module": "ESNext"
  }
}

Playground Link: Provided

@RyanCavanaugh RyanCavanaugh added the Bug A bug in TypeScript label Mar 4, 2020
@RyanCavanaugh RyanCavanaugh added this to the TypeScript 3.9.0 milestone Mar 4, 2020
@RyanCavanaugh
Copy link
Member

RyanCavanaugh commented Mar 4, 2020

What is even happening here...

interface BaseType<T1, T2>  {
  set<K extends keyof this>(key: K, value: this[K]): this[K];

  useT1(c: T1): void;
  useT2(r?: T2): void;
  unrelatedButSomehowRelevant(r?: any): void;
}

interface InheritedType extends BaseType<number, boolean> {
  // This declaration shouldn't do anything...
  useT1(_: number): void
}

// Structural expansion of InheritedType
interface StructuralVersion  {
  set<K extends keyof this>(key: K, value: this[K]): this[K];

  useT1(c: number): void;
  useT2(r?: boolean): void;
  unrelatedButSomehowRelevant(r?: any): void;
}

type GetT1<T> = T extends BaseType<infer U, any> ? U : never;

type T1_of_InheritedType = GetT1<InheritedType>; // = number | boolean | undefined ✗ (expected number)

// S2: number | boolean | "useT1" | "useT2" | "unrelatedButSomehowRelevant" | undefined
type S2 = GetT1<StructuralVersion>; // = number | boolean | undefined ✗ (expected number)

@ahejlsberg
Copy link
Member

This is definitely a strange one, but it's an easy fix.

Here's what's happening: During inference we obtain base signatures (using getBaseSignature) in which we substitute constraints for type parameters declared in the signatures. This is a fine thing to do for an inference source signature, but not so much for an inference target signature because it may, in rare cases, create unintended new inference targets. That's what's happening in the examples above. Specifically, in an inference target, the this[K] type in the set method becomes BaseType<T1, T2>[keyof BaseType<T1, T2>] which becomes a union of the function types of the methods in the class. We then proceed to infer from each method type in the source to each method type in the target, and we get meaningless results.

The simple fix is to use the erased signature in the target. In an erased signature we substitute any for type parameters declared in the signature. That causes this[K] to become any, meaning we're not creating unintended new inferences targets.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug A bug in TypeScript Fix Available A PR has been opened for this issue
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants