Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why can't TypeScript infer a generic type when it is in a nested object?

Tags:

typescript

I've got a type that TypeScript can't infer the generic of.

interface Foo<A> {
    [name: string] : {
        foo : A
    }
}

function makeFoo<A>(foo: Foo<A>) : Foo<A>{
    return foo
}

// Works fine when manually specifying the types
const manuallyTyped : Foo<string | number> = {
    a: {
        foo: '1'
    },
    b: {
        foo: 3
    }
}

// ERROR, Can't infer type as Foo<string | number>
makeFoo({
    a: {
        foo: '1'
    },
    b: {
        foo: 3
    }
})

Originally, I was using the type below but I wanted to make the values of the object objects themselves. Inference works just fine when the indexed signature is flat.

interface FlatFoo<B> {
    [name: string] : B
}

function makeFlatFoo<B>(bar: FlatFoo<B>): FlatFoo<B>{
    return bar
}

// Correctly has type FlatFoo<string | number>
const inferred = makeBar({
    a: 'a',
    b: 2
})

Does anyone have an explanation and/or a recommendation for getting this to work?

like image 515
Anthony Naddeo Avatar asked Sep 13 '18 22:09

Anthony Naddeo


1 Answers

This is a similar problem as in this question and this question. When TypeScript makes multiple covariant inferences for the same type parameter (in the first example, number and string for A), it tries to pick one of them that is a supertype of the others; it does not infer a union except in the special case where the inferences are literal types of the same primitive type. If TypeScript appears to infer a union type in other cases, it is because some other language feature is at work. In the case of makeFlatFoo, that feature is the implicit index signature generation for an object literal type, which takes the union of the types of the properties a and b, which is string | number. string | number is matched against B and you get a single inference of string | number for B and everything works. However, in makeFoo, the return type of the implicit index signature is Foo<string> | Foo<number>. When this is matched against Foo<A>, the union gets broken up and you get two different inferences string and number for A.

While the following example based on your answer compiles without error:

function makeFoo<A, F extends Foo<A>>(foo: F) : F{
    return foo
}

const result = makeFoo({
    a: {
        foo: '1'
    },
    b: {
        foo: 3
    }
});

you'll see that A is {} and the type of result is { a: { foo: string; }; b: { foo: number; }; }, so you haven't succeeded in converting the object to a Foo<T> type. Instead, you could use a type parameter FA to capture the return type of the implicit index signature and then use a distributive conditional type to pull out the actual types of the foo property, as in this answer:

interface FlatFoo<FA> { 
    [name: string]: FA;
}
type FooPropTypes<FA> = FA extends { foo: infer A } ? A : never;
function makeFoo<FA extends {foo: unknown}>(foo: FlatFoo<FA>) : Foo<FooPropTypes<FA>> {
    return <any>foo
}
like image 64
Matt McCutchen Avatar answered Nov 05 '22 21:11

Matt McCutchen