In TypeScript, I have a function that accepts a generic parameter with constraints:
function f1<U extends string>(param: U): U {
return param;
}
const a1 = f1('hello');
// a1's type is 'hello' -- Great!
Now, I'm trying to make it so you can optionally add another type as part of the return type. However, when I do so, I must supply a default parameter for the U type. This makes it so TypeScript stops inferring the value of U and uses the default type that I supply:
function f2<T = never, U extends string = string>(param: U): U | T {
return param;
}
const b1 = f2('hello');
// b1's type is 'hello' -- Great!
const b2 = f2<boolean>('hello');
// b2's type is string | boolean -- Terrible: I want the type to be 'hello' | boolean.
const b3 = f2<boolean, 'hello'>('hello');
// b3's type is 'hello' | boolean -- Poor: The type is correct but API is redundant.
So my question is, is there a way I can have TypeScript keep inferring the type from the parameter? I don't want to supply a default type for U, I always want TypeScript to infer that value. Pseudo-code that shows full API how I'd like:
function f3<T = never, U extends string = infer>(param: U): U | T {
return param;
}
const c1 = f3('hello');
// c1's type is 'hello' -- Great!
const c2 = f3<boolean>('hello');
// c2's type is 'hello' | boolean -- Great!
Unfortunately this is not possible. There is a PR to add partial inference using the _
sigil, but it has been inactive for quite a while.
The only solution is to use function currying to get this behavior, although it is not ideal:
function f2<T = never>() {
return function <U extends string = string>(param: U): U | T {
return param;
}
}
const b2 = f2<boolean>()('hello');
// b2's type is string | 'hello'
Playground Link
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With