Skip to content

Add digit and other character classes to template literal types. #46674

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
5 tasks done
everett1992 opened this issue Nov 3, 2021 · 10 comments
Open
5 tasks done

Add digit and other character classes to template literal types. #46674

everett1992 opened this issue Nov 3, 2021 · 10 comments
Labels
Awaiting More Feedback This means we'd like to hear from more people who would be helped by this feature Suggestion An idea for TypeScript

Comments

@everett1992
Copy link

Suggestion

πŸ” Search Terms

  • template literal types
  • ${number}

βœ… Viability Checklist

My suggestion meets these guidelines:

  • This wouldn't be a breaking change in existing TypeScript/JavaScript code
  • This wouldn't change the runtime behavior of existing JavaScript code
  • This could be implemented without emitting different JS based on the types of the expressions
  • This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, new syntax sugar for JS, etc.)
  • This feature would agree with the rest of TypeScript's Design Goals.

⭐ Suggestion

Typescript's template literal types support some built-in non-string types including number. These types are not documented anywhere I can find, certainly not the Template Literal Types page. ${number} matches any string that Number can parse, which includes floating points, hex, binary and scientific notation.

I think typescript users will use ${number} to incorrectly define types that should only accept decimals. I recommend adding Digits and other common character classes1

πŸ“ƒ Motivating Example

Using ${number}px seems like an interesting way to define api's that require css pixel values like 10px but TypeScript's definition of ${number} allows unexpected values like 0x1px which aren't supported by css. I can find that exact example in the typescript codebase

let pixels: `${number}px`

pixels = '100px'

// oh no
pixels = '0x1px'
pixels = '0b1px'
pixels = '1e21px'

I don't expect TypeScript to understand CSS syntax, but the behavior of ${number} leads to allowing values that are not valid css units. I think ${number} is generally not the behavior that people expect and it will be miss used (especially because it's not well documented).

πŸ’» Use Cases

Refining the current ${number} usecase.

Footnotes

  1. Maybe using POSIX character classes as a starting point ↩

@andrewbranch
Copy link
Member

These types are not documented anywhere I can find

They’re just https://www.typescriptlang.org/docs/handbook/2/everyday-types.html#the-primitives-string-number-and-boolean

@andrewbranch andrewbranch added Awaiting More Feedback This means we'd like to hear from more people who would be helped by this feature Suggestion An idea for TypeScript labels Nov 3, 2021
@jcalz
Copy link
Contributor

jcalz commented Nov 4, 2021

@andrewbranch but the behavior of `${number}` isn't really documented there, is it? It's not sufficient to say that it has something to do with number, since there are apparently two completely different ways of interpreting it.

I am in the apparent minority of people who think that it should correspond to those values `${n}` where n is of type number (that's what the syntax implies, doesn't it? doesn't it?! 😭). But what it actually corresponds to those strings s where +s is a finite number (which just plain old has nothing to do with template literals, right? right? 😡).

This isn't spelled out anywhere that I know of other than GitHub issues. Oh, yeah, see #41893 where I ranted about calmly discussed this before.

@everett1992
Copy link
Author

everett1992 commented Nov 4, 2021

@andrewbranch 1_000 can be assigned to number but '1_000' cannot be assigned to ${number}.There is a similar caveat for bigint and 1n. I don't think that documentation covers template literal types.

@MartinJohns
Copy link
Contributor

MartinJohns commented Nov 4, 2021

Somewhat related: #46109. Basically `${number}` is whatever JavaScript can parse as a number.

@andrewbranch
Copy link
Member

Yeah, I was probably being unhelpfully brief, but I intended to point out that there’s no such thing as β€œcharacter classes” in TypeScript so anything you want to put in a template literal type hole must be a first-class type, so I think this suggestion might be a bigger can of worms than you may have realized (and/or actually be a duplicate of issues that have nothing to do with template literal types, things asking for int types or other more restrictive number subtypes).

@fatcerberus
Copy link

fatcerberus commented Nov 5, 2021

so anything you want to put in a template literal type hole must be a first-class type

Admittedly, it does seem a bit odd that TS wants to see first-class types in a position where it's actually going to match it against a specific string representation of said type. Template literal types don't really have much to do with actual template literals at the end of the day, they are effectively just a poor man's regex engine piggybacking on the type system.

@jcalz
Copy link
Contributor

jcalz commented Nov 5, 2021

@fatcerberus but it looks like only number (and maybe bigint or any other "infinite" type) acts that way.

Every other type you are allowed to put inside a template literal type hole seems to obey a straightforward rule like "`[${T},${U}]` corresponds to the output of template literal strings of the form `[${t},${u}]` where t is of type T and u is of type U. " Where the specific string representation is just "what template literal strings output" or, in other words, the output of String(t) or ""+u. But for number it's backwards.

Like, if T is 1 | 2 | 3 and U is boolean, then "`[${T},${U}]` is "[1,false]" | "[1,true]" | "[2,false]" | "[2,true]" | "[3,false]" | "[3,true]". You don't get "[1.0,false]" in there, or "[2,0]", or anything else that can be coerced from a string into a value of type T or U. You just get whatever can be coerced into a string from a value of type T or U.

But with number it is just Ζ¨bΙΏΙ’wΚžΙ”Ι’d. Did I mention it's backwards? 😰 πŸ™ƒ 🀯

@adevine
Copy link

adevine commented Jan 18, 2023

I know this is an old issue, but just wanted to give another example/reason why I think it makes sense to implement this, and perhaps another way to implement a solution. Lots of time people end up implementing a Digit type like:

type Digit = 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9;

Then do things like type TwoDigitString = `${Digit}${Digit}`;. Of course, the problem with that is that due the limit on union type complexity, you can only have at max 4 digits in a template literal type.

However, none of that combinatorial explosion applies to ${number} placeholders, e.g. in our code base we basically define a DateTimeString type as `${number}${number}${number}${number}-${number}${number}-${number}${number}T${number}${number}:${number}${number}:${number}${number}.${number}${number}${number}Z`. Obviously that type can accept a lot more actual strings than are valid DateTimeStrings (most importantly it will allow extra digits, but, usefully, it will error out if there are fewer than the expected number of digits), but we've found it does catch the majority of mistakes we're likely to make.

So, perhaps reframing the issue as: Could it be possible to define a "character class" type that is only designed to be used in template literal type placeholders such that, instead of blowing out the combination of all possible union types, that "character class type" could be thought of as defining a function for acceptable input strings, just like the way ${number} works? This way I could define our DateTimeString type using ${Digit} instead of ${number}, but without worrying about the explosion of all possible types in the union.

I'm not very familiar with the innards of TypeScript so apologies if this approach has obvious shortcomings, just seems like a way to get the benefits of the way ${number} works but with a user-defined function.

@fider
Copy link

fider commented Nov 12, 2023

For people looking for string-like int type, bigint wil partially solvethe problem:

let intStr: `${bigint}`
intStr = `23` // allowed
intStr = `0b101` // allowed
intStr = `1.23` // error

@winstxnhdw
Copy link

winstxnhdw commented Mar 10, 2024

For people looking for string-like int type, bigint wil partially solvethe problem:

let intStr: `${bigint}`
intStr = `23` // allowed
intStr = `0b101` // allowed
intStr = `1.23` // error

This is really cool but it still allows octal/hex formats through :(

EDIT:

This works better.

export type Digit = '0' | '1' | '2' | '3' | '4' | '5' | '6' | '7' | '8' | '9'

export type StringNumber<T extends string, V = T> = T extends Digit
  ? V
  : T extends `${Digit}${infer R}`
    ? StringNumber<R, V>
    : `0`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Awaiting More Feedback This means we'd like to hear from more people who would be helped by this feature Suggestion An idea for TypeScript
Projects
None yet
Development

No branches or pull requests

8 participants