> `null` is only rarely used in JS, `undefined` is by far the more common one, and of course the types reflect which one is a possible value, so it's not like you can miss it by accident.
That's not true, `undefined` is not even a keyword or type, `null` is what you would use if you wanted to specify a variable has "no value" in JS or JSON (again because `undefined` isn't a Type). The "undefined" value is only for specifying if the variable "does not exist". The difference is important as the behavior is different depending on how you use them.
> I just always use the full form with the `===` operator, e.g. `if (variable !== undefined)` instead of `if (variable)`
This is bad practice as you're only testing for `undefined` here, not `null`, this is basically the only time where strict equality isn't useful. You want to test with `if (variable != null)` which tests for both `null` and `undefined` and not other falsy values that `if (variable)` tests for.
> There is now also the `??` operator
Right, the Nullish coalescing operator (which Dart/C# also has) tests for both `null` and `undefined` (well `void 0` since the `undefined` value can be overridden) and not falsy values.
Regarding the `if`, it only makes sense to test for things that the type actually contains.
So if I have a type `User|undefined`, testing `if (user === undefined || user === null)` is pointless, as `null` is not part of that type. It would be different if it were. Similarly, it makes no sense to test a variable of `User` type for value `false`, like it would if the type were `User|boolean`, or even `User|false` which you can do in TypeScript (not that you should). So `if (variable)` is still bad (ok for booleans, tolerable for some types but better avoided). You only test for what the variable can contain.
Some types you encounter are `T|null` - mostly DOM things (e.g. the return value of getElementById), but some JSON stuff as well like you said. Even though a lot of JSON just has optional properties instead, which for all intents and purposes work like `|undefined` when reading them.
From my experience, it is much more common to deal with `undefined` in the JS ecosystem, whether that is by omitting properties from objects, or explicitly assigning/returning `undefined` somewhere.
`T|null|undefined` types are ever rarer. Only for those it makes sense to test for both null and undefined.
By the way you can also have types like
type RemoteData<T> = {ready: false} | {ready: true, data: T}
in TypeScript, which is better than
{ ready: boolean, data?: T }
because you don't have to worry about "does `data` still make sense when it is non-ready?", the type clearly tells you that it doesn't (of course you can also have a different type where it would, the point is to accurately describe the structures you deal with). Then, if you do a check
if (remote.ready) {
console.log(remote.data.length)
}
it knows that data exists inside that `if` block, so you can safely access it without `!` or `?`. Similarly, inside `if (!remote.ready) { }`, you cannot access it at all. It also won't let you pass an object with { ready: true }, but no data of the required type to `RemoteData<T>`.
> So if I have a type `User|undefined`, testing `if (user === undefined || user === null)` is pointless, as `null` is not part of that type.
It's very relevant, your TypeScript definition is meaningless at runtime when it's executed JS where all TypeScript's type information is erased, the only checks are relevant are the runtime JS type checks.
> `if (user === undefined || user === null)` is pointless
Right, it is verbose & unecessary, so is only testing for `if (user === undefined)` which is inadequate and I don't know anyone who'd use it since using the recommended `if (user == null)` is shorter and more complete for testing if any value is not `null` or `undefined`. If you're validating JSON you should know that there is no `undefined` type in JSON, but any value could be `null`.
> From my experience, it is much more common to deal with `undefined` in the JS ecosystem, whether that is by omitting properties from objects, or explicitly assigning/returning `undefined` somewhere.
If a type want's to declare that property exists but doesn't have a value it would be `null`, which is also the only option you have in JSON to do so.
Your TypeScript definitions are for your code structure and API description so you can benefit from its static analysis, but you should be aware they do nothing at runtime where all Type annotations are erased so if you try to define your Vue component with an optional property, e.g:
class MyComponent extends Vue
{
p1?:string;
p2 = null;
}
It gets stripped away completely but setting a null value instead will ensure the property exists and is made reactive:
Yes, it is theoretically possible for a TypeScript type to be "wrong" and the underlying variable to have a different type.
But there are only a few ways in which that can happen:
- a bug in type definitions - a JS library with .d.ts files which "lie" about the actual types - this is no different than any other bug in a library, needs to be reported and fixed. In my experience, such bugs are not very common (I've yet to encounter one).
- non-validated external data (JSON coming from HTTP) - like I said, you have to validate external data (or assume that anything can be anything). You kinda have to do it in all languages, though ones with reflection and a deserializing library can do that somewhat automatically and throw a runtime error if the deserialization fails (although that is not full validation).
- incorrect explicit casts and using `any` - e.g. `const a: number = ('asdf' as any)`, but less stupid and more accidental. This is a bug in the casting code, similar to doing an incorrect `reinterpret_cast` in C++. There is usually little reason to do explicit casts in TypeScript code, outside of very small isolated pieces of code.
Which means you generally don't have to worry about any of this. Relying on those guarantees is good enough for 99+% of situations (I mean, some people even use dynamically typed languages and survive without doing tons of checks on every other line of code).
---
Testing stuff which is not expected (by the specified type) to be `null` for `null` is just as pointless as testing it for the number 10. There is just no "valid" way for the `null` to have gotten there (just like the number 10). It would be extremely paranoid and only add noise to the code.
If there is a valid way, then the type needs to be `T|null` (and then you need to test it), not `T`, otherwise it's a bug. For TypeScript code, this will be checked automatically, for JS library type definitions, it needs to be made correct manually.
Regarding JSON and undefined, you can omit properties in JSON. E.g. sometimes have {"user": {"name": "Bob", "email": "bob@example.com"}}, and sometimes just {"user": {"name": "Bob"}} without the email. Such an optional "email" can of course be expressed in TypeScript, and can be tested as `=== undefined`, which will return true if it's not there. And such a pattern of omitting properties from objects (not just JSON) is much more common in JS than assigning `null` to stuff.
You are much more likely to find a JS object where some property is sometimes there and sometimes not there but never `null`, than an object where it is always there but sometimes `null`. That's what I mean when I say `undefined` is much more common than `null` in the JS world. But neither is a problem, you just need to have types which correctly describe what values the properties can contain.
---
So typically none of this is a problem in practice, as long as you have correct type definitions for the JS libraries which you use (i.e. there is no bug in them, just like there should be no bugs in the library itself) and don't do funky casts with `any`.
> Yes, it is theoretically possible for a TypeScript type to be "wrong" and the underlying variable to have a different type.
Unlike most languages, TypeScript is just a high-level static analysis compiler service that compiles to JS that executes in a JS VM. It has no control over what happens at runtime, it doesn't know about external libraries and user input, how its TypeScript functions are going to be invoked. All it can know is the "truths" you tell it to assume, of which it makes no attempt to validate whether any of your assertions are accurate. It's not that TypeScript is wrong as it can only verify the compile-time state of your program, it doesn't know or can control the runtime state of the JS VM, what arguments your functions are invoked with, what values properties are set with, how any of its instances or prototype chain is manipulated, etc.
> Testing stuff which is not expected (by the specified type) to be `null` for `null` is just as pointless as testing it for the number 10. There is just no "valid" way for the `null` to have gotten there (just like the number 10). It would be extremely paranoid.
It's strange to read runtime JS type checks is pointless when it's the only way you can validate for sure what a Type is. You seem to think you can resolve issues by declaring typescript annotations in a different way, when they have absolutely no impact, your type annotations does not provide any guarantees at runtime, TypeScript can't tell you if your Type annotation is accurate now or will remain accurate in the future, if what you cast to is accurate, if how your code is invoked is accurate, you're telling TypeScript what the only types for a variable are, but these type assertions have no impact at runtime. Ask yourself if you're only checking for `undefined` what the behavior of your program is when it's called with `null`. Since JS functions have positional arguments it's very common for it to be invoked with `null` argument values when needing to provide values for each argument, whereas it's almost never explicitly called with an `undefined` value unless it's undergoing regression testing for bugs.
Given any JS variable can be assigned both `undefined` and `null`, there's no good reason for using strict equality check for `=== undefined`, it's always the wrong type check unless you need to determine the difference between `null` and `undefined`, otherwise it's just a more verbose inadequate type check for nothingness.
I know how it works. I'm talking about practical implications.
In practice, say I have
type User = { name: string, email: string }
let loggedInUser: User = { name: 'Bob', email: 'bob@examaple.com' }
or even
// Can't change the contents, only reassign the whole variable
let loggedInUser: Readonly<User> = ...
and only ever use that `loggedInUser` from TypeScript code (because my entire codebase is TypeScript), and I avoid the 3 scenarios mentioned above (buggy JS library type definitions, non-validated external data, casting/any/`!`).
How would a `null` email ever get inside `loggedInUser`? It just won't happen. I know there are no run-time checks - I don't need any run-time checks. I don't need 100% theoretical guarantees. I can be reasonably certain that `loggedInUser.email` will always be a non-null non-undefined string. It's a good enough™ practical guarantee. A null-check on `loggedInUser.email` would be pure confusing noise (because anyone reading it would go "wait, this can be null? I thought it was non-null").
Do you have a practical explanation of how a `loggedInUser.email === null` situation might occur given the above assumptions?
>Given any JS variable can be assigned both `undefined` and `null`
Any JS variable can be assigned literally anything. I don't need to check that `loggedInUser.email` is not null, just like I don't need to check that `loggedInUser.email` is not a boolean or a number. Because how would that even happen? Not having to do those checks is half the point of having a static type system in the first place (the other half is being notified of type errors at compile/edit time).
If you need to defensively check whether something so basic that is never supposed to happen has happened, then the code is already a giant unmaintainable mess.
E.g. `const a = [1, 2, 3]` will let you do `a[5]` and still give it type `number` instead of `number|undefined` as mentioned in the github issue. Many other languages also do this (though they have runtime checks/errors).
So that kinda sucks, but the rest still stands (there is still no practically realistic way for `null` or anything other than `undefined` to get there). Hopefully they'll fix that in the future with a new compiler setting.
That's not true, `undefined` is not even a keyword or type, `null` is what you would use if you wanted to specify a variable has "no value" in JS or JSON (again because `undefined` isn't a Type). The "undefined" value is only for specifying if the variable "does not exist". The difference is important as the behavior is different depending on how you use them.
> I just always use the full form with the `===` operator, e.g. `if (variable !== undefined)` instead of `if (variable)`
This is bad practice as you're only testing for `undefined` here, not `null`, this is basically the only time where strict equality isn't useful. You want to test with `if (variable != null)` which tests for both `null` and `undefined` and not other falsy values that `if (variable)` tests for.
> There is now also the `??` operator
Right, the Nullish coalescing operator (which Dart/C# also has) tests for both `null` and `undefined` (well `void 0` since the `undefined` value can be overridden) and not falsy values.