In Javascript, the following code works like so:
> var foo = { bar: 1 }
undefined
> foo
{ bar: 1 }
> 'bar' in foo
true
> !'bar' in foo
false
> 'baz' in foo
false
> !'baz' in foo
false
Why does using !
and in
not return true
when checking if an object does not contain a property?
You need to do !('baz' in foo)
, otherwise it's the same as (!'baz') in foo
(which looks up 'false' in foo
), reason being operator precedence.
!'foo' in {false: true} //as in 'false' in {false: true}
//true
!('foo' in {false: true}) //as in, does a key "foo" not exist in the object
//true
You need to use the !
against the result of the in
!("key" in object)
Right now it is like this.
(!"key") in object
And that is the same as doing
false in object
Whcih becomes
"false" in object