Good point about the backward compatibility issue. It was never going to be easy. I'll let this one slide. However it would have been nice to have that fixed for py3
It is perfectly logical, if you were around when the bool type was added to python (sometime around 2.2 or 2.3).
Prior to introduction of an actual bool type, 0 and 1 were the official representation for truth value, similar to C89. To avoid unnecessarily breaking non-ideal but working code, the new bool type needed to work just like 0 and 1. This goes beyond merely truth value, but all integral operations. No one would recommend using a boolean result in a numeric context, nor would most people recommend testing equality to determine truth value, no one wanted to find out the hard way just how much existing code is that way. Thus the decision to make True and False masquerade as 1 and 0, respectively. This is merely a historical artifact of the linguistic evolution.
In fact, give this a try:
>>> True == 1
True
>>> True == 0
False
>>> False == 0
True
>>> False == 1
False
>>> True + 2
3
>>> False - 5
-5
@thp: Good point. This is something python3 should have corrected while the opportunity presented itself.
There's nothing to fix. bool being a subclass of int is perfectly natural, at least to everyone reading Iverson (google Iverson bracket sometime). It enables you to do all sorts of wonderful things
sum(cond for x in seq): how many x in seq satisfy cond mystr += "\n" * keepends: add \n only if keepends is True (falsevalue, truevalue)[condition]: easy selector ...
Comment
Good point about the backward compatibility issue. It was never going to be easy. I'll let this one slide. However it would have been nice to have that fixed for py3
Parent comment
It is perfectly logical, if you were around when the bool type was added to python (sometime around 2.2 or 2.3). Prior to introduction of an actual bool type, 0 and 1 were the official representation for truth value, similar to C89. To avoid unnecessarily breaking non-ideal but working code, the new bool type needed to work just like 0 and 1. This goes beyond merely truth value, but all integral operations. No one would recommend using a boolean result in a numeric context, nor would most people recommend testing equality to determine truth value, no one wanted to find out the hard way just how much existing code is that way. Thus the decision to make True and False masquerade as 1 and 0, respectively. This is merely a historical artifact of the linguistic evolution. In fact, give this a try: >>> True == 1 True >>> True == 0 False >>> False == 0 True >>> False == 1 False >>> True + 2 3 >>> False - 5 -5 @thp: Good point. This is something python3 should have corrected while the opportunity presented itself.
Replies
There's nothing to fix. bool being a subclass of int is perfectly natural, at least to everyone reading Iverson (google Iverson bracket sometime). It enables you to do all sorts of wonderful things
sum(cond for x in seq): how many x in seq satisfy cond
mystr += "\n" * keepends: add \n only if keepends is True
(falsevalue, truevalue)[condition]: easy selector
...