Re: Style question: comparison between signed and unsigned?
Tue, 23 Sep 1997 15:31:47 -0500 (CDT)

And lo, Theodore Y. Ts'o saith unto me:
> From: "Leslie F. Donaldson" <>
> Date: Tue, 23 Sep 1997 11:25:31 -0500 (EST)
> >>Quite frankly, anybody who claims that the extra cast is a good thing
> >>should be shot on the spot - the extra cast is an abomination and has _no_
> >>redeeming features except to shut up a warning from a compiler that thinks
> >>it knows better than the programmer.
> Very true, but the problem isn't the compiler, or the programmer , it is the
> person that designed the interface to read.
> This statement is typical of the sort of ivory-tower academics who like
> to go around pointing fingers at people, but who probably couldn't write
> a robust program themselves. (Those who can't do, teach.)
Nonetheless, finger-pointing at the academics hasn't led to any "robust"
solutions here either. Assuming the implementors of "read" really meant
for it to return a size_t when it doesn't return -1 (not necessarily a
safe assumption) the "(size_t) i" code is closest to correct...but if
someone screwed up "read" and accidentally returned a negative number,
this code would also screw up (remember, "read" here refers to any
function with the "-1 or int...or was that -1 or size_t?" return

> The read(),
> write(), interfaces aren't going to change, folks. They predated gcc,
> and they will likely outlast gcc. Saying that it's the fault of those
> who invented the entire Unix system call interface doesn't help things.
It's pointed out a lack of clear understanding about the notional type
these functions return (even with unsigned size_t I'm still not sure
what the right type for non -1 returns is). And such confusion won't
help someone who has to do a 2G+ read() on a 32-bit machine...

> The fact of the matter is, by having the compiler issue these warnings,
> it makes folks much more likely to ignore *all* compiler warnings, since
> so many of them will be false positives.
So far the compiler seems to only emit them when sizeof() is used in
the comparison; literals and variables of the same type as holds the
return value are fine. This in turn just requires the programmer
make sure he isn't sometimes asking to compare -4 with 3 billion in a
32-bit compare...