[<prev] [next>] [thread-next>] [day] [month] [year] [list]
Message-Id: <E1HHmuB-0001lW-E1@be1.lrz>
Date: Thu, 15 Feb 2007 21:08:35 +0100
From: Bodo Eggert <7eggert@....de>
To: Sergei Organov <osv@...ad.com>,
Linus Torvalds <torvalds@...ux-foundation.org>,
J.A. MagallÃÃón
<jamagallon@....com>, Jan Engelhardt <jengelh@...ux01.gwdg.de>,
Jeff Garzik <jeff@...zik.org>,
Linux Kernel Mailing List <linux-kernel@...r.kernel.org>,
Andrew Morton <akpm@...ux-foundation.org>
Subject: Re: somebody dropped a (warning) bomb
Sergei Organov <osv@...ad.com> wrote:
> Linus Torvalds <torvalds@...ux-foundation.org> writes:
>> Exactly because "char" *by*definition* is "indeterminate sign" as far as
>> something like "strlen()" is concerned.
>
> Thanks, I now understand that you either don't see the difference
> between "indeterminate" and "implementation-defined" in this context or
> consider it non-essential, so I think I've got your point.
If you don't code for a specific compiler with specific settings, there is
no implementation defining the signedness of char, and each part of the code
using char* will be wrong unless it handles both cases correctly.
Therefore it's either always wrong to call your char* function with char*,
unsigned char* _and_ signed char unless you can guarantee not to overflow any
of them, or it's always correct to call char* functions with any kind of these.
Off cause if you happen to code for specific compiler settings, one signedness
of char will become real and one warning will be legit. And if pigs fly, they
should wear googles to protect their eyes ...
>> THE FACT IS, THAT "strlen()" IS DEFINED UNIVERSALLY AS TAKING "char *".
>
> So just don't pass it "unsigned char*". End of story.
Using signed chars for strings is wrong in most countries on earth. It was
wrong when the first IBM PC came out in 1981, and creating a compiler in
1987 defaulting to signed char is a sure sign of originating from an
isolated country and knowing nothing about this planet. Using signed chars
in comparisons is especially wrong, and casting each char to unsigned before
comparing them is likely to be forgotten. Unsigned character strings are
useless because there is no such thing as char(-23), and if these strings
weren't casted to signed inside all IO functions, they wouldn't work correctly.
Only because many programmers don't compare chars, most programs will work
outside the USA. I repeat: Thanks to using signed chars, the programs only
work /by/ /accident/! Promoting the use of signed char strings is promoting
bugs and endangering the stability of all our systems. You should stop this
bullshit now, instead of increasing the pile.
>> That BY DEFINITION means that "strlen()" cannot care about the sign,
>> because the sign IS NOT DEFINED UNIVERSALLY!
>>
>> And if you cannot accept that fact, it's your problem. Not mine.
>
> I never had problems either with strlen() or with this warning, so was
> curious why does the warning is such a problem for the kernel.
The warning is a problem because either it clutters your build log hiding
real bugs, you typecast hiding errors whenever a char* function is called,
or you disable it globally hiding real signedness bugs. Either way you take
it, it's wrong.
--
Funny quotes:
12. He who laughs last thinks slowest.
Friß, Spammer: uv@...eggert.dyndns.org mt4siyETc@...gwl.7eggert.dyndns.org
-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@...r.kernel.org
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/
Powered by blists - more mailing lists