On Wed, 25 Feb 2009 15:20:34 +0100, Tom St Denis
<tstdenis@xxxxxxxxxxxxxxxx> wrote:
Eivind LM wrote:
You're assigning an "int" type to a char. splint will warn you about
this, even
though it's perfectly reasonable and well understood [not to mention
portable]
code. Is that useful?
I'll repeat myslelf: If the compiler can guarantee that I don't loose
precision in the assignment, then I don't want a warning.
Don't mix types then? I know of no reason to use "char" unless you're
dealing with strings or octet data. I'd never use a char in a
day-to-day expression (e.g. as an index to an array, counter, etc).
However, if I have
int a;
ask_user(&a);
char b = a;
then I think a warning is in place.
Why? It's defined behaviour. Your real problem is mixing types, not
the promotion problems.
So you think it is a problem to mix types? Then we agree on something. The
code example was a response to your paragraph above where you wrote that
"assigning int type to char is perfectly reasonable and well understood".
I would not write such code. If I mistakenly assign an int to a char, then
I would like a warning, no matter how well defined the behaviour is.
As I wrote earlier, I consider these as two totally different things.
In the first case, the value is changed. In the second case, the value
is not changed.
But it has defined behaviour.
The first case might have well defined behaviour. But anyway, my value
is changed by 20%. If I wanted to skip the decimals from 2.5, then I
would have casted the value to an int explicitly. That's why I want a
warning in the cases where any of my values are implicitly changed.
Don't mix types? If you're writing a DSP or other math library chances
are you wouldn't have random functions that take int and some that take
float.
Exactly: don't mix types. Don't send a double as parameter to a function
that takes int (which you wrote is well defined behaviour). Don't compare
a float to an int (which you earlier wrote is perfectly valid).
If I do something like that by mistake, then I would like the compiler to
warn me, no matter if it's valid or well defined, because the code might
not do what I intended.
I am just saying that 1) I would like to have a warning whenever an
implicit conversion happens that might be "value destroying". And 2)
since I consider this a serious issue, then I expect the other warnings
in GCC (probably also those warnings that I am not aware of) to be
serious as well. That's why I would like to enable the whole lot to
find out what they can teach me.
Ok, but -Wconversion exists. Don't go tacking that onto -Wall so us
programmers who know what we're doing get stuck with it.
Yes, I found -Wconversion to be very useful. I wonder how many other flags
there are in GCC that might prove to be just as useful for me. If there
was a -Weverything flag, then it would be easy to find out.
I will not go about tacking any warnings on anyone. The only thing I'm
saying about -Wall is that the name is confusing and should be changed.
So you are saying that the unlikely cases are less serious? Like the
int to char assignment, that works fine because the int is *likely* to
be in [0,255]? Then it turns out that the int can be 256 before
assignment to char, in a very special corner case. How serious this is
does not depend on how likely it is.
No, it's less serious because it's defined behaviour.
We are talking about behavour which is possibly unintended, right? That's
when I would like a warning. I don't understand why you think the
consequence (or seriousness) of the unintended behaviour is related to its
likelihood to fail, or whether the behaviour is well defined or not.
Generally, I would rather say less likely cases are more serious than
high likely cases. The highly likely cases are usually easy to discover
while testing the software anyway. The less likely cases are the ones
that are hard to find when you test, and the most hard-to-debug
problems you receive after release.
I have yet to really have any defects found by trivial and
hypersensitive syntax checking. Wait till you have a 60,000 line
project with hundreds of inter dependencies between functions, then
you'll start worrying about something a little more serious than defined
behaviour.
Ok. I have about 50,000 lines of C++ code so far. The lines are spread
over different libraries though, so it's not the same project.
So I won't say nothanks if GCC have ability to warn me about the less
likely cases.
I have to ask you, what percentage of bugs do you suppose are attributed
to storing int's in chars (or similar)? 10%? 1%? 0.001%? And how much
will you miss because you spend time worrying about things like this
instead of just developing properly to start with?
Less likely does not mean less serious.
I am a human, and make mistakes from time to time. I expect to keep making
mistakes, even after the next 50 years of experience with C++. But I don't
want to make a mistake in 50 years from now, that a GCC warning today
could have tought me to avoid.
Just like micro-optimizations can be time consuming and wasteful, so can
micro-linting.
If you want to learn more about C, pick up the ISO C draft and read
it. Don't rely on the warnings from GCC to teach you what is and
isn't good C code.
I have Bjarne's book for C++, and think it is a great reference. But I
can't go about reading the whole thing and expect to be a fluent C++
programmer the next day. There are several ways to learn. One good way
for me is if possible problems in my own code are pointed out to me as
early as possible. That way I can look up in the book to find what the
problem is, and consider whether the problem is a real issue or not.
Afterwards, I will actually remember what I read in the spec, since it
was directly related to my own code.
Yeah, but again, you want warnings for things that aren't errors or
undefined behaviour. Where do you draw the line?
If you want to learn how to develop software, just pick problems and
solve them with software. Then test and verify, document and support.
GCC won't teach you how to be a good developer. And frankly, there is a
heck of a lot more to being a software developer than knowledge of the
syntax of a given language.
But that does not make the syntax part less important for me.
I think I understand your concern. But once again, I don't think a cast
is mindless or useless if it actually changes the data value. The above
cast does not change the data value, and I agree it should not be
neccesary.
But it's your type of thinking that leads to those warnings in the first
place. Then customers get wind of that and *demand* that we address
them. It's really annoying.
By "those warnings", you mean a warning for something that is absolutely
sure to not be a problem under any circumstance?
Could you please write and send me some example code that cause a
non-trivial warning with gcc, and where you can prove that there is no
potential problems with the code? I have yet to see such a warning, and it
would be very educating for me to see.
I agree it takes more than just warning-free to be bug-free. But some
of the hard-to-debug bugs can be avoided by warnings, so I want to use
the warnings for all they are worht.
Ok, but while you're wasting time chasing down every useless warning,
you're *not* learning about proper defensive coding, you're *not*
learning about common defects, and you're *not* becoming a good software
developer.
If you really want to learn how to debug/fix software, get familiar with
gdb, valgrind, and the like. Learn about common defects like buffer
overflow/runs, race conditions, etc.
I use gdb and valgrind. I have done my time debugging writes outside array
boundaries. I have used pthreads and debugged race conditions. But I still
care about compiler warnings. I don't think there is a contradiction there.
But we definitely have very different ideas about this, and probably
won't get any closer to agree. But thanks for your opinions though, I
learned a lot! :)
Just wait till you have customers with "coding standards" like MISRA or
whatever that say things like "goto can never be used." Right after you
put together a package which uses them exclusively (for error
handling). Pointless coding rules (of which I lump in useless warnings)
lead people to miss the bigger picture, and in the end real defects that
plague large software projects. You don't see it now, maybe because you
haven't been on the working end of a large project, but trust me. You
won't gain experience until you actually work on projects, and those
projects will have defects, and your defects will likely not be syntax
related.
You keep using the word "likely". If there is only a slightest chance that
one of the warnings can save me one of the really hard debugging sessions,
then I will keep caring about compiler warnings.
Eivind