Eivind LM wrote:
You're assigning an "int" type to a char. splint will warn you about
this, even
though it's perfectly reasonable and well understood [not to mention
portable]
code. Is that useful?
I'll repeat myslelf: If the compiler can guarantee that I don't loose
precision in the assignment, then I don't want a warning.
Don't mix types then? I know of no reason to use "char" unless you're
dealing with strings or octet data. I'd never use a char in a
day-to-day expression (e.g. as an index to an array, counter, etc).
However, if I have
int a;
ask_user(&a);
char b = a;
then I think a warning is in place.
Why? It's defined behaviour. Your real problem is mixing types, not
the promotion problems.
As I wrote earlier, I consider these as two totally different things.
In the first case, the value is changed. In the second case, the value
is not changed.
But it has defined behaviour.
The first case might have well defined behaviour. But anyway, my value
is changed by 20%. If I wanted to skip the decimals from 2.5, then I
would have casted the value to an int explicitly. That's why I want a
warning in the cases where any of my values are implicitly changed.
Don't mix types? If you're writing a DSP or other math library chances
are you wouldn't have random functions that take int and some that take
float.
I am just saying that 1) I would like to have a warning whenever an
implicit conversion happens that might be "value destroying". And 2)
since I consider this a serious issue, then I expect the other
warnings in GCC (probably also those warnings that I am not aware of)
to be serious as well. That's why I would like to enable the whole lot
to find out what they can teach me.
Ok, but -Wconversion exists. Don't go tacking that onto -Wall so us
programmers who know what we're doing get stuck with it.
So you are saying that the unlikely cases are less serious? Like the
int to char assignment, that works fine because the int is *likely* to
be in [0,255]? Then it turns out that the int can be 256 before
assignment to char, in a very special corner case. How serious this is
does not depend on how likely it is.
No, it's less serious because it's defined behaviour.
Generally, I would rather say less likely cases are more serious than
high likely cases. The highly likely cases are usually easy to
discover while testing the software anyway. The less likely cases are
the ones that are hard to find when you test, and the most
hard-to-debug problems you receive after release.
I have yet to really have any defects found by trivial and
hypersensitive syntax checking. Wait till you have a 60,000 line
project with hundreds of inter dependencies between functions, then
you'll start worrying about something a little more serious than defined
behaviour.
So I won't say nothanks if GCC have ability to warn me about the less
likely cases.
I have to ask you, what percentage of bugs do you suppose are attributed
to storing int's in chars (or similar)? 10%? 1%? 0.001%?
And how much will you miss because you spend time worrying about things
like this instead of just developing properly to start with?
Just like micro-optimizations can be time consuming and wasteful, so can
micro-linting.
If you want to learn more about C, pick up the ISO C draft and read
it. Don't rely on the warnings from GCC to teach you what is and
isn't good C code.
I have Bjarne's book for C++, and think it is a great reference. But I
can't go about reading the whole thing and expect to be a fluent C++
programmer the next day. There are several ways to learn. One good way
for me is if possible problems in my own code are pointed out to me as
early as possible. That way I can look up in the book to find what the
problem is, and consider whether the problem is a real issue or not.
Afterwards, I will actually remember what I read in the spec, since it
was directly related to my own code.
Yeah, but again, you want warnings for things that aren't errors or
undefined behaviour. Where do you draw the line?
If you want to learn how to develop software, just pick problems and
solve them with software. Then test and verify, document and support.
GCC won't teach you how to be a good developer. And frankly, there is a
heck of a lot more to being a software developer than knowledge of the
syntax of a given language.
I think I understand your concern. But once again, I don't think a
cast is mindless or useless if it actually changes the data value. The
above cast does not change the data value, and I agree it should not
be neccesary.
But it's your type of thinking that leads to those warnings in the first
place. Then customers get wind of that and *demand* that we address
them. It's really annoying.
I agree it takes more than just warning-free to be bug-free. But some
of the hard-to-debug bugs can be avoided by warnings, so I want to use
the warnings for all they are worht.
Ok, but while you're wasting time chasing down every useless warning,
you're *not* learning about proper defensive coding, you're *not*
learning about common defects, and you're *not* becoming a good software
developer.
If you really want to learn how to debug/fix software, get familiar with
gdb, valgrind, and the like. Learn about common defects like buffer
overflow/runs, race conditions, etc.
But we definitely have very different ideas about this, and probably
won't get any closer to agree. But thanks for your opinions though, I
learned a lot! :)
Just wait till you have customers with "coding standards" like MISRA or
whatever that say things like "goto can never be used." Right after you
put together a package which uses them exclusively (for error
handling). Pointless coding rules (of which I lump in useless warnings)
lead people to miss the bigger picture, and in the end real defects that
plague large software projects.
You don't see it now, maybe because you haven't been on the working end
of a large project, but trust me. You won't gain experience until you
actually work on projects, and those projects will have defects, and
your defects will likely not be syntax related.
Tom