Eivind LM wrote:
So you think it is a problem to mix types? Then we agree on something.
The code example was a response to your paragraph above where you
wrote that "assigning int type to char is perfectly reasonable and
well understood". I would not write such code. If I mistakenly assign
an int to a char, then I would like a warning, no matter how well
defined the behaviour is.
I think it's more important to not make the mistake in the first place.
If you're writing code where you freely interchange data types all
willy-nilly, you have bigger problems than what warnings GCC emits.
It's like saying you need spell check in your email client to write well.
Exactly: don't mix types. Don't send a double as parameter to a
function that takes int (which you wrote is well defined behaviour).
Don't compare a float to an int (which you earlier wrote is perfectly
valid).
But I would expect known behaviour. For example, if I were writing a
FIR or IIR function and happened to have int data, I wouldn't expect a
warning from passing an int to a function that accepts float. If I
passed an "int *" to a function that takes "float *" I would expect a
warning because the code is clearly wrong and won't work properly.
If I do something like that by mistake, then I would like the compiler
to warn me, no matter if it's valid or well defined, because the code
might not do what I intended.
But you shouldn't be in a position where you're freely interchanging
data types in random expressions anyways. If you are, you need to
re-write your algorithm from scratch.
I will not go about tacking any warnings on anyone. The only thing I'm
saying about -Wall is that the name is confusing and should be changed.
Except for everyone else who lives with it and is getting on just fine.
I'm ok with an additional flag, I just don't want -Wall to change (in
this respect anyways).
So you are saying that the unlikely cases are less serious? Like the
int to char assignment, that works fine because the int is *likely*
to be in [0,255]? Then it turns out that the int can be 256 before
assignment to char, in a very special corner case. How serious this
is does not depend on how likely it is.
No, it's less serious because it's defined behaviour.
We are talking about behavour which is possibly unintended, right?
That's when I would like a warning. I don't understand why you think
the consequence (or seriousness) of the unintended behaviour is
related to its likelihood to fail, or whether the behaviour is well
defined or not.
Because not everyone accidentally mixes types. If I store a long in an
unsigned char, that I know is in range [or I don't care about the higher
order bits] I don't want my compiler bitching and whining to me over
something that has clearly defined behaviour.
Let me put it this way, you can write perfectly syntactically correct
code that has the complete opposite meaning of what you want, for
example "if (a = 3) { ... }". I'm for catching that one because it's a
typo in 99% of cases and is good to find.
Where as storing a long in a char is *not* a typo, it's a design flaw,
and it means you don't know what you're doing if you're worried about
losing precision.
Ok. I have about 50,000 lines of C++ code so far. The lines are spread
over different libraries though, so it's not the same project.
And you think loss of precision is your biggest problem? ... Ok.
Less likely does not mean less serious.
It's an irresponsible use of time to hunt down and fix things that
aren't actually bugs when you can very likely have real bugs in your
software.
I am a human, and make mistakes from time to time. I expect to keep
making mistakes, even after the next 50 years of experience with C++.
But I don't want to make a mistake in 50 years from now, that a GCC
warning today could have tought me to avoid.
And what I'm trying to tell you is your not better served by having
pedantic warnings about things that aren't undefined behaviour or
obvious typos.
Just like micro-optimizations can be time consuming and wasteful, so
can micro-linting.
I like how you didn't reply to this.
But that does not make the syntax part less important for me.
The syntax should be second nature to you. I resort to looking at the
draft or the ANSI C spec maybe once a year and even then it's over very
obscure things that you don't see on the day to day development tasks.
You should know your order of precedences and associativity off the top
of your head, you should know the type promotions of expressions and
what not right away.
By "those warnings", you mean a warning for something that is
absolutely sure to not be a problem under any circumstance?
Not everyone is so unsure about the syntax and language as you are.
Could you please write and send me some example code that cause a
non-trivial warning with gcc, and where you can prove that there is no
potential problems with the code? I have yet to see such a warning,
and it would be very educating for me to see.
Well the warning you desired that started this thread is a good
example. The sort of things splint warns about are good examples, etc,
and so on.
You keep using the word "likely". If there is only a slightest chance
that one of the warnings can save me one of the really hard debugging
sessions, then I will keep caring about compiler warnings.
And you will miss a whole slew of real problems because you're worried
about micro-linting your code.
More warnings is only a good idea if the warnings are in fact useful and
likely to represent real life bugs. Warning about the type promotion of
expressions is just annoying and frankly, a complete waste of time. Put
it this way, for every warning you want to see, try and imagine what
percentage of bugs in the real world are attributed to it.
Now you're gonna say "but if there is a chance ..." .... but then I'll
say the time you waste on it is time not spent shoring up your code,
then'll you say "but if there is a chance ..." and I'm just going to
give up now.
Tom