On 04/10/2016 01:00, René Scharfe wrote:
Am 03.10.2016 um 19:09 schrieb Kevin Bracey:
As such, NULL checks can still be elided even with your change. If you
effectively change your example to:
if (nmemb > 1)
qsort(array, nmemb, size, cmp);
if (!array)
printf("array is NULL\n");
array may only be checked for NULL if nmemb <= 1. You can see GCC doing
that in the compiler explorer - it effectively turns that into "else
if".
We don't support array == NULL together with nmemb > 1, so a segfault
is to be expected in such cases, and thus NULL checks can be removed
safely.
Possibly true in practice.
But technically wrong by the C standard - behaviour is undefined if the
qsort pointer is invalid. You can't formally expect the defined
behaviour of a segfault when sending NULL into qsort. (Hell, maybe the
qsort has its own NULL check and silently returns! cf printf - some
printfs will segfault when passed NULL, some print "(null)"). I've
worked on systems that don't fault reads to NULL, only writes, so those
might not segfault there, if NULL appeared sorted...
And obviously there's the language lawyer favourite possibility of the
call causing nasal flying monkeys or whatever.
So if it's not a program error for array to be NULL and nmemb to be zero
in your code, and you want a diagnostic for array=NULL, nmemb non-zero,
I think you should put that diagnostic into sane_qsort as an assert or
something, not rely on qsort's undefined behaviour being a segfault.
sane_qsort(blah)
{
if (nmemb >= 1) {
assert(array);
qsort(array, nmemb, ...);
}
}
Can't invoke undefined behaviour from NULL without triggering the
assert. (Could still have other invalid pointers, of course).
Usually I am on the side of "no NULL checks", as I make the assumption
that we will get a segfault as soon as NULL pointers are used, and those
are generally easy to diagnose. But seeing a compiler invoking this sort
of new trickery due to invoking undefined behaviour is making me more
nervous about doing so...
To make that check really work, you have to do:
if (array)
qsort(array, nmemb, size, cmp);
else
printf("array is NULL\n");
So maybe your "sane_qsort" should be checking array, not nmemb.
It would be safe, but arguably too much so, because non-empty arrays
with NULL wouldn't segfault anymore, and thus become harder to
identify as the programming errors they are.
Well, you get the print. Although I guess you're worrying about the
second if being real code, not a debugging check.
I must say, this is quite a courageous new optimisation from GCC. It
strikes me as finding a language lawyer loophole that seems to have been
intended for something else (mapping library functions directly onto
CISCy CPU intrinsics), and using it to invent a whole new optimisation
that seems more likely to trigger bugs than optimise any significant
amount of code in a desirable way.
Doubly weird as there's no (standard) language support for this. I don't
know how you'd define "my_qsort" that triggered the same optimisations.
I've seen similar
library-knowledge-without-any-way-to-reproduce-in-user-code
optimisations like "malloc returns a new pointer that doesn't alias with
anything existing" (and no way to reproduce the optimisation with
my_malloc_wrapper). But those seemed to have a clear performance
benefit, without any obvious traps. Doubtful about this one.
Kevin