Re: [PATCH] Makefile: dedup git-ls-files output to prevent duplicate targets

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Ævar Arnfjörð Bjarmason <avarab@xxxxxxxxx> writes:

> I pointed out then that with --sort-by-file added we:
>
>  * Don't group the translations by C/SH/Perl anymore
>  * Change the sort order within files, to be line/sorted instead of
>    line/order (i.e. first occurring translations first)
>
> I suggested then to just use $(sort) on the respective lists.
>
> So why not just:
>
>  1. Switch to the $(FOUND_C_SOURCES) (good)
>  2. Filter that by C/Perl/SH as before (just a simple $(filter)
>  3. $(sort) that (which as noted, also de-dupes it)
>
> Then we don't have any of the behavior change of --sort-by-file, and we
> don't have to carefully curate the ls-files/find commands to not include
> duplicates (although as seen here that seems to have been a useful
> canary in the "find" case).

Does "--sort-by-file" really mean that?

The option is documented to sort output by file location, but does
it mean without the option (i.e. default), there is no guarantee in
the output order?  Or are we sure that the output is sorted by the
order of input files, and that is guaranteed to hold in the future?

If we are depending on certain ordering of the output produced by
gettext suite of programs, I would keep the option, regardless of
what we do to the input to them, if I were running the i18n part of
this project.

But I am not, so I would not complain if --sort-by-file is dropped
against my advice ;-)





[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux