Re: git-cvsexportcommit fails for huge commits

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, Dec 12, 2007 at 01:21:14AM -0800, Junio C Hamano wrote:

> > +sub xargs_safe_pipe_capture {
> > +	my $MAX_ARG_LENGTH = 1024;
> > +	my $cmd = shift;
> > +	my @output;
> > +	while(@_) {
> > +		my @args;
> > +		my $length = 0;
> > +		while(@_ && $length < $MAX_ARG_LENGTH) {
> > +			push @args, shift;
> > +			$length += length($args[$#args]);
> > +		}
> > +		push @output, safe_pipe_capture(@$cmd, @args);
> > +	}
> > +	return @output;
> > +}
> > +
> 
> Makes me wonder why you are not spawning xargs by doing it by hand.  If

Because we are reading the output, and it is possible to get a pipe
deadlock. This could be avoided with a tempfile.

> the path at the beginning happens to be longer than 1024 then you will
> run path-less "cvs status"?

No, read the loop again. The length starts at 0, so we always go through
the loop body once.

-Peff
-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux