Re: [PATCH] gc: ignore old gc.log files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, Feb 09, 2017 at 11:57:12PM -0000, Philip Oakley wrote:

> > > +gc.maxLogAge::
> > > + If the file gc.log exists, then `git gc --auto` won't run
> > > + unless that file is more than maxLogAge seconds old.  Default
> > > + is 86400, one day.
> 
> Is there a reason why one day is chosen? If maintenance staff are available
> 24/7 then a shorter time would be appropriate, but if it's a 5 day work week
> then they may want longer. Is there a particular case it targets?

I'm pretty sure the one-day time limit isn't scientific. It's just a
number we've been throwing around.

I'm not sure what maintenance staff matters, though. It basically needs
long enough that we're not doing _too_ many fruitless gc's, because it
wastes resources. But you'd prefer to not go too long without a gc for a
repository that needs it.

The root cause of the error could be any number of issues. But for the
case that David cares about most, you basically want to keep trying
until the too-many-objects condition goes away. That's usually on a
2-week timer. So trying once per day to see if the 2-week timer feels
about right.

That's certainly not science, but hopefully it at least frames the
general ballpark.

One possible option would be to auto-scale it with the pruneExpire time.
I don't know if people actually tweak that value or not.

-Peff



[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]