Re: commit sized around 100 gb in changes failed to push to a TFS remote - Git

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Aram
On 14/06/2019 17:47, Aram Maliachi (WIPRO LIMITED) wrote:
To @Git Community
 From the perspective of an Azure DevOps support engineer. I have a customer who is unable to make a push with following error:

fatal: The remote end hung up unexpectedly
failed to push some refs into https://zelos.healthcare.siemens.com/tfs/Hoover/VA20A.DevInt.Gvfs/_git/Saturn

The local repository has only one change when comparing it to the remote and it is a commit labelled with SHA value: 504aedfdbb to a branch called gitTest
This being said the scheme is as following:

[Remote] - master
b946c27c

[Local] - gitTest branch
504aedfdbb
b946c27c


Important data:
- The commit 504aedfdbb contains +100 GB in file changes
- The remote git repository is a TFS server
- Customer isn't building code - it is using the remote kind of as a storage service <- We understand these are not best practices but is the way customer is using Git and TFS. If @Git Community could confirm/elaborate on this customer may change up the current approach he is using.

Things tried:
- reset the history for the local repository back to the latest shared commit b946c27c  and committed something small which succeeded to push into remote into a brand new branch by running $ git push origin <name of local branch>
- cherry-picked the commit into local master and attempted to push = failed. <- this makes me think this is entirely caused by the oversized commit
- boosted up the http post buffer configuration = failed. Rolled configuration back to default according to the MSFT docs https://docs.microsoft.com/en-us/azure/devops/repos/git/rpc-failures-http-postbuffer?view=azure-devops
- since this is a TFS server I initially though this could be caused by insufficient disk storage capacity in the server containing the TFS product. But @Vimal Thiagaraj has confirmed that the repositories size limit depend upon the remote TFS databases and not the server itself. Is there a limit on these databases or on how much changes can a git commit contain?

Things I've suggested to customer:
- commit more frequently in smaller batches
- understand that the nature of git is to collaborate and track versions of files over time - not a cloud storage provider

Would appreciate any insight on this @Git Community. Thanks to @Phillip Oakley who took the time to answer last time I posted a question to this mailing list.
Can you confirm the operating system versions and Git versions for the machine doing the push and the server attempting the receiving? Especially are either of them Windows systems which currently have a 32bit size limit (which is sizeof(long)..).

I have a patch series in review on the Git-for-Windows Github repo (#2179) that should allow objects and packs greater than 4Gb. However..

This may still be a limit on the 'transfer' process (100Gb could take a long time, have time outs, break internal virus checkers that monitor the feed, etc).

Philip



[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux