commit sized around 100 gb in changes failed to push to a TFS remote - Git

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



To @Git Community
>From the perspective of an Azure DevOps support engineer. I have a customer who is unable to make a push with following error:

fatal: The remote end hung up unexpectedly
failed to push some refs into https://zelos.healthcare.siemens.com/tfs/Hoover/VA20A.DevInt.Gvfs/_git/Saturn

The local repository has only one change when comparing it to the remote and it is a commit labelled with SHA value: 504aedfdbb to a branch called gitTest
This being said the scheme is as following:

[Remote] - master
b946c27c

[Local] - gitTest branch
504aedfdbb
b946c27c


Important data:
- The commit 504aedfdbb contains +100 GB in file changes 
- The remote git repository is a TFS server
- Customer isn't building code - it is using the remote kind of as a storage service <- We understand these are not best practices but is the way customer is using Git and TFS. If @Git Community could confirm/elaborate on this customer may change up the current approach he is using.

Things tried:
- reset the history for the local repository back to the latest shared commit b946c27c  and committed something small which succeeded to push into remote into a brand new branch by running $ git push origin <name of local branch>
- cherry-picked the commit into local master and attempted to push = failed. <- this makes me think this is entirely caused by the oversized commit
- boosted up the http post buffer configuration = failed. Rolled configuration back to default according to the MSFT docs https://docs.microsoft.com/en-us/azure/devops/repos/git/rpc-failures-http-postbuffer?view=azure-devops
- since this is a TFS server I initially though this could be caused by insufficient disk storage capacity in the server containing the TFS product. But @Vimal Thiagaraj has confirmed that the repositories size limit depend upon the remote TFS databases and not the server itself. Is there a limit on these databases or on how much changes can a git commit contain?

Things I've suggested to customer:
- commit more frequently in smaller batches
- understand that the nature of git is to collaborate and track versions of files over time - not a cloud storage provider

Would appreciate any insight on this @Git Community. Thanks to @Phillip Oakley who took the time to answer last time I posted a question to this mailing list.




[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux