> On 20 Jul 2017, at 09:41, Volodymyr Sendetskyi <volodymyrse@xxxxxxxxxx> wrote: > > It is known, that git handles badly storing binary files in its > repositories at all. > This is especially about large files: even without any changes to > these files, their copies are snapshotted on each commit. So even > repositories with a small amount of code can grove very fast in size > if they contain some great binary files. Alongside this, the SVN is > much better about that, because it make changes to the server version > of file only if some changes were done. > > So the question is: why not implementing some feature, that would > somehow handle this problem? > > Of course, I don't know the internal git structure and the way of > working + some nuances (likely about the snapshots at all and the way > they are done), so handling this may be a great problem. But the > easiest feature for me as an end user will be something like > '.gitbinary', where I can list binary files, that would behave like on > SVN, or even more optimal, if you can implement it. Maybe there will > be a need for separate kinds of repositories, or even servers. But > that would be a great change and a logical way of next git's > evolution. GitLFS [1] might be the workaround you want. There are efforts to bring large file support natively to Git [2]. I tried to explain GitLFS in more detail here: https://www.youtube.com/watch?v=YQzNfb4IwEY - Lars [1] https://git-lfs.github.com/ [2] https://public-inbox.org/git/20170620075523.26961-1-chriscool@xxxxxxxxxxxxx/