[no subject]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi, I have a repository with about 55GB of contents, with binary files
that are less than 100MB each (so no LFS mode) from a project which
has almost filled up an entire hard drive. I am trying to add all of
the contents to a git repo and push it to GitHub but every time I do

git add .

in the folder with my contents after initializing and setting my
remote, git starts caching all the files to .git/objects, making the
.git folder grow in size rapidly. All the files are binaries, so git
cannot stage changes between versions anyway, so there is no reason to
cache versions.

Is there any way, such as editing the git attributes or changing
something about how files are staged in the git repository, to only
just add indexes or references to files in the repository rather than
cache them into the .git folder, while also being able to push all the
data to GitHub?



[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux