Hello,
is there safe way to garbage collect old commits from git repository?
Lets say that i want to always keep only last 100 commits and throw
everything older away. To achieve similar goal as git clone --depth=100,
but on the server side. I had partial success with doing shallow clone
and then converting to bare repo while removing the shallow flag from
.git/config. But i didn't liked that solution and wasn't really sure
what consequences in terms of data integrity and forward compatibility
with newer git versions might be.
To tell you more about my USE CASE:
I want to create free opensource sofware similar to dropbox, but based
on git. My idea is following:
1.) Automaticaly pull/commit/push changed files to/from several laptops
to single git server (and forcefully resolve all conflicts, this will
work unless you plan to use it for software development)
2.) On central server maintain tags indicating latest commits
synchronized to individual laptops.
3.) On server delete old commits that are no longer needed by any laptop
to sync their worktree. Once synced, delete these commits on laptops as
well. (optionaly leaving eg. 1 month or 1GB of old commits in case you
might need to rollback. Possibly keep the history only on the server,
while deleting it from clients)
This way computers can stay in sync forever without running out of disk
space, because old commits are removed.
Eg. If i accidentaly add some very big file to synced folder and then
delete it, it will eventualy get deleted, once everybody gets in sync
again.
I am aware that this is not something which git was designed for, but to
me it seems like it should be more than doable. Do you think, any of you
can give me some hints on how to approach this problem please?
These are some projects which inspired me to explore this route:
https://github.com/presslabs/gitfs
https://www.syncany.org/
https://www.cis.upenn.edu/~bcpierce/unison/
https://etckeeper.branchable.com/
--
S pozdravem
Best regards
Tomáš Mudruňka - SPOJE.NET s.r.o.