I have a git repo with some very large binaries in it. I no longer need them, and I don't care about being able to checkout the files from earlier commits. So, to reduce the repo size, I want to delete the binaries from the history altogether.
After a web search, I concluded that my best (only?) option is to use git-filter-branch
:
git filter-branch --index-filter 'git rm --cached --ignore-unmatch big_1.zip big_2.zip etc.zip' HEAD
Does this seem like a good approach so far?
Assuming the answer is yes, I have another problem to contend with. The git manual has this warning:
WARNING! The rewritten history will have different object names for all the objects and will not converge with the original branch. You will not be able to easily push and distribute the rewritten branch on top of the original branch. Please do not use this command if you do not know the full implications, and avoid using it anyway, if a simple single commit would suffice to fix your problem. (See the "RECOVERING FROM UPSTREAM REBASE" section in git-rebase(1) for further information about rewriting published history.)
We have a remote repo on our server. Each developer pushes to and pulls from it. Based on the warning above (and my understanding of how git-filter-branch
works), I don't think I'll be able to run git-filter-branch
on my local copy and then push the changes.
So, I'm tentatively planning to go through the following steps:
- Tell all my developers to commit, push, and stop working for a bit.
- Log into the server and run the filter on the central repo.
- Have everyone delete their old copies and clone again from the server.
Does this sound right? Is this the best solution?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…