I just set something like this up at work using pip, Fabric and git. The flow is basically like this, and borrows heavily from this script:
- In our source tree, we maintain a requirements.txt file. We'll maintain this manually.
- When we do a new release, the Fabric script creates an archive based on whatever treeish we pass it.
- Fabric will find the SHA for what we're deploying with
git log -1 --format=format:%h TREEISH
. That gives us SHA_OF_THE_RELEASE
- Fabric will get the last SHA for our requirements file with
git log -1 --format=format:%h SHA_OF_THE_RELEASE requirements.txt
. This spits out the short version of the hash, like 1d02afc
which is the SHA for that file for this particular release.
- The Fabric script will then look into a directory where our virtualenvs are stored on the remote host(s).
- If there is not a directory named
1d02afc
, a new virtualenv is created and setup with pip install -E /path/to/venv/1d02afc -r /path/to/requirements.txt
- If there is an existing
path/to/venv/1d02afc
, nothing is done
The little magic part of this is passing whatever tree-ish you want to git, and having it do the packaging (from Fabric). By using git archive my-branch
, git archive 1d02afc
or whatever else, I'm guaranteed to get the right packages installed on my remote machines.
I went this route since I really didn't want to have extra virtuenvs floating around if the packages hadn't changed between release. I also don't like the idea of having the actual packages I depend on in my own source tree.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…