Backing Up Bitbucket Code to a Synology

Published on Monday, September 28, 2015

Photo by Jan Loyde Cabrera on Unsplash

In my last post, I talked about getting my Synology to do a nightly backup of all my GitHub repos. Since all of them are public, it's a pretty simply process to clone them all locally. This time, I want to talk about Bitbucket, which is where I keep all my private repos. Because they're private, and because some of them are owned by teams, the process is a bit more involved.

I started with an excellent Bash script from the BibSonomy team which covers most of the bases; it can retrieve private repos along with all of their issues and wiki entries, and it works with both git and mercurial. My Synology runs BusyBox, which means that the default shell is Almquist, not Bash. Rather than install Bash, I just rewrote the script a bit so that it would work in ash - mostly this was just a matter of changing how the functions were defined.

The other challenge was that the original script assumed that the box it was running on already had the correct SSL CA Certificates installed for curl. This wasn't the case for my Synology, so I grabbed cacert.pem from the cUrl site and dropped it into the folder with the script. Then I modified the script to call cUrl with the --cacert cacert.pem option. It's quick and dirty, but it worked for getting this process up and running.

The script basically has four dependencies: cUrl, jq, git, and hg (mercurial). cUrl and jq were already installed on my Synology. Setting up git and mercurial is easy - just use the packages provided by SynoCommunity. The mercurial package added hg to my path; for git you'll need to add /volume1/@appstore/git/bin to your path or set it before you call the script (export PATH="/volume1/@appstore/git/bin:$PATH").

The full version of the modified script can be found here.

To retrieve your personal repos, you can set USER_NAME to your Bitbucket user name and the API_KEY to your password before calling the script:

export USER_NAME=hartez
export API_KEY=[password]

For teams, you'll need to generate an API key by going to "Manage Team" and selecting API key under "Access Management". Then you can retrieve everything for your team:

export USER_NAME=[team name]
export API_KEY=[api key]

I do this in a script which handles all of my teams (4 at the moment), zips everything up, and cleans up older archives:


cd /volume1/homes/ez/documents/bitbucket-backup

# Add git to the path temporarily
export PATH="/volume1/@appstore/git/bin:$PATH"

# Use the current date and time to name the working dir and archive
DEST=$(date "+%Y%m%d%H%M%S")

# My Stuff
export USER_NAME=hartez
export API_KEY=[password]

# Teams
export USER_NAME=[team 1]
export API_KEY=[api key]


# Archive everything
tar -Jcf "$BACKUP_FILE" "$DEST"

# Clean up the working directory
rm -rf $DEST

# Clean up backups over 30 days old
find /volume1/homes/ez/documents/bitbucket-backup/*.xz -type f -mtime +30 -delete

That's all there is to it.

It's actually quite a relief to know that I've got all this code backed up locally and off-site. I don't really expect anything to go tragically wrong with either Bitbucket or GitHub, but until I got these backups running I hadn't realized how much psychic weight there was in that tiny, nagging worry that just maybe all my work would get lost. It's definitely worth a little time to set this sort of thing up, if just for the peace of mind. Hopefully these articles will make it a bit easier for someone else.

Update: I was checking up on my backup the other day and noticed that my personal projects (the ones which weren't under a group) weren't backing up anymore. It turns out that the above solution isn't compatible with personal accounts if you've got 2-factor authentication enabled. Since there's no option to set up an API key for personal accounts at this point, getting this working with 2FA in place would require jumping through a bunch of OAuth hoops. Hopefully application specific passwords will be implemented soon and I can move to using that feature; if not, eventually I'm going to be spending some painful hours dealing with OAuth. Either way, I'll put up another post detailing the process.