Back up your server to Backblaze B2 with Duplicity

comments 3
Ops

Amazon S3 has been around for more than ten years now and I have been happily using it for offsite backups of my servers for a long time. Backblaze’s cloud backup service has been around for about the same length of time and I have been happily using it for offsite backups of my laptop, also for a long time.

In September 2015, Backblaze launched a new product, B2 Cloud Storage, and while S3 standard pricing is pretty cheap (Glacier is even cheaper) B2 claims to be “the lowest cost high performance cloud storage in the world”. The first 10 GB of storage is free as is the first 1 GB of daily downloads. For my small server backup requirements this sounds perfect.

My backup tool of choice is duplicity, a command line backup tool that supports encrypted archives and a whole load of different storage services, including S3 and B2. It was a simple matter to create a new bucket on B2 and update my backup script to send the data to Backblaze instead of Amazon.

Here is a simple backup script that uses duplicity to keep one month’s worth of backups. In this example we dump a few MySQL databases but it could easily be expanded to back up any other data you wish.

#!/bin/bash

########################################################################
# A b2 backup script
# Uses duplicity (http://duplicity.nongnu.org/)
#
# Run this daily and keep 1 month's worth of backups
########################################################################

# b2 variables
B2_ACCOUNT_ID=account_id
B2_APPLICATION_KEY=application_key
BUCKET=my-bucket-name

# GPG
ENCRYPT_KEY=gpg_key_id
export PASSPHRASE=key_passphrase

# Database credentials
DB_USER='root'
DB_PASS='password'

# Backup these databases
DATABASES=(my_db_1 my_db_2 my_db_3) 

# Working directory
WORKING_DIR=/root/bak

########################################################################

# Make the working directory
mkdir $WORKING_DIR

#
# Dump the databases
#
for database in ${DATABASES[@]}; do
  mysqldump -u$DB_USER -p$DB_PASS $database > $WORKING_DIR/$database.sql
done

# Send them to b2
duplicity --full-if-older-than 7D --encrypt-key="$ENCRYPT_KEY" $WORKING_DIR b2://$B2_ACCOUNT_ID:$B2_APPLICATION_KEY@$BUCKET

# Verify
duplicity verify --encrypt-key="$ENCRYPT_KEY" b2://$B2_ACCOUNT_ID:$B2_APPLICATION_KEY@$BUCKET $WORKING_DIR

# Cleanup
duplicity remove-older-than 30D --force --encrypt-key=$ENCRYPT_KEY b2://$B2_ACCOUNT_ID:$B2_APPLICATION_KEY@$BUCKET

# Remove the working directory
rm -rf $WORKING_DIR

Run this via a cron job, something like this:

0 4 * * * /root/b2_backup.sh >>/var/log/duplicity/backup.log

 

 

3 Comments

  1. Here is how you can restore your backups:

    export B2_ACCOUNT_ID=???
    export B2_APPLICATION_ID=???
    export BUCKET=???
    export ENCRYPT_KEY=???
    export PASSPHRASE=???
    ulimit -n 1024
    
    duplicity --encrypt-key=$ENCRYPT_KEY list-current-files b2://$B2_ACCOUNT_ID:$B2_APPLICATION_KEY@$BUCKET
    
    duplicity --encrypt-key=$ENCRYPT_KEY b2://$B2_ACCOUNT_ID:$B2_APPLICATION_KEY@$BUCKET restore
  2. Tamás says

    Good one. I made some tests and got some advice: verify always shows 0 difference found, even if You modify or add files to the source. It displays correct info as soon as You use the --compare-data option, so You might want to use that.

Leave a Reply

Your email address will not be published. Required fields are marked *