Amazon S3 has been around for more than ten years now and I have been happily using it for offsite backups of my servers for a long time. Backblaze’s cloud backup service has been around for about the same length of time and I have been happily using it for offsite backups of my laptop, also for a long time.
In September 2015, Backblaze launched a new product, B2 Cloud Storage, and while S3 standard pricing is pretty cheap (Glacier is even cheaper) B2 claims to be “the lowest cost high performance cloud storage in the world”. The first 10 GB of storage is free as is the first 1 GB of daily downloads. For my small server backup requirements this sounds perfect.
My backup tool of choice is duplicity, a command line backup tool that supports encrypted archives and a whole load of different storage services, including S3 and B2. It was a simple matter to create a new bucket on B2 and update my backup script to send the data to Backblaze instead of Amazon.
Here is a simple backup script that uses duplicity to keep one month’s worth of backups. In this example we dump a few MySQL databases but it could easily be expanded to back up any other data you wish.
#!/bin/bash ######################################################################## # A b2 backup script # Uses duplicity (http://duplicity.nongnu.org/) # # Run this daily and keep 1 month's worth of backups ######################################################################## # b2 variables B2_ACCOUNT_ID=account_id B2_APPLICATION_KEY=application_key BUCKET=my-bucket-name # GPG ENCRYPT_KEY=gpg_key_id export PASSPHRASE=key_passphrase # Database credentials DB_USER='root' DB_PASS='password' # Backup these databases DATABASES=(my_db_1 my_db_2 my_db_3) # Working directory WORKING_DIR=/root/bak ######################################################################## # Make the working directory mkdir $WORKING_DIR # # Dump the databases # for database in ${DATABASES[@]}; do mysqldump -u$DB_USER -p$DB_PASS $database > $WORKING_DIR/$database.sql done # Send them to b2 duplicity --full-if-older-than 7D --encrypt-key="$ENCRYPT_KEY" $WORKING_DIR b2://$B2_ACCOUNT_ID:$B2_APPLICATION_KEY@$BUCKET # Verify duplicity verify --encrypt-key="$ENCRYPT_KEY" b2://$B2_ACCOUNT_ID:$B2_APPLICATION_KEY@$BUCKET $WORKING_DIR # Cleanup duplicity remove-older-than 30D --force --encrypt-key=$ENCRYPT_KEY b2://$B2_ACCOUNT_ID:$B2_APPLICATION_KEY@$BUCKET # Remove the working directory rm -rf $WORKING_DIR
Run this via a cron job, something like this:
0 4 * * * /root/b2_backup.sh >>/var/log/duplicity/backup.log
Very interesting article. Could you give an example on how would the process of restoring a backup for example?
Thanks
Here is how you can restore your backups:
Good one. I made some tests and got some advice:
verify
always shows0 difference found
, even if You modify or add files to the source. It displays correct info as soon as You use the--compare-data
option, so You might want to use that.