Automate backup of your web site

Below is a method of backing up your web site document root and MySQL. Assuming this is the only thing on your server you care about, but you could easily extend this to include any set of directories you wish to backup.

Put this in cron:

# backup my site:
1 1 * * * /scripts/backup.sh > /dev/null 2>&1

The contents of /scripts/backup.sh are:

#!/bin/sh

mkdir -p /backup/
DOMAIN=domain.tld
DOCUMENT_ROOT=/path/to/document/root
DATABASE_NAME=dbname
DATABASE_USER=user
DATABASE_PASS=pass
DATE=`date +"%Y-%m-%d_%H_%M_%S"`

mysqldump -u $DATABASE_USER -p$DATABASE_PASS --opt $DATABASE_NAME | gzip > /backup/$DATABASE_NAME.$DATE.sql.gz
cd $DOCUMENT_ROOT
tar -czf /backup/$DOMAIN.$DATE.tar.gz .

#cleanup older than 30 days:
find /backup -type f -mtime +30 | xargs rm -f

# Send files to S3:
s3syncdir /backup/

This requires some explanation. The mysqldump statement has “–opt” which is a great thing to do when dumping your MySQL database. It uses extended insert (multiple rows in one INSERT statement), but more importantly, puts statements at the top to disables keys prior to insert, then indexes at once, after all rows are inserted. This gives you tremendously faster import.

The gzip reduces the filesize dramatically.

The cleanup removes files older than 30 days. Change to any value you like, if you only need a few days of backup, or maybe more days.

The last command requires the most explanation… it syncs the contents of your /backup dir with your Amazon’s S3 account. See article, “Amazon S3 Tools, Using PHP” to learn how to construct s3synndir. If you don’t have an S3 account, then figure out a way to get your backup files offsite some other way.

Dave.

Comments are closed.