Backing up WordPress Two Ways

Posted by: TomS on June 27, 2010 @ 9:03 pm

WordPress 3.0 is out and seems to be picking up quite a few downloads.  I’m about ready to take the plunge, but up till now, I’ve neglected the backup side of my site, so now seems like as good a time as any to get an automated backup tool running.  It’ll give me a little piece of mind when I perform the upgrade to WordPress 3.0 and will give me a frequent snapshot of the site should I ever need to restore it for other reasons (moving hosting providers, recovering from a site attack, etc.)

I’ve seen quite a few ways that one can backup their WordPress site, and specifically, I’ve decided to implement two methods.  The first will be a purely command-line method using Linux/Unix shell scripts that will copy backups of the site down to my home machine from my hosting provider.  For my second approach, I’ll use one of the WordPress Backup Plugins to store my backups on an internet based storage provider.

Backups the Command Line Way


For my command line approach to backing up, I’m going to follow the basic steps outlined below.

  1. Backup the WordPress Blog on my hosting provider (Daily)
  2. Copy the recent backups from my hosting provider onto my home computer (including removing backup file from hosting provider) (Weekly)
  3. Clean up my home computer to retain only two weeks of backups.

Backup Script

Backing up a WordPress consists of two parts.  First you must backup your database.  The database holds most of the variable data that you’ve added to your site.  In addition to the data in the database, WordPress stores some information directly on the file system (uploaded images, custom themes, etc.).  The backup script I will create should perform the following:

  • Create a temp directory.
  • Backup WordPress Database to temp directory.
  • Backup WordPress Web Directory to temp directory.
  • Archive and Compress all files into a uniquely named archive that includes a date/timestamp.
  • Clean up temp directory.

The script I made ended up looking like this

BACKUP_NAME=hackrunner-backup-$(date +%Y%m%d-%H%M)

#create and switch to temp directory
echo "Creating Temp Directory $BACKUP_NAME"
mkdir --parents "$TMP_DIR" || { echo "ERROR: Could not create temp directory"; exit 1; }
cd "$TMP_DIR" || { echo "ERROR: Could not create temp directory"; exit 1; }

#backup mysql
echo "Backing up database to $DB_BACKUP"
mysqldump -u $MYSQL_USERNAME --password=$MYSQL_PASSWORD $MYSQL_DATABASE > "$DB_BACKUP" || { echo "ERROR: Could not backupu mysql db"; exit 1; }

#backup website
echo "Backing up website to site"
mkdir site || { echo "ERROR: Could not create temp site directory"; exit 1; }
cp -R "$SITE_DIR" site || { echo "ERROR: Could not backup web site"; exit 1; }

#tar and compress backup
echo "Creating backup archive $BACKUP_NAME.tar.gz"
tar czf "$BACKUP_DIR$BACKUP_NAME.tar.gz" * || { echo "ERROR: Could not create backup archive"; exit 1; }

#get rid of temp files
echo "Cleaning up temp directory"
rm -rf "$TMP_DIR" || { echo "ERROR: Could not clean up temp directory";  }

#switch back to initial working directory
cd "$CDIR"

The script should be pretty self-explanatory, but I’ll point out a couple of things.

  • My DB is mysql, so I use the mysqldump command to get a backup of the DB. There are several different methods for backing up a DB, and they differ product to product. Check you DB documentation as a refeence, but mysqldump works pretty well for mysql.
  • My site is relatively small, so I chose to do a complete full backup each time this script runs.  There are options for incremental backups which would save some space by only storing the diffs between backups, but for my purposes, its just not worth it.
  • This script simply creates backups each time it runs.  In the next section, I’ll go about retrieving the files on my home computer and cleaning up the directory.

The last piece of this is to make sure that my backup script runs daily.  Specifically, I want the script to run each night at 11:00 PM.  To do this, I just add an entry into my cron schedule that looks like below:

0 22 * * * /usr/local/bin/

Retrieving Backups

Now that I had backups generating for my site on a nightly basis, I wanted to copy them down to my home machine.  Also, I didn’t want backups to accumulate on my hosting providers server, so I wanted to clean up the backups on the hosting providers machines.  In order to this, I created another simple script on my home machine that does the following:

  • Copy latest backups down from hosting provider machine
  • Remove all backups from hosting provider machine
  • Remove all backups from local server older than 2 weeks.

My home server already is setup with RAID mirroring and once a week does a full backup to an external drive which I periodically swap out with a second drive in a fire-proof safe.  As long as I have the backups on my home machine, I feel pretty comfortable that site backups will be reliably available should I ever need them.

The script I created for copying files down and cleaning up the server is below:


#copy files down to local server
scp $USER@$HOST:"$HOSTLOC"$PATTERN $DEST || { echo "ERROR: Could not copy latest backups from remote server"; exit 1; }

#remove files from remote server
ssh $USER@$HOST "rm $HOSTLOC$PATTERN" || { echo "ERROR: Could not remove backups from remote server"; exit 1; }

#cleanup local server
find "$DEST" -name $PATTERN -ctime +14 -exec rm {} \;  || { echo "ERROR: Could not clean up backups from local server"; exit 1; }

A few quick notes…

  • First I use scp to copy all the recent backups down from the server.  I use SSH public/private keys to login, so no passwords are required.
  • There is no srm comand, so I use ssh to send a remote command up to the server to delete all backups after the download has completed.
  • I then use find to find all backups on my local server that have a ctime greater than 14 days (two weeks) and delete those files.

I then configured to my local server to run the script every Monday at 12:00AM via Cron. On my local machine, I also pipe the output of the script to a log file, and using tee, also direct it to be mailed to me.  This cron entry looks like the following:

0 0 * * 1 /usr/local/bin/ 2>&1 | tee -a /var/log/hackrunner-backup.log | mail -t -s " fetch backups"

While a little bit involved, that set up gets full backups of my site down to my home server where I can ensure its safety.  I know as long as I keep my home backups secure, I’ll always have a snapshot available for

Backups the Plugin Way

Cloud based storage has become a lot more prevalent these days, and its desirable for two big reasons.  One, its usually fairly easy; the storage provider does most of the work for you, and two, providers usually provide a lot more redundancy than any individual can provide on their own hardware.

IDrive is a WordPress plugin that automates the backup process for you and stores you backups on  They offer 2GB of storage for free, and additional storage at pretty affordable prices.  The plugin itself has a lot of nice features including secure, incremental backups, email notifications, and automated restores.

I first saw the IDrive plugin on and decided to give it a go.  Their site has step-by-step walkthrough with screenshots, but installing and configuring the plugin really is as easy as it seems.

  • Install Plugin
  • Activate Plugin
  • Go to Settings for Plugin
  • Sign up for IDrive Account
  • Configure settings (schedule, transfer method, notifications)
  • Start your first backup

Overall, the plugin seems really nice, and I’ll keep it around as an alternative backup for my site.  If I had one complaint, its that it takes a bit to actually get a full backup the first time.  I assume further incremental backups will be faster, and I’m sure part of it is due to my hosting provider, but my roll-your-own backup scheme definitely completes much faster than the service IDrive provides.

Final Thoughts

When it comes down to it, WordPress is really just a bunch of files on disk and a database.  Its pretty simple to find/make a backup option that works best for you.  I usually like to roll my own, so I went through the process of creating a few shell scripts to manually backup the files.  Its nice to know that I have complete control of that process.

The IDrive plugin on the other hand took all of 1 minute to set up and backups my site just as well if not better.  Personally, I will keep both options going, so that I know if IDrive ever runs on hard times or my own home server gets fried, I’ll still have an alternative source of backups.  For most people though, I think something like IDrive is really the way to go.  Its very simple to install/configure and the free cloud storage provided by IDrive is probably plenty for most people.

3 responses to “Backing up WordPress Two Ways”

  1. […] backup system(s) for my WordPress blog have been running for well over a week now, and the upgrade notice in WordPress finally got the […]

  2. […] me that I needed to set up some kind of automated backup routine for this beta blog. So I took this backup script on Hackrunner and tweaked it to work with Binero (who I use to host this blog). This script will […]

  3. The scripts provide you a couple of packages db/files which are easy to restore if something goes wrong with the site.

    In the plugin option, suppose your site gets wiped out, it’s not clear how you can restore the whole… I’d stick with the scripts option.

    Good article.

Leave a Reply