Backup Your Web Site

One of the MOST important things in running your own web site (even more important if your running web sites for clients) is making sure that you have good backups, in case something goes wrong.

“Something going wrong” could include any of the following:

  • You tinkering around on your production site, messing it up, and needing to go back (tisk, tisk)
  • Problems with your server or hosting provider
  • Your site is hacked – rolling back to a good backup (and then properly securing the site) is one of the best ways to deal with this

No matter what the reason for needing to go back, if you don’t have a backup, you can’t go back.

My rule is to never trust anyone other than myself with my data backups.  Even if my hosting provider says that they run constant backups and can get my data back up to a year ago (FYI, I’ve never seen a promise like that), I’m still going to make my own backups.  It can take a hosting provider a long time to do a restore (remember, they’ll have a LOT of data to sift through to get to yours) and some providers charge for restores (can’t say that I blame them, considering what goes into a good backup system).

Personally, before I make any changes, I like to have a good backup, just in case.  Even though I make my changes on my development copy first (hint, hint).

I should never have to go to my hosting provider for a restore.

So, here’s the trusty script that fits the majority of my backups cases.  This is designed for a site on a Linux server that uses a MySQL database and stores all of its files and folder under a single folder (web root).  You will need SSH access or the ability to upload the script to your server and run it (possibly through your providers control panel or a cron job).  You will also need to create a folder (outside of your web root) to store the backups.

#!/bin/bash
#set -xv
#Edit these variables
backupDir="/backups/"
siteName="mysite"
siteFolder="/var/www/html"
dataFolder="/var/www/data"
dbHost="localhost"
dbName="mysitedb"
dbUser="mysitedbuser"
dbPass="mysitedbpassword"
daysToKeepBackups=30
#Stop editing

dateTimeStamp=`date '+%Y%m%d_%H%M'`
dbBackupFile="${dbName}_${dateTimeStamp}.sql"
dbBackupFilePath="${backupDir}${dbBackupFile}"
zipFile="${backupDir}${siteName}_${dateTimeStamp}.zip"

#Zip files
zip -r $zipFile $siteFolder $dataFolder
if [ $dbHost ]; then
cd $backupDir
mysqldump --user=$dbUser --password=$dbPass --host=$dbHost $dbName > $dbBackupFilePath
zip -j $zipFile $dbBackupFile
rm -f $dbBackupFilePath
fi

#Remove old backup files
find ${backupDir}* -mtime +$daysToKeepBackups -exec rm {} \;

Some web applications have the web root folder of all of the web files (html, php, etc.) and an additional folder outside of the web root. With a simple modification to the script (lines 7 and 21 below), we can handle that.

#!/bin/bash
#set -xv
#Edit these variables
backupDir="/backups/"
siteName="mysite"
siteFolder="/var/www/html"
dataFolder="/var/www/data"
dbHost="localhost"
dbName="mysitedb"
dbUser="mysitedbuser"
dbPass="mysitedbpassword"
daysToKeepBackups=30
#Stop editing

dateTimeStamp=`date '+%Y%m%d_%H%M'`
dbBackupFile="${dbName}_${dateTimeStamp}.sql"
dbBackupFilePath="${backupDir}${dbBackupFile}"
zipFile="${backupDir}${siteName}_${dateTimeStamp}.zip"

#Zip files
zip -r $zipFile $siteFolder $dataFolder
if [ $dbHost ]; then
cd $backupDir
mysqldump --user=$dbUser --password=$dbPass --host=$dbHost $dbName > $dbBackupFilePath
zip -j $zipFile $dbBackupFile
rm -f $dbBackupFilePath
fi

#Remove old backup files
find ${backupDir}* -mtime +$daysToKeepBackups -exec rm {} \;

To really protect yourself, I recommend downloading your backups on a regular basis.

Tagged: Tags

Leave a Reply

Your email address will not be published. Required fields are marked *