How to back up multiple servers

Just writing down the process so I don’t forget. If anyone else gets some use out of it, that’s cool too…

Here’s how I just set up my Mac to automatically back up 2 servers, as well as my home directory, to an external firewire drive. The process uses stuff that’s included with MacOSX, so won’t cost a dime. And it’s automatable, so I won’t forget to run it.

Set up SSH to allow automated connection

Following these instructions, boiled down to bare essentials below. Run this stuff from the “client” machine (in my case, my desktop box in my cube) – where all data will wind up.

% .ssh/authorized_keys2
% scp ~/.ssh/id_dsa.pub :~/.ssh/authorized_keys2

repeat scp step for each server

This will allow your account on that client machine to SSH, rsync and SCP without being prompted for a password every time, making it possible to automate the process.

Create a shell script to automate backups

Using rsync to copy directories from the server(s) to a local volume (preferably an external drive). I created a file at ~/bin/backup_servers.sh, but it can live anywhere. Replace [SERVER] with the ip/domain of the server to be backed up. Replace [DIRECTORY] with the directory on the server to be backed up (could be something like /Library/WebServer). Replace [DEST_DIRECTORY] with the directory that will contain the backup (could be something like /Volumes/BackupDrive/[SERVER]/[DIRECTORY]).

#!/bin/sh

echo "Backing up [SERVER] [DIRECTORY] directory"
rsync -rtlzv --ignore-errors -e ssh [USERNAME]@[SERVER]:[DIRECTORY] [DEST_DIRECTORY] > [DEST_DIRECTORY]/backup.log

Tweak directories as needed, but this should create a backup (without deleting missing files) of the server directory to the external hard drive. If you want to back up more than one server, or more than one directory, just keep repeating the echo/rsync lines as needed (changing values for each server/directory, of course). I have 5 entries in my script, copying a couple of directories from 2 servers, and also backing up my home directory on my Desktop machine.

Automate it via cron

I have my cron tasks defined in a file at ~/mycrontab, so I just added this to the end of that file:

30	4	*	*	*		~/bin/backup_servers.sh > /dev/null

So now, every morning at 4:30AM, the servers and directories that I’ve specified in ~/bin/backup_server.sh will get backed up to my external firewire drive (assuming it’s on and mounted). I’m sure I could be doing fancier things to make this process even smoother, but it seems to be working fine now.

At the moment, I have stuff backed up from the 2 servers, and important stuff from my Powerbook (iPhoto library, iTunes library, Documents, etc…) get copied to the desktop where they get backed up automatically to the external drive.

Just writing down the process so I don’t forget. If anyone else gets some use out of it, that’s cool too…

Here’s how I just set up my Mac to automatically back up 2 servers, as well as my home directory, to an external firewire drive. The process uses stuff that’s included with MacOSX, so won’t cost a dime. And it’s automatable, so I won’t forget to run it.

Set up SSH to allow automated connection

Following these instructions, boiled down to bare essentials below. Run this stuff from the “client” machine (in my case, my desktop box in my cube) – where all data will wind up.

% .ssh/authorized_keys2
% scp ~/.ssh/id_dsa.pub :~/.ssh/authorized_keys2

repeat scp step for each server

This will allow your account on that client machine to SSH, rsync and SCP without being prompted for a password every time, making it possible to automate the process.

Create a shell script to automate backups

Using rsync to copy directories from the server(s) to a local volume (preferably an external drive). I created a file at ~/bin/backup_servers.sh, but it can live anywhere. Replace [SERVER] with the ip/domain of the server to be backed up. Replace [DIRECTORY] with the directory on the server to be backed up (could be something like /Library/WebServer). Replace [DEST_DIRECTORY] with the directory that will contain the backup (could be something like /Volumes/BackupDrive/[SERVER]/[DIRECTORY]).

#!/bin/sh

echo "Backing up [SERVER] [DIRECTORY] directory"
rsync -rtlzv --ignore-errors -e ssh [USERNAME]@[SERVER]:[DIRECTORY] [DEST_DIRECTORY] > [DEST_DIRECTORY]/backup.log

Tweak directories as needed, but this should create a backup (without deleting missing files) of the server directory to the external hard drive. If you want to back up more than one server, or more than one directory, just keep repeating the echo/rsync lines as needed (changing values for each server/directory, of course). I have 5 entries in my script, copying a couple of directories from 2 servers, and also backing up my home directory on my Desktop machine.

Automate it via cron

I have my cron tasks defined in a file at ~/mycrontab, so I just added this to the end of that file:

30	4	*	*	*		~/bin/backup_servers.sh > /dev/null

So now, every morning at 4:30AM, the servers and directories that I’ve specified in ~/bin/backup_server.sh will get backed up to my external firewire drive (assuming it’s on and mounted). I’m sure I could be doing fancier things to make this process even smoother, but it seems to be working fine now.

At the moment, I have stuff backed up from the 2 servers, and important stuff from my Powerbook (iPhoto library, iTunes library, Documents, etc…) get copied to the desktop where they get backed up automatically to the external drive.

4 thoughts on “How to back up multiple servers”

  1. I was just thinking that it’s a pretty big departure from the rest of the *NIX world, where you just edit a standard text file. A .plist isn’t technically much more than that, but the only place it exists is on MacOSX. There’s likely a very good technical reason for making the change though.

  2. The idea is that launchd will replace both cron and watchdog for launching of services. If you are not into editing plists – I normally copy one of the existing an edit with plist editor (part of Dev tools). There are some fancier plist editors out there. There are also some launchd editors that roll the whole file for you.

  3. Whoah! Thanks for the heads-up, Bill. I’ve been merrily cron-ing stuff for years now. So, I’ll have to edit .plist files to do this in the future?

  4. Since cron appears to be on its way out in favor of launchd, you might want to try this

    Label
    net.darcynorman.backup.plist
    LowPriorityIO

    Nice
    1
    ProgramArguments

    /User/dnorman/bin/backup_server.sh
    daily

    StartCalendarInterval

    Hour
    4
    Minute
    30

    =========

    You will then have to activate it in launchd with

    launchctl load ~/Library/LaunchAgents/net.darcynorman.backup.plist

    Reference: http://www.afp548.com/article.php?story=20050620071558293

    Cron wasn’t running by default on early 10.4 client installs so launchd was the option. It is also supposed to catch jobs that fail to run because the computer wasn’t active at the scheduled time.

Comments are closed.