I manage a few servers for my employer as well as for myself. Backups are always an issue on all of them. Do we purchase a commercial product? Do we try an open source solution? Do we just not do a backup? Yep, I have heard that last one before. The issues always seam to boil down to time vs money. Usually we ain’t got time and we can’t spend money. So, to backup my personal server with the minimal amount of effort with the maximum amount of usefulness, I wrote this script. It’s very customizable, but does the job.
To explain it simply, it dumps my two database systems to SQL files, makes copies of all files in the backup list (directories and files), then creates an archive for local storage and pushes a copy of all files offsite. You could tweak this a bit depending upon storage capability. Currently, my local storage capability is 500GB and my offsite storage is a cloud service with 10GB. Since I don’t have the ability to store more than 10GB I keep only the latest copy of my files on the cloud system with 3 days worth of local archives stored. It works well for me and allows me a disaster recovery option by simply mounting my offsite cloud storage to any linux server and starting up my services again.
DATE=`date +%Y%m%d` BACKUP_FILENAME="backup-$DATE" BACKUP_FILES=("/opt/" "/srv/" "/var/named/" "/root/" "/etc/httpd/" "/etc/php.ini" "/etc/murmur.ini") BACKUP_DIR="/backup" BACKUP_ARCHIVE_DIR="/backup_archive" BACKUP_ARCHIVE_COUNT=3 MYSQL_CMD="mysqldump -u root -p****** --all-databases &> /dev/null > $BACKUP_DIR/mysql-full-dump.sql" PGSQL_CMD="pg_dumpall -f $BACKUP_DIR/pgsql-full-dump.sql" CLOUD_CMD="rsync -rlth --stats --delete -e '/usr/bin/sshpass -p****** /usr/bin/ssh' $BACKUP_DIR/ USERNAME@CLOUDSTORAGE.NET:/home/USERNAME/backup/ &> /dev/null" ### DO NOT EDIT BELOW THIS LINE ### echo "Dumping MySQL Environment..." eval $MYSQL_CMD echo "Dump PGSQL Environment..." eval $PGSQL_CMD for i in "${BACKUP_FILES[@]}"; do if [ ! -d $BACKUP_DIR$i ]; then mkdir -p $BACKUP_DIR$i fi echo "Syncing $i..." rsync -rltmq --delete $i $BACKUP_DIR$i done echo "Creating local archive..." tar -cf $BACKUP_ARCHIVE_DIR/$BACKUP_FILENAME.tar.bz2 --bzip2 $BACKUP_DIR/ &> /dev/null echo "Maintaining local archives..." COUNT=0 for FILE in `ls -t $BACKUP_ARCHIVE_DIR`; do COUNT=$((COUNT+1)) if [ $COUNT -gt $BACKUP_ARCHIVE_COUNT ]; then rm -f $BACKUP_ARCHIVE_DIR/$FILE fi done echo "Syncing backup to cloud..." eval $CLOUD_CMD echo "Backup completed"