A while ago I was looking for a good backup script, and there were none. It had to be fast, to cope with big data sizes, and configurable so I could set how many days to keep backups.
This folder structure has to be created manually. Rotation script will take anything in incomming folder and move it to appropriate destination backup archive.
drwxr-xr-x 15 root root 4096 Apr 23 06:44 backup.daily drwxr-xr-x 4 root root 4096 Apr 1 06:29 backup.monthly -rwxr-x--- 1 root root 1386 Feb 21 11:53 backup.sh drwxr-xr-x 10 root root 4096 Apr 20 06:38 backup.weekly drwxr-xr-x 2 root root 4096 Apr 23 06:44 incoming
Here is the my original script translated to English:
backup.sh
#!/bin/bash # Julius Zaromskis # Backup rotation # Storage folder where to move backup files # Must contain backup.monthly backup.weekly backup.daily folders storage=/home/backups/your_website_name # Source folder where files are backed source=$storage/incoming # Destination file names date_daily=`date +"%d-%m-%Y"` #date_weekly=`date +"%V sav. %m-%Y"` #date_monthly=`date +"%m-%Y"` # Get current month and week day number month_day=`date +"%d"` week_day=`date +"%u"` # Optional check if source files exist. Email if failed. if [ ! -f $source/archive.tgz ]; then ls -l $source/ | mail your@email.com -s "[backup script] Daily backup failed! Please check for missing files." fi # It is logical to run this script daily. We take files from source folder and move them to # appropriate destination folder # On first month day do if [ "$month_day" -eq 1 ] ; then destination=backup.monthly/$date_daily else # On saturdays do if [ "$week_day" -eq 6 ] ; then destination=backup.weekly/$date_daily else # On any regular day do destination=backup.daily/$date_daily fi fi # Move the files mkdir $destination mv -v $source/* $destination # daily - keep for 14 days find $storage/backup.daily/ -maxdepth 1 -mtime +14 -type d -exec rm -rv {} \; # weekly - keep for 60 days find $storage/backup.weekly/ -maxdepth 1 -mtime +60 -type d -exec rm -rv {} \; # monthly - keep for 300 days find $storage/backup.monthly/ -maxdepth 1 -mtime +300 -type d -exec rm -rv {} \;
Please note – only file rotation is performed, it is up to you to copy your backup to source folder (incoming). For faster operations, files are moved from the source folder! Here is what I use on one of the servers, to backup entire website:
/etc/cron.daily/my_website_backup
#!/bin/bash #@author Julius Zaromskis #@description Backup script for your website BACKUP_DIR=/home/backups/your_website_name FILES_DIR=/var/www/your_website_name # Dump MySQL tables mysqldump -h 127.0.0.1 -u admin -pyourpassword database > $BACKUP_DIR/incoming/mysql_dump.sql # Compress tables and files tar -cvzf $BACKUP_DIR/incoming/archive.tgz $BACKUP_DIR/incoming/mysql_dump.sql $FILES_DIR # Cleanup rm $BACKUP_DIR/incoming/mysql_dump.sql # Run backup rotate cd $BACKUP_DIR bash backup.sh
Using this! Really clear simple code that is well laid out.
Thanks… nicely done!!
Hi all, we have just developed web3backup; it’s a PHP script that backs up MySQL, SVN, local files with binary incremental backup, and rotates daily, weekly, monthly and yearly backups. You can get it at:
http://www.exteon.ro/en/products/programming-tools/web3backup
Thanks a lot for the great script, a real timesaver!
Recent mail clients are more fussy. Need to put email address at end of line, i.e change:-
ls -l $source/ | mail your@email.com -s “[backup script] Daily backup failed! Please check for missing files.”
to
ls -l $source/ | mail -s “[backup script] Daily backup failed! Please check for missing files.” your@email.com
“mkdir $destination” = mkdir backup.monthly/$date_daily. doesn’t work, prolly needs $storage/$destination instead. same with mv.
I wish to show my thanks to this writer just for bailing me out of this type of situation. As a result of looking out throughout the world-wide-web and coming across principles that were not powerful, I thought my entire life was gone. Existing without the strategies to the difficulties you’ve resolved by means of your guideline is a critical case, as well as ones that would have in a wrong way damaged my entire career if I had not noticed your web page. Your good know-how and kindness in touching every part was precious. I don’t know what I would have done if I hadn’t come across such a solution like this. I am able to now look forward to my future. Thank you very much for this impressive and effective guide. I won’t hesitate to suggest the blog to anybody who will need guide about this topic.
First off GREAT script ! A couple of things I thought of in my time scripting over the years. (1) combine the scripts and make functions and call those (2) set a min. number of backups to keep, currently it will delete them all say, for example it quit backup up properly but the script ran and performed it’s regular “housekeeping”. I am going to update the script and share it here. Thanks again, I love the thoughtfulness you had put into it thus far ! Have a nice day!
OK as I said I was going to do — I updated the script, so what I mainly did was (1) combined the scripts created a main() function call. (2) added in checks for things like folders exist backup worked etc. (3) checked that at least X backups always remain (most likely) as a user defined option. Though the purge if run seldom could delete everything , might fix that later . Anyway let me know what you think of this version: https://github.com/mikequick-dev/bash-scripts/blob/master/bash-general/backup_www_folder.sh
This is also a very good backup script: https://github.com/ccztux/glsysbackup
This is also a very good backup scrip: https://github.com/ccztux/glsysbackup
Just wanted to drop a note and say thanks for this post. I used it yesterday to setup my own backup archiving on a Synology NAS. It is still useful info.
Still usable. I’ve made my backup based on this one. Thanks