I am looking to find a strategy for backing up data. I have an Ubuntu 10.04 box that I have essentially acting as a server. I work on multiple UNIX systems and I daily backup critical databases, directories, files, etc., via a combination of tar, bzip2, sshfs, and scp. Because of this, I have daily "snapshots" and this will begin to take it's toll on my storage space although I'm okay with snapshots but I was hoping to find out if there is a better way. Any ideas or pointers are greatly appreciated. One item I am not familiar with and haven't studied is the idea of using tar, etc., to "sync" to an existing tar file. Maybe that's a better way so if anyone can point me in the right direction on that as well. asked 28 May '10, 15:00 Andy |
I've used two strategies depending upon my need. For both I have an online RAID6 array containing a directory for each machine I'm backing up. Currently I'm using a Thecus N5200 which has been reliable for a few years: 1) Rsync is FAST and will help your snapshot problem a bit. It overwrites files having the same name with the newer version. It does NOT delete files removed on the original machine which may be a plus or minus depending upon your need to archive vs. restore. 2) For snapshots using an efficient DIFF scheme to minimize duplication, I use rdiff-backup. It's much slower than Rsync but has allowed me to restore working machines when they've been totally clobbered. It also has a feature allowing you to delete old snapshots. Read all about it (after you install it from synaptic or Applications/Ubuntu Software Center) at: man (1) rdiff-backup Good Luck. answered 28 May '10, 15:39 DBA |
Well, If you are a Perl oriented person then http://search.cpan.org/~lbrocard/Dackup-0.44/lib/Dackup.pm is your friend. i use it for everything and anything. just make a backup aplication (20 lines of code) and set it under crone and all problems solved. (for me at least) answered 17 Aug '10, 07:53 Robert Bakaric |
rdiff-backup used with http://smxi.org/site/about.htm#rbxi answered 12 Aug '10, 01:47 craigevil |
Tar is not meant for incremental backups, unless you're using tape. Rsync is the best solution I've found. I have a script that has worked well for me for years, which you can download from http://www.unc.edu/~adamsonj/files/jbackup-0.1.tar.gz It's licensed under the GPL. Keep in mind that experts (i.e. paranoids) recommend you having backups on multiple media, and in multiple locations, e.g. in case your house burns down or some kind of catastrophe destroys your whole system (including the external hard drive you are mirroring to). Enjoy! Joel answered 19 Jun '10, 19:49 trashbird1240 |
Unison File Synchronizer works wonderful in this direction. http://www.cis.upenn.edu/~bcpierce/unison/ answered 18 Jun '10, 09:11 ananth.p |
rsnapshot will give you the benefits of daily snapshots without taking much more space than a single snapshot (at least for normal usage patterns. If you have a ton of data changing on a daily basis, you'll want to explore other options).
--jeremy answered 11 Jun '10, 02:16 jeremy ♦♦ |
rsync is a good option, also I find simple-backup and also luckybackup to also be useful in their own ways. Obviously if you JUST want to backup folders and files, that is different than say backing up entire systems, which you would use a different toolset to do, like Clonezilla. https://help.ubuntu.com/community/BackupYourSystem/SimpleBackupSuite answered 28 May '10, 22:24 Ron ♦ |
I don't have an 'answer' as such but rsync could be very useful to you. For example the below would only copy files (or bits of files) if they differ, thus not transferring anything un-required.
rsync -e ssh -au /var/www/. root@server2:/var/www/.