Hello, I'm looking for a backup tool on my (arch)linux. On the archlinux wiki, I've found this list : backup programs. There is so many program I don't know which one choose and when I search review on the web I still found several others which looks great. I'm looking for something to do incremental backup (save disk space and time), easy to recover and manipulate the backups. I'd like something easily customisable to do stuff (with bash script maybe) like "save my system every week, delete after two months", "save my home folder every day except the following folders, don't follow the links (to the partition or another) but keep them", "do incremental backup every x and a full backup every y", "save x every day, delete after after a week but keep one backup one week, two weeks, one month and three months old",... I'm using only linux so I don't need a cross-system solution. I tried rsync some times ago. It's a great tool but I had problems to keep the users and permissions. Also rsync doesn't allow to recover something else than the last backup (stop me if I'm wrong). I heard a lot about rdiff-backup but never tried. Advantage to be able to recover previous backup. In the wiki, there is link-backup. Never heard about it but it looks great, I may test it. Someone knows it ? Unison : seen some good reviews. It has bidirectionnal synchronisation (feature, I don't really need). rdup : another unknown program (based on hdup, like duplicity but looks more powerfull). I like the spirit of the program "not invent the wheel again and again" so instead of doing the backup, it uses another unix tool to do that. It can do compression and encryption. Problem, it copy the full file and not the difference (but if one backup fail, it's not so much a problem them). If someone has tested it, I'd really like to heard comments. What do you use and why ? Thank you to develop your point and explain the main feature of the program compare to another. asked 24 Apr '10, 08:39 martvefun |
It sounds like rsnapshot should satisfy most of your needs: http://www.linuxquestions.org/linux/articles/Jeremys_Magazine_Articles/Backups_with_rsnapshot
--jeremy answered 24 Apr '10, 20:02 jeremy ♦♦ Thank you but why not use rsync directly ?
(25 Apr '10, 11:48)
martvefun
rsnapshot is both easier to setup and gets you additional functionality that you'd need to manually replicate or go without if you used rsync alone. --jeremy
(25 Apr '10, 16:20)
jeremy ♦♦
|
I would recommend sbackup: http://sourceforge.net/projects/sbackup It is very easy to configure (provides a configuration GUI), and some of its features are: answered 25 Apr '10, 16:06 Jazz ♦ I use that for my Ubuntu server, very easy and convenient.
(27 Apr '10, 21:18)
atilla
sbackup is now unsupported and it doesn't work on the latest Ubuntu. It's been replaced by nssbackup (not so simple backup). I looked at many options before settling on backintime. One to keep an eye is time drive (http://www.oak-tree.us/blog/index.php/science-and-technology/time-drive)
(13 May '10, 12:26)
PJO
|
rsync seems to fit the bill. If you prefer a GUI version of rsync, have a look at Back In Time. http://lifehacker.com/5212899/back-in-time-does-full-linux-backups-in-one-click answered 03 May '10, 09:44 beachboy2 the biggest problem with backintime is that it does ONLY incremental backup. If I loose my computer, an incremental backup without the first full backup is useless.
(15 May '10, 08:46)
martvefun
|
I prefer writing a good shell script that uses rsync to send your backed-up data elsewhere.You can configure an array of directories and a network target URI, then write out a script that, in this order:
This design is intended strongly for cronjobs. Maybe to fire off every other week to once a month, and is designed so that all the user has to do is configure what is backed up and to where the backups are sent. Could be a good use for a home file server, but it can even even send backups across the Internet if you so desire. answered 03 May '10, 16:45 Yaro Kasear |
There are basically 4 ways to backup data : (let's say we have 1Go of data to backup) 1/ One backup of data using rsync. Pros : Fast. Only 1Go of space needed for backup. Cons : Only one backup. 2/ Multiple copies of data (using rsync or cp or tar or zip). Let's say we keep the last 4 weeks. Pros : Multiple aged backups. For each backup, you have the full directories structure of data. Cons : 4Go of space needed . 3/ Incremental backup (using tar or zip). Let's say we keep 1 full backup and 3 incrementals. Pros : Multiple aged backups. A bit more of 1Go of space needed. Cons : The incremental backups contain only the modified files, so it's quite difficult to find files you want to restore. 4/ Rsync + Hard-links (the best way imo). Let's say we keep the last 4 weeks. Pros : Multiple aged backups. A bit more of 1Go of space needed. Each backup contain the full directories structure of data. Cons : Slower. How (4) works : It takes multiple full backups but, using hard links between files in backup N and files from backup N-1, it creates the illusion of multiple full backups. Actually the data is only stored in the first backup. The next backups are only links, plus differences (added files between backups). Rsnapshot (command line) and BackInTime (GUI) work this way (4). answered 03 May '10, 22:41 rndmerle |
I can recommend Luckybackup: http://luckybackup.sourceforge.net/ Very easy configurable with GUI, based on rsync. Automatically makes cron jobs. answered 05 May '10, 08:23 Dion seconded and agreed. I was going to recommend this too, but you beat me to it.
(25 Aug '10, 14:45)
Ron ♦
|
Dear ! Friend, You can also use DAR for backup.you can take differential backup,Full backup. open following web links:-http://gradha.sdf-eu.org/textos/dar-differential-backup-mini-howto.en.html http://www.softpedia.com/progDownload/DAR-Download-130423.html
answered 05 May '10, 09:30 rahuldevalone for what I read dar seems good thanks
(05 May '10, 11:08)
martvefun
|
I recommend a backup expert tool(which suggests a commercial product) since you mentioned incrementals. I use a product from Acronis which has worked greatly for me in the past. It is intuitive and has many features including the ability to backup and restore dissimilar OS's for those who have have a dual boot option. I hesitate to recommend it now since it has gotten kind of pricey. The home backup product cost $50 and the plus pack is another $30 as opposed to the $35 I paid for both products about 3 years ago. If you consider your data very important that $80 may well be worth it. FYI: generally speaking, backing up data is relatively easy but the recovery can be a bitch if you don't manage it properly. answered 03 May '10, 19:42 jpvrla |
One solution that seems to have been overlooked here is Amanda (http://en.wikipedia.org/wiki/Advanced_Maryland_Automatic_Network_Disk_Archiver), it's quite popular and feature-full and its development is also supported for Enterprise-level systems. There is always a trade-off in customisation if you use an off-the-shelf product, like the many suggested in these answers, but on the other hand you could have a simple set of scripts to deploy your own backup system, once you are familiar with the basic concepts of "rotation" like GFS where you have full monthly ones, weekly partial ones and daily incremental ones: http://en.wikipedia.org/wiki/Backup_rotation_scheme Compressors like zip or rar allow you to process only the modified/new files and at the same time save space. Also remember to keep a copy of really important data off-site, for instance on a CD at a friend's place or using one of those online (cloud) file storage services like JungleDisk answered 12 May '10, 10:13 pmarini |
write your own backup routine in bash script. The tar command is about as useful as a swiss army knife for that task and can be wrapped with some minimal scripting to do exactly what you want. Otherwise you are picking from one of a million backup utilities that all may do something slightly different than what you really need.