Please note that LinuxExchange will be shutting down on December 31st, 2016. Visit this thread for additional information and to provide feedback.

4
2

Hello,

I'm looking for a backup tool on my (arch)linux.

On the archlinux wiki, I've found this list : backup programs. There is so many program I don't know which one choose and when I search review on the web I still found several others which looks great.

I'm looking for something to do incremental backup (save disk space and time), easy to recover and manipulate the backups. I'd like something easily customisable to do stuff (with bash script maybe) like "save my system every week, delete after two months", "save my home folder every day except the following folders, don't follow the links (to the partition or another) but keep them", "do incremental backup every x and a full backup every y", "save x every day, delete after after a week but keep one backup one week, two weeks, one month and three months old",...

I'm using only linux so I don't need a cross-system solution.

I tried rsync some times ago. It's a great tool but I had problems to keep the users and permissions. Also rsync doesn't allow to recover something else than the last backup (stop me if I'm wrong).

I heard a lot about rdiff-backup but never tried. Advantage to be able to recover previous backup.

In the wiki, there is link-backup. Never heard about it but it looks great, I may test it. Someone knows it ?

Unison : seen some good reviews. It has bidirectionnal synchronisation (feature, I don't really need).

rdup : another unknown program (based on hdup, like duplicity but looks more powerfull). I like the spirit of the program "not invent the wheel again and again" so instead of doing the backup, it uses another unix tool to do that. It can do compression and encryption. Problem, it copy the full file and not the difference (but if one backup fail, it's not so much a problem them). If someone has tested it, I'd really like to heard comments.

What do you use and why ? Thank you to develop your point and explain the main feature of the program compare to another.

asked 24 Apr '10, 08:39

martvefun's gravatar image

martvefun
1662310
accept rate: 33%

edited 24 Apr '10, 13:02

write your own backup routine in bash script. The tar command is about as useful as a swiss army knife for that task and can be wrapped with some minimal scripting to do exactly what you want. Otherwise you are picking from one of a million backup utilities that all may do something slightly different than what you really need.

(05 May '10, 08:50) shreddies



12next »

Simple Backup for local HDD to external HDD and CrashPlan Pro to go from your local PC to the cloud. $7.99 a month for unlimited space.

link

answered 03 Mar '13, 20:03

Ron's gravatar image

Ron ♦
9361718
accept rate: 13%

edited 03 Mar '13, 20:03

Try Simple Backup ( sbackup ) it works wonders for me.

link

answered 04 Nov '12, 10:13

Ron's gravatar image

Ron ♦
9361718
accept rate: 13%

I would recommend Duplicity which is by same primary author as rdiff-backup. It does incremental backups nicely and it supports S3 storage on the backend which is how I am currently using it. Plus restoration is pretty straightforward.

This is the site http://duplicity.nongnu.org/index.html

link

answered 31 Oct '12, 14:53

ranton's gravatar image

ranton
213
accept rate: 0%


sbackup
rsync
bacula
amanda
clonezilla

..all of these and a few others are all good. It's just a matter of which tool you use. While the tool is important, no one here has mentioned methodology, which I'll recommend here.

Do a fully weekly backup and then daily incremental backups. So if you do a full backup on Friday and your system borks on Monday, you need only install the full Friday backup followed by the incremental backups of Saturday and Sunday to get back to where you were before the crash.

Have redundant redundancy that is redundant. In other words, have a RAID 1 setup, do your backups as noted above, and also have those same backups backed up not only locally, but off-site as well such as via an online storage facility.

Lastly...... the most overlooked and rarely ever done step of testing your backups!! Backups that are never tested are worthless. All too often times what happens is that a system crashes and it's only then that the person finds out that their backup schema has failed them because they never tested the backups before the system crashed.

All the geeky tools, scripts, etc are absolutely worthless unless the backups can be used to restore the system. There's many ways to bake the cake, but if the cake isn't edible, what good is it, ya know what I mean?

link

answered 25 Aug '10, 14:52

Ron's gravatar image

Ron ♦
9361718
accept rate: 13%

edited 25 Aug '10, 15:02

I recommend Amanda also. It's hard setting up but when you got it running it keeps on going. Amanda Backup

link

answered 25 Aug '10, 08:42

EricTRA's gravatar image

EricTRA
564
accept rate: 7%

Hi, I'm using safekeep+rdiff-backup and it saved my neck when a RAID 5 broke (1 physical damage and one out of sync).

http://safekeep.sourceforge.net/index.shtml

link

answered 13 May '10, 18:36

Eduard%20Malinschi's gravatar image

Eduard Malin...
1
accept rate: 0%

One solution that seems to have been overlooked here is Amanda (http://en.wikipedia.org/wiki/Advanced_Maryland_Automatic_Network_Disk_Archiver), it's quite popular and feature-full and its development is also supported for Enterprise-level systems.

There is always a trade-off in customisation if you use an off-the-shelf product, like the many suggested in these answers, but on the other hand you could have a simple set of scripts to deploy your own backup system, once you are familiar with the basic concepts of "rotation" like GFS where you have full monthly ones, weekly partial ones and daily incremental ones: http://en.wikipedia.org/wiki/Backup_rotation_scheme

Compressors like zip or rar allow you to process only the modified/new files and at the same time save space. Also remember to keep a copy of really important data off-site, for instance on a CD at a friend's place or using one of those online (cloud) file storage services like JungleDisk

link

answered 12 May '10, 10:13

pmarini's gravatar image

pmarini
286216
accept rate: 28%

Dear ! Friend, You can also use DAR for backup.you can take differential backup,Full backup.

open following web links:-http://gradha.sdf-eu.org/textos/dar-differential-backup-mini-howto.en.html

http://www.softpedia.com/progDownload/DAR-Download-130423.html

                                Enjoy!
link

answered 05 May '10, 09:30

rahuldevalone's gravatar image

rahuldevalone
211
accept rate: 0%

for what I read dar seems good thanks

(05 May '10, 11:08) martvefun

I can recommend Luckybackup: http://luckybackup.sourceforge.net/

Very easy configurable with GUI, based on rsync. Automatically makes cron jobs.

link

answered 05 May '10, 08:23

Dion's gravatar image

Dion
313
accept rate: 0%

seconded and agreed. I was going to recommend this too, but you beat me to it.

(25 Aug '10, 14:45) Ron ♦

There are basically 4 ways to backup data : (let's say we have 1Go of data to backup)

1/ One backup of data using rsync. Pros : Fast. Only 1Go of space needed for backup. Cons : Only one backup.

2/ Multiple copies of data (using rsync or cp or tar or zip). Let's say we keep the last 4 weeks. Pros : Multiple aged backups. For each backup, you have the full directories structure of data. Cons : 4Go of space needed .

3/ Incremental backup (using tar or zip). Let's say we keep 1 full backup and 3 incrementals. Pros : Multiple aged backups. A bit more of 1Go of space needed. Cons : The incremental backups contain only the modified files, so it's quite difficult to find files you want to restore.

4/ Rsync + Hard-links (the best way imo). Let's say we keep the last 4 weeks. Pros : Multiple aged backups. A bit more of 1Go of space needed. Each backup contain the full directories structure of data. Cons : Slower.

How (4) works : It takes multiple full backups but, using hard links between files in backup N and files from backup N-1, it creates the illusion of multiple full backups. Actually the data is only stored in the first backup. The next backups are only links, plus differences (added files between backups).

Rsnapshot (command line) and BackInTime (GUI) work this way (4).

link

answered 03 May '10, 22:41

rndmerle's gravatar image

rndmerle
211
accept rate: 0%

Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text](http://url.com/ "Title")
  • image?![alt text](/path/img.jpg "Title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported

Tags:

×10
×1

Asked: 24 Apr '10, 08:39

Seen: 11,723 times

Last updated: 03 Mar '13, 20:03

powered by OSQA