Please note that LinuxExchange will be shutting down on December 31st, 2016. Visit this thread for additional information and to provide feedback.

4
2

Hello,

I'm looking for a backup tool on my (arch)linux.

On the archlinux wiki, I've found this list : backup programs. There is so many program I don't know which one choose and when I search review on the web I still found several others which looks great.

I'm looking for something to do incremental backup (save disk space and time), easy to recover and manipulate the backups. I'd like something easily customisable to do stuff (with bash script maybe) like "save my system every week, delete after two months", "save my home folder every day except the following folders, don't follow the links (to the partition or another) but keep them", "do incremental backup every x and a full backup every y", "save x every day, delete after after a week but keep one backup one week, two weeks, one month and three months old",...

I'm using only linux so I don't need a cross-system solution.

I tried rsync some times ago. It's a great tool but I had problems to keep the users and permissions. Also rsync doesn't allow to recover something else than the last backup (stop me if I'm wrong).

I heard a lot about rdiff-backup but never tried. Advantage to be able to recover previous backup.

In the wiki, there is link-backup. Never heard about it but it looks great, I may test it. Someone knows it ?

Unison : seen some good reviews. It has bidirectionnal synchronisation (feature, I don't really need).

rdup : another unknown program (based on hdup, like duplicity but looks more powerfull). I like the spirit of the program "not invent the wheel again and again" so instead of doing the backup, it uses another unix tool to do that. It can do compression and encryption. Problem, it copy the full file and not the difference (but if one backup fail, it's not so much a problem them). If someone has tested it, I'd really like to heard comments.

What do you use and why ? Thank you to develop your point and explain the main feature of the program compare to another.

asked 24 Apr '10, 08:39

martvefun's gravatar image

martvefun
1662310
accept rate: 33%

edited 24 Apr '10, 13:02

write your own backup routine in bash script. The tar command is about as useful as a swiss army knife for that task and can be wrapped with some minimal scripting to do exactly what you want. Otherwise you are picking from one of a million backup utilities that all may do something slightly different than what you really need.

(05 May '10, 08:50) shreddies



« previous12

I recommend a backup expert tool(which suggests a commercial product) since you mentioned incrementals. I use a product from Acronis which has worked greatly for me in the past. It is intuitive and has many features including the ability to backup and restore dissimilar OS's for those who have have a dual boot option. I hesitate to recommend it now since it has gotten kind of pricey. The home backup product cost $50 and the plus pack is another $30 as opposed to the $35 I paid for both products about 3 years ago. If you consider your data very important that $80 may well be worth it.

FYI: generally speaking, backing up data is relatively easy but the recovery can be a bitch if you don't manage it properly.

link

answered 03 May '10, 19:42

jpvrla's gravatar image

jpvrla
1312
accept rate: 0%

I've been liking Back In Time on my desktop very much. On the headless server(s) rsnapshot gets my vote.

link

answered 03 May '10, 18:51

Kevin's gravatar image

Kevin
111
accept rate: 0%

I prefer writing a good shell script that uses rsync to send your backed-up data elsewhere.You can configure an array of directories and a network target URI, then write out a script that, in this order:

  1. Begins a loop for each entry in the directory array.
  2. Adds the contents of the current directory in the array to an archive and compresses it.
  3. Loops to the next directory in the array.
  4. Once the array's done, timestamp the backup's name.
  5. Open an rsync connection to the specified network target and transfer the compressed archive to the specificed destination.
  6. Delete the oldest archive (Possibly even a maximum amount of archives can be configured.) as it's not likely to be useful anymore.

This design is intended strongly for cronjobs. Maybe to fire off every other week to once a month, and is designed so that all the user has to do is configure what is backed up and to where the backups are sent. Could be a good use for a home file server, but it can even even send backups across the Internet if you so desire.

link

answered 03 May '10, 16:45

Yaro%20Kasear's gravatar image

Yaro Kasear
4914
accept rate: 0%

rsync seems to fit the bill. If you prefer a GUI version of rsync, have a look at Back In Time.

http://backintime.le-web.org/

http://lifehacker.com/5212899/back-in-time-does-full-linux-backups-in-one-click

link

answered 03 May '10, 09:44

beachboy2's gravatar image

beachboy2
211
accept rate: 0%

the biggest problem with backintime is that it does ONLY incremental backup. If I loose my computer, an incremental backup without the first full backup is useless.

(15 May '10, 08:46) martvefun

I would recommend sbackup: http://sourceforge.net/projects/sbackup

It is very easy to configure (provides a configuration GUI), and some of its features are:
- manual or automated backups
- purging of older backups
- a simple interface for configuring when to backup (uses crontab)
- include/exclude files and folders using paths, file types, file size, or regular expressions

link

answered 25 Apr '10, 16:06

Jazz's gravatar image

Jazz ♦
7811312
accept rate: 33%

I use that for my Ubuntu server, very easy and convenient.

(27 Apr '10, 21:18) atilla

sbackup is now unsupported and it doesn't work on the latest Ubuntu. It's been replaced by nssbackup (not so simple backup). I looked at many options before settling on backintime. One to keep an eye is time drive (http://www.oak-tree.us/blog/index.php/science-and-technology/time-drive)

(13 May '10, 12:26) PJO

It sounds like rsnapshot should satisfy most of your needs: http://www.linuxquestions.org/linux/articles/Jeremys_Magazine_Articles/Backups_with_rsnapshot

rsnapshot is a Perl-based utility for saving snapshots of local and remote filesystems. It uses rsync and hard links to create multiple, full filesystem backups, yet only requires slightly more disk space than a single snapshot plus incremental archives.

--jeremy

link

answered 24 Apr '10, 20:02

jeremy's gravatar image

jeremy ♦♦
1.0k1516
accept rate: 37%

Thank you but why not use rsync directly ?

(25 Apr '10, 11:48) martvefun

rsnapshot is both easier to setup and gets you additional functionality that you'd need to manually replicate or go without if you used rsync alone.

--jeremy

(25 Apr '10, 16:20) jeremy ♦♦
Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text](http://url.com/ "Title")
  • image?![alt text](/path/img.jpg "Title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported

Tags:

×10
×1

Asked: 24 Apr '10, 08:39

Seen: 11,725 times

Last updated: 03 Mar '13, 20:03

powered by OSQA