simple way to back up a folder
What's a good way to back up a folder?
Looking to make monthly backups. Simply doing this: Code:
cp -r /path/to/folder/* /backup/folder/ Then there's the problem of monitoring every folder in case I move a folder. So, cp -r isn't going to work. Another solution is to have two backup hard drives, and cp -r onto alternating drives every month. It's gigabytes of data, and most of it is already is compressed format. So, tar and similar commands only increase the amount of time required. |
Just to satisfy my curiosity, what is wrong with using a real backup program?
|
Quote:
|
rsync -av --delete /source/ /target/
|
Perhaps this will give you some inspiration for incremental backups using rsync and hardlinks: https://wiki.alienbase.nl/doku.php?id=linux:rsnapshot
|
I suggest `restic`. Very happy with it. Incremental backup is amazingly fast!
For Slackware, see http://slackbuilds.org/repository/15...?search=restic You can find a quick start intro and plenty of documentation at https://restic.net/ I should add that backups are fully encrypted. So no confidentiality risk if you backup to a removable USB stick or hard disk. |
Quote:
Also you would not have to do both onsite and offsite backups on the same day. You could stagger them two weeks apart. |
tar has an option to update only changed files. 7zip and others have a "only add if changed" option as well. Some don't support permissions and users/groups saving so consider that. cpio. rsync.
|
Creating something like a new tar file for every backup has the advantage that you will have different backups with different ages. If you accidentally break something in the original source and make a new backup you will at least still have some older backup backup laying around.
The same can be accomplished also with rsync to a single directory structure if the target directory structure resides on a file system with some kind of snapshot functionality. Whenever someone is considering different backup solutions I usually point to this good old set of pages with things to consider: http://www.taobackup.com/ regards Henrik |
rsync mo better
Code:
rsync -av --progress /path/to/folder/* /backup/folder/ I sometimes `alias backup='rsync -av --progress to ease the wear and tear of the old fingerbones. Incremental built in. I use cp for monthly image backups, rsync for daily and weeklies. |
https://www.tecmint.com/linux-system-backup-tools/ provides a list of the best 23, but not the other 104.
|
Quote:
If you want file versioning, then I can recommend rsnapshot: https://rsnapshot.org/. It's simple and robust, and hasn't disappointed me once since I started using it. It optimises space usage by using a combination of rsync, hard links and deltas. For example: I have a 62Gb data set, for which I started using rsnapshot almost 2 years ago. At this point I have 7 daily, 4 weekly, 12 monthly and 1 annual backups in 77Gb of hard drive space. |
export bkpdate="`date +%y%m%d'_'%H%M%S`"
cd the_dir_to_backup tar -zcvf * backup_${bkpdate}.tar.gz Copy resulting file to some other media. A backup kept on the same media is worth nothing. |
Quote:
There are lots of ways of using tar, but I prefer: tar -cvf /backups/wibble.bkp.tar -C /somewhere/wibble . Use of -C and '.' makes it easier to restore to an alternate location while still using absolute paths on the command-line. For "archival backups" I use gnu tar listed-incrementals. For disaster recovery (i.e. latest point in time snapshot) backups I use rsync. I don't like to use any backup tools that will need installing before I can restore from them. By sticking with tar and rsync I know they will already be present. |
Quote:
|
All times are GMT -5. The time now is 03:24 PM. |