[SOLVED] Restoring a single folder from an Evolution backup in Ubuntu 20.04
Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Restoring a single folder from an Evolution backup in Ubuntu 20.04
Comming here from 30 years of using Windows and Outlook...
In Outlook, there is a choice of restoring if need be, an entire backup, or a single folder only.
It depends on how the backup was made. Linux has both tar and rsync, either of which will make a backup of whatever extent you specify, and restore any part of it, or all of it. Either can be run from the command line or from a cronjob. If you want to use a GUI, there is grsync, which is a graphical frontend for rsync and makes things easier for those who are unfamiliar with terminal commands. There are also GUI tools for tar, but I've never used them. Linux has both GUI and command-line tools for almost everything, and one can choose to use any combination. I tend to use the command line for many things, because it's quicker and easier for me than using a mouse/keyboard combo. But that's just me (and lots of other experienced users). If the backup was done in Windows, then a restore may be possible from Linux, but it depends on what was used to make it, and the file format of the backup. FWIW, Evolution is a good substitute for Outlook, and is available in most repositories. I think there are migration tools available, but I haven't used Evolution in years, so I'm not really familiar with it. For email and calendar I use Thunderbird, which provides all I need. YMMV.
It depends on how the backup was made. Linux has both tar and rsync, either of which will make a backup of whatever extent you specify, and restore any part of it, or all of it. Either can be run from the command line or from a cronjob. If you want to use a GUI, there is grsync, which is a graphical frontend for rsync and makes things easier for those who are unfamiliar with terminal commands. There are also GUI tools for tar, but I've never used them. Linux has both GUI and command-line tools for almost everything, and one can choose to use any combination. I tend to use the command line for many things, because it's quicker and easier for me than using a mouse/keyboard combo. But that's just me (and lots of other experienced users). If the backup was done in Windows, then a restore may be possible from Linux, but it depends on what was used to make it, and the file format of the backup. FWIW, Evolution is a good substitute for Outlook, and is available in most repositories. I think there are migration tools available, but I haven't used Evolution in years, so I'm not really familiar with it. For email and calendar I use Thunderbird, which provides all I need. YMMV.
Hard for me to know if you are a boy or a girl, and you see - I like to know who am I talking with, if you don't mind.
I was asking about EMAIL Backup and restore. I am using Evolution in Linux, made a backup - of the option in Evolution which creates a tar.gz file. I wanted to try and restore one of my folders in that backup, not the entire thing. I asked the question in the Evolution forum, but I don't get an answer.
Now, to my machine backup, I would love to have a GUY type tool to do it, should I care about what format it keeps it in? I am looking for a backup system that does a COPY of my folder into another folder in an external drive. I want a COPY, not a compresed file so that if I need to restore a single file, or a single folder - I can sift through the backup, find it and copy it back to my machine.
At this stage, I am not comfortable with terminal command, I know that there are ways to create a script that will do it, but have no idea how.
I'm a guy, I guess, actually an old geezer. Binary male, 73 years old. Tired and retired.
You can restore the entire tar.gz file, any folder, or a single file. A tar.gz file is just a .zip file in a different format, and files can be extracted just as from a .zip. There is a GUI front-end to tar, xarchiver, that makes it relatively easy. The xarchiver interface is very similar to the Windows Zip program.
As I think I posted in another thread, rsync is the tool you want to make a backup of your drive. There is a GUI front-end for that, too, called grsync. Rsync syncs files in one direction - source > destination. The initial sync will be long, but subsequent runs will be much shorter if you set it to only copy newer files. That's the standard way to do backups. No need to transfer gigabytes of files that haven't had a byte changed and probably never will. Tar was the original backup utility (Tape ARchive) but it makes tar.gz files which are compressed and concatenated like a .zip file. If you want the files to be backed up just as they are, rsync (or grsync) is the tool to do that.
You can run almost any Linux app from the command line and append --help, to get a brief explanation of the options and how to use them. Use man, as in "man package-name" for a more verbose explanation. Then you have a better idea of what the choices mean in the GUI.
I'm a guy, I guess, actually an old geezer. Binary male, 73 years old. Tired and retired.
You can restore the entire tar.gz file, any folder, or a single file. A tar.gz file is just a .zip file in a different format, and files can be extracted just as from a .zip. There is a GUI front-end to tar, xarchiver, that makes it relatively easy. The xarchiver interface is very similar to the Windows Zip program.
As I think I posted in another thread, rsync is the tool you want to make a backup of your drive. There is a GUI front-end for that, too, called grsync. Rsync syncs files in one direction - source > destination. The initial sync will be long, but subsequent runs will be much shorter if you set it to only copy newer files. That's the standard way to do backups. No need to transfer gigabytes of files that haven't had a byte changed and probably never will. Tar was the original backup utility (Tape ARchive) but it makes tar.gz files which are compressed and concatenated like a .zip file. If you want the files to be backed up just as they are, rsync (or grsync) is the tool to do that.
You can run almost any Linux app from the command line and append --help, to get a brief explanation of the options and how to use them. Use man, as in "man package-name" for a more verbose explanation. Then you have a better idea of what the choices mean in the GUI.
OK, downloaded , installed and ran a session, Source->Destination and the result is exactly what I wanted, but:
1) Do I need to create a seprate session for each folder I want to backup?
2) Is there an option to automate this, say - once a day at 2:00pm
If I need to create a script to do the automation, I have no idea how to even start about it???
I tried responding to your direct email and it got rejected, the error was :
THE ACCOUNT <xxxx@lavabit.com> HAS BEEN LOCKED FOR INACTIVITY
Lavabit went out of business, AFAIK. I used to have an account there, but got a notice they were folding. Oh well, I suppose I need to update my profile.
What are you using, bare rsync, or grsync? Rsync works perfectly doing what you want, as a cronjob. You should read up on cron. But the basics would be
Code:
crontab -e
that edits the crontab, using the default bash editor. I don't know what that is in Ubuntu, there are several. I like nano.
In the editor, create a line similar to the following, substituting your own system info, folders, etc.
That will run rsync every day at 14:00, or 2pm. For 2AM use 2 instead of 14. Easier would be @daily instead of the asterisks and numbers, but that happens every night at midnight. If you want it done at another time, use the syntax I used. The parameters aru mean archive (preserve everything), recurse into subdirectories, only update changed files, and compress the data during transfer to save time and space. Run rsync --help for more detailed information. Run 'man cron' for more information about cron, the utility which reads the crontab files and executes them. You can also find more detailed help on the web.
You do have to run separate sessions for separate folders, but look at the --exclude option, which could allow you to use one session for multiple subdirectories under one upper-level directory, excluding ones you don't want to back up. I don't know your system or what you want to back up.
Oh, and be careful with the trailing slashes on the source and destination. Read that part carefully, or you may not get exactly what you want.
Lavabit went out of business, AFAIK. I used to have an account there, but got a notice they were folding. Oh well, I suppose I need to update my profile.
What are you using, bare rsync, or grsync? Rsync works perfectly doing what you want, as a cronjob. You should read up on cron. But the basics would be
Code:
crontab -e
that edits the crontab, using the default bash editor. I don't know what that is in Ubuntu, there are several. I like nano.
In the editor, create a line similar to the following, substituting your own system info, folders, etc.
That will run rsync every day at 14:00, or 2pm. For 2AM use 2 instead of 14. Easier would be @daily instead of the asterisks and numbers, but that happens every night at midnight. If you want it done at another time, use the syntax I used. The parameters aru mean archive (preserve everything), recurse into subdirectories, only update changed files, and compress the data during transfer to save time and space. Run rsync --help for more detailed information. Run 'man cron' for more information about cron, the utility which reads the crontab files and executes them. You can also find more detailed help on the web.
You do have to run separate sessions for separate folders, but look at the --exclude option, which could allow you to use one session for multiple subdirectories under one upper-level directory, excluding ones you don't want to back up. I don't know your system or what you want to back up.
Oh, and be careful with the trailing slashes on the source and destination. Read that part carefully, or you may not get exactly what you want.
At this stage, I try to stay with GUI as much as I can. When I am done setting up a working machine, I will doubt venture into Terminal mode.
I installed Grsync, and it seems I have two major problems:
1) It seems I have to create a seprate task for each folder, unless I move all my folders into a TOP folder, then have a single task for that.
2) I do not find a way to automate it...
Maybe, I can do my configuration then with your help execute a cron job?
GUIs can't be automated. At least I don't know of a way, not in Linux nor in Windows. A cron job is easy, as long as you know the details of what you want to do. Computers are not smart, and they can't read you mind. They only do exactly what you tell them to do, exactly the way you tell them to do it. In order to set a cron job, you have to know the time(s) you want it done, and what you want to do. For your use, that means the folders to be backed up, and the destination(s) for them. How you want it done, and what feedback you want. It would be useful to know where in the filesystem structure all the locations are, so fully-qualified domain names would be necessary.
GUIs can't be automated. At least I don't know of a way, not in Linux nor in Windows. A cron job is easy, as long as you know the details of what you want to do. Computers are not smart, and they can't read you mind. They only do exactly what you tell them to do, exactly the way you tell them to do it. In order to set a cron job, you have to know the time(s) you want it done, and what you want to do. For your use, that means the folders to be backed up, and the destination(s) for them. How you want it done, and what feedback you want. It would be useful to know where in the filesystem structure all the locations are, so fully-qualified domain names would be necessary.
OK, from your previos feedback : * 14 * * * rsync -auz /source-directory/ /destination I will assume something like this, copied from my current grsync:
folder1 folder2 folder3 are without the / at the end, becasue I wanted them to be copied into the destination including the folders, not just the contents of those folders...
If so far I am correct:
1) Where on my machine do I place the file?
2) How do I set it to get started each time I login
I don't need any feedback, I check the backup once a week. If there is an error during the COPY (yes, I don't want any compression. But, would like to keep several - say 5 generations of backup back, for safety
Cron doesn't keep multiple copies, just one. The initial run copies everything, but after that it just sends changed files. It synchronizes them, thus the hame, rsync. You could get multiple copies by syncing to a different destination each day, then rotate around, but that way lies madness. The destination gets synced every day, so if the destination gets borked, it will be completely restored the next run. Rsync is very robust, and checks to insure that the files are valid after copying.
My suggestion is to use a cron job to schedule the backups. You do not create a separate file, unless you just want to call one script to do everything. When you run
Code:
crontab -e
a file is created which the cron daemon reads and executes. You don't need to be logged in, it runs in the background, as long as the computer is powered. I schedule my cron jobs for the middle of the night when nothing else is happening, so they won't bog things down when I'm working. I have some that execute at midnight, and some at 3am. I don't log out or power the computer off, I keep it on full time, although it does go into sleep mode. If you turn your computer off every time you finish, this is going to be a problem. If rsync runs 3 times every time you boot the computer, it's going to be very slow getting ready. Please, read up on running cron. There are many, many websites which explain it.
Cron doesn't keep multiple copies, just one. The initial run copies everything, but after that it just sends changed files. It synchronizes them, thus the hame, rsync. You could get multiple copies by syncing to a different destination each day, then rotate around, but that way lies madness. The destination gets synced every day, so if the destination gets borked, it will be completely restored the next run. Rsync is very robust, and checks to insure that the files are valid after copying.
My suggestion is to use a cron job to schedule the backups. You do not create a separate file, unless you just want to call one script to do everything. When you run
Code:
crontab -e
a file is created which the cron daemon reads and executes. You don't need to be logged in, it runs in the background, as long as the computer is powered. I schedule my cron jobs for the middle of the night when nothing else is happening, so they won't bog things down when I'm working. I have some that execute at midnight, and some at 3am. I don't log out or power the computer off, I keep it on full time, although it does go into sleep mode. If you turn your computer off every time you finish, this is going to be a problem. If rsync runs 3 times every time you boot the computer, it's going to be very slow getting ready. Please, read up on running cron. There are many, many websites which explain it.
Hello again,
I am a very square kind of guy... At 8:00 pm I sit after dinner to watch news and a movie. Hence, my backup job can be invoked at that time, I will not be sitting at the computer at that time.
I do turn it off before going to sleep. Turning it on in the morning should not invoked the Cron job, it is scheduled for 8 pm, right?
The reason I want to keep several backups is to avoid a bunch of bad files overriding the previous back in destination.
So, is my suggested setup for the .sh file correct? I read the tutorial about rsync, but it doesn't tell me how to include 20 folders from source to destination in one command.
You can run your cron jobs at any time you choose. Cron never sleeps.
If you want to maintain multiple snapshot backups, tar is the only sensible way to do it. The internet, and almost all business servers, run on Linux. Those businesses have huge amounts of money invested in their computer files, and they back it up with tar. It has been doing that reliably for many years. If you want to try to use cp (copy) to transfer multiple directories to multiple folders multiple times, you're on your own. I won't even try to set that up. That way lies madness. You can use the GUI utility xarchiver to check any tarball and look at every individual file in it, and restore the whole thing, individual folders, or individual files, to the location of your choice. It's easy to keep track of archives by date, not so easy for a dozen directory trees. I'm willing to help you set up either rsync or tar, but with trying to manage what you are suggesting is more than I'm willing to do. What you need and what you want may not be the same. IMO you need a backup of your entire home tree, every day, either through rsync to individual files in one place, or by tar, saving multiple separate backups. Trying to do individual subdirectories under it is more work than it's worth. I suggest one daily backup, run through cron, at whatever time you choose, of /home/alex, excluding nothing.
You can run your cron jobs at any time you choose. Cron never sleeps.
If you want to maintain multiple snapshot backups, tar is the only sensible way to do it. The internet, and almost all business servers, run on Linux. Those businesses have huge amounts of money invested in their computer files, and they back it up with tar. It has been doing that reliably for many years. If you want to try to use cp (copy) to transfer multiple directories to multiple folders multiple times, you're on your own. I won't even try to set that up. That way lies madness. You can use the GUI utility xarchiver to check any tarball and look at every individual file in it, and restore the whole thing, individual folders, or individual files, to the location of your choice. It's easy to keep track of archives by date, not so easy for a dozen directory trees. I'm willing to help you set up either rsync or tar, but with trying to manage what you are suggesting is more than I'm willing to do. What you need and what you want may not be the same. IMO you need a backup of your entire home tree, every day, either through rsync to individual files in one place, or by tar, saving multiple separate backups. Trying to do individual subdirectories under it is more work than it's worth. I suggest one daily backup, run through cron, at whatever time you choose, of /home/alex, excluding nothing.
Hello again and good afternoon to you. Here is "Picture" of my case:
My destination is to an external SD 512GB card, in it I have a folder
/media/alex/New-SD-512/DTBAckup
I want, all the above 10 folders and their subfolders and files to be COPIED (uncompressed), once a day into the folder DTBackup on the SD card. To keep versions, I would like to have 7 jobs, one for each day of the week, so that I will have a Sunday backup, override the previous Sunday, etc...
This job will run automatically, once a day at 8:00 pm. If for some reason, my computer is NOT ON at that time, that backup will be skipped.
Is that too big job for you to set for me the details, I hope not. I will replace the folder name by myself of course
Check out rsnapshot. It will provide the automation you seek and create and maintain multiple generations as you describe them. It uses rsync "under the hood" The documentation is extensive.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.