LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 08-17-2020, 03:35 PM   #1
alex4buba
Member
 
Registered: Jul 2020
Posts: 624

Rep: Reputation: Disabled
Restoring a single folder from an Evolution backup in Ubuntu 20.04


Comming here from 30 years of using Windows and Outlook...
In Outlook, there is a choice of restoring if need be, an entire backup, or a single folder only.

Dd we have such a facility in Linux?

Cheers
Alex
 
Old 08-18-2020, 09:12 AM   #2
sgosnell
Senior Member
 
Registered: Jan 2008
Location: Baja Oklahoma
Distribution: Debian Stable and Unstable
Posts: 1,943

Rep: Reputation: 542Reputation: 542Reputation: 542Reputation: 542Reputation: 542Reputation: 542
It depends on how the backup was made. Linux has both tar and rsync, either of which will make a backup of whatever extent you specify, and restore any part of it, or all of it. Either can be run from the command line or from a cronjob. If you want to use a GUI, there is grsync, which is a graphical frontend for rsync and makes things easier for those who are unfamiliar with terminal commands. There are also GUI tools for tar, but I've never used them. Linux has both GUI and command-line tools for almost everything, and one can choose to use any combination. I tend to use the command line for many things, because it's quicker and easier for me than using a mouse/keyboard combo. But that's just me (and lots of other experienced users). If the backup was done in Windows, then a restore may be possible from Linux, but it depends on what was used to make it, and the file format of the backup. FWIW, Evolution is a good substitute for Outlook, and is available in most repositories. I think there are migration tools available, but I haven't used Evolution in years, so I'm not really familiar with it. For email and calendar I use Thunderbird, which provides all I need. YMMV.
 
Old 08-18-2020, 04:09 PM   #3
alex4buba
Member
 
Registered: Jul 2020
Posts: 624

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by sgosnell View Post
It depends on how the backup was made. Linux has both tar and rsync, either of which will make a backup of whatever extent you specify, and restore any part of it, or all of it. Either can be run from the command line or from a cronjob. If you want to use a GUI, there is grsync, which is a graphical frontend for rsync and makes things easier for those who are unfamiliar with terminal commands. There are also GUI tools for tar, but I've never used them. Linux has both GUI and command-line tools for almost everything, and one can choose to use any combination. I tend to use the command line for many things, because it's quicker and easier for me than using a mouse/keyboard combo. But that's just me (and lots of other experienced users). If the backup was done in Windows, then a restore may be possible from Linux, but it depends on what was used to make it, and the file format of the backup. FWIW, Evolution is a good substitute for Outlook, and is available in most repositories. I think there are migration tools available, but I haven't used Evolution in years, so I'm not really familiar with it. For email and calendar I use Thunderbird, which provides all I need. YMMV.
Hard for me to know if you are a boy or a girl, and you see - I like to know who am I talking with, if you don't mind.

I was asking about EMAIL Backup and restore. I am using Evolution in Linux, made a backup - of the option in Evolution which creates a tar.gz file. I wanted to try and restore one of my folders in that backup, not the entire thing. I asked the question in the Evolution forum, but I don't get an answer.

Now, to my machine backup, I would love to have a GUY type tool to do it, should I care about what format it keeps it in? I am looking for a backup system that does a COPY of my folder into another folder in an external drive. I want a COPY, not a compresed file so that if I need to restore a single file, or a single folder - I can sift through the backup, find it and copy it back to my machine.

At this stage, I am not comfortable with terminal command, I know that there are ways to create a script that will do it, but have no idea how.

Over to you, thanks for your time

Alex
 
Old 08-18-2020, 04:27 PM   #4
sgosnell
Senior Member
 
Registered: Jan 2008
Location: Baja Oklahoma
Distribution: Debian Stable and Unstable
Posts: 1,943

Rep: Reputation: 542Reputation: 542Reputation: 542Reputation: 542Reputation: 542Reputation: 542
I'm a guy, I guess, actually an old geezer. Binary male, 73 years old. Tired and retired.

You can restore the entire tar.gz file, any folder, or a single file. A tar.gz file is just a .zip file in a different format, and files can be extracted just as from a .zip. There is a GUI front-end to tar, xarchiver, that makes it relatively easy. The xarchiver interface is very similar to the Windows Zip program.

As I think I posted in another thread, rsync is the tool you want to make a backup of your drive. There is a GUI front-end for that, too, called grsync. Rsync syncs files in one direction - source > destination. The initial sync will be long, but subsequent runs will be much shorter if you set it to only copy newer files. That's the standard way to do backups. No need to transfer gigabytes of files that haven't had a byte changed and probably never will. Tar was the original backup utility (Tape ARchive) but it makes tar.gz files which are compressed and concatenated like a .zip file. If you want the files to be backed up just as they are, rsync (or grsync) is the tool to do that.

You can run almost any Linux app from the command line and append --help, to get a brief explanation of the options and how to use them. Use man, as in "man package-name" for a more verbose explanation. Then you have a better idea of what the choices mean in the GUI.
 
Old 08-18-2020, 06:07 PM   #5
alex4buba
Member
 
Registered: Jul 2020
Posts: 624

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by sgosnell View Post
I'm a guy, I guess, actually an old geezer. Binary male, 73 years old. Tired and retired.

You can restore the entire tar.gz file, any folder, or a single file. A tar.gz file is just a .zip file in a different format, and files can be extracted just as from a .zip. There is a GUI front-end to tar, xarchiver, that makes it relatively easy. The xarchiver interface is very similar to the Windows Zip program.

As I think I posted in another thread, rsync is the tool you want to make a backup of your drive. There is a GUI front-end for that, too, called grsync. Rsync syncs files in one direction - source > destination. The initial sync will be long, but subsequent runs will be much shorter if you set it to only copy newer files. That's the standard way to do backups. No need to transfer gigabytes of files that haven't had a byte changed and probably never will. Tar was the original backup utility (Tape ARchive) but it makes tar.gz files which are compressed and concatenated like a .zip file. If you want the files to be backed up just as they are, rsync (or grsync) is the tool to do that.

You can run almost any Linux app from the command line and append --help, to get a brief explanation of the options and how to use them. Use man, as in "man package-name" for a more verbose explanation. Then you have a better idea of what the choices mean in the GUI.
OK, downloaded , installed and ran a session, Source->Destination and the result is exactly what I wanted, but:
1) Do I need to create a seprate session for each folder I want to backup?
2) Is there an option to automate this, say - once a day at 2:00pm

If I need to create a script to do the automation, I have no idea how to even start about it???
I tried responding to your direct email and it got rejected, the error was :

THE ACCOUNT <xxxx@lavabit.com> HAS BEEN LOCKED FOR INACTIVITY

Cheers
Alex

Last edited by alex4buba; 08-18-2020 at 06:12 PM.
 
Old 08-18-2020, 07:05 PM   #6
sgosnell
Senior Member
 
Registered: Jan 2008
Location: Baja Oklahoma
Distribution: Debian Stable and Unstable
Posts: 1,943

Rep: Reputation: 542Reputation: 542Reputation: 542Reputation: 542Reputation: 542Reputation: 542
Lavabit went out of business, AFAIK. I used to have an account there, but got a notice they were folding. Oh well, I suppose I need to update my profile.

What are you using, bare rsync, or grsync? Rsync works perfectly doing what you want, as a cronjob. You should read up on cron. But the basics would be
Code:
crontab -e
that edits the crontab, using the default bash editor. I don't know what that is in Ubuntu, there are several. I like nano.
In the editor, create a line similar to the following, substituting your own system info, folders, etc.
Code:
* 14 * * * rsync -auz /source-directory/ /destination
That will run rsync every day at 14:00, or 2pm. For 2AM use 2 instead of 14. Easier would be @daily instead of the asterisks and numbers, but that happens every night at midnight. If you want it done at another time, use the syntax I used. The parameters aru mean archive (preserve everything), recurse into subdirectories, only update changed files, and compress the data during transfer to save time and space. Run rsync --help for more detailed information. Run 'man cron' for more information about cron, the utility which reads the crontab files and executes them. You can also find more detailed help on the web.

A good rsync tutorial is here: https://phoenixnap.com/kb/rsync-command-linux-examples Putting linux cron tutorial into my search engine returns more hits than I care to read, so just pick one.

You do have to run separate sessions for separate folders, but look at the --exclude option, which could allow you to use one session for multiple subdirectories under one upper-level directory, excluding ones you don't want to back up. I don't know your system or what you want to back up.

Oh, and be careful with the trailing slashes on the source and destination. Read that part carefully, or you may not get exactly what you want.

Last edited by sgosnell; 08-18-2020 at 07:12 PM.
 
Old 08-18-2020, 09:23 PM   #7
alex4buba
Member
 
Registered: Jul 2020
Posts: 624

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by sgosnell View Post
Lavabit went out of business, AFAIK. I used to have an account there, but got a notice they were folding. Oh well, I suppose I need to update my profile.

What are you using, bare rsync, or grsync? Rsync works perfectly doing what you want, as a cronjob. You should read up on cron. But the basics would be
Code:
crontab -e
that edits the crontab, using the default bash editor. I don't know what that is in Ubuntu, there are several. I like nano.
In the editor, create a line similar to the following, substituting your own system info, folders, etc.
Code:
* 14 * * * rsync -auz /source-directory/ /destination
That will run rsync every day at 14:00, or 2pm. For 2AM use 2 instead of 14. Easier would be @daily instead of the asterisks and numbers, but that happens every night at midnight. If you want it done at another time, use the syntax I used. The parameters aru mean archive (preserve everything), recurse into subdirectories, only update changed files, and compress the data during transfer to save time and space. Run rsync --help for more detailed information. Run 'man cron' for more information about cron, the utility which reads the crontab files and executes them. You can also find more detailed help on the web.

A good rsync tutorial is here: https://phoenixnap.com/kb/rsync-command-linux-examples Putting linux cron tutorial into my search engine returns more hits than I care to read, so just pick one.

You do have to run separate sessions for separate folders, but look at the --exclude option, which could allow you to use one session for multiple subdirectories under one upper-level directory, excluding ones you don't want to back up. I don't know your system or what you want to back up.

Oh, and be careful with the trailing slashes on the source and destination. Read that part carefully, or you may not get exactly what you want.
At this stage, I try to stay with GUI as much as I can. When I am done setting up a working machine, I will doubt venture into Terminal mode.

I installed Grsync, and it seems I have two major problems:
1) It seems I have to create a seprate task for each folder, unless I move all my folders into a TOP folder, then have a single task for that.
2) I do not find a way to automate it...

Maybe, I can do my configuration then with your help execute a cron job?

Cheers
Alex
 
Old 08-18-2020, 09:32 PM   #8
sgosnell
Senior Member
 
Registered: Jan 2008
Location: Baja Oklahoma
Distribution: Debian Stable and Unstable
Posts: 1,943

Rep: Reputation: 542Reputation: 542Reputation: 542Reputation: 542Reputation: 542Reputation: 542
GUIs can't be automated. At least I don't know of a way, not in Linux nor in Windows. A cron job is easy, as long as you know the details of what you want to do. Computers are not smart, and they can't read you mind. They only do exactly what you tell them to do, exactly the way you tell them to do it. In order to set a cron job, you have to know the time(s) you want it done, and what you want to do. For your use, that means the folders to be backed up, and the destination(s) for them. How you want it done, and what feedback you want. It would be useful to know where in the filesystem structure all the locations are, so fully-qualified domain names would be necessary.
 
Old 08-18-2020, 10:20 PM   #9
alex4buba
Member
 
Registered: Jul 2020
Posts: 624

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by sgosnell View Post
GUIs can't be automated. At least I don't know of a way, not in Linux nor in Windows. A cron job is easy, as long as you know the details of what you want to do. Computers are not smart, and they can't read you mind. They only do exactly what you tell them to do, exactly the way you tell them to do it. In order to set a cron job, you have to know the time(s) you want it done, and what you want to do. For your use, that means the folders to be backed up, and the destination(s) for them. How you want it done, and what feedback you want. It would be useful to know where in the filesystem structure all the locations are, so fully-qualified domain names would be necessary.
OK, from your previos feedback : * 14 * * * rsync -auz /source-directory/ /destination I will assume something like this, copied from my current grsync:

* 14 * * * rsync -auz /home/alex/folder1 /media/alex/WD_Extern/DTBackup
* 14 * * * rsync -auz /home/alex/folder2 /media/alex/WD_Extern/DTBackup
* 14 * * * rsync -auz /home/alex/folder3 /media/alex/WD_Extern/DTBackup
etc...

inside a file call as an example mybackup.sh

Am I right so far?

folder1 folder2 folder3 are without the / at the end, becasue I wanted them to be copied into the destination including the folders, not just the contents of those folders...

If so far I am correct:

1) Where on my machine do I place the file?
2) How do I set it to get started each time I login

I don't need any feedback, I check the backup once a week. If there is an error during the COPY (yes, I don't want any compression. But, would like to keep several - say 5 generations of backup back, for safety


Cheers
Alex
 
Old 08-18-2020, 10:49 PM   #10
sgosnell
Senior Member
 
Registered: Jan 2008
Location: Baja Oklahoma
Distribution: Debian Stable and Unstable
Posts: 1,943

Rep: Reputation: 542Reputation: 542Reputation: 542Reputation: 542Reputation: 542Reputation: 542
Cron doesn't keep multiple copies, just one. The initial run copies everything, but after that it just sends changed files. It synchronizes them, thus the hame, rsync. You could get multiple copies by syncing to a different destination each day, then rotate around, but that way lies madness. The destination gets synced every day, so if the destination gets borked, it will be completely restored the next run. Rsync is very robust, and checks to insure that the files are valid after copying.

My suggestion is to use a cron job to schedule the backups. You do not create a separate file, unless you just want to call one script to do everything. When you run
Code:
crontab -e
a file is created which the cron daemon reads and executes. You don't need to be logged in, it runs in the background, as long as the computer is powered. I schedule my cron jobs for the middle of the night when nothing else is happening, so they won't bog things down when I'm working. I have some that execute at midnight, and some at 3am. I don't log out or power the computer off, I keep it on full time, although it does go into sleep mode. If you turn your computer off every time you finish, this is going to be a problem. If rsync runs 3 times every time you boot the computer, it's going to be very slow getting ready. Please, read up on running cron. There are many, many websites which explain it.
 
Old 08-18-2020, 11:01 PM   #11
alex4buba
Member
 
Registered: Jul 2020
Posts: 624

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by sgosnell View Post
Cron doesn't keep multiple copies, just one. The initial run copies everything, but after that it just sends changed files. It synchronizes them, thus the hame, rsync. You could get multiple copies by syncing to a different destination each day, then rotate around, but that way lies madness. The destination gets synced every day, so if the destination gets borked, it will be completely restored the next run. Rsync is very robust, and checks to insure that the files are valid after copying.

My suggestion is to use a cron job to schedule the backups. You do not create a separate file, unless you just want to call one script to do everything. When you run
Code:
crontab -e
a file is created which the cron daemon reads and executes. You don't need to be logged in, it runs in the background, as long as the computer is powered. I schedule my cron jobs for the middle of the night when nothing else is happening, so they won't bog things down when I'm working. I have some that execute at midnight, and some at 3am. I don't log out or power the computer off, I keep it on full time, although it does go into sleep mode. If you turn your computer off every time you finish, this is going to be a problem. If rsync runs 3 times every time you boot the computer, it's going to be very slow getting ready. Please, read up on running cron. There are many, many websites which explain it.
Hello again,

I am a very square kind of guy... At 8:00 pm I sit after dinner to watch news and a movie. Hence, my backup job can be invoked at that time, I will not be sitting at the computer at that time.
I do turn it off before going to sleep. Turning it on in the morning should not invoked the Cron job, it is scheduled for 8 pm, right?
The reason I want to keep several backups is to avoid a bunch of bad files overriding the previous back in destination.

So, is my suggested setup for the .sh file correct? I read the tutorial about rsync, but it doesn't tell me how to include 20 folders from source to destination in one command.

Nor does it tell me how to automate it

It is late now at your place, go to sleep

Thanks again
Alex
 
Old 08-19-2020, 09:03 AM   #12
sgosnell
Senior Member
 
Registered: Jan 2008
Location: Baja Oklahoma
Distribution: Debian Stable and Unstable
Posts: 1,943

Rep: Reputation: 542Reputation: 542Reputation: 542Reputation: 542Reputation: 542Reputation: 542
You can run your cron jobs at any time you choose. Cron never sleeps.

If you want to maintain multiple snapshot backups, tar is the only sensible way to do it. The internet, and almost all business servers, run on Linux. Those businesses have huge amounts of money invested in their computer files, and they back it up with tar. It has been doing that reliably for many years. If you want to try to use cp (copy) to transfer multiple directories to multiple folders multiple times, you're on your own. I won't even try to set that up. That way lies madness. You can use the GUI utility xarchiver to check any tarball and look at every individual file in it, and restore the whole thing, individual folders, or individual files, to the location of your choice. It's easy to keep track of archives by date, not so easy for a dozen directory trees. I'm willing to help you set up either rsync or tar, but with trying to manage what you are suggesting is more than I'm willing to do. What you need and what you want may not be the same. IMO you need a backup of your entire home tree, every day, either through rsync to individual files in one place, or by tar, saving multiple separate backups. Trying to do individual subdirectories under it is more work than it's worth. I suggest one daily backup, run through cron, at whatever time you choose, of /home/alex, excluding nothing.
 
Old 08-19-2020, 03:53 PM   #13
alex4buba
Member
 
Registered: Jul 2020
Posts: 624

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by sgosnell View Post
You can run your cron jobs at any time you choose. Cron never sleeps.

If you want to maintain multiple snapshot backups, tar is the only sensible way to do it. The internet, and almost all business servers, run on Linux. Those businesses have huge amounts of money invested in their computer files, and they back it up with tar. It has been doing that reliably for many years. If you want to try to use cp (copy) to transfer multiple directories to multiple folders multiple times, you're on your own. I won't even try to set that up. That way lies madness. You can use the GUI utility xarchiver to check any tarball and look at every individual file in it, and restore the whole thing, individual folders, or individual files, to the location of your choice. It's easy to keep track of archives by date, not so easy for a dozen directory trees. I'm willing to help you set up either rsync or tar, but with trying to manage what you are suggesting is more than I'm willing to do. What you need and what you want may not be the same. IMO you need a backup of your entire home tree, every day, either through rsync to individual files in one place, or by tar, saving multiple separate backups. Trying to do individual subdirectories under it is more work than it's worth. I suggest one daily backup, run through cron, at whatever time you choose, of /home/alex, excluding nothing.
Hello again and good afternoon to you. Here is "Picture" of my case:

Source :

/home/alex/folder1
/home/alex/folder2
/home/alex/folder3
/home/alex/...

A total of 10 folders currently

My destination is to an external SD 512GB card, in it I have a folder

/media/alex/New-SD-512/DTBAckup

I want, all the above 10 folders and their subfolders and files to be COPIED (uncompressed), once a day into the folder DTBackup on the SD card. To keep versions, I would like to have 7 jobs, one for each day of the week, so that I will have a Sunday backup, override the previous Sunday, etc...

This job will run automatically, once a day at 8:00 pm. If for some reason, my computer is NOT ON at that time, that backup will be skipped.

Is that too big job for you to set for me the details, I hope not. I will replace the folder name by myself of course

Thanks in advance

Alex
 
Old 08-19-2020, 04:17 PM   #14
scasey
LQ Veteran
 
Registered: Feb 2013
Location: Tucson, AZ, USA
Distribution: CentOS 7.9.2009
Posts: 5,735

Rep: Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212Reputation: 2212
Check out rsnapshot. It will provide the automation you seek and create and maintain multiple generations as you describe them. It uses rsync "under the hood" The documentation is extensive.
 
Old 08-19-2020, 04:37 PM   #15
sgosnell
Senior Member
 
Registered: Jan 2008
Location: Baja Oklahoma
Distribution: Debian Stable and Unstable
Posts: 1,943

Rep: Reputation: 542Reputation: 542Reputation: 542Reputation: 542Reputation: 542Reputation: 542
I was unaware of rsnapshot, but it does look like exactly what the OP is after. I'll have to give that a trial.
 
  


Reply

Tags
backup, restore



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Newbie trying to write a simple backup script to backup a single folder Nd for school stryker759a Linux - Newbie 2 09-16-2009 08:52 AM
Permission problem with importing evolution mail from the old .evolution folder oskar Linux - Software 4 05-23-2008 08:28 PM
New installation: restoring the old home folder? Backup advices bruno321 Linux - General 4 11-18-2007 08:55 AM
weird error restoring Evolution melinda_sayang Linux - Software 4 11-08-2005 01:18 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 05:57 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration