Mirror a folder to a second (PGP encrypted) folder
Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Mirror a folder to a second (PGP encrypted) folder
Hello,
I would like to backup my Documents folder in the cloud, but encrypting it.
Is there any way to mirror 2 folders
/home/user/Documents -> not encrypted
/home/user/Nextcloud/Documents -> encrypted
I will work on the non-encrypted Documents folder and the system would have to encrypt the files and mirror them to the Nextcloud folder.
Obviusly the encryption must be done on my local folder and then the encrypted file has to be moved to the Nextcloud folder.
Depending on how you set up this folder might be need to known.
It should be or can be unlocked and viewable at some point to the user, via text or gui. One could make a cron job or script to take their work and then rsync it to this folder I'd think.
1. Compress the folder to a tar.gz file.
2. Encrypt the tar.gz file.
3. Transfer to your cloud storage.
Thanks for your reply. I was looking for a single file encryption for easy management but this could be good as backup solution. I may also use deja-dup to create ad incremental backup
Quote:
Originally Posted by jefro
Depending on how you set up this folder might be need to known.
It should be or can be unlocked and viewable at some point to the user, via text or gui. One could make a cron job or script to take their work and then rsync it to this folder I'd think.
I would like to have an encrypted copy of each file so I can eventually restore a single file without download the whole backup also from another computer. If you have any better idea I hear you
Sorry, I thought you wanted the entire directory as one. But doing the individual files shouldn't be difficult, either. You would need a simple script to loop through the directory encrypting each file and transferring it. I don't know the makeup of your file list, but some version of gpg --encrypt-files would seem to fit the bill. Assuming you have no gpg files in the directory to begin with, and assuming you will use scp with public key authentication, and assuming you are using bash, a simple script might be
Of course you would have to use your information, not the bogus stuff I used for example purposes and you might want to use rsync or something, but this shows the general idea. And I would also be very careful to follow the KISS (Keep It Simple) rule. It sounds as if it would be very easy to make things far more complicated than necessary.
Once the script works, run it as a cron job. Just be sure you are using full file paths in all cases and not relying on a $PATH environment variable which can cause all sorts of problems.
Sorry, I thought you wanted the entire directory as one. But doing the individual files shouldn't be difficult, either. You would need a simple script to loop through the directory encrypting each file and transferring it. I don't know the makeup of your file list, but some version of gpg --encrypt-files would seem to fit the bill. Assuming you have no gpg files in the directory to begin with, and assuming you will use scp with public key authentication, and assuming you are using bash, a simple script might be
Of course you would have to use your information, not the bogus stuff I used for example purposes and you might want to use rsync or something, but this shows the general idea. And I would also be very careful to follow the KISS (Keep It Simple) rule. It sounds as if it would be very easy to make things far more complicated than necessary.
Once the script works, run it as a cron job. Just be sure you are using full file paths in all cases and not relying on a $PATH environment variable which can cause all sorts of problems.
Thank you very much. Yes, it seems quite easy and useful. The only problem with this solution is the script can't manage the directories. In Documents I have many directories and sub-directories.
If this makes the script too much complicated I could opt for the single-file backup. It's not my favorite solution but better than nothing
I've found many backup software but no one does an encrypted copy of single files. They can do a whole encrypted backup (like DejaDup) or copy the single un-encrypted files only (like LuckyBackup).
OK, it took some work since bash is not my strong suit but there is actually a very simple way to do what you are asking. I simply needed to keep checking for KISS! Try this (with your info, of course):
Code:
find <directory to start in> -type f -exec gpg --encrypt --recipient <identity for public password> {} \;
For example:
Code:
find . type -f -exec --encrypt --recipient "George" {} \;
to encrypt all files in the current directory and subdirectories. For more complicated situations check out the man page for find.
Then, of course, you transfer the gpg files wherever you wish and remove the gpg files with another find command. NOTE: This assumes there are no gpg files currently in the directories or else you will get a query about replacing or renaming which is a problem if running as a cron job, and run the risk of losing them or are guaranteed to lose them during the deletion process.
After a little more thought, as dangerous as that can be, if the above idea is too simple and inflexible you might try looking at rsnapshot. It is not really set up for this, but, although I haven't tried something quite like what you are asking for, it is quite flexible and with some jury rigging it might work. It uses rsync but allows other scripts to be run before and after it runs rsync. It also lets you enter commands to rsync that you need in addition to what it issues. It also allows quite generous wildcarded lists of files and directories to be included and excluded. With some thought and experimentation I'll bet you could get it to work. I have used it for some years as my backup system and like it very much for being able to make it fit my every need. Of course the simpler solution above would be preferable if it works, but if not . . . .
It uses rsync but allows other scripts to be run before and after it runs rsync.
Good! So I can run your first script to encrypt everything, then use rsync to update only the .pgp files in the Nextcloud directory. In this way I should have a complete incremental backup of the encrypted files. What do you think?
Quote:
Originally Posted by Patrick59
Hi,
Just a small question, is your Nextcloud alredy encrypted ?
I'm using the e.foundation free cloud space (/e/OS is an Android ungoogled OS developed by the french creator of Mandrake/Mandriva that I really suggest you). For my knowledge, the client-side encryption is not available, so I consider it as not-encrypted.
I think that is entirely possible. It may take some work and some experimentation, but my gut feeling is that it is absolutely possible. It may not even be as hard to do as I initially thought and certainly would then be possible as a cron job. The only real gotcha about rsnapshot is to be sure you use tabs in the configuration file to separate values instead of spaces. They warn you about this but it always happens. Habits can be so terrible.
I think that is entirely possible. It may take some work and some experimentation, but my gut feeling is that it is absolutely possible. It may not even be as hard to do as I initially thought and certainly would then be possible as a cron job. The only real gotcha about rsnapshot is to be sure you use tabs in the configuration file to separate values instead of spaces. They warn you about this but it always happens. Habits can be so terrible.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.