LinuxQuestions.org
Help answer threads with 0 replies.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 03-26-2020, 10:59 AM   #1
Enry211
LQ Newbie
 
Registered: Jan 2010
Distribution: Fedora Mate
Posts: 23

Rep: Reputation: 0
Mirror a folder to a second (PGP encrypted) folder


Hello,
I would like to backup my Documents folder in the cloud, but encrypting it.

Is there any way to mirror 2 folders

/home/user/Documents -> not encrypted
/home/user/Nextcloud/Documents -> encrypted

I will work on the non-encrypted Documents folder and the system would have to encrypt the files and mirror them to the Nextcloud folder.
Obviusly the encryption must be done on my local folder and then the encrypted file has to be moved to the Nextcloud folder.

Thanks!
 
Old 03-26-2020, 12:30 PM   #2
agillator
Member
 
Registered: Aug 2016
Distribution: Mint 19.1
Posts: 419

Rep: Reputation: Disabled
I would do it in three steps:

1. Compress the folder to a tar.gz file.
2. Encrypt the tar.gz file.
3. Transfer to your cloud storage.
 
1 members found this post helpful.
Old 03-26-2020, 03:25 PM   #3
jefro
Moderator
 
Registered: Mar 2008
Posts: 22,020

Rep: Reputation: 3630Reputation: 3630Reputation: 3630Reputation: 3630Reputation: 3630Reputation: 3630Reputation: 3630Reputation: 3630Reputation: 3630Reputation: 3630Reputation: 3630
Depending on how you set up this folder might be need to known.

It should be or can be unlocked and viewable at some point to the user, via text or gui. One could make a cron job or script to take their work and then rsync it to this folder I'd think.
 
Old 03-26-2020, 03:33 PM   #4
Enry211
LQ Newbie
 
Registered: Jan 2010
Distribution: Fedora Mate
Posts: 23

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by agillator View Post
I would do it in three steps:

1. Compress the folder to a tar.gz file.
2. Encrypt the tar.gz file.
3. Transfer to your cloud storage.
Thanks for your reply. I was looking for a single file encryption for easy management but this could be good as backup solution. I may also use deja-dup to create ad incremental backup

Quote:
Originally Posted by jefro View Post
Depending on how you set up this folder might be need to known.

It should be or can be unlocked and viewable at some point to the user, via text or gui. One could make a cron job or script to take their work and then rsync it to this folder I'd think.
I would like to have an encrypted copy of each file so I can eventually restore a single file without download the whole backup also from another computer. If you have any better idea I hear you
 
Old 03-26-2020, 07:34 PM   #5
agillator
Member
 
Registered: Aug 2016
Distribution: Mint 19.1
Posts: 419

Rep: Reputation: Disabled
Sorry, I thought you wanted the entire directory as one. But doing the individual files shouldn't be difficult, either. You would need a simple script to loop through the directory encrypting each file and transferring it. I don't know the makeup of your file list, but some version of gpg --encrypt-files would seem to fit the bill. Assuming you have no gpg files in the directory to begin with, and assuming you will use scp with public key authentication, and assuming you are using bash, a simple script might be
Code:
#!/bin/bash
gpg --encrypt-files --recipient "George Whomever" /home/username/Documents/*
scp /home/username/Documents/*.gpg
rm /home/username/Documents/*.gpg
Of course you would have to use your information, not the bogus stuff I used for example purposes and you might want to use rsync or something, but this shows the general idea. And I would also be very careful to follow the KISS (Keep It Simple) rule. It sounds as if it would be very easy to make things far more complicated than necessary.

Once the script works, run it as a cron job. Just be sure you are using full file paths in all cases and not relying on a $PATH environment variable which can cause all sorts of problems.
 
1 members found this post helpful.
Old 03-27-2020, 02:12 AM   #6
Enry211
LQ Newbie
 
Registered: Jan 2010
Distribution: Fedora Mate
Posts: 23

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by agillator View Post
Sorry, I thought you wanted the entire directory as one. But doing the individual files shouldn't be difficult, either. You would need a simple script to loop through the directory encrypting each file and transferring it. I don't know the makeup of your file list, but some version of gpg --encrypt-files would seem to fit the bill. Assuming you have no gpg files in the directory to begin with, and assuming you will use scp with public key authentication, and assuming you are using bash, a simple script might be
Code:
#!/bin/bash
gpg --encrypt-files --recipient "George Whomever" /home/username/Documents/*
scp /home/username/Documents/*.gpg
rm /home/username/Documents/*.gpg
Of course you would have to use your information, not the bogus stuff I used for example purposes and you might want to use rsync or something, but this shows the general idea. And I would also be very careful to follow the KISS (Keep It Simple) rule. It sounds as if it would be very easy to make things far more complicated than necessary.

Once the script works, run it as a cron job. Just be sure you are using full file paths in all cases and not relying on a $PATH environment variable which can cause all sorts of problems.
Thank you very much. Yes, it seems quite easy and useful. The only problem with this solution is the script can't manage the directories. In Documents I have many directories and sub-directories.
If this makes the script too much complicated I could opt for the single-file backup. It's not my favorite solution but better than nothing

I've found many backup software but no one does an encrypted copy of single files. They can do a whole encrypted backup (like DejaDup) or copy the single un-encrypted files only (like LuckyBackup).
 
Old 03-27-2020, 04:29 AM   #7
agillator
Member
 
Registered: Aug 2016
Distribution: Mint 19.1
Posts: 419

Rep: Reputation: Disabled
OK, it took some work since bash is not my strong suit but there is actually a very simple way to do what you are asking. I simply needed to keep checking for KISS! Try this (with your info, of course):
Code:
find <directory to start in> -type f -exec gpg --encrypt --recipient <identity for public password> {} \;
For example:
Code:
find . type -f -exec --encrypt --recipient "George" {} \;
to encrypt all files in the current directory and subdirectories. For more complicated situations check out the man page for find.
Then, of course, you transfer the gpg files wherever you wish and remove the gpg files with another find command. NOTE: This assumes there are no gpg files currently in the directories or else you will get a query about replacing or renaming which is a problem if running as a cron job, and run the risk of losing them or are guaranteed to lose them during the deletion process.

Are we there yet?
 
Old 03-27-2020, 04:52 AM   #8
agillator
Member
 
Registered: Aug 2016
Distribution: Mint 19.1
Posts: 419

Rep: Reputation: Disabled
After a little more thought, as dangerous as that can be, if the above idea is too simple and inflexible you might try looking at rsnapshot. It is not really set up for this, but, although I haven't tried something quite like what you are asking for, it is quite flexible and with some jury rigging it might work. It uses rsync but allows other scripts to be run before and after it runs rsync. It also lets you enter commands to rsync that you need in addition to what it issues. It also allows quite generous wildcarded lists of files and directories to be included and excluded. With some thought and experimentation I'll bet you could get it to work. I have used it for some years as my backup system and like it very much for being able to make it fit my every need. Of course the simpler solution above would be preferable if it works, but if not . . . .
 
Old 03-27-2020, 05:08 AM   #9
Patrick59
Member
 
Registered: Apr 2007
Location: North France
Distribution: Slackware64 15.0
Posts: 74

Rep: Reputation: 19
Hi,

Just a small question, is your Nextcloud alredy encrypted ?
 
Old 03-27-2020, 05:17 AM   #10
Enry211
LQ Newbie
 
Registered: Jan 2010
Distribution: Fedora Mate
Posts: 23

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by agillator View Post
It uses rsync but allows other scripts to be run before and after it runs rsync.
Good! So I can run your first script to encrypt everything, then use rsync to update only the .pgp files in the Nextcloud directory. In this way I should have a complete incremental backup of the encrypted files. What do you think?

Quote:
Originally Posted by Patrick59 View Post
Hi,

Just a small question, is your Nextcloud alredy encrypted ?
I'm using the e.foundation free cloud space (/e/OS is an Android ungoogled OS developed by the french creator of Mandrake/Mandriva that I really suggest you). For my knowledge, the client-side encryption is not available, so I consider it as not-encrypted.

Last edited by Enry211; 03-27-2020 at 05:19 AM.
 
Old 03-27-2020, 06:37 AM   #11
agillator
Member
 
Registered: Aug 2016
Distribution: Mint 19.1
Posts: 419

Rep: Reputation: Disabled
I think that is entirely possible. It may take some work and some experimentation, but my gut feeling is that it is absolutely possible. It may not even be as hard to do as I initially thought and certainly would then be possible as a cron job. The only real gotcha about rsnapshot is to be sure you use tabs in the configuration file to separate values instead of spaces. They warn you about this but it always happens. Habits can be so terrible.
 
1 members found this post helpful.
Old 03-27-2020, 06:38 AM   #12
Enry211
LQ Newbie
 
Registered: Jan 2010
Distribution: Fedora Mate
Posts: 23

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by agillator View Post
I think that is entirely possible. It may take some work and some experimentation, but my gut feeling is that it is absolutely possible. It may not even be as hard to do as I initially thought and certainly would then be possible as a cron job. The only real gotcha about rsnapshot is to be sure you use tabs in the configuration file to separate values instead of spaces. They warn you about this but it always happens. Habits can be so terrible.
I'll try in the next days! Thanks!
 
  


Reply

Tags
backup, encryption, mirror, pgp



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
LXer: Protecting Code Integrity with PGP -- Part 6: Using PGP with Git LXer Syndicated Linux News 0 03-22-2018 09:51 AM
LXer: Protecting Code Integrity with PGP — Part 6: Using PGP with Git LXer Syndicated Linux News 0 03-22-2018 03:23 AM
LXer: Protecting Code Integrity with PGP -- Part 3: Generating PGP Subkeys LXer Syndicated Linux News 0 02-28-2018 09:03 PM
Recovering data using Knoppix on a PGP encrypted drive crazypoker Linux - Newbie 19 07-05-2013 10:15 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 06:16 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration