Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I want to backup some files on Windows shared folder, from my Centos 7, i need Script/tool to backup shared files in a local folder, every 1 or 2 seconds, where i want to copy those files each time in new sub folder names by (date-time).
note:
1. the files in the shared folder does not exist there for long time , it is like temprary files stays for few seconds then removed by the application from the shared folder, so i want to backup them before the app remove them.
2. i could not find away how to run cron to run tasks in seconds as well,
You might find that kind of rapidity killing your CPU or network bandwidth so would definitely want to monitor.
While it is true you can only create cron jobs to run by the minute you can create the script run by that cron job to do repetitive tasks using loops and sleep.
e.g. Your script could be something like copyem.sh with:
Code:
#!/bin/bash
start=$(date +%s)
echo Start is $start
end=$(( start + 59 ))
echo End is $end
cur=$(start)
until [[ $cur -gt $end ]]
do echo Time is $cur
cp -p /pathto/share/* /pathto/save/
sleep 1
cur=$(date +%s)
done
You'd then tell cron to run copyem.sh every 1 minute. The above script determines when it starts with the number of seconds since 1970. It then adds 59 seconds to that (just under a minute) to determine when it should end. It initializes cur variable with the time it started. The "until" loop then checks to see if cur variable is greater than the end variable. If not it runs the commands between the do and the done. The first command you'd substitute with the copy you want to do. After that it waits 1 second and resets the cur variable to seconds at that point on each pass. Once cur is greater than end it stops the loop. At that point your next 1 minute cron job would do exactly the same thing.
You might have to play with timing a little (e.g. set end to $start + 55 rather than + 59) to be sure the runs from cron don't step on each other. You also might have to play with it depending on how long each copy takes. You might be tempted to remove the sleep altogether on the theory the copy would take more than 1 second but that risks the loop launching the next copy before the first one is complete.
Last edited by MensaWater; 04-19-2019 at 03:24 PM.
maybe something like inotifywait or inotifywatch would make more sense then a cron job?
read their man pages, it's all fairly straightforward.
on my system, these programs are part of inotify-tools.
I want to backup some files on Windows shared folder, from my Centos 7
I had thought about inotify but is isn't going to work when the files are created on another system. I don't think there is a "clean" way to accomplish the OPs goals since it appears there is no control over the application. Files maybe duplicated as well as deleted while being copied.
Rsync i've used to make incremental backups. Rsync make full backup and every backup after, if there some changes at source folders - he puts changed files to new dest.folder under date.time-format. If you want real-time sync - than its "lsyncd" best solution.
Some Links : Rsync | Lsyncd
Other solutions, its just something with real-time synchronisation in one-way. Like : NFS+AutoFS, Samba+AutoFS, etc. Otherwise like "MensaWater" told, you will make heavy load on your CPU and memory. Will be something like bootloop with backups every 2 seconds.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.