Need a script for daily backup of oracle archive file
Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
this is the daily work im doing for taking archive file backup.
daily oracle will generate some 10-20 archive files depend upon the usage.
so, how can i write a script to copy the latest files only.
To make it run daily, you can do this one of two ways.
Edit the user crontab (crontab -e)
Make the script executable (chmod +x script.sh) and place it in /etc/cron.daily
Now I'm not familiar with how Oracle files are backed up, but looking at the files you are copying, it would seem that they increment in value (filename changes). You can either specify *.arc, or use other commands to narrow down which files are copied.
i want the exported file name as abc_2101.dmp and abc_2101.log (2101 is a date and month)
I want to execute this script daily at 02:00 AM.
so that i edited the crontab -e as....
Quote:
00 02 * * * ./u02/script/dailybkp.sh
Now what i want is....
1. is the steps are correct for crontab -e?
2. Is the script of the crontab will execute daily at 02:00 AM?
3. how can i rename the .log filename and .dmp file name daily according to the date. for example abc_2001 is today's date and month.how can i replace with the variable.
Give the full path to the backup script in your crontab, not a relative path (that begins with a dot, which means start from wherever you currently are), something like this: 00 02 * * * /u02/script/dailybkp.sh
i want the exported file name as abc_2101.dmp and abc_2101.log (2101 is a date and month)
However "2101" is not easy to look for, "20070121" is (replace with '+%Y%m%d' below). Anyway, add date like this:
.. I think you mean you edited your post whilst I was replying. Why on earth would you do that? It makes this thread very difficult to read, understand and follow.
ok forget everything....
i have got a solution....working...but i have few questions to UnSpawn.
Phrasing it like that and addressing one person when asking questions seems a bit inconsiderate towards those that helped you with other solutions. Besides that person may not be around to answer.
1. what is the purpose of "exit 0"?
It sets the exit status of the script to what humans read as "OK".
2. If im not using "exit 0" anything harm?
No harm anything.
3. will the crontab execute daily at 02:00 AM?
You just reposted what you got from Tredegar.
Is there any reason to doubt his reply?
And is there any reason for you not to actually test it yourself? Come on!
once again sorry for reopening this thread....i tested with cp command it was working fine...
so, i thought it should work for exp command also.
but its not working....also its not throwing any error....
the crontab -e is having...
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.