data push from box to removable usb drive dies, then dies, then dies again.
Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
data push from box to removable usb drive dies, then dies, then dies again.
an Oracle DBA playing SysAdmin so not a total idiot, but creeping up on it.
have a usb drive mounted and as I use cp to try to get about 70G of data from a staged drive ( no RAID )sitting on my redhat box onto the usb it dies, every damn time.
any of you brilliant boys have any genius for a girl geek. i'm dying here..
What type of data, single file archive, etc. Any idea how much data is being copied before it dies?
What is the make/model of drive?
Do you know how it is formatted? i.e file system type? (NTFS, ext3 etc)
Any error messages?
Lisle,
will do, but a precursory glance tells me there may be a problem since I don't want to untar these files, they then become 300G and my usb is 250.
If nothing else, I'm now exposed to cpio. Thank you for that.
Any idea how much data is being copied before it dies?
it varies. last "death" was two files one a couple hundred M and one about 30G which died after 5.7
What is the make/model of drive?
Vendor: WD Model: 2500BEV External Rev: 1.75
Do you know how it is formatted? i.e file system type? (NTFS, ext3 etc)
It was NTFS but I ran mke2fs, mounted it, no problem, small stuff goes off and on with not a single hitch
Any error messages?
yep.. all about out of memory. after which the kernal kills the process. ( this is a puzzle because the data is actually a backup that's going from our 0+1 Box to the non-raided staged drive with no problems, night after night. I just can't get it from that staged drive to the usp by taring/gziping them to this drive)
clearly cp isn't the ticket
Last edited by bodyofabanshee; 03-14-2012 at 02:30 PM.
I'm going to guess you've formatted the disk with 1KB block size and that's why its dying, NTFS would have worked fine but you might not have been able to mount the NTFS partition in RW.
elfenlied,
Since there's nothing on it, is it possible to rerun mke2fs? The first time I ran is as follows mke2fs -j -m 1
Also, if I understand the layout of the following:
and it is a matter of the block size being too small, why would it die at 1G sometimes, and others get all the way to 6G, and would that account for the syslog saying the process ran out of memory and was killed?
Plan "B" could include putting the USB drive onto a workstation and trying to copy the data from the server to a USB drive on another workstation using SCP or something? Just thinking out loud.
If it’s a limit because of the file size, you can try the command split with the option -b and a proper size to write pieces of the file to the USB-disk. On the target side you can use cat to concatenate them again. And maybe check the result later on with md5sum.
update, or more questions than answers depending on your point of view,
what is literally happening is that my system is running out of memory. We have plenty of memory for daily use, for moving the backup to the staged drive etc, but it gets maxed out EVERY time i use cp to try to put these files on the usb drive.
Is there a way to keep cp from journaling or whatever it's doing.. just pipe the dang thing?
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.