Red HatThis forum is for the discussion of Red Hat Linux.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
it should be ok. however you use hardware or software raid in sata and ide? easier way is
1. u boot via live cd
2. copy entire system from ide to sata.
3. after complete copy all files, using fedora core recovery cd(2,3, 4,5 or 6 up to u),
4. chroot to the '/' directory in sata
5. edit /etc/fstab and /boot/grub/menu.lst to suit new environment
6. perform grub-install in new hdd (/dev/sda or /dev/sdb or ...)
7. restart
Please don't simply remove your old raid, it is just logically and I'd never try this. remember your linux must able to detect the sata devices, and I assume you don't have any encrypted partition.
If you use hardware raid, then it is transparent to your OS and the computer will consider it as a normal hard disk, probably sda or sdb.
Once you copy everything from old hdd into new sata hdd. then you done majority of thing. However, you still need to make sure the new hdd able to boot, in ide harddisk that is 1st 1024byte for boot, partition table and etc. I don't know how about the hardware raid, but i think is no much difference, and you need to make it otherwise your linux maynot boot.
Normally how we do is using command grub-install /dev/yournewhdd
i assume your sata device have only 2 partition
/dev/sdb1 /
/dev/sdb2 swap
in fc recovery disk, probably mount sata hdd array at /media/sysimage (i forgot the path, you can check using 'mount')
-perform 'chroot /media/sysimage' (then it simulate you already boot into new hard disk, definitely not 100%)
-edit /etc/fstab (remeber after chroot), make sure /, swap location is correct
-edit /boot/grub/menu.lst (maybe edit root=(hd0,1),kernel=.../dev/hda1))
-perform grub-install /dev/sdb (I consider /dev/sdb as your sata, /dev/sda as ide hdd)
after this, your system should able to boot, remember this only work if your linux support your new sata raid hardware.
kstan, I wanted to take a moment and update the thread/you on the outcome.
After realizing that I was going from software raid to software raid I realized it wouldnt work. The system had a IDE promise card which was conflicting with the SATA raid I was adding to replace the array.
Realizing this I realized that two of these servers (New parts, services) would cost about 2,000.
I convinced the client to go with all new hardware and install suse.
Ultimately the bill was 3,120 and instead of two older machines with huge disk they ended up with two new machines with huge disks for about a 1000 more.
The cost, time, copying over 440gbs between the old and new server (Times two) for them, but it was a solution.
=) "They wish", no I meant for "1000 more" in dollars they ended up with totally new hardware instead of using the 2 old servers still and just upgrading their disk systems.
Essentially the idea was to take 3 160gb drives and move them to two 750gb sata drives (per machine) giving them about 1397 gbs of space version about 460 I think it was.
So I quoted about 2200 for upgrading the systems, didnt work, ended up just building new machines entirely which yes gives them new machines and they can keep the old machines so they end up with:
(This is times two)
460gb X 2
1397gb X 2
.... between their two locations (They rsync this stuff nightly - incrementally)
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.