I'm absolutely stumped on this one. I'll try to explain it as clearly as possible, but it gets a bit confusing.
I have a homemade machine, based on an Asus K8V SE Deluxe mobo. This model motherboard has 4 SATA connectors, with 2 on board SATA RAID controllers - one VIA chip and one Promise chip.
I have been running stable for over a year with 2 Hitachi 250Gb HDs using the VIA RAID chip in a RAID 1 configuration. However, one of the drives is beginning to fail, so I'm going to RMA it.
I'm pretty cautious about losing data, so I plunked down some dough to pick up a couple Western Digital 250Gb HDs to run RAID 1 on the Promise controller. I installed them just fine, formatted and backed up my data, and was all ready to pull the failing Hitachi out knowing I had my data copied on both WD drives. So I patted myself on the back for being prepared and forethought.
So I de-RAIDed the two Hitachi drives using the VIA utility, and tried to boot to the good drive, and then wiped, repartitioned, reformatted, and reinstalled Windows XP Pro, since it needed a rebuilding anyway. When I came back up, I could see the newly rebuilt Hitachi drive (C) and the dying Hitachi drive (D), but I was expecting to also see the Western Digital drives (E). My supposedly functioning WD drives running RAID 1 on the Promise controller weren't assigned a drive letter, though I was receiving a functional report when the Promise BIOS loaded during startup. In fact, if I go into Disk Management, I can see the Western Digital RAID 1, but it shows as unpartitioned space.
It gets weirder. I made a mistake in trying to see what partitions I could see booting from the Windows XP CD - I accidentally did a repair Windows install on (C), which of course screwed Windows up, so I had to repartition, reformat, and reinstall Windows again on that drive. That seemed to go fine. Then I started moving drives around a bit to see if I could see one of the Western Digital drives if I hooked it up to the VIA connection, but no luck. When I moved the drives back, I now was only able to boot up to the failing drive (What was D). If I boot to the drive I just rebuilt anew, I get a blinking cursor towards the upper left of the screen and no activity. Booting up to the failing drive, I now have only one drive letter, and Disk Management shows me two unallocated disks underneath my properly functioning but failing disks (one for the newly rebuilt drive and one for the Promise array).
That's two partitions that seem to have disappeared, even though the drives seem to show up in Disk Management. What might be causing this problem and how do I get it back?
Also of note - a couple weeks before installing the 3rd and 4th HDs, I upgraded my PSU to a 500w unit. My case temperature reads within reason and the drives are not too warm to the touch. The repeatability of the problem, even from a cold startup, leads me to think the problem isn't thermal.
I have a homemade machine, based on an Asus K8V SE Deluxe mobo. This model motherboard has 4 SATA connectors, with 2 on board SATA RAID controllers - one VIA chip and one Promise chip.
I have been running stable for over a year with 2 Hitachi 250Gb HDs using the VIA RAID chip in a RAID 1 configuration. However, one of the drives is beginning to fail, so I'm going to RMA it.
I'm pretty cautious about losing data, so I plunked down some dough to pick up a couple Western Digital 250Gb HDs to run RAID 1 on the Promise controller. I installed them just fine, formatted and backed up my data, and was all ready to pull the failing Hitachi out knowing I had my data copied on both WD drives. So I patted myself on the back for being prepared and forethought.
So I de-RAIDed the two Hitachi drives using the VIA utility, and tried to boot to the good drive, and then wiped, repartitioned, reformatted, and reinstalled Windows XP Pro, since it needed a rebuilding anyway. When I came back up, I could see the newly rebuilt Hitachi drive (C) and the dying Hitachi drive (D), but I was expecting to also see the Western Digital drives (E). My supposedly functioning WD drives running RAID 1 on the Promise controller weren't assigned a drive letter, though I was receiving a functional report when the Promise BIOS loaded during startup. In fact, if I go into Disk Management, I can see the Western Digital RAID 1, but it shows as unpartitioned space.
It gets weirder. I made a mistake in trying to see what partitions I could see booting from the Windows XP CD - I accidentally did a repair Windows install on (C), which of course screwed Windows up, so I had to repartition, reformat, and reinstall Windows again on that drive. That seemed to go fine. Then I started moving drives around a bit to see if I could see one of the Western Digital drives if I hooked it up to the VIA connection, but no luck. When I moved the drives back, I now was only able to boot up to the failing drive (What was D). If I boot to the drive I just rebuilt anew, I get a blinking cursor towards the upper left of the screen and no activity. Booting up to the failing drive, I now have only one drive letter, and Disk Management shows me two unallocated disks underneath my properly functioning but failing disks (one for the newly rebuilt drive and one for the Promise array).
That's two partitions that seem to have disappeared, even though the drives seem to show up in Disk Management. What might be causing this problem and how do I get it back?
Also of note - a couple weeks before installing the 3rd and 4th HDs, I upgraded my PSU to a 500w unit. My case temperature reads within reason and the drives are not too warm to the touch. The repeatability of the problem, even from a cold startup, leads me to think the problem isn't thermal.