So I recovered a a degraded disk and a folder that contained a project I did last week is corrupt
"The file or directory is corrupt and unreadable"
Class suit anyone?
Anyone know how I might start trying to recover this data.
Anyone know WHY this would've happened. How can I test the raid.
I have now installed RST 9.6 on 7 new or upgraded systems. The driver has worked correctly on all of them, without any drives being dropped at random. It should be noted however, that he upgraded systems all worked satisfactorily with older versions of IMSM.
It is now over 4 weeks since I updated my own system (WD Blacks in RAID10) from 8.8 to 9.6, again working without any problems at all. Previously when this system was updated from 8.8 to 8.9 it dropped drives twice within 2 hours, before I realised that the driver was the cause, and reverted to 8.8.
My experience supports the information (probably from Intel) that the it was a corner condition which was only discovered once Intel had a system that exhibited the failure, and the problem was finally fixed in 9.6.
I do find that the system tray popup saying that the "system is protected" a nuisance, especially to those users conditioned by the "missing drive" popup from 8.9. The message can be turned off, but it seems that turning this off also turns off the useful "verification & repair" popup. The popup arrangement of 9.6 seems over-complicated and not as good as that in 8.8, which gave all the information required without being intrusive under normal conditions.
I have RAID 1 with 220.127.116.114 on a W7 64bit. Seems to be working fine unless I have to do a forced restart if the computer hangs. Then I always have to rebuild a disk. Would this be the situation if I bought a dedicated RAID card to use instead? Appreciate any opinions or experiences. Thx Ray
Whenever I have to do a forced reboot, one of the two discs has to totally rebuild (a few hours, 2-2tb Seagate Enterprise drives). Today after rebuilding, it said the disc failed. We tested each one separately and they worked fine. On reboot. the startup wanted to reconfigure the RAID. I'm done with this RAID, more trouble than it has been worth. I am going to try an Adaptec for something more robust. Hope it works better.
I've reported the dropped drive/degraded RAID problems I've had using the older MSM 8.9 with a Gigabyte EX58UD5 then nothing but success with the new RST 18.104.22.1684. Still running great.
But last week I installed Win7 x64 onto an ASUS P5E-VM HDMI system with a RAID 5 as the data drive. I didn't bother updating the Intel drives because the P5E uses an ICH9R controller. I assumed the driver problems only effected the ICH10R.
Not true apparently. Within a few days, I got the characteristic non-responsiveness then the indication that the drive was dropped from the RAID and the RAID was degraded. then loaded the RST driver and after reformatting the "bad" drive, put it back into the RAID.
It's working fine so far.
I have the same problem with Win7 64bit OS and the forced restart using 22.214.171.1244. I use all Seagate drives of same size and type. Seagate indicated that after 9 reported failures under 8.9 I would no longer be able to send drives back. Now I am getting the same thing. One showed as failed 10 days ago, so I updated to 126.96.36.1994 and now it rebuilds every other day or when I do the forced restart. I have had cache on and off and it seems to work better with it on.... but need to hear if others have an suggesstions.
I've had too much troubles with it until I installed latest driver ver 188.8.131.524 several weeks ago and my RAID became amazingly stable with no single drop, failure or rebuild. BTW, I am using Win7 64-bit ultimate with 4-drives RAID-10 .