Ok, here is the thing. The list that you provided is accurate for Intel(R) motherboards. On third party ones you can get the option as the motherboard manufacturer added the feature. Bottom line here is, on Intel(R) motherboards the list of supported RAID levels and configurations is the following: http://www.intel.com/support/chipsets/imsm/sb/CS-022304.htm On thir party ones you will need to check with the manufacturer (Asus*, etc.) as they may have added the support. PV.
You can call Intel and ask them, they will tell you that the chipset is capable but not supported. Nowhere on the datasheets or in the web site it states that the ICH10R will support 6 drive RAID 10. To get the information on the chipset customization possibilities you will need to get in contact with one of the engineers that work there (good luck).
Actually if you study the Intel docs they only state that 4 disks are needed for raid 10. I've never found a piece of documentation stating 6 or more drives aren't supported.
I've been using a 6 drive raid 10 since Labor Day 2009. I can tell you without a shadow of a doubt 6 drive raid 10 works on my Asus Rampage Extreme ICH9R. I had people in the Asus forums in sheer disbelief this was possible, it is, it works. Keep in mind that you can't grow a 4 disk raid 10 to a 6 disk raid 10 on ICH9R, but I think I read that ICH10R has this capability.
6 x Seagate 7200.11 (CC1H firmware) 1.5TB drives = 1.4TB mbr C: drive (max 2TB for a boot volume on this controller), and a 2.8TB GPT D: Drive. I also partitioned a small dedicated virtual memory volume 15gb to home a 12gb system swap file.
It took 8+ hours to build the C: volume, then another 10+ hours to build the D: volume.
I posted on Seagate.com 6 disk raid 10 benchmarks using Matrix Storage 8.9, but now I use Intel Rapid Storage driver set (much friendlier to tweakers that crash their PC from time). Here's the link: http://forums.seagate.com/t5/Internal-ATA-and-Serial-ATA/6-Disk-Raid-10-Is-this-good-performance/m-p/38334
You are going to think I'm crazy for what just happened. I have finished building this rig and it screams. The performance is outstanding.
I have, Rampage Extreme 2, i7 920, Radion 4890, Thors Hammer Cooler in a Sniper case.
I´m not a gamer as such, I just want a solid machine for editing. I won´t be over clocking either.
All good until I dwelled into the Raid area.
I have been playing with Raid for a few months now with the old machine and thought I had it figured.
With no problems I set up the boot on a 4 WD disks on Raid 10. I also put in a 1g WD as a hot spare All was splendid as I updated all Windows and drivers and was doing performance tests when drive 2 failed so it started rebuilding to the 1g drive. That was disapointing as the performance slowed.
I re-booted, drive2 and the 1g were not recognised in Bios. It did boot for the last time.
Then drive 3 failed while it was rebuilding the degraded raid. Then puff the magic dragon, blue screen of death.
I have set up now with a single WD500 which was not involved in this carnage as that's all I have left. The 5 drives I raided are completely dead.
I have done tests on all Sata inputs with the one good drive and Bios is happy and finds the drive. The others don't even spin up and cannot be detected. I went as far to check the 7th Sata on the board with the same result. Also in all cases my settings in Bios were correct. I tried to detect them with Raid switched on and off. It appears all 5 drives are cactus!
How can Intel Storage Manager destroy 5 drives? Surely I am missing something? Have you heard of this before?
All the best,
Just a little edit: I do want to Raid the operating system for speed. I went with Raid 10 so I would have the safety net. Well, I fell through that.
Plan B is to make a seperate Raid set for editing and rendering.
Sorry, never heard for something like this specifically, but I don't think Intel drivers had anything to do with your experience.
Were those WD drives supported for raid? (not all drives are), is there new firmware for those drives?, did you get a nasty virus?, rebooting during a rebuild is not a best practice. I can't assess what happened, but those are a few things that I think of that might have caused it.
TIGR's post shows a 6 drive RAID array, but I don't see where it shows it's RAID 10. I just installed the latest Intel Rapid Storage Technology drivers & management software on a socket 1366 mobo with the X58 chipset (Gigabyte GA-X58A-UD5). It would only allow creation of a 4 drive RAID 10 array. It allows creation of a 6 drive RAID array, but only with RAID 5.
Has anyone figured out how to get a 6 drive RAID 10 array built on an ICH10R on the mobo?
Hmm are you using WD advanced format green drives? They have a tendency to drop out of RAIDs due to a not supporting TLER (Time-Limited Error Recovery). The advanced format drives does not enable you to alter the TLER or the spindown (which is 8 seconds preset I think).
The WD greens are not at all suitable for RAID due to this and also the advanced format drives (4k sectors) need to be aligned to the stripesize and volume start sector otherwise you will suffer extremely bad performance.
If you scroll the image to the left you can see there is a 6 disk raid10 set up.
From Google searches it appears some people have managed to set up 6 disk raid10 using older versions of the driver and utilities (like ver 8.9).
The latest driver appears to be from December 2010 and it still looks like there's no support for what should be a fairly simple feature to include.
Does anyone know if Intel is releasing driver support for 6 disk raid10 (does it require a BIOS update??)?
Is there a mod that can be made to the .inf file(s) to enable 6 disk support (basically overide the default Intel settings)?
I'd really like a solution to this issue as it seems pretty basic - unless I'm missing something and totally off the mark...