I have a computer running 2 256GB Crucial M4's (CT256M4SSD2) in a RAID 0 (striped) array on an ASUS P9X79 Pro using Intel's (built-in) RAID system.
I recently installed Windows 8 Pro as UEFI on this RAID array. (wiping a fully-function Windows 7 non-UEFI installation)
Now, whenever the computer is left running for about an hour, the system no longer sees those drives. Since those drives contain Windows, this leads to various forms of BSODs. If I Intel RSTe (RAID manager) is running at the time, it will say that the disk backing that RAID array has been removed.
Once this happens, if I reset the computer, it will no longer boot. Entering BIOS setup shows that the SATA 3 (6Gbps) ports that those disks are connected to are both empty.
If I then power down the system completely, then turn it on again, the drives reappear, but the problem repeats after another hour or so.
I have inconclusively determined that the problem occurs even if Windows is not running (booted into the installation environment from a UEFI flash drive)
I don't think there has been any data corruption since this started happening, although I have had two strange issues with a GIT repo on that disk.sfc /ScanNow
and Intel's disk check (in RSTe) both do not find anything.
Does anyone know what might cause this?
Answer
Make sure you're running the latest firmware on your Crucial M4s. There is a well-known firmware bug that causes the drives to misbehave (causing BSODs in Windows) every hour or so, once the drives have hit 5184 power-on hours.
You can use a SMART diagnostic such as CrystalDiskInfo or smartmontools to check the power-on hours and see if you recently passed the magic number.
Crucial patched this bug in a January 2012 firmware update. The update leaves your data intact, but you should back up anything important first, even if it requires several 1-hour sessions and you have to do it in chunks.
No comments:
Post a Comment