Thursday, April 16, 2015

windows 7 - SSD read-write failure with large amounts of data

The software I am using sometimes requires, say, up to 36 hours of constant read-write (at max SATA II speed due to motherboard requirements) and CPU processing with a few gigabytes of program generated (numerical) data for use in scientific computing. Although Windows file system checkers and the PC manufacturer hardware assistant don't find anything wrong with the SSD, I am worried that the SSD might have been 'overused' and prone to non-functional issues with large amounts of data (for example the application stalled and no further program generated output, but Windows didn't crash or terminate the application). Replacing the SSD with the original HDD has none of these issues (from which the OS was originally cloned onto the SSD drive). With small amounts of data (and therefore shorter execution times), the application/OS/SSD is fine and I didn't have have this issue a while ago. I read that SSDs do not generally fail in this way, although there had often been spinning blue circle next to the mouse cursor (just after boot-up) and I had to re-install some other software more recently. I suspect that the fault may be with the original cloning of the OS rather than the SSD hardware. How can I find out if there really is anything physically wrong now with the SSD? The PC starts up and shuts down normally and no changes to the Windows Experience Index and 'normal use' of the PC!

No comments:

Post a Comment

linux - How to SSH to ec2 instance in VPC private subnet via NAT server

I have created a VPC in aws with a public subnet and a private subnet. The private subnet does not have direct access to external network. S...