Need advice on replacing failed drive

Status
Not open for further replies.
Joined
Aug 27, 2014
Messages
13
I had what appeared to be a drive go bad (FreeNAS 9.3, RAID-Z3 (12 drive array)).
I proceeded to go through the replacement procedure - upon rebooting drive was showing as UNAVAIL.
Drive replacement option was not being shown.
After checking dmesg, I am not showing drives on all 4 ports on the MB contoller (Dell C2100). I had previously replaced drives that were attached to the LSI controller without any issue.
I doubt the replacement drive is bad (I have another one I can try), so there are a few options.
It could be that the port on the MB went bad (in which case I can connect to a different port), or a backplane issue, or just a loose connection somewhere. Assuming the hardware level issue is taken care of, will the drive, once recognized by the OS, be displayed as an option for replacement when the current drive is in an UNAVAIL state? I have seen some reports of issues replacing the drive when shown as UNAVAIL. And if not through the GUI, is there a given procedure I should follow (maybe via the CLI) to get the system back to a healthy state?

On another note - my LSI controller is running firmware version 20.00.04. Is it safe to upgrade to 9.10, and if so is it safe to upgrade it while in a degraded state?

Thanks in advance,

Benjamin
 

Mirfster

Doesn't know what he's talking about
Joined
Oct 2, 2015
Messages
3,215
After checking dmesg, I am not showing drives on all 4 ports on the MB contoller (Dell C2100)
Can you please let me know if the backplane on your C2100 has the two Mini-SAS (SFF-8087) connectors or do you have the three SFF-8484 connectors?

On another note - my LSI controller is running firmware version 20.00.04. Is it safe to upgrade to 9.10, and if so is it safe to upgrade it while in a degraded state?
You can run 9.10 with that version, however you will get a message about P21 which can be ignored (discussed here). However, before proceeding lets get your system running correctly first.
 
Joined
Aug 27, 2014
Messages
13
Thanks for he lightning fast reply! I will head down to the data center tomorrow morning (when we do not run backups to the NAS) to take a look for sure, but if my memory serves me correctly, it has the 3 SFF-84848's. I think there were 2 cables coming from the LSI to the backplane, and 4 MB connectors to the last SFF-8484. I have 8 drives on the LSI and 4 on the MB.
 

Mirfster

Doesn't know what he's talking about
Joined
Oct 2, 2015
Messages
3,215
Okay, so since you said you appeared to lose the 4 drives that are attached to the MB; I am presuming that it is actually the cable that is bad. Either that or the SFF-8484 connection got dislodged.

Side note, you all may want to consider using getting/using the backplane with just the two Mini-SAS (SFF-8087) connectors. That is actually an Expander Backplane and will allow you to run all those drives off of a single SFF-8087 connection if so inclined (can use both connections as well). Also, the SATA Ports on the MotherBoard are only 3Gbps as opposed to the LSI being 6Gbps (I am assuming you are using something like the 9211-8i...)

Us Dell C2100/FS12-TY users have got to stick together; I have at least 7 of them here with me, let alone those for clients. :)
 
Status
Not open for further replies.
Top