X10SL7-F and LSI 9211-8i card and disk detection problems.

Status
Not open for further replies.

tscholak

Dabbler
Joined
Jun 3, 2014
Messages
10
maybe i should revise what i said a month ago.
initially i thought the problems were gone. however, in the meantime, i have had a couple of reboots of the server and sometimes, not always, the controllers cannot be initialized (or maybe it's only one of them, hard to tell).
the mpt sas driver's boot messages contain the string "fault_state(0x2622)". i guess that's your "IOC Fault 0x4002622".
reboot helps, but, yeah, very annoying. supermicro is totally unsupportive. they just say they won't do anything, because my additional raid card hasn't been validated by their lab for the x10sl7.
lsi won't help either, they say it's a problem with the bios of the board.
i've resetted the bios configuration a number of times, i don't think i have set anything unusual there.
recently supermicro released a v19 it fw for the x10sl7. maybe upgrading helps, check out their ftp server.
 

jjstecchino

Contributor
Joined
May 29, 2011
Messages
136
maybe i should revise what i said a month ago.
initially i thought the problems were gone. however, in the meantime, i have had a couple of reboots of the server and sometimes, not always, the controllers cannot be initialized (or maybe it's only one of them, hard to tell).
the mpt sas driver's boot messages contain the string "fault_state(0x2622)". i guess that's your "IOC Fault 0x4002622".
reboot helps, but, yeah, very annoying. supermicro is totally unsupportive. they just say they won't do anything, because my additional raid card hasn't been validated by their lab for the x10sl7.
lsi won't help either, they say it's a problem with the bios of the board.
i've resetted the bios configuration a number of times, i don't think i have set anything unusual there.
recently supermicro released a v19 it fw for the x10sl7. maybe upgrading helps, check out their ftp server.

Well after I posted here I started doing some more research. The M1015 is a PCIe 2.0 card. I believe the x10 bios at default setting (auto) does not select the correct PCIe version all the time. I set the PCIe negotiation to Gen 2 in bios for the 2nd pcie16 slot and so far I have not had anymore errors.
 

jjstecchino

Contributor
Joined
May 29, 2011
Messages
136
P.S. I would not upgrade any of the FW to Phase 19 as freenas 9.2.1.6 is on driver version 16. It is best to have fw and driver at the same version.
 

tscholak

Dabbler
Joined
Jun 3, 2014
Messages
10
ok, just to clarify, you propose to change
bios pcie.jpg

to
bios pcie 2.jpg

?
 

jjstecchino

Contributor
Joined
May 29, 2011
Messages
136
Correct
 

jjstecchino

Contributor
Joined
May 29, 2011
Messages
136
rebooted 10 times and no errors. whereas before 1/2 of the time would error and go to kdb
 

tscholak

Dabbler
Joined
Jun 3, 2014
Messages
10
i can confirm that! i also did 10 or so reboots, no problems.
wow, didn't think this could be solved in bios.
kudos to you, sir
 

raidflex

Guru
Joined
Mar 14, 2012
Messages
531
Thanks for the fix! I had the same issue for a while and now everything is working great.
 

SirMaster

Patron
Joined
Mar 19, 2014
Messages
241
FWIW I know not the best info for FreeNAS users (I use ZFS on Linux), but I was having the exact problems described by this thread.

My LSI 9211-8i came from eBay with FW v11 and my X10 was at V16 when I was having the issues.

I flashed v19 with BIOS on both and now they both work perfectly together on every boot with PCIe set to the default Auto setting.

So perhaps look forward for when they add v19 driver to FreeNAS I guess :)
 

tscholak

Dabbler
Joined
Jun 3, 2014
Messages
10
AFAIK the stock mpt2sas driver in the linux kernel is also only a v16
 

SirMaster

Patron
Joined
Mar 19, 2014
Messages
241
AFAIK the stock mpt2sas driver in the linux kernel is also only a v16

Hmm, well v19 firmwares appear to be working perfectly on my Linux box. Should I downgrade for any reason?

I'm using stock Debian 7.6, never installed any mpt2sas drivers.

What's the downside to using FW v19 if the drivers are v16? Am I risking something?
 

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194
Hmm, well v19 firmwares appear to be working perfectly on my Linux box. Should I downgrade for any reason?

I'm using stock Debian 7.6, never installed any mpt2sas drivers.

What's the downside to using FW v19 if the drivers are v16? Am I risking something?

It's a good question that I believe hasn't been scientifically answered.

From what I know, it's what LSI recommends, but I haven't seen any specific problems caused by mismatched firmware and drivers (that I can recall right now, at least).

What's also unknown is whether either of them is designed to work with older versions of the other.

Guinea pigs are always welcome to test the kinds of things.
 

SirMaster

Patron
Joined
Mar 19, 2014
Messages
241
It's a good question that I believe hasn't been scientifically answered.

From what I know, it's what LSI recommends, but I haven't seen any specific problems caused by mismatched firmware and drivers (that I can recall right now, at least).

What's also unknown is whether either of them is designed to work with older versions of the other.

Guinea pigs are always welcome to test the kinds of things.

Well my zpool has been under pretty heavy stress testing the last couple days with the v19 firmware and I have not seen any issues. I do have a complete backup array of my zpool in a separate system so I may just leave it like this at v19 firmware and see if anything ever happens.

Linux says my drivers are:

Code:
root@nick-server:/# modinfo mpt2sas
version:        16.100.00.00


And the LSI site has 16.255.01.00 as v19 so yeah there are newer. I could install the v19 drivers though as they list that they support debian 7 in the readme.
 
Last edited:

jjstecchino

Contributor
Joined
May 29, 2011
Messages
136
Well my zpool has been under pretty heavy stress testing the last couple days with the v19 firmware and I have not seen any issues. I do have a complete backup array of my zpool in a separate system so I may just leave it like this at v19 firmware and see if anything ever happens.

Linux says my drivers are:

Code:
root@nick-server:/# modinfo mpt2sas
version:        16.100.00.00


And the LSI site has 16.255.01.00 as v19 so yeah there are newer. I could install the v19 drivers though as they list that they support debian 7 in the readme.

Maybe in FW ver 19 they have fixed the PCIx negotiation and the M1015 now can correctly negotiate PCIx gen2 when the x10 bios is set to auto.
Not sure about side effects of having fw and driver version mismatch. There may be glitches not immediately evident, though.
 

SirMaster

Patron
Joined
Mar 19, 2014
Messages
241
Maybe in FW ver 19 they have fixed the PCIx negotiation and the M1015 now can correctly negotiate PCIx gen2 when the x10 bios is set to auto.
Not sure about side effects of having fw and driver version mismatch. There may be glitches not immediately evident, though.

I decided to just put them both back to v16 with BIOS, and I booted a few times and it always worked with both cards just fine with PCIe still on Auto.

So not sure why this issue is seemingly so random still...

(v11bios) card + (v16bios) onboard + PCIe Auto = didn't work most times.

(v19bios) card + (v19bios) onboard + PCIe Auto = worked dozens of times 100%.

(v16bios) card + (v16bios) onboard + PCIe Auto = worked a few times 100% so far.
 

jjstecchino

Contributor
Joined
May 29, 2011
Messages
136
I decided to just put them both back to v16 with BIOS, and I booted a few times and it always worked with both cards just fine with PCIe still on Auto.

So not sure why this issue is seemingly so random still...

(v11bios) card + (v16bios) onboard + PCIe Auto = didn't work most times.

(v19bios) card + (v19bios) onboard + PCIe Auto = worked dozens of times 100%.

(v16bios) card + (v16bios) onboard + PCIe Auto = worked a few times 100% so far.

Don't know if it helps but in my case it would not work most of the time if the m1015 was connected to the Intel RES2SV240 expander. If disconnected it would boot most of the time and error occasionally. Running Freenas 9.2.1.6 FW 16 on both LSI 2008 and 2308.
Are you having problem with Linux as well?
 

SirMaster

Patron
Joined
Mar 19, 2014
Messages
241
Don't know if it helps but in my case it would not work most of the time if the m1015 was connected to the Intel RES2SV240 expander. If disconnected it would boot most of the time and error occasionally. Running Freenas 9.2.1.6 FW 16 on both LSI 2008 and 2308.
Are you having problem with Linux as well?

No, I only had a problem when the LSI 2008 card was at the old v11 it came with from eBay. When both my 2008 and 2308 were at v19 or now v16 with BIOS it has always booted correctly every time so far with PCIe at the default Auto setting. Also, my 2308 is always connected to a Chenbro SAS Expander and that always works too.
 

tscholak

Dabbler
Joined
Jun 3, 2014
Messages
10
i got it working with v16 fw, v16 drivers and the bios "trick" described above.
i'm not seeing myself tinkering with that again.

for reference, i have an lsi 9211-8i card seated in the first slot of the x10sl7.
 

i3luefire

Explorer
Joined
Jan 4, 2014
Messages
69
thank you for the bios setting to gen2 on the m1015 made this problem go away for me. i also changed all the oproms to efi/uefi in the bios but it still had the mpt error 01h 99% of the time. once i changed the m1015 to be gen 2 everything worked fine again. i have only booted 1x though. i can come back and update later if i boot some more without issue. or if i have issues.
 
Status
Not open for further replies.
Top