Hi all,
I have a problem with disk recognition.
This is my scenario:
I have a 2U server rack with 12bay connected to an LSI 3008 controller. VMware ESXi 7.0.3 is installed in the server. DSM is installed in a virtual machine with the LSI controller in passthrough. I followed this guide: https://xpenology.com/forum/topic/62547-tutorial-install-dsm-7x-with-tinycore-redpill-tcrp-loader-on-esxi/
So I have 2 SATA controllers and the LSI 3008 controller where the physical disks are connected. At the moment I have only 3 disks connected (/dev/sdn, /dev/sdo and /dev/sdp) and they are recognized.
If I add a disk (can I do it hotplug?) I see it via shell ssh, but DSM does not see it.
In my case I added a WD red 3TB disk as a test and the system recognizes it as /dev/sdq
root@NAStore:~# fdisk -l |grep /dev/sd
Disk /dev/sda: 21 GiB, 22548578304 bytes, 44040192 sectors
/dev/sda1 8192 16785407 16777216 8G fd Linux raid autodetect
/dev/sda2 16785408 20979711 4194304 2G fd Linux raid autodetect
Disk /dev/sdn: 14.6 TiB, 16000900661248 bytes, 31251759104 sectors
/dev/sdn1 8192 16785407 16777216 8G Linux RAID
/dev/sdn2 16785408 20979711 4194304 2G Linux RAID
/dev/sdn3 21241856 31251554303 31230312448 14.6T Linux RAID
Disk /dev/sdo: 14.6 TiB, 16000900661248 bytes, 31251759104 sectors
/dev/sdo1 8192 16785407 16777216 8G Linux RAID
/dev/sdo2 16785408 20979711 4194304 2G Linux RAID
/dev/sdo3 21241856 31251554303 31230312448 14.6T Linux RAID
Disk /dev/sdp: 14.6 TiB, 16000900661248 bytes, 31251759104 sectors
/dev/sdp1 8192 16785407 16777216 8G Linux RAID
/dev/sdp2 16785408 20979711 4194304 2G Linux RAID
/dev/sdp3 21241856 31251554303 31230312448 14.6T Linux RAID
Disk /dev/sdq: 2.7 TiB, 3000592982016 bytes, 5860533168 sectors
Question: why do the unit letters of the LSI controller start from sdn for me? What did I do wrong?
At this moment I have install DSM 7.1.1-42962 Update 5 with TCRP 0.9.4.4
Can you help me out so I can see the other disks (potentially 9 more of 12 supported by the server)? Thanks