Hi,
I need your help.
I have resisted to update to DSM 6 for a very long time, but I finally came to a moment where I had to take the plunge.
I have been running Xpenology DSM 5.2-5644 Update 3 (DS3615xs) on a ESXI 5.5.0. I have 16 drives and for that I have modified my synoinfo.conf for maxdisks=24
esataportcfg="0xff000000"
usbportcfg="0x300000000"
internalportcfg="0xffffff"
I made a test VM first and brought it to same spec as my real VM, with the exception of the 16 drives.
The update of the test VM went fine, with the exception that the 50MB boot drive was visible in the Storage Manager. I don't know if that is an issue or not.
Anyway, i proceeded to update my real VM with the disk raid still connected.
Installation went fine but after logging in for the first time DSM reports the RAID is crashed.
Checking the Volume manager I can see that max disks are reverted to 8.
I have edited synoinfo.conf for maxdisk=24 again after that, restarted the VM, but maxdisk are still 8.
I really hope that my raid is not really crashed and that this is fixable somehow, but I really need your help, gentlemen.
Do you have any suggestion for me that I can try on my test VM?
Do you need any more information to give a good answer?
Thanks!