n40l_Boxhead

N40L Major issues in getting 6.1.4 installed

Recommended Posts

This is a bit similar to the thread regarding the N54L I read a bit earlier.

 

Firstly, I have an HP N40L Microserver, which was happily running 5.2.X for some years (and, to be frank, should have been left running it).

 

I foolishly decided to upgrade to 6.1.4, using Jun's 1.02b bootloader, and following Polanskiman's detailed howto. (Not to impugn their work; more that I should have heeded the "if it aint broke, don't fix it" adage...)

 

What happens is that it will run (initially), and the box will be found by the Synology assistant. I then run the install (have been trying "migration"), which "seems" to run fine. Then, after reboot, it does not appear on the network. At all. It simply goes back to the same three options: Bare metal Install, Bare Metal Reinstall, and a VMWare/ESXI install option. So clearly, the installation isn't working.

 

Just like the person in the N54L thread, I then have to unplug the thing from power, re-image the USB stick, and it will then be seen on the network. Trouble is, it only gives me the option to "recover", which does something for about two seconds, goes into the 10 minute long "reboot" mode, and the device cannot be seen on the network. (And so round we go again).

 

So, right now I have the thing sitting at "recover" in the synology assistant (and the web version), which I do NOT want to use, as it does not work. Is there a way I can somehow get into a CLI and manually wipe and reinstall completely from scratch to at least get it up and running? Or is it bricked?

 

It is running ( I think) the latest "modded" version of HP's version 041 BIOS (dated 10/01/2013). C1E and quick boot are both disabled.

 

On initial cold boot, both the onboard NIC (whose MAC number is included in the bootloader's grub.cfg file), and the dual-port NIC I added in are found. I've used several different USB sticks in creating (and attempting to deploy) the bootloader and install the system. I've amended the PID and VID settings in the grub.cfg file to reflect correct values for each. I used a serial number generated by https://xpenogen.github.io/serial_generator/index.html,

This was also included in the grub.cfg file.

 

The box also runs 5 6Tb WD Red drives, with the 5th connected via the external eSATA port. (Plus appropriate BIOS settings to support this config.) It's also running 8Gb of RAM (although I doubt that this would be an issue).

 

I'm also considering going back to 5.2 (and staying there), but I don't think that that is an option, as the installer will detect a "busted" later installation, and won't allow the downgrade...

 

So, in a nutshell, I'm a bit stuck.

 

 

 

 

Share this post


Link to post
Share on other sites

Hmmm...

 

In a fit of "I'll try anything to get it going", I chucked the old (xpenoboot) USB stick in it, used the "install 5.6.44" option (but used the 6.1.x .pat file). It (obviously) had some "issues", BUT at least allowed me to then chuck the usb key with Jun's 1.02B loader on it, AND I could see the box as "migrateable" in Synology Assistant. I did a clean install then (again pointing it at the same .pat file), and it rebooted.

 

Much to my surprise, the initial screen (for inputting user name etc.) greeted me. So, it was up and running.

 

I've turned off auto-updating, but did install Update 3, so am now on 6.1.7-15284 Update 3. Plus, I've remapped the appropriate drive (this box functions primarily as a backup target for a Poweredge T320 I run), so backups can proceed as normal.

Share this post


Link to post
Share on other sites

If you have an .xpenoboot directory somewhere in the root of DSM I strongly suggest you delete it. Simply SSH in the box and issue the following command:

rm -rf .xpenoboot

Will save you some headaches.

  • Like 1

Share this post


Link to post
Share on other sites
Guest
This topic is now closed to further replies.