Porkchop

Tutorial: How to downgrade from 6.2 to 6.1 - Recovering a bricked system

Recommended Posts

Hello! This is a short guide about how to downgrade from DSM 6.2 to DSM 6.1 after a failed upgrade. I made this mistake myself, so i'm sharing how to fix it! 

 

  • Your going to need a spare HDD or SSD. Make sure there is nothing on it. 
  • You've installed DSM before, so you should have your synoboot.img from JunsLoader.
  • Your current bootloader on your USB in now NONFUNCTIONING, you will need to reflash your USB with JunsLoader. 

 

  1. Once you have your new, reflashed USB stick, unplug ALL drives from the motherboard. Make sure there are NO SATA interfaces connected.
  2. Plug in your new, empty drive to the motherboard and boot from the USB stick. (You will need a keyboard for this step) The first option in the GRUB interface is what you are looking for.
  3. Use http://find.synology.com or Synology Web Assistant to find your DSM.
  4. Follow the steps and install DSM 6.1 from a PAT file. DO NOT INSTALL 6.2 
    • IF YOU LEFT A DRIVE IN IT WILL SAY RECOVER. GOTO 2. 
  5. Let Synology do its thing and eventually you will have a working 6.1. 
  6. While the PC is ON, connect the drives to the motherboard (with you data on it). Under Storage Manager they will pop up and say Unused.
  7. You might need to reboot, at this option. It took me some fiddling around, but eventually Synology will prompt you with a Recover Crashed Partition prompt. Click Yes.
  8. All of your data will appear, and Synology will automatically perform RAID scrubbing. 
  • Like 1
  • Thanks 1

Share this post


Link to post
Share on other sites

thanks this worked for me!

In my case my update to 6.1.7 didnt work..

I wasnt able to find my nas.

So followed the steps and then on step 4 I chose 6.1.7 again and while on step 6 I removed my spare ssd and then reconnected my other drives and restarted from the menu.

Once I got back to the main screen it was like nothing had ever happened everything was back to normal.

Share this post


Link to post
Share on other sites

5.2 Boot Loader - forceinstall - 6.1 Manual Installation - Boot to Jun's Mod 1.02b - 6.1 Migration - Downgrade Completed!
 

Share this post


Link to post
Share on other sites

Hello, I've tried your guide.

I get ou my 4 disk (with data) I boot my HP proliant gen 8 with Jun's loader  on key and one blank disk.

I can access the NAS with DSM 6.1.

BUT, when I exchange the blank disk, to put in my 4 disks, the NAS in invisible again on my Net

May I let the blank disk, and put 3 of my 4 disks ?In the case, will I found my data back ?

Thanks

Share this post


Link to post
Share on other sites
On 25/06/2018 at 4:58 PM, Porkchop said:

Hello! This is a short guide about how to downgrade from DSM 6.2 to DSM 6.1 after a failed upgrade. I made this mistake myself, so i'm sharing how to fix it! 

 

  • Your going to need a spare HDD or SSD. Make sure there is nothing on it. 
  • You've installed DSM before, so you should have your synoboot.img from JunsLoader.
  • Your current bootloader on your USB in now NONFUNCTIONING, you will need to reflash your USB with JunsLoader. 

 

  1. Once you have your new, reflashed USB stick, unplug ALL drives from the motherboard. Make sure there are NO SATA interfaces connected.
  2. Plug in your new, empty drive to the motherboard and boot from the USB stick. (You will need a keyboard for this step) The first option in the GRUB interface is what you are looking for.
  3. Use http://find.synology.com or Synology Web Assistant to find your DSM.
  4. Follow the steps and install DSM 6.1 from a PAT file. DO NOT INSTALL 6.2 
    • IF YOU LEFT A DRIVE IN IT WILL SAY RECOVER. GOTO 2. 
  5. Let Synology do its thing and eventually you will have a working 6.1. 
  6. While the PC is ON, connect the drives to the motherboard (with you data on it). Under Storage Manager they will pop up and say Unused.
  7. You might need to reboot, at this option. It took me some fiddling around, but eventually Synology will prompt you with a Recover Crashed Partition prompt. Click Yes.
  8. All of your data will appear, and Synology will automatically perform RAID scrubbing. 

This is a variation of the 'Clean Drive' downgrade process I've described before in several threads

 

1) Disconnect your raid drives

2) Connect spare HDD to SATA port 1

3) Install DSM 6.x from fresh 1.02b loader, create admin user and server name matching live system, DONT create volume/raid. Update DSM to match your pre-beta version

4) Shutdown, reconnect raid drives to SATA ports 2-n

5) Boot, login, repair system partitions on raid, shutdown

6) Remove spare HDD and reconnect raid drives to SATA ports 1-n

7) Boot, repair packages etc

 

 

My additional steps would be to power down the server between drive swaps and update DSM on the clean drive to the version matching the one running before crash. 

  • Thanks 2

Share this post


Link to post
Share on other sites

Thanks a lot, my friend, for your explanations.

I've recover a new Synology, and BTW, all my precious data.  !!!
It's wonderfull

 

Share this post


Link to post
Share on other sites

This is a variation of the 'Clean Drive' downgrade process I've described before in several threads

 

1) Disconnect your raid drives - Did this

2) Connect spare HDD to SATA port 1 - Did this

3) Install DSM 6.x from fresh 1.02b loader, create admin user and server name matching live system, DONT create volume/raid. Update DSM to match your pre-beta version - Did this

actually tried to install a PAT version earlier than the last version I was running. Syno Assistant told me which version I had to install and so I did.

4) Shutdown, reconnect raid drives to SATA ports 2-n - Did this but shutdown system at about 10% rebuild process. It occurred to me that leaving the spare HDD in to be part of the rebuild  is redundant.

 

5) Boot, login, repair system partitions on raid, Shutdown Did this

6) Remove spare HDD and reconnect raid drives to SATA ports 1-n - Did this

7) Boot, repair packages etc

 

 

My additional steps would be to power down the server between drive swaps and update DSM on the clean drive to the version matching the one running before crash. 

 

The above was posted by SBV3000 and many thanks. I did make a couple of changes to shorten the process. It worked for me but no warranty implied

Everything in coloured text is mine, all else belongs to SBV3000 and others

Share this post


Link to post
Share on other sites
On 7/20/2018 at 2:03 AM, manfriday said:

This is a variation of the 'Clean Drive' downgrade process I've described before in several threads

 

1) Disconnect your raid drives - Did this

2) Connect spare HDD to SATA port 1 - Did this

3) Install DSM 6.x from fresh 1.02b loader, create admin user and server name matching live system, DONT create volume/raid. Update DSM to match your pre-beta version - Did this

actually tried to install a PAT version earlier than the last version I was running. Syno Assistant told me which version I had to install and so I did.

4) Shutdown, reconnect raid drives to SATA ports 2-n - Did this but shutdown system at about 10% rebuild process. It occurred to me that leaving the spare HDD in to be part of the rebuild  is redundant.

 

5) Boot, login, repair system partitions on raid, Shutdown Did this

6) Remove spare HDD and reconnect raid drives to SATA ports 1-n - Did this

7) Boot, repair packages etc

 

 

My additional steps would be to power down the server between drive swaps and update DSM on the clean drive to the version matching the one running before crash. 

 

The above was posted by SBV3000 and many thanks. I did make a couple of changes to shorten the process. It worked for me but no warranty implied

Everything in coloured text is mine, all else belongs to SBV3000 and others

I follow your steps and managed to reinstall the DSM 6.1 with a new USB and blank SSD. However, when I insert my original raid drives (3 out of 4) into remaining bays, there is no option for me to repair the system partition. Can anyone offer some advice, thank you so much.

 

HP microserver gen8

Raid: SHR

 

Share this post


Link to post
Share on other sites
5 hours ago, Kelvin616 said:

I follow your steps and managed to reinstall the DSM 6.1 with a new USB and blank SSD. However, when I insert my original raid drives (3 out of 4) into remaining bays, there is no option for me to repair the system partition. Can anyone offer some advice, thank you so much.

 

HP microserver gen8

Raid: SHR

 

I followed the steps from other posters based on a 6.2 upgrade failure. My hardware is Intel based (J3455M) vs your AMD (I don't know if this makes a difference). Perhaps try another recovery process from earlier in this thread?

Share this post


Link to post
Share on other sites

So how crucial is using your original synoboot.img ?

 

Mine was backed up on said NAS drive that I can't access yet.

 

Is it ok to just create another which will likely have a different generated serial?

 

 

~ Kev

Share this post


Link to post
Share on other sites
On 7/31/2018 at 8:00 AM, Kelvin616 said:

I follow your steps and managed to reinstall the DSM 6.1 with a new USB and blank SSD. However, when I insert my original raid drives (3 out of 4) into remaining bays, there is no option for me to repair the system partition. Can anyone offer some advice, thank you so much.

 

HP microserver gen8

Raid: SHR

 

Glad it worked. You are correct about the spare drive not being needed for raid rebuild, thats why I say dont create a volume on it, avoids volume/raid naming issues. What this process does is to make a clean 'master' copy of DSM on the system partition of clean drive, using standard install processes (ie no need to mount ubuntu mdadm etc). DSM then finds 'corrupted' system parts on the other drives (because of the version mismatch) and in Storage Manager there should be the option to 'Repair System Partition' and when clicked warns to keep DRIVE 1 connected etc. On boot,  the system looks for a 'good' version of DSM on the lowest sata channel drive, hence use sata 1 etc 

Share this post


Link to post
Share on other sites
15 hours ago, GBKev said:

So how crucial is using your original synoboot.img ?

 

Mine was backed up on said NAS drive that I can't access yet.

 

Is it ok to just create another which will likely have a different generated serial?

 

 

~ Kev

It shouldn't make any difference.

Share this post


Link to post
Share on other sites

Server back online.  All data intact!  I owe you guys!!  Thanks a million.

Share this post


Link to post
Share on other sites

i brick my nas because you know stupidity, and im desperately trying to get it back by downgrading.

im able to install dsm on the clean drive ,reboot re-plug one of my original drive but all i get is the recover button.

it never boots back into the clean install of dsm.

not sure what im doing wrong

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now