Transition Members
  • Content Count

  • Joined

  • Last visited

Community Reputation

0 Neutral

About ryerye

  • Rank
  1. Yes, my conclusion is that GPU passthrough will not help for thumbnail/video transcoding. Warning: dont passthrough your only GPU -> esxi does not boot up without GPU -> I ended up with esxi reinstallation. Actually the i7 of the NUC is powerful enough and the CPU conversion doesn't take too long, so after the esxi reinstall I ended up my passthrough experimentations.
  2. sl0n, actually 916+ and 918+ versions originally support HW transcoding supported by GPU, whereby 3615 and 3617 dont support I was just wondering if the video conversion process would benefit from this feature.
  3. Hi guys, First I would like to share my minimalistic, but powerful enough homelab configuration: Intel NUC i7 (1TB M.2 NVMe, 32GB DDR4) is hosting ESXI 6.7 around 10 VM-s (win, linux etc) are running on that ESX-i for homelab for testing purposes. PLUS the "production" Xpeno 6.2 and one test Xpeno 6.2, both 918+ running on ESXi. the production Xpeno has one external 8TB WD Red HD attached to NUC exclusively via ESXi passthrough, connected via 0.5m SATA cable + external power supply for 3,5" HDD. Backup is done via USB 3 port to two 4TB 2.5" HDD (direct passthrough to Xpeno) as the NUC has only 1 SATA port. I am very happy with that minimalistic solution, however I am well aware that it is not for everyone. + low power consumption and space requirement + powerful enough, very good transfere rate and responsive UI + all updates can be tested with the test Xpeno before upgrading the production system - Power pins of SATA cable and power supply need to be soldered (I did not find any ready made cable for this specific purpose) - Backups need some discipline Second, I have two simple questions 1. Would the HW transcoding of 918+ also speed up the Video conversion process of the Photo Station? I mean the process of generating lower quality videos and thumbnails for fast viewing? 2. I am running 6.2 for 918+ on ESXi and I am able to passthrough the Intel 620 GPU of the NUC exclusively to the Xpenology. Is that the prerequisite for the hardware transcoding in ESXi environment. How to get it running and how to verify that it is running? Thnx in advance
  4. I have taken a real SN and MAC from old physical Synology box and set it up on virtual 3615xs Synology. Stil not able to activate legally purchased camera license. Still the "Connection failed. Please check your network settings" error.
  5. I am well aware that Synology tried to limit moving the licences, however afaik they cancelled the limitations again and now there should be no restrictions. At least I could move the licence back from Xpenology 6.02 to DS112 again. Therefore I would still believe that the error has something to do with Xpenology detected by Synology.
  6. Right now tried also the MAC/SN which worked on baremetal Xpenology 6.02, same "connection failed" message. I know, I can take my DS112 box and run it for cameras only, but after the ESXi experience I just cant see myself going back to any HW based solution. I would love to pay licences, but not ready to go back to any HW based solution, not to the baremetal xpenology and not to the original Synology box, its so huge difference in response time and user experience in general.
  7. Coming back to the initial question with some answers and no solution I have bought a licence for the 3-rd cam which worked well on DS112 box. The same licence worked also on baremetal Xpenology 6.02. Now I have gone over to ESXi based environment and the licence does not activate anymore, giving the "connection failed" error I have tried SN and MAC pair generated by the excel file available in the net. I have also tried the original MAC and SN from the DS112, with still the connection failed message at activation the licence. Might be that there are some detection mechanisms for non genuine HW. I also tried to fall back to the oldest possible surveillance station with no success. My workaround: I am running separate Xpenology instance on ESXi for every pair of cameras. Not very convenient as they have separate timelines. Still looking for a more permanent solution, and I'm ready to buy more licences if I get it working.