Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Decairn
Dec 1, 2007

Xaris posted:

... Docker and NZBGet stuff ...

The Docker applications are not Synology specific and therefore have a bigger install / support base, generally very fast to update to latest builds. They also have a common setup through Docker for your network and directory preferences which is probably a bit more flexible if you need that option.
In DSM Docker application, Containers sub-menu, right click the container and do settings export. That saves the config settings - save it somewhere you will backup.
Docker backup - in the docker file share make a directory for each container and point that container to use that for config and runtime data. You can backup each of those, or depending on the container application, a subset of what's in there. The Hyperbackup utility allows for directory level selection. It's what I do.

NZBGet can be finicky. Here's my config items of note on a 918, it gets 50-60MB/s without DSM ssd cache, and it jumped to 70-80MB/s with ssd cache drives added on a 1GB link, running in a Docker container. Ensure the news server setting has connections set to 50 or whatever that provider allows. If you have multiple, put them in the same level group so they can download in parallel. Turn FlushQueue off. Set ContinuePartial=No. Set ArticleCache=1024. Set DirectWrite=Yes. Set WriteBuffer=1024. PostStrategy=Balanced. DirectUnpack=No. UnpackPauseQueue=Yes. I also have the Completion and FailureLink extension scripts loaded to for some added smarts on downloading.

Decairn fucked around with this message at 17:34 on May 6, 2020

Adbot
ADBOT LOVES YOU

Xaris
Jul 25, 2006

Lucky there's a family guy
Lucky there's a man who positively can do
All the things that make us
Laugh and cry

Decairn posted:

The Docker applications are not Synology specific and therefore have a bigger install / support base, generally very fast to update to latest builds. They also have a common setup through Docker for your network and directory preferences which is probably a bit more flexible if you need that option.
In DSM Docker application, Containers sub-menu, right click the container and do settings export. That saves the config settings - save it somewhere you will backup.
Docker backup - in the docker file share make a directory for each container and point that container to use that for config and runtime data. You can backup each of those, or depending on the container application, a subset of what's in there. The Hyperbackup utility allows for directory level selection. It's what I do.

NZBGet can be finicky. Here's my config items of note on a 918, it gets 50-60MB/s without DSM ssd cache, and it jumped to 70-80MB/s with ssd cache drives added on a 1GB link, running in a Docker container. Ensure the news server setting has connections set to 50 or whatever that provider allows. If you have multiple, put them in the same level group so they can download in parallel. Turn FlushQueue off. Set ContinuePartial=No. Set ArticleCache=1024. Set DirectWrite=Yes. Set WriteBuffer=1024. PostStrategy=Balanced. DirectUnpack=No. UnpackPauseQueue=Yes. I also have the Completion and FailureLink extension scripts loaded to for some added smarts on downloading.

Hey thanks for this! :tipshat: Gunna look into backing up that stuff, although it wasn' ttoo hard to install, I imagine it's sort of annoying to set all the API keys and ports and PUIDs and stuff like that so if I could just pop those settings right in next time, that'd be swell. I kinda hosed up my Storage Pool and really don't need a SHR on all 4 of my drives and could probably keep 2 on SHR for personal files/photos (+ cloud backup), and the other 2 in a non-raid for movies/tv storage pool that I can just redownload if they do fail; so I may be looking at trying to back up most of it and wiping the DSM for a second time sometime here soon unless there's an easy way to downsize an SHR and get rid of a few drives from it without it going into Degraded status, but a cursory Google says that's not really possible.

I'm surprised and SSD cache really makes that big of a difference. That's the M.2 NVME port at the bottom that you can pop one in right? I did run a speedtest on the NAS and was getting pretty much my full gigabit speeds, but I guess that's not the same thing as actually downloading files. I think SABNZB is working well at the moment but I'll keep your post saved here incase I need to try NZBGet again.

Xaris fucked around with this message at 19:00 on May 6, 2020

Tanbo
Nov 19, 2013

For those who want an alternative, or just another request method, there's another released a while ago, Requestrr. https://github.com/darkalfx/requestrr
https://i.imgur.com/zrb5i4j.png

Similar to Ombi, just discord bot based. Can interface with Ombi using users already setup or it's own, or directly with Radarr/Sonarr. It's an easy way to request without opening ports or firing up a VPN.

Incessant Excess
Aug 15, 2005

Cause of glitch:
Pretentiousness
I’m following a weekly show that it’s not supported by Sonarr and I’m wondering if I could still automate it somehow. The releases are mostly named like this:

BRAND.SHOWNAME.YYYY.MM.DD.USAN.720p.WEB.h264-HEEL

with the “YYYY.MM.DD” obviously being the airdate of the episode. I used to have an RSS feed set up that would grab every new release that included the brand, show name and 720p but I eventually abandoned that solution because it led to a lot of duplicate. I’m wondering if it is possible to do something like the RSS feed I had, but in a way where SABNZBD doesn’t grab it if a release with an identical filename already exists in its history. Ideally, I would like it so that if the following releases of the show get posted (both are the same episode):

BRAND.SHOWNAME.YYYY.MM.DD.720p.HDTV.x264-Star
BRAND.SHOWNAME.YYYY.MM.DD.720p.WEB.h264-HEEL

It would grab whichever one gets posted first and skip the other, tho I imagine something like this will not be possible.

EL BROMANCE
Jun 10, 2006

COWABUNGA DUDES!
🥷🐢😬



I thought wrestling stuff was generally carried by tvdb now so thus supported in Sonarr. I don’t watch that particular one, but my setup picks up their Wednesday one fine altho sometimes needs a nudge to import it after grabbing.

Incessant Excess
Aug 15, 2005

Cause of glitch:
Pretentiousness

EL BROMANCE posted:

I thought wrestling stuff was generally carried by tvdb now so thus supported in Sonarr. I don’t watch that particular one, but my setup picks up their Wednesday one fine altho sometimes needs a nudge to import it after grabbing.

I only tried Sonarr with it some years ago so I wasn't aware of this change, thanks for the heads up!

Xaris
Jul 25, 2006

Lucky there's a family guy
Lucky there's a man who positively can do
All the things that make us
Laugh and cry

Tanbo posted:

For those who want an alternative, or just another request method, there's another released a while ago, Requestrr. https://github.com/darkalfx/requestrr
https://i.imgur.com/zrb5i4j.png

Similar to Ombi, just discord bot based. Can interface with Ombi using users already setup or it's own, or directly with Radarr/Sonarr. It's an easy way to request without opening ports or firing up a VPN.

Oh hey, that sounds pretty neat, thanks for the tip. I've just recently started using Ombi but tbf, it's kind of a pain to initially get it setup and give out info, and msot of all, actually have people actually use it, or remember to use it. probably better just to have people msg me, but this may be a better option because everyone has Discord and it's more accessible without a port forward and account stuff.

I kind of wish I could just use some sort of Plex plugin that integrates an Ombi-like feature to add requests into sonarr/radarr.

Thermopyle
Jul 1, 2003

...the stupid are cocksure while the intelligent are full of doubt. —Bertrand Russell

Xaris posted:

and msot of all, actually have people actually use it, or remember to use it.

Welcome to every startups problem. :(

Tanbo
Nov 19, 2013

Xaris posted:

Oh hey, that sounds pretty neat, thanks for the tip. I've just recently started using Ombi but tbf, it's kind of a pain to initially get it setup and give out info, and msot of all, actually have people actually use it, or remember to use it. probably better just to have people msg me, but this may be a better option because everyone has Discord and it's more accessible without a port forward and account stuff.

I kind of wish I could just use some sort of Plex plugin that integrates an Ombi-like feature to add requests into sonarr/radarr.

I set up a whole category for media related stuff in my server, one channel for reporting fakes/faulty/wrong subs/whatever, another channel where Radarr notifies on additions, and Sonarr, though just a select few, didn't want it to be spammed too much. Also a channel for "announcements". I still prefer Ombi, but I almost always have discord open, I don't always have a tab open for Ombi/Organizr. So a bit more convenient to use sometimes.

I also have a channel for bazaar notifications, was curious how much of the work it's doing for subs compared to Emby, looks like not much. I want to have Bazaar handle them and disable downloading subs in Emby, Bazaar has features like upgrading subs, but I'm not entirely sure how quickly Bazaar downloads subtitles, is it just once a day? Or as they're downloaded on notification from Radarr/Sonarr? Edit: Looks like every 3 hours based on the settings, looks like Emby only does it daily. I'll just disable it in Emby and see how it goes.

Tanbo fucked around with this message at 05:34 on May 12, 2020

Burden
Jul 25, 2006

Tanbo posted:

For those who want an alternative, or just another request method, there's another released a while ago, Requestrr. https://github.com/darkalfx/requestrr
https://i.imgur.com/zrb5i4j.png

Similar to Ombi, just discord bot based. Can interface with Ombi using users already setup or it's own, or directly with Radarr/Sonarr. It's an easy way to request without opening ports or firing up a VPN.

Does this allow you to approve what gets downloaded first or does it just send it to Radarr and Sonarr? I like to put shows and movies in certain folders and put certain settings on TV shows and movies depending on what they are.

Tanbo
Nov 19, 2013

Burden posted:

Does this allow you to approve what gets downloaded first or does it just send it to Radarr and Sonarr? I like to put shows and movies in certain folders and put certain settings on TV shows and movies depending on what they are.

Not directly no. But you can connect it to Ombi instead of Radarr/Sonarr and setup approvals there. You can also link discord users with Ombi users, so quotas, limits, etc all still work properly, and you can also see who the request is coming from.

Being able to manage the approval process in Ombi is on their feature tracker, near the top in votes, though probably not a short term goal.

quote:

https://app.gitkraken.com/glo/view/card/d0e188f3c18e4917a14ca15920ae6c7a
Allow Admins? to View/Approve/Deny Ombi requests via their configured chat clients


Edit: Also did a little testing with subtitles, Bazaar and Emby both search/download subs immediately, assuming they're available. The 3 hour timeframe in Bazaar/daily in Emby is for subtitles that aren't available at that point.

Tanbo fucked around with this message at 05:33 on May 12, 2020

Vykk.Draygo
Jan 17, 2004

I say salesmen and women of the world unite!
I think I don't understand how release profiles work in Sonarr. How can I make it so that Sonarr prefers Amazon rips but will download something else if it can't find any, and will upgrade if it's found at a later time?

EL BROMANCE
Jun 10, 2006

COWABUNGA DUDES!
🥷🐢😬



Vykk.Draygo posted:

I think I don't understand how release profiles work in Sonarr. How can I make it so that Sonarr prefers Amazon rips but will download something else if it can't find any, and will upgrade if it's found at a later time?

My setup is a release profile I call 'webscore' that then has the following values

.AMZN. 50
.IT. 40
.ITUNES. 40
.NF. 40
.HULU. 40
.DSNY. 40
.WEB. 30

which I then add to every show I track, and it works well. You can tell when it's in effect because if you do a manual search for an episode you'll now see scores next to it. The scores I've given are a bit arbitrary as I was expecting to have to amend this quite a few times, but honestly this is essentially my day 1 attempt with a few tweaks (adding the .s was important!).

I've noticed that .WEB. now often comes from better sources than it indicates so I get some duplication, but it's not a huge amount and I'm uncapped so I don't really care.

Vykk.Draygo
Jan 17, 2004

I say salesmen and women of the world unite!
That's smart. Thanks.

Sub Rosa
Jun 9, 2010




EL BROMANCE posted:

My setup is a release profile I call 'webscore' that then has the following values

.AMZN. 50
.IT. 40
.ITUNES. 40
.NF. 40
.HULU. 40
.DSNY. 40
.WEB. 30

which I then add to every show I track, and it works well. You can tell when it's in effect because if you do a manual search for an episode you'll now see scores next to it. The scores I've given are a bit arbitrary as I was expecting to have to amend this quite a few times, but honestly this is essentially my day 1 attempt with a few tweaks (adding the .s was important!).

I've noticed that .WEB. now often comes from better sources than it indicates so I get some duplication, but it's not a huge amount and I'm uncapped so I don't really care.

How do you do this? The only profile stuff I see isn't configurable like this?

Thermopyle
Jul 1, 2003

...the stupid are cocksure while the intelligent are full of doubt. —Bertrand Russell

Sub Rosa posted:

How do you do this? The only profile stuff I see isn't configurable like this?

You need to be running sonarr v2 v3.


VVV Oops, my bad!

Thermopyle fucked around with this message at 23:28 on May 18, 2020

EL BROMANCE
Jun 10, 2006

COWABUNGA DUDES!
🥷🐢😬



Thermopyle posted:

You need to be running sonarr v3.

I'm sure Thermo just made a typo, but don't want anyone to be confused.

more falafel please
Feb 26, 2005

forums poster

Maybe y'all can help. I'm running Sonarr/Radars/NZBget on a beefy PC, storing on a USB external HDD and then playing on an RPi3 with OSMC (so Kodi), mounting the shared external over SMB. The RPi is on WiFi, the PC is hardwired cat6 to the router.

I have trouble with buffering almost anything that downloads in 1080p. I know the RPi can't handle x265, so I exclude that, but even x264 stuff will get choppy and ill have to pause so it can buffer. My best guess is that it's bitrate/filesize related rather than decoding related, because bigger files tend to cause more problems.

The easiest fix I've found is to just prioritize 720p profiles in the Any quality category, which seems to work fine, but it seems like the RPi3 should be able to handle 1080p video, and it seems like the network should be able to handle ~2.5 gigs over 45 minutes, and it seems like the HDD and the USB interface should be able to as well, considering I can download the same file in, like 5 minutes, to the same drive.

What might be my bottleneck? The WiFi? My PS4 is able to get 1080p video over WiFi, an inch or two away from the RPi.

norp
Jan 20, 2004

TRUMP TRUMP TRUMP

let's invade New Zealand, they have oil
I wish they'd hurry up and release V3, I use freebsd packages rather than docker so the upgrade will be a pita

Skarsnik
Oct 21, 2008

I...AM...RUUUDE!




norp posted:

I wish they'd hurry up and release V3, I use freebsd packages rather than docker so the upgrade will be a pita

As long as you've got the right mono packages installed its just a case of downloading, extracting and running it

I'm not sure what mechanism freebsd uses to launch services , but neither init nor systemd are hard to figure out

It wasn't complicated on my old centos server anyway

UltimoDragonQuest
Oct 5, 2011



more falafel please posted:

What might be my bottleneck? The WiFi? My PS4 is able to get 1080p video over WiFi, an inch or two away from the RPi.
Pi WiFi speed is terrible. You might get slightly better results using Plex Server on your PC and the Pi as a client. But at that point you can just use Plex on the PS4.

hot date tonight!
Jan 13, 2009


Slippery Tilde
I was having exactly the same issues when I set up a pi with osmc a few weeks ago but I was sharing over NFS and not Samba. After a bunch of digging I found the performance of NFS shares which have been added with the gui perform significantly worse than shares mounted in fstab. When I switched it over all of my buffering issues disappeared. I don't know if that same issue applies to samba shares but it might be worth a shot.

norp
Jan 20, 2004

TRUMP TRUMP TRUMP

let's invade New Zealand, they have oil

Skarsnik posted:

As long as you've got the right mono packages installed its just a case of downloading, extracting and running it

I'm not sure what mechanism freebsd uses to launch services , but neither init nor systemd are hard to figure out

It wasn't complicated on my old centos server anyway

Its more that if I do all that I have to undo it some point in the future to go back to packages.

I can just definitely just extract it over the top of the existing install, or beside it and change the rc.d scripts appropriately.

But like I said... Then I have to fix it later when they release it

Keito
Jul 21, 2005

WHAT DO I CHOOSE ?

more falafel please posted:

Maybe y'all can help. I'm running Sonarr/Radars/NZBget on a beefy PC, storing on a USB external HDD and then playing on an RPi3 with OSMC (so Kodi), mounting the shared external over SMB. The RPi is on WiFi, the PC is hardwired cat6 to the router.

I have trouble with buffering almost anything that downloads in 1080p. I know the RPi can't handle x265, so I exclude that, but even x264 stuff will get choppy and ill have to pause so it can buffer. My best guess is that it's bitrate/filesize related rather than decoding related, because bigger files tend to cause more problems.

The easiest fix I've found is to just prioritize 720p profiles in the Any quality category, which seems to work fine, but it seems like the RPi3 should be able to handle 1080p video, and it seems like the network should be able to handle ~2.5 gigs over 45 minutes, and it seems like the HDD and the USB interface should be able to as well, considering I can download the same file in, like 5 minutes, to the same drive.

What might be my bottleneck? The WiFi? My PS4 is able to get 1080p video over WiFi, an inch or two away from the RPi.

I find Kodi to be almost useless on WiFi without modifying the caching. The defaults are way too optimistic and will make you buffer with the slightest disruption.

https://kodi.wiki/view/HOW-TO:Modify_the_video_cache

Skarsnik
Oct 21, 2008

I...AM...RUUUDE!




norp posted:

Its more that if I do all that I have to undo it some point in the future to go back to packages.

I can just definitely just extract it over the top of the existing install, or beside it and change the rc.d scripts appropriately.

But like I said... Then I have to fix it later when they release it

Assuming config is saved in the same place that would involve deleting a directory and a init file

I believe in you, you can do this 🙏

more falafel please
Feb 26, 2005

forums poster

Keito posted:

I find Kodi to be almost useless on WiFi without modifying the caching. The defaults are way too optimistic and will make you buffer with the slightest disruption.

https://kodi.wiki/view/HOW-TO:Modify_the_video_cache

Oh wow. I should mention that the Samba shares are mounted with fstab (much easier than figuring out whatever GUI tools might be there, and OSMC doesn't have much in the way of stuff outside Kodi), so I wouldn't be surprised if Kodi just thinks it's a local drive and doesn't do any caching. That would make sense.

What settings do you use?

derk
Sep 24, 2004
man retention is bad on NGD lately. found a show only 157 days old and it fails. I remember when i first got NGD, i was grabbing stuff that was 4-500 days old no problem. I am gonna need a new indexer. what is the good ones now? NGD is nice and cheap i have it $40 a year auto renew unlimited.

edit: i signed up for ninja, instant success and cheaper than NGD!!! great steal!!! i will keep both, ninja is the backup, good for those backfills

derk fucked around with this message at 02:58 on May 29, 2020

EL BROMANCE
Jun 10, 2006

COWABUNGA DUDES!
🥷🐢😬



Good news for macOS Catalina people, it looks like the update that rolled out for Sonarr v3 yesterday 'fixes' the permission issue that needs manual intervention. I had a request for Sonarr to access my external storage earlier, and changelog shows

code:
New
Replaced launcher on OSX Catalina so that individual permissions can be assigned
So that makes it a bit easier. I expect Radarr Aphrodite will follow suit.

The Gunslinger
Jul 24, 2004

Do not forget the face of your father.
Fun Shoe
Anyone here farted around with Traefik in unRAID? I've been using one of the combo LetsEncrypt/nginx dockers but stuff like nzbhydra breaks periodically and its always a pain to troubleshoot/fix. Is Traefik easy to setup with a custom domain?

Edit: oh, I should post this in the NAS thread.

The Gunslinger fucked around with this message at 14:58 on Jun 14, 2020

Super-NintendoUser
Jan 16, 2004

COWABUNGERDER COMPADRES
Soiled Meat

The Gunslinger posted:

Anyone here farted around with Traefik in unRAID? I've been using one of the combo LetsEncrypt/nginx dockers but stuff like nzbhydra breaks periodically and its always a pain to troubleshoot/fix. Is Traefik easy to setup with a custom domain?

Edit: oh, I should post this in the NAS thread.

I have traefik set up in docker with custom domains and let's encrypt. It's fantastic and pretty easy. Getting wildcard certs took a bit of finagling, but that was because GoDaddy broke some of the apis. I posted my docker-compose.yml a while back in this thread that gives a full example (before I set up the wildcards).

I can post my lastest config with wildcard setup if the thread is interested.

Skarsnik
Oct 21, 2008

I...AM...RUUUDE!




Letsencypt makes it so easy to add multiple domains to a cert I've never seen much point in trying to get a wildcard to work

I think mine has about 8 or 9 now tied to a single primary cert

Hughlander
May 11, 2005

Skarsnik posted:

Letsencypt makes it so easy to add multiple domains to a cert I've never seen much point in trying to get a wildcard to work

I think mine has about 8 or 9 now tied to a single primary cert

I got api limited by let’s encrypt from configuring my system before I went to wildcard. I think I use 60 or so hostnames now that it’s fully configured.

Matt Zerella
Oct 7, 2002

Norris'es are back baby. It's good again. Awoouu (fox Howl)

Skarsnik posted:

Letsencypt makes it so easy to add multiple domains to a cert I've never seen much point in trying to get a wildcard to work

I think mine has about 8 or 9 now tied to a single primary cert

It's just easier to do a wildcard with traefik and combine it with a wildcard DNS entry.

Then I add my sub domains in my labels for docker and it Just Works.

Skarsnik
Oct 21, 2008

I...AM...RUUUDE!




Hughlander posted:

I got api limited by let’s encrypt from configuring my system before I went to wildcard. I think I use 60 or so hostnames now that it’s fully configured.

:catstare:

Irritated Goat
Mar 12, 2005

This post is pathetic.

Jerk McJerkface posted:

I have traefik set up in docker with custom domains and let's encrypt. It's fantastic and pretty easy. Getting wildcard certs took a bit of finagling, but that was because GoDaddy broke some of the apis. I posted my docker-compose.yml a while back in this thread that gives a full example (before I set up the wildcards).

I can post my lastest config with wildcard setup if the thread is interested.

I think I need something better than a Pi 3. Installing most of the suite of containers makes them all run like poo poo. 😑

The Gunslinger
Jul 24, 2004

Do not forget the face of your father.
Fun Shoe

Jerk McJerkface posted:

I have traefik set up in docker with custom domains and let's encrypt. It's fantastic and pretty easy. Getting wildcard certs took a bit of finagling, but that was because GoDaddy broke some of the apis. I posted my docker-compose.yml a while back in this thread that gives a full example (before I set up the wildcards).

I can post my lastest config with wildcard setup if the thread is interested.

Yeah that would be great if you have the time man, appreciate it. I'm going to try to get it going on unRAID in the next week or two.

Kin
Nov 4, 2003

Sometimes, in a city this dirty, you need a real hero.
I've just upgraded my TV (50" to a 75" 4K capable one) and was wondering if any of you have a general rule when it comes to resolution and file sizes?

I've actually been sticking to 720p just out of sheer habit, but the bigger screen's making me rethink that given how things look.

The filesize obviously goes up though but I've seen it fluctuate quite a bit when you get into the 1080p+ range (720p generally seems to sit between 1 and 1.5GB). Is there a max file size that you generally set Sonar to aim for on 1080p so that it finds things without downloading a 7GB file for a single episode of something.

Or should i actually be starting to look at even higher resolutions now?

Vykk.Draygo
Jan 17, 2004

I say salesmen and women of the world unite!

Kin posted:

I've just upgraded my TV (50" to a 75" 4K capable one) and was wondering if any of you have a general rule when it comes to resolution and file sizes?

I've actually been sticking to 720p just out of sheer habit, but the bigger screen's making me rethink that given how things look.

The filesize obviously goes up though but I've seen it fluctuate quite a bit when you get into the 1080p+ range (720p generally seems to sit between 1 and 1.5GB). Is there a max file size that you generally set Sonar to aim for on 1080p so that it finds things without downloading a 7GB file for a single episode of something.

Or should i actually be starting to look at even higher resolutions now?

I'm a stickler for quality and I've never had a problem with 1080p webrips on a 65" 4k tv. I would say use the release profiles posted towards the top of this page and set the quality to 1080p.

However, looking through the 4K offerings, I am noticing that they are being released more frequently, and a number of them have HDR. Is there a way to tell Sonarr to download an episode in 4K ONLY if it has "HDR" in the title, and otherwise prefer 1080p?

Burden
Jul 25, 2006

Vykk.Draygo posted:

I'm a stickler for quality and I've never had a problem with 1080p webrips on a 65" 4k tv. I would say use the release profiles posted towards the top of this page and set the quality to 1080p.

However, looking through the 4K offerings, I am noticing that they are being released more frequently, and a number of them have HDR. Is there a way to tell Sonarr to download an episode in 4K ONLY if it has "HDR" in the title, and otherwise prefer 1080p?

If you are still on version 2 of Sonarr and go to the indexers tab, at the bottom is restrictions. Add HDR as the must contain and I would use HDR as the tag as well. Then when you add a series (but not add and search), and go into it, click the wrench to edit series, add your HDR tag, and then tell it to search. It won't auto 1080p though in version 2.

I'm pretty sure the new version of Sonarr does exactly what you are saying, but I have not messed with it yet.

Adbot
ADBOT LOVES YOU

St. Blaize
Oct 11, 2007
Does anyone have any idea how long extracting a 50-75GB download should take? I'm using nzbget in a docker container and it seems slow at around 20-30 minutes. The host system is an HP 290 with a G4900 which may be slower than I think.

Also my TV does not support DTS and I'd like to transcode the audio track, what's the easiest way to automate that?

St. Blaize fucked around with this message at 16:37 on Jun 24, 2020

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply