|
Do you have your API limits set up right on the indexers in hydra?
|
# ? Feb 22, 2020 09:33 |
|
|
# ? Apr 28, 2024 17:22 |
|
derk posted:yea, transcoding 4k files is going to choke just about anything. don't offer the 4K movies to non 4K devices that can direct play them. Like i have a separate library for 4K movies, and only certain users have access to that library that i have set up their clients to play those files directly so it does NOT transcode them. Hmm, interesting idea. Will it also choke modern processors as well, even with Plex hardware transcoding? Seems like a pain to split things into separate libraries manually, unless that can be automated somehow? Like if a movie is 4K, put it in my movies_4k folder instead?
|
# ? Feb 22, 2020 16:42 |
|
I think a modern GPU should be able to handle 4k transcodes. I've got a Quadro P2000 in my Unraid box, and haven't run into any issues serving up 3-4 concurrent 4k hevc streams at 720p and 1080p.
|
# ? Feb 22, 2020 17:08 |
|
I have the standard SABnzbd/Sonarr/Radarr setup going in Windows (same PC as my Plex server), but I'm having issues with like 25% of my downloads not being sorted into the correct folder. They end up on my destination drive as the standard Title.Resolution.Quality.ReleaseGroup name instead of being renamed into Show (Year)\Season folders. I assume this has something to do with something locking the folder open. Any ideas on what would be causing this? In my Plex server settings, I have "scan my library automatically" (when changes are detected to library folders) and "run a partial scan when changes are detected" (when changes to library folders are detected, only scan the folder that changed). The "scan my library periodically" option is unchecked. Is there any reason to keep the first 2 both checked, or should I just use one of them? Could this scan lock the download folders so they can't be renamed and sorted properly?
|
# ? Feb 22, 2020 22:10 |
|
Jerk McJerkface posted:This here string: Thanks again for this, I was able to port my setup using your Compose as a starting base with a minimal of fuckery, preserving all the backend container volume data so the move was as transparent to services as could be. Truly, we live in an age of wonders... Now if only apps like NZB360 could support Oauth2, it'd be perfect. Jesse Iceberg fucked around with this message at 23:39 on Feb 22, 2020 |
# ? Feb 22, 2020 23:36 |
|
Jesse Iceberg posted:Thanks again for this, I was able to port my setup using your Compose as a starting base with a minimal of fuckery, preserving all the backend container volume data so the move was as transparent to services as could be. It sort of works if you auth through the browser and then open the app. You can also setup a direct "PORT:" right on the container, and it will by pass Traefik entirely if you go to the port, like I have setup for Calibre in that example (8080:8080). That will allow NZB and whatever else you use to work if you configure it to reach them directly. I'd advise against opening up those ports on your firewall and just letting the entire world get directly at your stack, but internally (or over a VPN I guess) it'd work.
|
# ? Feb 23, 2020 01:06 |
|
Does anyone still use the Completion.py script with nzbget anymore? Mine is throwing errors and the dev said he won't update it for python 3 but I don't know if it's even necessary anymore.
|
# ? Feb 23, 2020 04:22 |
|
I use it and it does what it's meant to. Running NZBGet in a Docker and no Python issues. I assume it's using the Python environment with the Docker.
|
# ? Feb 23, 2020 19:28 |
|
Anyone know what backbone premiumize.me is on? They aren't primarily a Usenet provider, and you have a monthly quota shared between all their stuff. But I've been noticing it has been catching 99% of things that fall to it. I have it as same priority as the frugal bonus server since both have a 1TB monthly quota which is also really impeccable as a backup.
|
# ? Feb 24, 2020 03:19 |
|
Henrik Zetterberg posted:Hmm, interesting idea. Will it also choke modern processors as well, even with Plex hardware transcoding? You set up two different installations of Radarr. In instance 1, you use the 4K profile and tell it to download to movies_4k. In instance 2, you use a 1080 profile and tell it to download to movies_1080. You have to add movies to both instances of Radarr if you want both copies. Each folder is added as a separate library in Plex.
|
# ? Feb 24, 2020 07:21 |
|
Sub Rosa posted:Anyone know what backbone premiumize.me is on? They aren't primarily a Usenet provider, and you have a monthly quota shared between all their stuff. But I've been noticing it has been catching 99% of things that fall to it. I have it as same priority as the frugal bonus server since both have a 1TB monthly quota which is also really impeccable as a backup. what is the server address of the premiumize server? i thought you had to use there special software?
|
# ? Feb 24, 2020 16:50 |
|
Henrik Zetterberg posted:Hmm, interesting idea. Will it also choke modern processors as well, even with Plex hardware transcoding? really not a pain at all. have you ever used a .plexignore file before? these files are essential from keeping your 4K movies from showing up in Plex libraries not meant for 4K capable users/clients. Especially if your 4K folder is within your main movie folder such as mine.
|
# ? Feb 24, 2020 17:46 |
|
SlipperyNipple posted:what is the server address of the premiumize server? i thought you had to use there special software? usenet.premiumize.me and no I use it with NZBget. You really get strangely a lot for your money. Usenet, 1TB of cloud storage, VPN, international VOIP, download torrents for you so your IP is never in the swarm, and let you download from a lot of file hosts without premium. I found out about it when I was messing with Kodi boxes because it is supported as a debrid server and kept it long after I gave up on Kodi in favor of just running my own auto fed Plex server. Edit: Might as well through out my referral link https://www.premiumize.me/ref/390043947 as we both get 15 bonus days of premium if you buy premium time. Sub Rosa fucked around with this message at 19:01 on Feb 24, 2020 |
# ? Feb 24, 2020 18:58 |
|
derk posted:really not a pain at all. I have not used that file before but will check it out, thanks!
|
# ? Feb 24, 2020 19:11 |
|
so uh Usenet has always somewhat intrigued me, I've got a reader working, have successfully downloaded some files but is this it? is it basically Old-Style Torrenting without having to faff about with seed/seeder shenanigans? Is a good indexer basically the same as a good tracker?
|
# ? Feb 29, 2020 04:12 |
|
SolusLunes posted:so uh Usenet has always somewhat intrigued me, I've got a reader working, have successfully downloaded some files but Basically, nobody uses a "reader" for downloading from usenet. They use tools like sonarr along with nzbget to automate the poo poo out of it.
|
# ? Feb 29, 2020 05:24 |
|
Goons were very mean the last time people talked about Usenet for actual reading of Usenet posts. Repent and use Google Groups!
|
# ? Feb 29, 2020 05:42 |
|
SolusLunes posted:Is a good indexer basically the same as a good tracker? Im not entirely sure what you’re looking for that you’re not getting. Usenet means not having to seed back to ratio or time limits, it’s purely one way. E: ah think I misread you. I thought you were asking along the lines of ‘what’s the point’ EL BROMANCE fucked around with this message at 15:22 on Feb 29, 2020 |
# ? Feb 29, 2020 05:52 |
|
Thermopyle posted:Basically, nobody uses a "reader" for downloading from usenet. They use tools like sonarr along with nzbget to automate the poo poo out of it. I actually still have a reader (GrabIt) because a lot of stuff posted in music groups doesn't get indexed, but I am definitely a rarity.
|
# ? Feb 29, 2020 08:35 |
|
Yeah that's pretty dumb to think newsreaders no longer have a place. Not everything you might want gets indexed and not everything is on your indexer.
|
# ? Feb 29, 2020 09:44 |
|
BeastOfExmoor posted:I actually still have a reader (GrabIt) because a lot of stuff posted in music groups doesn't get indexed, but I am definitely a rarity. What's your rationale for not running a local indexer?
|
# ? Feb 29, 2020 14:41 |
|
Former Human posted:Yeah that's pretty dumb to think newsreaders no longer have a place. Not everything you might want gets indexed and not everything is on your indexer. I didn't say they didn't have a place.
|
# ? Feb 29, 2020 16:00 |
|
I should investigate that because there’s stuff I want that isn’t out there that I feel like just has to be somewhere.
|
# ? Feb 29, 2020 18:42 |
|
Usenet express is having a sale 4 years unlimited to $95. https://members.usenetexpress.com/signup/?coupon=LEAPYEAR Or 1 year for $29. https://members.usenetexpress.com/signup/?coupon=LEAP1YR Just saw it on reddit.
|
# ? Feb 29, 2020 21:44 |
|
Greatest Living Man posted:What's your rationale for not running a local indexer? I'm not familiar with that at all, but I doubt it would improve my process. Right now I basically download headers for a few genre specific groups every few months and quickly scroll through to see if there's anything I want.
|
# ? Feb 29, 2020 22:15 |
|
Whoa sweet. Burden posted:Usenet express is having a sale 4 years unlimited to $95. Looks like this is on a different backbone than my unlimited ThunderNews. Is it worth it to get a second provider?
|
# ? Feb 29, 2020 22:22 |
|
UsenetExpress only has 1100 days retention, servers are all in the US, and the VPN service is spotty. It's cheap for a reason. https://www.techradar.com/reviews/usenetexpress
|
# ? Mar 1, 2020 05:56 |
|
Former Human posted:UsenetExpress only has 1100 days retention, servers are all in the US, and the VPN service is spotty. It's cheap for a reason. it is also the same as newsgroupdirect which doesnt exactly have glowing reviews even just in this topic as far as retention goes. 1100 days is a very generous number for them to advertise probably borderline false advertising. also i dont think i would want to pay for 4 years upfront of anything especially usenet. end of the year prices will probably be 20 bucks a year the way they keep dropping. SlipperyNipple fucked around with this message at 18:22 on Mar 1, 2020 |
# ? Mar 1, 2020 18:14 |
|
In other news (lol get it) Altopia finally achieved 16 days binary retention after 25 years in business
|
# ? Mar 1, 2020 20:55 |
|
A friend wants me to help them get set up. They’re competent but not as tech savvy as most of us. I was going to set them up with docker with everything inside a docker-compose file, because I figure it’s easier for them to have one ‘program’ to open and updating is as easy as stopping and restarting the containers than it is to have them maintain and update NZBGet, Sonarr, Radarr, Hydra and Plex individually and worry about mono versions etc. I know for me everything has got easier to manage since I migrated to Docker, but I’m a much heavier user than I expect them to ever be. Is this overkill and am I going about this the wrong way? Would I be better off setting them up with everything running outside of containers and tell them to just update from inside the apps?
|
# ? Mar 2, 2020 15:46 |
|
Tea Bone posted:A friend wants me to help them get set up. They’re competent but not as tech savvy as most of us. Why don't you ask your friend what does he/she actually want? Explain the pros and cons of each solution, show them how to use docker if they're interested and go from there? If you're gonna be the one maintaining/upgrading their solution then sure, install whatever you want. Otherwise no.
|
# ? Mar 2, 2020 16:00 |
|
Tea Bone posted:A friend wants me to help them get set up. They’re competent but not as tech savvy as most of us. Never update dockers in container. They're ephemeral. You can probably automate the update process with a cron job, or a script.
|
# ? Mar 2, 2020 16:05 |
|
Matt Zerella posted:Never update dockers in container. They're ephemeral. You can probably automate the update process with a cron job, or a script. Yeah, sorry this is what I meant, personally I think it's easier to recreate the containers than it is to update each app individually.
|
# ? Mar 2, 2020 16:13 |
|
Matt Zerella posted:Never update dockers in container. They're ephemeral. You can probably automate the update process with a cron job, or a script. There is a dockerised tool for this called watchtower.
|
# ? Mar 3, 2020 00:15 |
|
Tea Bone posted:A friend wants me to help them get set up. They’re competent but not as tech savvy as most of us. Just be aware that the dockers will update themselves and updates can occasionally break functionality. I've only had it happen for QBittorrent for me thus far, but it could theoretically happen to any of the containers.
|
# ? Mar 3, 2020 00:25 |
|
BeastOfExmoor posted:Just be aware that the dockers will update themselves and updates can occasionally break functionality. I've only had it happen for QBittorrent for me thus far, but it could theoretically happen to any of the containers. To be clear, the apps might be configured to do so, but automatically updating themselves is not a general feature of docker containers.
|
# ? Mar 3, 2020 00:35 |
|
To further complicate things, "never update dockers in a container" isn't a hard rule either. My Nextcloud instance needs to be updated through the Nextcloud interface for example. Before I switched to unraid I used the previously mentioned watchtower container to keep things automatically updating. It works well but with any docker updates keep periodic backups of the config folders in case you need to roll back. Occasionally you can get a broken update or have things break due to a major version change.
|
# ? Mar 3, 2020 08:29 |
|
THF13 posted:To further complicate things, "never update dockers in a container" isn't a hard rule either. My Nextcloud instance needs to be updated through the Nextcloud interface for example.
|
# ? Mar 3, 2020 10:10 |
|
The Linuxserver Nextcloud does some weird updating stuff, but the official Nextcloud image handles migrating the persistent data correctly when you replace the container image, automatically handling major version updates.
|
# ? Mar 3, 2020 14:26 |
|
|
# ? Apr 28, 2024 17:22 |
|
Sonarr v3 had a bug on the last version where it would hang process monitored downloads task. Mine had not downloaded anything in 3 days before I noticed. Update that fixes it would fail to auto install also, so had to go get the latest version on the website and install it. Also after installing had to change what the (windows) service uses as an account to get everything back to normal.
|
# ? Mar 6, 2020 16:35 |