Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Volguus
Mar 3, 2009

Mortanis posted:

Is there something that's like Sonarr, but for generic content? Magazines, comics, etc? Kind of a pain in the rear end of keeping track of all of that sort of stuff by hand.

I don't use Sonarr but Spotweb has been quite nice so far. It has everything I could ever ask/want.

Adbot
ADBOT LOVES YOU

Volguus
Mar 3, 2009
What I have been running for a few years now is a local spotweb installation. I have an account with some free indexer but which I rarely visit, only when spotweb cannot find something recent enough. Last time I looked the mysql database was at around 2GB or so, and has been working really well for me so far.

Volguus
Mar 3, 2009
I've been with supernews for quite a few years now. When I had 100-150 Mbps internet I always hit the speed limit. Now I'm on 250 and I don't always hit it. That is, most of the time I download at max speed, but every now and then I go to 5 MB (megabytes, not bits) per second. Always assumed it was something with either my ISP, my neighborhood or that others in the house are hitting the net hard (netflix and friends). I'm tempted to try newsdemon, but they don't have that GoT promotion running anymore it seems. Or do they and is only active Sunday/Monday?

Volguus
Mar 3, 2009

sedative posted:

It's only Sunday and Monday, but they said they're not doing that crazy cheap deal anymore. You can try newsgroup ninja for $5.99 https://www.newsgroup.ninja/?promo=unlimitedsale

It's the same backbone as newsdemon.

Oh cool, thanks. Made an account (it is a bit weird the method, with the dashboard/key basically not really protected by anything else). But it is nice to hit 29.5MB/s. I'll see how long it'll last.

Volguus
Mar 3, 2009

UltimoDragonQuest posted:

How does rolling your own newnznab work in the modern obfuscated world?
Seems like NZB.su has a lot of obfuscated stuff that doesn't hit most indexers.

I would guess it's just about the groups that one is indexing, but i could be wrong.

Volguus
Mar 3, 2009

Lawen posted:

I was talking to a friend recently who runs a newznab instance and asked how much disk space and network it used (I don't think RAM would be an issue). He said with ~5 years of indexing most binary groups other than boneless (and possibly not indexing porn groups?) he's using just under 100GB on disk and when it's pulling down headers it uses maybe 25mbps. Not nearly as bad as I'd assumed it would be and now I'm seriously thinking about running one of my own in AWS or Digital Ocean.

I looked a bit in newznab source code and they have a list of groups by default. It looks to be a bit light though. Is there a more comprehensive list of binary groups that one should monitor if one would be so inclined to roll a personal newznab? I have been using usenet binary groups for almost a decade now but I have been spoiled by all the front-ends and tools on top of it that hide the actual groups things are in so I am a bit in the dark here.
Edit: Bah, it was just a matter of searching a bit and I had all my answers. I guess the management of bandwidth, db space and RAM it comes down to the management of monitored groups. Something like boneless would most likely make the requirements skyrocket.

Volguus fucked around with this message at 15:24 on Oct 26, 2017

Volguus
Mar 3, 2009

Thermopyle posted:

I index 44 groups that I found just be googling around for good binary groups.

I've been running it for 5ish years and my db and newznab data directories are using ~10GB. Note that it took me a long time to get up to these 44 groups, so I don't really know what the data storage requirements would be if I was running 44 groups for that whole time.

In the last 30 minutes of indexing it's downloaded 2.02 GB. Mid-morning in America on a Thursday isn't exactly prime usenet activity time, though.

I saw that newznab offer their paid version that comes with the sphinx search engine. How awful is the search without it?

Volguus
Mar 3, 2009

Acid Reflux posted:

nZEDb is a free fork of newznab that pretty much covers all of nn's paid functions. I've just recently recommissioned an i3 machine that was sitting idle as a dedicated indexer, only 5 groups going back about 45 days at the moment, but it's doing a good job so far. I need to sit down and research some groups to expand it a bit.

Thanks for the info. Before I would splurge to give newznab my hard-earned $$$$ I wanna see what it could be capable of (I know, a fork, but still). I installed nZEDb, activated a group and now I'm the proud owner of a bunch of hashes that range in size from 300MB to 5GB. Yay me.
After some days of monitoring I'll probably add more groups just to see if there's anything out there besides those hashes.
One question: so far I was only able to update nZEDb using screen and the shell scripts. The tmux scripts wouldn't work for some reason (tmux wouldn't find sessions). Is newznab the same with updating (same style, screen+bash) or do they actually use cron like a normal person? Or do they run the scheduler internally?

Volguus
Mar 3, 2009

SymmetryrtemmyS posted:

Does it poll the groups more frequently than existing indexers? I see stuff posted on the usergroup IRC two minutes after it's done, but it's not on indexers for another 15 at least.

By default the script sleeps for only 60 seconds but I assume that can be changed somewhere.

Volguus
Mar 3, 2009
Having played with this indexer now it just occured to me: the main job of this little piece of code is to categorize the usenet posts. In newznab they chose to do it via regex. This has the advantage that one has full control of what goes where, but the disadvantage that they'd have to be maintained. I wonder if there's a better way to go about it. I mean, it's just a categorization problem and people have been able to train computers to recognize images and shapes and so on, surely it can't be that hard to categorize a bunch of text (headers, names, etc.).
Oh well, here's another project that may never be finished, but at least it could be a bit interesting.

Volguus
Mar 3, 2009

Thermopyle posted:

I've started work on a newznab replacement a few times doing just this with TensorFlow. It seems to work OK, but I never get around to actually doing all the work required to get from a really rough prototype to something usable.

I'm not confident it would work with less maintenance than a list of regexes anyway.

Well, not initially no. But while the newznab approach will always require maintenance, this has the potential to not need it after a while. One approach that can be taken (by both) would be to crowdsource the maintenance via either a central server (that could end up being quite expensive) or via plain old P2P. But the machine learning approach will always be superior once it gets going.

Volguus
Mar 3, 2009

Boris Galerkin posted:

How does Radarr compare to couchpotato?

Well, Radarr (for all its bloated Sonarr interface), actually managed to get me a movie once in the last 4 months I had it running. CouchPotato I had running for a few years, never got me anything.

Volguus
Mar 3, 2009

Thermopyle posted:

That could be because I don't really sit around waiting for stuff. I mostly have no idea what my usenet programs are doing...things just show up in Kodi and I watch them (or not).

I was the same way (well, minus kodi or plex or crap like that. they just show up in folders.), until I went one day in the basement where all the servers/nas/crap is. And I hear the NAS HDDs screaming in terror as the needles move faster than helicopter blades at full speed. What do you know, sonarr and radarr have the very nice habit of trashing the HDDs like is going out of style. They have tasks that launch every loving minute and no, their schedule cannot be updated. After quite a bit of searching on the net I ended up disabling a bunch of crap from sonarr and radarr and hopefully they let my NAS breathe a little.

Those programs are the devil, let me tell you.

Volguus
Mar 3, 2009

Grassy Knowles posted:

I don't have this issue, but I only put completed files on the NAS and use local server storage as the temporary download location.

I do that too (it's way faster than repairing or unrar-ing over the network mount), but they are/were monitoring poo poo I guess.

Volguus
Mar 3, 2009

Thermopyle posted:

Never have had any similar sort of problem and I have hundreds of shows in Sonarr and 1500 movies in Radar.

My server is monitored with Grafana/Telegraf so I'd know if anything was thrashing the system.

My NAS is just a NAS box, mounted over the network via NFS/SMB by the various servers and services that need said mount.2 of those services were sonarr and radarr until i told them to take a hike and unmounted the NAS from their VM. Given that 2 HDDs on the NAS are WD red at 5400RPM the speed of that thing was never its strong point.

Volguus
Mar 3, 2009

wolrah posted:

When running on the same system as the storage they can use system APIs to know if something changed. When accessing it over SMB they take a more brute force approach.

Any reason you couldn't just have your entire download apparatus directly on the NAS?

Directly on the NAS? That thing has a cpu that's powered by a drunken mouse (which I am not feeding). ARM something I believe. I got plenty of CPU power available for repairing and unraring things, the NAS just does storage over the network, nothing more.

Secondly , my download apparatus is quite complicated, with custom applications that I built over the years that do all kinds of crap. I understand that the NAS has a linux OS running on it, but everything is just much easier and faster if I have computers/VMs handling that and just shoving it onto the NAS when done for long term storage.

CPU on the NAS:

code:
processor       : 0
model name      : ARMv7 Processor rev 1 (v7l)
BogoMIPS        : 34.37
Features        : half thumb fastmult vfp edsp vfpv3 vfpv3d16 tls idivt
CPU implementer : 0x56
CPU architecture: 7
CPU variant     : 0x1
CPU part        : 0x581
CPU revision    : 1

Hardware        : Marvell Armada 370/XP (Device Tree)
Revision        : 0000
Serial          : 0000000000000000
Great little CPU, but I wouldn't run par2 on it and expect to be done at some point this decade.

Volguus fucked around with this message at 01:27 on Dec 28, 2017

Volguus
Mar 3, 2009

Laserface posted:

Sonarr is very solid but Radarr is still weird like CP was all those years ago. I had added a film before it released to blu-ray which did not download when it was released.

I manually found a file that met my criteria and downloaded it and THEN Radarr took the credit for it and moved/renamed for me.

Im assuming Radarr is not as far a long dev-wise as Sonarr?

Other than the godawful user interface and now this issue with working too hard on my NAS, didn't have a problem with Sonarr/Radarr either. Unfortunately, i'm not aware of a better one (slickbeard is old and I don't remember what the issue was but it had something going wrong that forced me to move to sonarr).

Volguus
Mar 3, 2009

wolrah posted:

I'd just upgrade the NAS, personally. There are plenty of nice options that run on decently capable x86-64 CPUs. Or DIY it, which is what I actually did because it was cheaper and I don't value my own time.

If you want to keep the split setup maybe using NFS or iSCSI might be a better way to do it.

Well, there is no reason to upgrade the NAS yet, since it's working just fine. What it does it does well. And yes, everyone other than a couple of windows machines in the house use NFS to talk to the thing (32k read/write buffers, though some only use 8K). Is not the speed of the NAS, it handles that well. But when you have a crappy program that's scanning the harddrives like there's no tomorrow every minute/hour/whatever it doesn't matter what you have. To throw hundreds of $ down the toilet because sonarr sucks balls is not acceptable. Luckily it can be told to stop this nonsense, even though i would have preferred to just tell it to take it easier. Ultimately I hope to replace it with something saner.

Volguus
Mar 3, 2009

astral posted:

The problem in that scenario isn't Sonarr, and it probably isn't Radarr either.

Except that those were the processes that were trashing the hdds/mount point at that time. My tools are not doing any folder scanning and except one, the others are not even touching the mounted NAS. What's with the Sonarr defending bandwagon: "need more powerful NAS, need to put everything in one place or else, definitely isn't sonarr, you're doing it wrong" when I caught the culprit with its pants down?

Volguus
Mar 3, 2009

astral posted:

It's simply way more likely you had (or still have, if you're still using them) something grossly misconfigured or merely misinterpreted what was causing the thrashing, that's all. What you described was by no means normal behavior, as several posters attested to.

From my investigation through its source code (quite a mess, and if you're not familiar with Owin ... good luck) it was one of the tasks that were run very often (Downloaded Episodes Scan maybe?) the cause for everything. As for configuration, i guess, it could be misconfigured:

- nzbhydra as indexer
- nzbget as download client (with now completed download handling disabled and Failed Download Handling set to No)
- and nothing else really, profiles, quality are just defaults, connect is empty, metadata is disabled,
- Media management does not rename anything, does not import extra files and ignores deleted episodes


All i need from a program like sonarr is to watch for the show when it appears, grab the nzb and send it to the download client. That's all there is to it, nothing less and definitely nothing more. What mine was also doing was moving from the TV folder to it's show folder in its appropriate season folder after it has been downloaded. Of course, since it gets the information from nzbget on where the show has been downloaded and since the mount path to the NAS is identical on both computers, it shouldn't have to scan anything, just a simple move over NFS (which should be fast as is just a parent change). And yet, it doesn't seem to be.

For now though, a crippled sonarr (I unmounted the download path, it's running blind at the moment) still does what i need it to do, so all is good until i find or make a replacement.

Volguus
Mar 3, 2009

astral posted:

Downloaded Episodes Scan was a task that ran if you were using the drone factory folder, which wasn't recommended (and is deprecated nowadays anyway). You should make sure the drone factory setting is empty and set to an interval of zero to make sure it's disabled. The better way is to link Sonarr to your download client with "Completed Download Handling" so it can pick new downloads up directly from the download client and file them away, but even that is an optional thing that can be turned off if you don't want it.

If you perchance accidentally had drone factory enabled and also for some reason set it to the same folder that housed the rest of your media library, that is a misconfiguration that would have resulted in a comical periodic scan of your entire media library (rather than just a newly-downloaded-but-not-yet-filed-away-items folder, which was what the drone factory was for) in search of new items it hadn't seen before.

That's the way it was (drone factory still is empty and set to 0) and i had it set to Yes on completed download handling (which now is set to no). I found it safer to just remove sonarr's access completely to the media files. Now it can do whatever crap it wants, in its own little box VM without any chance of trashing the NAS until the day i'll put it to rest definitively. It does complain that "OMG, cannot see episode files", but it can suck a bag of dicks as far as I'm concerned.

Volguus
Mar 3, 2009
Yes, maybe that's what it wants, to be on the same machine with the download client . I have sonarr and radarr on an ubuntu vm, whle the download client is on a different machine (fedora), both having had the NAS mounted under the same folder (/mnt/nas) . It could be linux, it could be the separation, it could be ... well anything really.

Volguus
Mar 3, 2009
Which is why they were both mapped exactly the same "/mnt/nas/Downloads/complete". Yes, I've read that part of the wiki, yes, it should just work. It probably has to do with it being a network mount and Sonarr (probably being written with windows in mind) doesn't know that, therefore it probably thinks everything is on the local hard-drive. Maybe in windows is easy to tell if a drive is a network drive or not from C#. In linux is probably not that trivial.

Thermopyle posted:

I'm think somewhere Sonarr specifically recommends not having your storage on a separate machine, but I'm not positive where or if that's right.

Really it sounds like Sonarr isn't the tool you want and you're shoehorning it.

It definitely isn't the tool that I want, but the drat sickbeard stopped working many moons ago. It was almost perfect, as it did only what i needed it to do, nothing more. Maybe I'll have to check to see what went wrong with it and maybe fix it if it isn't too difficult. All i need are the basics to automate that crap for me, in the quietest, easiest, fastest way possible. Sonarr does too many things, and does them badly.

Volguus
Mar 3, 2009

Laserface posted:

FWIW a lot of my issues with Sonarr did clear up when I moved SABZBD+'s completed download location to same local machine Sonarr and Radarr were installed on. It still isnt perfect, neither was Sickbeard, but Sickbeard was at least reliable in its function.

If sickbeard still worked I would still be using that. There was nothing else about Sonarr that had any appeal to me at all.

Sickbeard had only one issue: it didn't do pagination when listing the episodes of a series. So you have a tv show that has gotten to season 25 with 20+ episodes per season, well better don't click there. Other than that it was perfect.
The main reason I have sonarr and radarr on a separate machine is because I have that ubuntuvm running for other things, ubuntu apparently gets a bit more love than , say , fedora and the installation was faster and easier and they have a repo available for updates. The VM host however is running fedora and that's where I have a ton of HDD space (2TB hdd) so nzbget is staying there.

But, I also admit that I hated sonarr the first time i laid my eyes on it (that UI, ughhh), before even trying it out, so outside of an amazing performance that thing doesn't get any praise from me. I will replace it the first chance i get.

Volguus
Mar 3, 2009
Newgroup Ninja decided that offering free Usenet was not good for business,so yesterday they told me to give them a credit card soon. It's still cheap ($5.99), still getting at 29MB/s (that's bytes not bits)so yea ... quite worth it.

Dongattack posted:

People in two people in russia and india tried to access my Synology NAS DS218+. Their IP was auto blocked it says. How can i stop stuff like that? I don't want people to try and get into it D:

edit: "relax it happens to everyone" and "dont have SSH enabled when you're not using it" is what google told me. I have SSH on still i'm p sure from setting up some stuff, just gonna try to remember how i turned it on so i can turn it off.

edit2: found it under Terminal & SNMP under Control Panel. you're welcome random google person googling this 2 years from now on and welcome to our dead and gay forums

"relax it happens to everyone"
"dont have SSH enabled when you're not using it"

And that's all you can do (almost). If you have a computer/device/whatever accessible from the internet, on any port, there is a 100% chance that someone from the internet will try to access it. What can you do to minimize the intrusions, while still having the device available from outside:

1) Whitelist IPs. That is, block everything and only allow certain known IPs to contact you on whatever port you're listening. It may or may not be possible to do this.
2) If you're using well known services (like SSH) change the port. It doesn't do anything to protect you, but it may discourage flyby "hackers" that just probe for known ports.
3) Secure as much as you can the service that you are exposing. For SSH one thing can be to disable password authentication (only public key). To only allow your user to log in. Little tricks that can make ssh a tiny little bit safer.

Volguus fucked around with this message at 04:53 on Jan 28, 2018

Volguus
Mar 3, 2009

RoboBoogie posted:

i need some automation in my process

I have a VPS in France that downloads linux isos that was pushed by sonarr using nzbget. After it downloads i run the list of commands to push the unrar'd folders to a rclone folder for post processing by sonarr

code:
mv * ../staging && cd ../staging && rclone move . "linux:staging/" && rm -rf * && cd ../linuxiso 
Then sonarr would move the files to the right folder and then the file will magically show up on Plex. This is not girlfriend friendly and is not a smooth process as i need to log in and run the list of commands for magic to happen.

what is the best way to create a script that will move the folder to the rclone drive and then deletes the empty folder after it completes unrar-ing?

Cheers

Here's an move.sh script that I use to move downloads into a folder. Feel free to modify and improve it as you see fit:

code:
#!/bin/sh

##############################################################################
### NZBGET POST-PROCESSING SCRIPT                                          ###

# move to folder

### NZBGET POST-PROCESSING SCRIPT       ###
###########################################


if [ "$NZBPP_TOTALSTATUS" != "SUCCESS" ]; then
        echo "[ERROR] This nzb-file was not processed correctly, terminating the script"
        exit 95
fi

dest_dir="/target_folder/$NZBPP_CATEGORY"

if [ -z "$NZBPP_CATEGORY" ]; then
        dest_dir="target_folder"
fi

size=$(du -s "$NZBPP_DIRECTORY"|cut -f 1)
echo "dir size is $size bytes"
start=$(date +%s)

mkdir -p "$dest_dir"
mv -f "$NZBPP_DIRECTORY" "$dest_dir"
result=$?

end=$(date +%s)
duration=$(echo $(($end - $start)))
speed=$(echo $(($size / $duration)))

echo "Moved $NZBPP_NZBNAME of size $size KB in $duration seconds, an average of $speed KB/s"

if [ $result -eq 0 ]; then
        echo "move success"
        echo "[NZB] DIRECTORY=$dest_dir"
        exit 93
fi

echo "move failed"
exit 94

Volguus
Mar 3, 2009

derk posted:

you should be able to install nano. VI is confusing to use, especially for a rookie such as yourself, unless you have documentation on how to navigate and execute in VI, i highly suggest NOT using it!

Oh come on, figuring out how to exit vi is a rite of passage.

Volguus
Mar 3, 2009

nuvan posted:

I tried to upgrade a few months back, but my system has Java 9 installed, and hydra wouldn't work worth it. At the time it was set to wontfix, as the underlying problem was with spring, the web framework it uses.

You can download Java 8 from oracle, unpack it somewhere and point nzbhydra to use that JRE instead of the global one. Then you can still have java 9 but use 8 for those programs that still only work with it.

Volguus
Mar 3, 2009

el_caballo posted:

Is it worth it to switch all my poo poo (SAB, Hydra, Radar, Sonar, etc.) to Docker stuff if I'm just running a Windows 10 file server? Hydra1 seems to be getting a lot of http 500 errors and complaints from Radar/Sonar as being unresponsive, so I was going to try out Hydra2 and that led to Docker research.

I read about Docker awhile ago and was interested, but then read something about there not being much benefit if you weren't running it on a NAS box. Seems to be even more popular now.

I am of the belief that if I do not need technology X, I do not employ technology X. There are very valid use cases for Docker, running these programs on a server is not one of them. On the other hand, there probably is no harm in doing so, if your goal is to learn more about Docker.

Volguus
Mar 3, 2009

Thermopyle posted:

A benefit of running hydra2 in docker is not having to install JRE on your server. Same goes for radarr/sonarr with regards to Mono....but I don't have as much against Mono as I do against JRE.

It becomes especially useful when different apps require different versions of JRE or Mono.

I am not familiar with mono, if it can be installed outside of the package manager or not, but java can certainly be. You can have 100 JREs installed on a computer that do not conflict with each other at all. All you need to do is :

  • Get the tar.gz of the needed JRE
  • Unpack said tar.gz somewhere
  • launch said application with the JRE it needs. e.g. /some/path/jre1.2.3/bin/java -jar application.jar -whatever -arguments -it -needs
  • Profit?

For the more "complicated" applications, one may need to create a shell script that launches their shell script. Your shell script only needs to set the environment variables that the other app needs: JAVA_HOME,JRE_HOME, maybe update PATH so it has /some/path/jre1.2.3/bin/ at the start. Done.

Volguus
Mar 3, 2009

Thermopyle posted:

Yeah, I know. I wasn't saying that it's been impossible to use multiple JREs up to the introduction of Docker. I'm saying that Docker is just as valid a solution to that problem (and maybe even easier) and when you couple that with not wanting to manage a bunch of different JREs on top of managing your JRE-needing applications, Docker seems pretty attractive.

If you have the HDD space, sure. When I played with docker (getting an application of mine built by the container and create a slim container to host it) i found myself with a 30GB /var/lib/docker after a couple of hours. That's not very nice. Not very nice at all.

Volguus
Mar 3, 2009

The Gunslinger posted:

... and removes the few security concerns I had about running a media server. ...

It shouldn't. Docker is not a VM. One can get out of a VM as well, but it is a little bit harder. Out of docker is comparatively a breeze. If you had security concerns running a media server before, the same concerns should still exist. As one former docker developer said: do not run code that you don't trust in docker. Linux cgroups' goal is not security, but resource usage control and accounting, and namespaces ... again, is not a security replacement. A VM provides a much lower surface of attack of the host system.

Volguus
Mar 3, 2009

Dongattack posted:

Do you guys pay for a usenet indexer? I've just been using the free ones and they seem to work perfectly, but wondering if i'm missing out on anything.

I used to 10 years ago for a very popular one, that shut down after a few years (forget the name). Only a year or two ago i convinced myself to give $10 to nzbcat. But normally, no. They're dime-a-dozen, appearing and dissapearing like mushrooms after a rain. Been using free ones as much as i can.

Volguus
Mar 3, 2009

Rexxed posted:

Spotweb still exists and works kind of, but it seems like the Netherlands is its largest userbase. If you need NL subs for something it's got it.

While there are dutch subs on some movies, most of them have them external, so they can be disabled/deleted. Some movies do have them hardcoded though, that's true. Since it's a set & forget system, to not have it is kinda foolish. I have mine for a long time now (since newzbin died) and I am very happy with it. I get a lot of crap from there.

Volguus
Mar 3, 2009

salted hash browns posted:

Why would anyone use an unlimited account? With block accounts you pay ~$0.04 per GB in a 500GB block, and unlimited is ~$10/mo. It would take 100TB/mo+ to recoup that cost with a unlimited monthly account.

Huh? I don't quite follow that math. What i see there is $20/500GB block. $10/month for unlimited. So, the only way the block is worth it is if I download less than 250GB per month. To make it worth I would need to buy that 500GB block less than every 2 months. It's even worse if you look at the yearly plan ($90/year) but even so, 250GB per month is nothing.

I use newsgroupdirect now, $75/year. Relatively ok, 50 connection 3700 days retention.

Volguus
Mar 3, 2009

The Diddler posted:

I've started watching everything I can on Netflix because you can skip the intro on pretty much everything. I didn't think I would care about that until I started watching older shows with 60-75 seconds worth of intro.

A media player should have no issues skipping in 10 seconds intervals. Sure, it can get a bit annoying, but to pay $10/month to not have to do that is a bit ... strange. That being said, I do still have netflix (even though I don't need it/want it) only because that's the only way the family let me cancel cable many years ago. I would drop it in a heartbeat if I could.

Volguus
Mar 3, 2009

Sub Rosa posted:

I never used NZBHydra, but I do have 4 indexers I use, so seems like something I should check out. Are people still using NZBHydra or should I start straight away with NZBHydra 2?

Condoning NzbHydra2. About docker: sure, if you feel strongly about it. There is nothing to "install" nor "configure" about java (just unzip the thing somewhere and be done) so installing the plain version works just as well.

Volguus
Mar 3, 2009

Heners_UK posted:

I'll be honest, I felt the same way about it until I tried docker. The ease was huge once I spend a little time learning how it worked (probably the same amount of time it would take me to do a manual install). Well beyond any specific issues people may have with Java, Mono, Python etc.

I'm personally a bit surprised I find this easier than package managers but sure enough I do.

Oh i tried docker, used it for various purposes: run applications in a set environment, create a known and stable and ephemeral build environment for applications, etc. It's fine, nothing wrong with it, i just find it a bit over-the-top as whenever a fart needs to be made in a computer, the first thing people say: " do it docker". Sure sure sure, it may smell and all, but its your fart so jesus, let it be. It's quite ridiculous.
When installing an application requires nothing more than untarring a file ... eh, whatever, it's your poo poo, do whatever you like in it.

Volguus
Mar 3, 2009

ClassH posted:

Anyone have oznzb and hydra2? For some reason it wont work but seems to test fine in radarr or sonarr.
javax.net.ssl.SSLHandshakeException: Received fatal alert: handshake_failure
when trying to add it.

hydra2 recently had an update in which it broke SSL (not familiar with the details of the issue). The developer said that the latest update fixes all the troubles, but it could be that you may have to update manually.

Adbot
ADBOT LOVES YOU

Volguus
Mar 3, 2009

frameset posted:

People still use Sickbeard? I thought that died years ago.

No program ever dies. Just development on it may stop.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply