|
Mortanis posted:Is there something that's like Sonarr, but for generic content? Magazines, comics, etc? Kind of a pain in the rear end of keeping track of all of that sort of stuff by hand. I don't use Sonarr but Spotweb has been quite nice so far. It has everything I could ever ask/want.
|
# ¿ Apr 9, 2016 03:39 |
|
|
# ¿ Apr 25, 2024 11:11 |
|
What I have been running for a few years now is a local spotweb installation. I have an account with some free indexer but which I rarely visit, only when spotweb cannot find something recent enough. Last time I looked the mysql database was at around 2GB or so, and has been working really well for me so far.
|
# ¿ Sep 22, 2016 21:00 |
|
I've been with supernews for quite a few years now. When I had 100-150 Mbps internet I always hit the speed limit. Now I'm on 250 and I don't always hit it. That is, most of the time I download at max speed, but every now and then I go to 5 MB (megabytes, not bits) per second. Always assumed it was something with either my ISP, my neighborhood or that others in the house are hitting the net hard (netflix and friends). I'm tempted to try newsdemon, but they don't have that GoT promotion running anymore it seems. Or do they and is only active Sunday/Monday?
|
# ¿ Aug 12, 2017 00:13 |
|
sedative posted:It's only Sunday and Monday, but they said they're not doing that crazy cheap deal anymore. You can try newsgroup ninja for $5.99 https://www.newsgroup.ninja/?promo=unlimitedsale Oh cool, thanks. Made an account (it is a bit weird the method, with the dashboard/key basically not really protected by anything else). But it is nice to hit 29.5MB/s. I'll see how long it'll last.
|
# ¿ Aug 12, 2017 01:10 |
|
UltimoDragonQuest posted:How does rolling your own newnznab work in the modern obfuscated world? I would guess it's just about the groups that one is indexing, but i could be wrong.
|
# ¿ Oct 25, 2017 14:08 |
|
Lawen posted:I was talking to a friend recently who runs a newznab instance and asked how much disk space and network it used (I don't think RAM would be an issue). He said with ~5 years of indexing most binary groups other than boneless (and possibly not indexing porn groups?) he's using just under 100GB on disk and when it's pulling down headers it uses maybe 25mbps. Not nearly as bad as I'd assumed it would be and now I'm seriously thinking about running one of my own in AWS or Digital Ocean. I looked a bit in newznab source code and they have a list of groups by default. It looks to be a bit light though. Is there a more comprehensive list of binary groups that one should monitor if one would be so inclined to roll a personal newznab? I have been using usenet binary groups for almost a decade now but I have been spoiled by all the front-ends and tools on top of it that hide the actual groups things are in so I am a bit in the dark here. Edit: Bah, it was just a matter of searching a bit and I had all my answers. I guess the management of bandwidth, db space and RAM it comes down to the management of monitored groups. Something like boneless would most likely make the requirements skyrocket. Volguus fucked around with this message at 15:24 on Oct 26, 2017 |
# ¿ Oct 26, 2017 14:43 |
|
Thermopyle posted:I index 44 groups that I found just be googling around for good binary groups. I saw that newznab offer their paid version that comes with the sphinx search engine. How awful is the search without it?
|
# ¿ Oct 26, 2017 15:32 |
|
Acid Reflux posted:nZEDb is a free fork of newznab that pretty much covers all of nn's paid functions. I've just recently recommissioned an i3 machine that was sitting idle as a dedicated indexer, only 5 groups going back about 45 days at the moment, but it's doing a good job so far. I need to sit down and research some groups to expand it a bit. Thanks for the info. Before I would splurge to give newznab my hard-earned $$$$ I wanna see what it could be capable of (I know, a fork, but still). I installed nZEDb, activated a group and now I'm the proud owner of a bunch of hashes that range in size from 300MB to 5GB. Yay me. After some days of monitoring I'll probably add more groups just to see if there's anything out there besides those hashes. One question: so far I was only able to update nZEDb using screen and the shell scripts. The tmux scripts wouldn't work for some reason (tmux wouldn't find sessions). Is newznab the same with updating (same style, screen+bash) or do they actually use cron like a normal person? Or do they run the scheduler internally?
|
# ¿ Oct 27, 2017 04:47 |
|
SymmetryrtemmyS posted:Does it poll the groups more frequently than existing indexers? I see stuff posted on the usergroup IRC two minutes after it's done, but it's not on indexers for another 15 at least. By default the script sleeps for only 60 seconds but I assume that can be changed somewhere.
|
# ¿ Oct 27, 2017 12:17 |
|
Having played with this indexer now it just occured to me: the main job of this little piece of code is to categorize the usenet posts. In newznab they chose to do it via regex. This has the advantage that one has full control of what goes where, but the disadvantage that they'd have to be maintained. I wonder if there's a better way to go about it. I mean, it's just a categorization problem and people have been able to train computers to recognize images and shapes and so on, surely it can't be that hard to categorize a bunch of text (headers, names, etc.). Oh well, here's another project that may never be finished, but at least it could be a bit interesting.
|
# ¿ Oct 27, 2017 14:33 |
|
Thermopyle posted:I've started work on a newznab replacement a few times doing just this with TensorFlow. It seems to work OK, but I never get around to actually doing all the work required to get from a really rough prototype to something usable. Well, not initially no. But while the newznab approach will always require maintenance, this has the potential to not need it after a while. One approach that can be taken (by both) would be to crowdsource the maintenance via either a central server (that could end up being quite expensive) or via plain old P2P. But the machine learning approach will always be superior once it gets going.
|
# ¿ Oct 27, 2017 21:44 |
|
Boris Galerkin posted:How does Radarr compare to couchpotato? Well, Radarr (for all its bloated Sonarr interface), actually managed to get me a movie once in the last 4 months I had it running. CouchPotato I had running for a few years, never got me anything.
|
# ¿ Dec 21, 2017 23:36 |
|
Thermopyle posted:That could be because I don't really sit around waiting for stuff. I mostly have no idea what my usenet programs are doing...things just show up in Kodi and I watch them (or not). I was the same way (well, minus kodi or plex or crap like that. they just show up in folders.), until I went one day in the basement where all the servers/nas/crap is. And I hear the NAS HDDs screaming in terror as the needles move faster than helicopter blades at full speed. What do you know, sonarr and radarr have the very nice habit of trashing the HDDs like is going out of style. They have tasks that launch every loving minute and no, their schedule cannot be updated. After quite a bit of searching on the net I ended up disabling a bunch of crap from sonarr and radarr and hopefully they let my NAS breathe a little. Those programs are the devil, let me tell you.
|
# ¿ Dec 27, 2017 23:31 |
|
Grassy Knowles posted:I don't have this issue, but I only put completed files on the NAS and use local server storage as the temporary download location. I do that too (it's way faster than repairing or unrar-ing over the network mount), but they are/were monitoring poo poo I guess.
|
# ¿ Dec 27, 2017 23:36 |
|
Thermopyle posted:Never have had any similar sort of problem and I have hundreds of shows in Sonarr and 1500 movies in Radar. My NAS is just a NAS box, mounted over the network via NFS/SMB by the various servers and services that need said mount.2 of those services were sonarr and radarr until i told them to take a hike and unmounted the NAS from their VM. Given that 2 HDDs on the NAS are WD red at 5400RPM the speed of that thing was never its strong point.
|
# ¿ Dec 28, 2017 00:28 |
|
wolrah posted:When running on the same system as the storage they can use system APIs to know if something changed. When accessing it over SMB they take a more brute force approach. Directly on the NAS? That thing has a cpu that's powered by a drunken mouse (which I am not feeding). ARM something I believe. I got plenty of CPU power available for repairing and unraring things, the NAS just does storage over the network, nothing more. Secondly , my download apparatus is quite complicated, with custom applications that I built over the years that do all kinds of crap. I understand that the NAS has a linux OS running on it, but everything is just much easier and faster if I have computers/VMs handling that and just shoving it onto the NAS when done for long term storage. CPU on the NAS: code:
Volguus fucked around with this message at 01:27 on Dec 28, 2017 |
# ¿ Dec 28, 2017 01:13 |
|
Laserface posted:Sonarr is very solid but Radarr is still weird like CP was all those years ago. I had added a film before it released to blu-ray which did not download when it was released. Other than the godawful user interface and now this issue with working too hard on my NAS, didn't have a problem with Sonarr/Radarr either. Unfortunately, i'm not aware of a better one (slickbeard is old and I don't remember what the issue was but it had something going wrong that forced me to move to sonarr).
|
# ¿ Dec 28, 2017 01:45 |
|
wolrah posted:I'd just upgrade the NAS, personally. There are plenty of nice options that run on decently capable x86-64 CPUs. Or DIY it, which is what I actually did because it was cheaper and I don't value my own time. Well, there is no reason to upgrade the NAS yet, since it's working just fine. What it does it does well. And yes, everyone other than a couple of windows machines in the house use NFS to talk to the thing (32k read/write buffers, though some only use 8K). Is not the speed of the NAS, it handles that well. But when you have a crappy program that's scanning the harddrives like there's no tomorrow every minute/hour/whatever it doesn't matter what you have. To throw hundreds of $ down the toilet because sonarr sucks balls is not acceptable. Luckily it can be told to stop this nonsense, even though i would have preferred to just tell it to take it easier. Ultimately I hope to replace it with something saner.
|
# ¿ Dec 28, 2017 03:45 |
|
astral posted:The problem in that scenario isn't Sonarr, and it probably isn't Radarr either. Except that those were the processes that were trashing the hdds/mount point at that time. My tools are not doing any folder scanning and except one, the others are not even touching the mounted NAS. What's with the Sonarr defending bandwagon: "need more powerful NAS, need to put everything in one place or else, definitely isn't sonarr, you're doing it wrong" when I caught the culprit with its pants down?
|
# ¿ Dec 28, 2017 04:10 |
|
astral posted:It's simply way more likely you had (or still have, if you're still using them) something grossly misconfigured or merely misinterpreted what was causing the thrashing, that's all. What you described was by no means normal behavior, as several posters attested to. From my investigation through its source code (quite a mess, and if you're not familiar with Owin ... good luck) it was one of the tasks that were run very often (Downloaded Episodes Scan maybe?) the cause for everything. As for configuration, i guess, it could be misconfigured: - nzbhydra as indexer - nzbget as download client (with now completed download handling disabled and Failed Download Handling set to No) - and nothing else really, profiles, quality are just defaults, connect is empty, metadata is disabled, - Media management does not rename anything, does not import extra files and ignores deleted episodes All i need from a program like sonarr is to watch for the show when it appears, grab the nzb and send it to the download client. That's all there is to it, nothing less and definitely nothing more. What mine was also doing was moving from the TV folder to it's show folder in its appropriate season folder after it has been downloaded. Of course, since it gets the information from nzbget on where the show has been downloaded and since the mount path to the NAS is identical on both computers, it shouldn't have to scan anything, just a simple move over NFS (which should be fast as is just a parent change). And yet, it doesn't seem to be. For now though, a crippled sonarr (I unmounted the download path, it's running blind at the moment) still does what i need it to do, so all is good until i find or make a replacement.
|
# ¿ Dec 28, 2017 04:44 |
|
astral posted:Downloaded Episodes Scan was a task that ran if you were using the drone factory folder, which wasn't recommended (and is deprecated nowadays anyway). You should make sure the drone factory setting is empty and set to an interval of zero to make sure it's disabled. The better way is to link Sonarr to your download client with "Completed Download Handling" so it can pick new downloads up directly from the download client and file them away, but even that is an optional thing that can be turned off if you don't want it. That's the way it was (drone factory still is empty and set to 0) and i had it set to Yes on completed download handling (which now is set to no). I found it safer to just remove sonarr's access completely to the media files. Now it can do whatever crap it wants, in its own little
|
# ¿ Dec 28, 2017 05:41 |
|
Yes, maybe that's what it wants, to be on the same machine with the download client . I have sonarr and radarr on an ubuntu vm, whle the download client is on a different machine (fedora), both having had the NAS mounted under the same folder (/mnt/nas) . It could be linux, it could be the separation, it could be ... well anything really.
|
# ¿ Dec 28, 2017 15:33 |
|
Which is why they were both mapped exactly the same "/mnt/nas/Downloads/complete". Yes, I've read that part of the wiki, yes, it should just work. It probably has to do with it being a network mount and Sonarr (probably being written with windows in mind) doesn't know that, therefore it probably thinks everything is on the local hard-drive. Maybe in windows is easy to tell if a drive is a network drive or not from C#. In linux is probably not that trivial.Thermopyle posted:I'm think somewhere Sonarr specifically recommends not having your storage on a separate machine, but I'm not positive where or if that's right. It definitely isn't the tool that I want, but the drat sickbeard stopped working many moons ago. It was almost perfect, as it did only what i needed it to do, nothing more. Maybe I'll have to check to see what went wrong with it and maybe fix it if it isn't too difficult. All i need are the basics to automate that crap for me, in the quietest, easiest, fastest way possible. Sonarr does too many things, and does them badly.
|
# ¿ Dec 28, 2017 19:49 |
|
Laserface posted:FWIW a lot of my issues with Sonarr did clear up when I moved SABZBD+'s completed download location to same local machine Sonarr and Radarr were installed on. It still isnt perfect, neither was Sickbeard, but Sickbeard was at least reliable in its function. Sickbeard had only one issue: it didn't do pagination when listing the episodes of a series. So you have a tv show that has gotten to season 25 with 20+ episodes per season, well better don't click there. Other than that it was perfect. The main reason I have sonarr and radarr on a separate machine is because I have that ubuntuvm running for other things, ubuntu apparently gets a bit more love than , say , fedora and the installation was faster and easier and they have a repo available for updates. The VM host however is running fedora and that's where I have a ton of HDD space (2TB hdd) so nzbget is staying there. But, I also admit that I hated sonarr the first time i laid my eyes on it (that UI, ughhh), before even trying it out, so outside of an amazing performance that thing doesn't get any praise from me. I will replace it the first chance i get.
|
# ¿ Dec 28, 2017 23:42 |
|
Newgroup Ninja decided that offering free Usenet was not good for business,so yesterday they told me to give them a credit card soon. It's still cheap ($5.99), still getting at 29MB/s (that's bytes not bits)so yea ... quite worth it.Dongattack posted:People in two people in russia and india tried to access my Synology NAS DS218+. Their IP was auto blocked it says. How can i stop stuff like that? I don't want people to try and get into it D: "relax it happens to everyone" "dont have SSH enabled when you're not using it" And that's all you can do (almost). If you have a computer/device/whatever accessible from the internet, on any port, there is a 100% chance that someone from the internet will try to access it. What can you do to minimize the intrusions, while still having the device available from outside: 1) Whitelist IPs. That is, block everything and only allow certain known IPs to contact you on whatever port you're listening. It may or may not be possible to do this. 2) If you're using well known services (like SSH) change the port. It doesn't do anything to protect you, but it may discourage flyby "hackers" that just probe for known ports. 3) Secure as much as you can the service that you are exposing. For SSH one thing can be to disable password authentication (only public key). To only allow your user to log in. Little tricks that can make ssh a tiny little bit safer. Volguus fucked around with this message at 04:53 on Jan 28, 2018 |
# ¿ Jan 28, 2018 04:47 |
|
RoboBoogie posted:i need some automation in my process Here's an move.sh script that I use to move downloads into a folder. Feel free to modify and improve it as you see fit: code:
|
# ¿ Mar 9, 2018 00:30 |
|
derk posted:you should be able to install nano. VI is confusing to use, especially for a rookie such as yourself, unless you have documentation on how to navigate and execute in VI, i highly suggest NOT using it! Oh come on, figuring out how to exit vi is a rite of passage.
|
# ¿ Mar 9, 2018 15:17 |
|
nuvan posted:I tried to upgrade a few months back, but my system has Java 9 installed, and hydra wouldn't work worth it. At the time it was set to wontfix, as the underlying problem was with spring, the web framework it uses. You can download Java 8 from oracle, unpack it somewhere and point nzbhydra to use that JRE instead of the global one. Then you can still have java 9 but use 8 for those programs that still only work with it.
|
# ¿ Jun 11, 2018 18:05 |
|
el_caballo posted:Is it worth it to switch all my poo poo (SAB, Hydra, Radar, Sonar, etc.) to Docker stuff if I'm just running a Windows 10 file server? Hydra1 seems to be getting a lot of http 500 errors and complaints from Radar/Sonar as being unresponsive, so I was going to try out Hydra2 and that led to Docker research. I am of the belief that if I do not need technology X, I do not employ technology X. There are very valid use cases for Docker, running these programs on a server is not one of them. On the other hand, there probably is no harm in doing so, if your goal is to learn more about Docker.
|
# ¿ Jun 18, 2018 17:36 |
|
Thermopyle posted:A benefit of running hydra2 in docker is not having to install JRE on your server. Same goes for radarr/sonarr with regards to Mono....but I don't have as much against Mono as I do against JRE. I am not familiar with mono, if it can be installed outside of the package manager or not, but java can certainly be. You can have 100 JREs installed on a computer that do not conflict with each other at all. All you need to do is :
For the more "complicated" applications, one may need to create a shell script that launches their shell script. Your shell script only needs to set the environment variables that the other app needs: JAVA_HOME,JRE_HOME, maybe update PATH so it has /some/path/jre1.2.3/bin/ at the start. Done.
|
# ¿ Jun 18, 2018 18:36 |
|
Thermopyle posted:Yeah, I know. I wasn't saying that it's been impossible to use multiple JREs up to the introduction of Docker. I'm saying that Docker is just as valid a solution to that problem (and maybe even easier) and when you couple that with not wanting to manage a bunch of different JREs on top of managing your JRE-needing applications, Docker seems pretty attractive. If you have the HDD space, sure. When I played with docker (getting an application of mine built by the container and create a slim container to host it) i found myself with a 30GB /var/lib/docker after a couple of hours. That's not very nice. Not very nice at all.
|
# ¿ Jun 18, 2018 19:31 |
|
The Gunslinger posted:... and removes the few security concerns I had about running a media server. ... It shouldn't. Docker is not a VM. One can get out of a VM as well, but it is a little bit harder. Out of docker is comparatively a breeze. If you had security concerns running a media server before, the same concerns should still exist. As one former docker developer said: do not run code that you don't trust in docker. Linux cgroups' goal is not security, but resource usage control and accounting, and namespaces ... again, is not a security replacement. A VM provides a much lower surface of attack of the host system.
|
# ¿ Jun 18, 2018 20:26 |
|
Dongattack posted:Do you guys pay for a usenet indexer? I've just been using the free ones and they seem to work perfectly, but wondering if i'm missing out on anything. I used to 10 years ago for a very popular one, that shut down after a few years (forget the name). Only a year or two ago i convinced myself to give $10 to nzbcat. But normally, no. They're dime-a-dozen, appearing and dissapearing like mushrooms after a rain. Been using free ones as much as i can.
|
# ¿ Jun 28, 2018 15:04 |
|
Rexxed posted:Spotweb still exists and works kind of, but it seems like the Netherlands is its largest userbase. If you need NL subs for something it's got it. While there are dutch subs on some movies, most of them have them external, so they can be disabled/deleted. Some movies do have them hardcoded though, that's true. Since it's a set & forget system, to not have it is kinda foolish. I have mine for a long time now (since newzbin died) and I am very happy with it. I get a lot of crap from there.
|
# ¿ Jun 29, 2018 01:56 |
|
salted hash browns posted:Why would anyone use an unlimited account? With block accounts you pay ~$0.04 per GB in a 500GB block, and unlimited is ~$10/mo. It would take 100TB/mo+ to recoup that cost with a unlimited monthly account. Huh? I don't quite follow that math. What i see there is $20/500GB block. $10/month for unlimited. So, the only way the block is worth it is if I download less than 250GB per month. To make it worth I would need to buy that 500GB block less than every 2 months. It's even worse if you look at the yearly plan ($90/year) but even so, 250GB per month is nothing. I use newsgroupdirect now, $75/year. Relatively ok, 50 connection 3700 days retention.
|
# ¿ Oct 18, 2018 01:31 |
|
The Diddler posted:I've started watching everything I can on Netflix because you can skip the intro on pretty much everything. I didn't think I would care about that until I started watching older shows with 60-75 seconds worth of intro. A media player should have no issues skipping in 10 seconds intervals. Sure, it can get a bit annoying, but to pay $10/month to not have to do that is a bit ... strange. That being said, I do still have netflix (even though I don't need it/want it) only because that's the only way the family let me cancel cable many years ago. I would drop it in a heartbeat if I could.
|
# ¿ Oct 19, 2018 14:42 |
|
Sub Rosa posted:I never used NZBHydra, but I do have 4 indexers I use, so seems like something I should check out. Are people still using NZBHydra or should I start straight away with NZBHydra 2? Condoning NzbHydra2. About docker: sure, if you feel strongly about it. There is nothing to "install" nor "configure" about java (just unzip the thing somewhere and be done) so installing the plain version works just as well.
|
# ¿ Nov 13, 2018 22:28 |
|
Heners_UK posted:I'll be honest, I felt the same way about it until I tried docker. The ease was huge once I spend a little time learning how it worked (probably the same amount of time it would take me to do a manual install). Well beyond any specific issues people may have with Java, Mono, Python etc. Oh i tried docker, used it for various purposes: run applications in a set environment, create a known and stable and ephemeral build environment for applications, etc. It's fine, nothing wrong with it, i just find it a bit over-the-top as whenever a fart needs to be made in a computer, the first thing people say: " do it docker". Sure sure sure, it may smell and all, but its your fart so jesus, let it be. It's quite ridiculous. When installing an application requires nothing more than untarring a file ... eh, whatever, it's your poo poo, do whatever you like in it.
|
# ¿ Nov 14, 2018 04:56 |
|
ClassH posted:Anyone have oznzb and hydra2? For some reason it wont work but seems to test fine in radarr or sonarr. hydra2 recently had an update in which it broke SSL (not familiar with the details of the issue). The developer said that the latest update fixes all the troubles, but it could be that you may have to update manually.
|
# ¿ Feb 14, 2019 20:04 |
|
|
# ¿ Apr 25, 2024 11:11 |
|
frameset posted:People still use Sickbeard? I thought that died years ago. No program ever dies. Just development on it may stop.
|
# ¿ Apr 25, 2019 16:42 |