Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Greatest Living Man
Jul 22, 2005

ask President Obama
I really like http://www.nzbndx.com/ from what I've seen so far.

Adbot
ADBOT LOVES YOU

Greatest Living Man
Jul 22, 2005

ask President Obama
I've finally revitalized my 10 year old tower with a new mobo, a new (cheap) Celeron processor, 16 gigs of RAM, 4 x 3 TB NAS HDs and FreeNAS. I just got CouchPotato, Sonarr and SAB talking to each other and everything seems hunky dory. Next step I suppose is installing rtorrent and rutorrent.

Greatest Living Man fucked around with this message at 05:03 on Oct 11, 2016

Greatest Living Man
Jul 22, 2005

ask President Obama
NZBCat also has semi-open registrations right now.

https://nzb.cat/invites

Greatest Living Man
Jul 22, 2005

ask President Obama
I also use NZBHydra to replace CouchPotato entirely, by the way. Gives better flexibility of searching for specific releases (I can search for Ubuntu 2009 720p CANONiCAL).

I've modified a script that SABNZBd can use to notify Plex that it's finished a download so I don't have to go in and refresh the library manually. I'll edit it into this post later tonight.

Greatest Living Man
Jul 22, 2005

ask President Obama

Violator posted:

Almost nothing automatically downloads via CouchPotato. I check the indexers everyday and see things available but CouchPotato never downloads them. I'll see things listed at the top of the CouchPotato main page, but it usually says something like "Releases found before ETA" in the log and I have to click on them to download manually. Turning on "Always search" to get results before ETA seems like it would be a world of hurt in getting a bunch of crap. But do people generally turn that on to make things work better?

You'll likely get a poo poo ton of CAM releases that are mislabeled.

Greatest Living Man
Jul 22, 2005

ask President Obama
To have Plex autoupdate when you're done with anything in SAB:

Put this as autorun.py in your SAB folder (jails/sabnzbd_1/var/db/sabnzbd/MyScripts if you're on FreeNAS). Under Notifications on the SAB web GUI select "Notification Script" and browse to autorun.py. Run it after Job finished.

The script will take a minute to execute with "test" because I'm bad at python, but essentially it just sends a refresh to Plex, waits, refreshes, waits, and then refreshes again. It's maybe not necessary on your system but for me one refresh wasn't doing the trick for Plex.

code:
#!/usr/local/bin/python2.7

#####################################
#
#   Updated script to refresh Plex library after download
#   update credit: ADS / Diaspar (diaspar8-xm at the provider named yahoo.com)
#   (original credit unknown)
#
#####################################

import urllib
import time
from xml.dom import minidom

host = 'YOURIPADDRESSHERE'
source_type = ['movie', 'show'] # Valid values: artist (for music), movie, show (for tv)
base_url = 'http://%s:32400/library/sections' % host
refresh_url = '%s/%%s/refresh' % base_url

try:
  xml_sections = minidom.parse(urllib.urlopen(base_url))
  sections = xml_sections.getElementsByTagName('Directory')
  for s in sections:
    if s.getAttribute('type') in source_type:
      url = refresh_url % s.getAttribute('key')
      x = urllib.urlopen(url)
except:
  pass

time.sleep(30)

try:
  xml_sections = minidom.parse(urllib.urlopen(base_url))
  sections = xml_sections.getElementsByTagName('Directory')
  for s in sections:
    if s.getAttribute('type') in source_type:
      url = refresh_url % s.getAttribute('key')
      x = urllib.urlopen(url)
except:
  pass

time.sleep(30)

try:
  xml_sections = minidom.parse(urllib.urlopen(base_url))
  sections = xml_sections.getElementsByTagName('Directory')
  for s in sections:
    if s.getAttribute('type') in source_type:
      url = refresh_url % s.getAttribute('key')
      x = urllib.urlopen(url)
except:
  pass

Greatest Living Man
Jul 22, 2005

ask President Obama
Is there a way to set up NZBHydra with private trackers? Or is there a similar program that can be set up? I've been looking around and it seems all the different variations have died over the years.

Greatest Living Man
Jul 22, 2005

ask President Obama
Honestly I mostly use cat and nzbnoob. There's a lot of weird poo poo on nzbnoob, especially older stuff.

Greatest Living Man
Jul 22, 2005

ask President Obama

xgalaxy posted:

I'm a little behind the times on the latest stuff.
I was under the apparent false impression that NZBHydra was only useful for manual searches.
But it looks like you can actually hook it up to Sonarr, CP, etc. as the searcher itself and let Hydra do the lifting of figuring out which indexer to use.

Does anyone use Hydra in this way and does it work well?

I do this. The only issues I've had with Sonarr/NZBHydra was when I started adding Jackett torrent trackers -- haven't quite figured that one out yet.

Greatest Living Man
Jul 22, 2005

ask President Obama
I uh... managed to delete my entire TV collection of 10+ years while setting up manually downloaded season packs from Jackett/Transmission to auto-import into Sonarr. Not really sure how it happened, but it had something to do with messing with the Drone Import settings. Anyway, just make sure you're careful and make snapshots before you gently caress around with stuff.

Greatest Living Man
Jul 22, 2005

ask President Obama

Vykk.Draygo posted:

If it drone imported them, wouldn't it have to dump them out somewhere?

Well, the drone import settings in Sonarr say specifically that it may result in data loss if you use the same name for multiple folders. I didn't do that, but it still seems to have overwritten all of my /TV/* folder with a single series that I was trying to import automatically.

Greatest Living Man
Jul 22, 2005

ask President Obama

EL BROMANCE posted:

If the data hasn't been overwritten by something else, you can probably recover a good chunk of it. Hate it when stuff like that happens though, insanely frustrating.

I didn't have snapshots set up correctly to recover it. Now I have each of my data containing folders set up as separate datasets, so I can snapshot them individually. Do you have an idea of how I'd recover "deleted" files on FreeNAS? From what I can tell, there's really no way without snapshots.

e: There's still a significant amount of space being taken up on my drives, which is ostensibly from the "deleted" shows. Of course, I can't find them.

Greatest Living Man
Jul 22, 2005

ask President Obama

MrCodeDude posted:

Is there a way to do preferred words for releases in Sonarr? The best option I've found for backfilling old releases is to temporarily add the tag "obfuscated" to a series (dramatically increases chance download succeeds), but for new releases, I don't need to wait for an obfuscated release (so I'll remove the tag once fully backfilled).

It would be nice if there was a way to make "obfuscated" a preferred, but not required, word.

And with all these obfuscated/scrambled releases and the supposed relationships between posters and indexers, are there now "premiere" indexers?

Afaik there is this option? It's for preferring releases from select groups, but you could put obfuscated in there and it would serve the same purpose.

Greatest Living Man
Jul 22, 2005

ask President Obama

MrCodeDude posted:

Looks like it's a low-priority feature request that won't be addressed until Sonarr 3.x versions: https://github.com/Sonarr/Sonarr/milestone/1

:smith:

Ah my bad. I think this was actually a couch potato feature?

Greatest Living Man
Jul 22, 2005

ask President Obama

Greatest Living Man posted:

I uh... managed to delete my entire TV collection of 10+ years while setting up manually downloaded season packs from Jackett/Transmission to auto-import into Sonarr. Not really sure how it happened, but it had something to do with messing with the Drone Import settings. Anyway, just make sure you're careful and make snapshots before you gently caress around with stuff.

I kind of figured out what's going on. I have my Drone Factory folder set to my transmission download folder. When I manually download a season pack torrent to my transmission watched folder using Jackett for a series that's watched in Sonarr, it automatically sees that it's downloading and puts it in Activity. HOWEVER: It then spits out an error message "Download wasn't grabbed by Sonarr and not in a category, Skipping." once it's finished, and deletes the data associated with that torrent. Sometimes it will import the first episode, but still deletes the rest. I'm not sure how I can make Sonarr "accept" these downloads, because right now season pack searching through Sonarr itself doesn't seem to work.

e: https://forums.sonarr.tv/t/search-for-season-packs-torrent/14335/9 I guess there's no way to manually search for season packs?

Greatest Living Man fucked around with this message at 19:18 on Jul 23, 2017

Greatest Living Man
Jul 22, 2005

ask President Obama

Methylethylaldehyde posted:

Just chiming in that the completion check script for NZBget Here is loving amazing with sonarr and radarr. Configure it with a fast unlimited account and a decent set of block accounts for retention, and it's suddenly tons faster.

I have it set to check 2k articles, and if more than 50% aren't on my main provider, to fail that nzb and try another one, my speeds have gone up substantially because it isn't checking 6 servers sequentially to find the article, and my block usage has also gone down substantially.

Is this faster than sab's completion check?

Greatest Living Man
Jul 22, 2005

ask President Obama
There's also some weird indexing phenomena -- for example, NzbNoob has a ton of old Adult Swim content I can't find on other indexers.

Greatest Living Man
Jul 22, 2005

ask President Obama

Thermopyle posted:

My wife would reward me greatly if I could find an indexer with HGTV shows.

It's amazing how much you can find on Usenet, but there's some obvious gaps.

Lemme know some names of shows you can't find and I'll check my indexers through Hydra.

Greatest Living Man
Jul 22, 2005

ask President Obama
Yeah I would say set quality to Any and make sure to use Sonarr. I am seeing a lot of these on NZBGeek especially. Then maybe Tabula Rasa, NZBNdx, ABNzb... Doesn't look like a lot of these are complete archives though.

And if nothing's popping up, check your Sonarr debug log file.

Worst case scenario, it's available on Sling. Cheapest w/o ads would be Hulu Plus for 12/mo

Greatest Living Man fucked around with this message at 19:29 on Sep 5, 2017

Greatest Living Man
Jul 22, 2005

ask President Obama

Thermopyle posted:

Ugh, looks like the house hunters shows (which are indexed on nzbgeek!) don't match up with thetvdb at all so of course Sonarr doesnt do poo poo with them.

That's another problem I run in to like every time with these shows that aren't exactly mainstream usenet material.

In that case try using NZBGeek's TVSeek feature. I haven't personally used it, but it would seem to supersede Sonarr/TVDB problems.

Greatest Living Man
Jul 22, 2005

ask President Obama

Keito posted:

Make a mapping on XEM to correct that! (or ask someone to do it for you)

http://thexem.de/

Whoa I'll have to check this out.

Greatest Living Man
Jul 22, 2005

ask President Obama

WhiteHowler posted:

I've been using Newshosting.com for over a decade and had few issues, but recently I've been getting tons of missing articles.

Can anyone recommend a good backup provider for me? Is there still a "if your primary provider is on X back-end you should use a backup on Y back-end" guideline?

Sorry if these are dumb questions, but I haven't had to worry about any of this in ages, and the OP hasn't been updated for five years, so I have no idea how much is still relevant.

Newshosting is Highwinds, so I would recommend Supernews (giganews), Usenet.farm blocks, or Astraweb. There are deals for all of these. Astraweb is 8/mo, Supernews is 10/mo, Usenet.Farm is 500 gb block for EUR15. More deals pop up on blocks for Black Friday.

Check out this: https://www.reddit.com/r/usenet/wiki/providers

Greatest Living Man
Jul 22, 2005

ask President Obama

Thermopyle posted:

The main problem I have with subs is forced subs.

Watching a movie and then I'm wondering "is this part supposed to be in another language without captions or is it missing forced subs?".

Agreed. I don't really know what a good workaround is. I guess if there were a distinction between closed captions and regional translations? Most sub files you have to choose between two types of 'english' to find the right one.

Greatest Living Man
Jul 22, 2005

ask President Obama
Are iocage jails considered secure? I have my mono programs on one and my perl/python on another.

Greatest Living Man
Jul 22, 2005

ask President Obama

Less Fat Luke posted:

What's the best recommended provider these days? I'm using Supernews but anything over a month or two is ungrabbable from missing posts (even with Blocknews as a backup)

I switched from supernews to fast usenet about a year ago. No ragrets

Greatest Living Man
Jul 22, 2005

ask President Obama

BeastOfExmoor posted:

I've been running Sonarr and Radarr for a few weeks and ran into a few issues.

First, is it possible to figure out what search terms Sonarr is is using to find episodes. I followed a new show and its not finding episodes. If I manually search my tracker they're all there and correctly name (This.Is.A.Show.S01.E01, etc.), but even a manual search in Sonarr brings zero hits.

Check the debug logs. They say pretty specifically what's hitting and why it's being excluded.

BeastOfExmoor posted:

Secondly, in Radarr I had an issue where I grabbed a 9GB file, but ended up with only a one minute clip (presumable the -sample) and everything else was deleted. Any solution to making sure this doesn't happen again? I use block accounts so wasting large amounts of space is super annoying.

There is a setting in radarr to remove extraneous files. There should also be a 'trash' folder that the file may have been moved to.

Greatest Living Man
Jul 22, 2005

ask President Obama

BeastOfExmoor posted:

Hmm, I guess I'll add all the usual video extensions to this list and hope that solves any future issues with "sample" files. I can't find and trash folder for Sonarr so I guess I'm out of luck there.

IIRC it should be an exclusion list to remove stuff like .nfo files. Those particular files wouldn't be in the trash folder, but files deleted for an upgrade might be. You probably need to click show advanced settings.

Greatest Living Man
Jul 22, 2005

ask President Obama
Are public torrents in any way better than Usenet for content? I have a VPN and am moderately interested in setting up a jail behind the VPN for automated downloading of public torrents for stuff that's not found on private indexers or trackers.

Greatest Living Man
Jul 22, 2005

ask President Obama

EL BROMANCE posted:

Yeah, nobody to my knowledge has ever been busted for solely downloading from usenet.


For content? Maybe. Either stuff that has been DMCA'd or out of retention for your service. Even with a VPN I'd want to get off seeding a public torrent ASAP.

Really? I mean there would be no inventive to seed a public torrent, but it's not like your true IP would be figured out by DMCAbots if you did. Unless the VPN provider complied with such requests.

Greatest Living Man
Jul 22, 2005

ask President Obama

EL BROMANCE posted:

Some examples. Day 1 all had [Fox at the end, I can’t remember what yesterday’s batches were, today are all Epic. Mix of stuff that’s out and stuff that isn’t. File size is growing, they were all under 1gb on day 1, yesterday all around 1.3. Naturally some of them are obvious fakes, but a 1.8gb HC HDRip is possible from a not so hot source. I’m guessing they’re not gonna stop, and will change again by tomorrow.




Yeah I have gotten a couple of those for new releases. Using geek & cat. Not really an issue so far since it's not getting executed. If it did, it's in a FreeBSD jail. Mostly just a minor annoyance.

Greatest Living Man
Jul 22, 2005

ask President Obama

Dongattack posted:

What kind of extensions should i add to my download blacklist? SabNZB only came with .exe and .com blacklisted so i added .lnk after catching up on the thread.

.bat, .dll, .py, .java, .jar...

Greatest Living Man
Jul 22, 2005

ask President Obama
Are there indexers for southeast Asian content? Like... who would have Masterchef Thailand?

Greatest Living Man
Jul 22, 2005

ask President Obama
i'm in a weird dimension

Greatest Living Man
Jul 22, 2005

ask President Obama

TenementFunster posted:

so best practice for sonarr is to have files download to a general directory, then sort them manually from there? doesn't that defeat the main point of sonarr?

also where/what is the drone folder? google isn't helping.

In my opinion it's best to have a download folder but allow Sonarr to create symbolic links in your media folder. Depends on the OS you're working with.

Greatest Living Man
Jul 22, 2005

ask President Obama

Horn posted:

Is there a turnkey solution to hosting your own indexer? I already have a server running 24/7 so if I could just pull down some docker container that would be :perfect:

Sort of. A lot of major releases are obfuscated nowadays which makes things more difficult.

See https://nzedb.github.io/ and http://www.newznab.com/

Greatest Living Man
Jul 22, 2005

ask President Obama
I really do recommend nzbgeek. My only gripe with them is that the API is often fairly slow, but since it's automated I don't give a poo poo.
I got onto nzb.cat when they were offering lifetime VIP, so I also keep that around. I would set up a crawler to refresh https://nzb.cat/invites periodically.

Also as a reminder, http://6box.me/ is still open and free.

Greatest Living Man fucked around with this message at 00:12 on Apr 2, 2019

Greatest Living Man
Jul 22, 2005

ask President Obama

wolrah posted:

Once you have more than 2-3 indexers in more than one application I recommend setting up NZBHydra. Basically what it does is emulates the Newznab API and acts as a proxy, so you set up your indexers in Hydra and then point Sonarr/Radarr/etc at it, then you only have one place to manage everything. You can prioritize indexers and it'll automatically handle throttling requests for sites with API limits.

It also gives you a single central search interface that hits all the connected indexers and shows your stats. There's multi-user functionality as well, no idea how that works though but if you share any usenet/indexer accounts with friends that might be useful.

Newer version:
https://github.com/theotherp/nzbhydra2

Greatest Living Man
Jul 22, 2005

ask President Obama

PitViper posted:

I just set myself up with VIP on DS and renewed my cat VIP, so hopefully between those two and .su I should have decent coverage. Sad day deactivating nzbs.org on my setup though. Any other indexers I should consider over one of those three? Looking back over the last few weeks, almost all my pulls were from .su anyway.

I get most unique stuff from DS and Geek at this point I think. Also NZBNooB has a lot of strange content.
e: and 6box.me is free

Greatest Living Man
Jul 22, 2005

ask President Obama

Duck and Cover posted:

I don't know usenet this kind of thing is very annoying.


Sonarr's not great at grabbing full season packs in my experience. If you can't find a season NZB manually through Hydra or something, it's probably better to have a dedicated TV tracker.

Adbot
ADBOT LOVES YOU

Greatest Living Man
Jul 22, 2005

ask President Obama
Cat's back

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply