Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
an actual cat irl
Aug 29, 2004

My SABnzbd+ has stopped picking up RSS bookmarks from NZBMatrix for some reason. It was working absolutely fine as of last weekend, but I just noticed that some stuff I bookmarked a few days ago hasn't even entered my SABnzbd+ download queue. I've tried updating to the newest SABnzbd+ release, but that didn't help. I've also tried completely clearing the bookmarks list in NZBMatrix, and my bookmark history within SABnzbd+, but that didn't help either.

Does anyone have any suggestions?

Adbot
ADBOT LOVES YOU

an actual cat irl
Aug 29, 2004

I set myself up a private install of Newznab and, I must say, it's been pretty straightforward overall. The only hinderance has been that the backfilling process would take a prohibitively long time for any decent number of groups.

I initially only activated two groups (two relatively busy x264 groups), but thought I'd see how long it'd take to backfill 1500 days. Well, just a single one of the groups had a total of 2.1billion parts to index, and was going 20,000 at a time (with each 20,000 taking about 15-20 seconds), so I quickly came to the conclusion that it wasn't going to work out. Instead I've opted to start indexing from today, building towards to an indefinite retention.

Newznab is a bit janky, with it's chmod 777 weirdness and command line php scripts to index etc, but it works well enough. I'm running it on a relatively modest VPS and, so far, it works well enough. I suspect that, if I was running a quad core dedicated server with 24gb of ram and a 1gbs connection, then perhaps the backfilling process wouldn't have been such a washout but, as it is, I'm on a 1gb Linode VPS, so i'm having to make do.

So, in conclusion, if you have the resources available, I recommend giving Newznab a go.

an actual cat irl
Aug 29, 2004

ClassH posted:

I setup newznab in windows and it was pretty easy. I back filled 14 days and it wasn't too bad but there is a way to import nzbs from a collection which is faster than backfilling. You can get the collection by searching or pm me but you basically import these 40 gigs of nzbs and its almost like a huge backfill.

Ah right, that probably explains why there's always people asking about NZB torrents in the Newznab chatroom, then? I'll probably just forego the backfilling process tbh. I think that it'll be more hassle than it's worth, considering it's only a personal site.

On the subject of Astraweb, i left newznab indexing overnight and was really disappointed to find that it'd barely managed to index anything compared to what's listed on nzbs.org. As an act of desperation, I registered a supernews account to try it out with and, holy poo poo, it's incredible how much more stuff i'm getting now. I've used Astraweb for years and never really had an issue, but I can certainly echo the sentiments that they're being pretty lovely at the moment.

an actual cat irl
Aug 29, 2004

I've just installed Spotweb to see how it shapes up compared to Newznab. So far, it seems pretty sweet; it's nice that it can run perfectly happily on a Linux VM with very little CPU load, as opposed to Newznab which has my VPS host sending me lovely warning emails on a daily basis about CPU usage and disk io.

One question - how does one actually contribute to Spotnet, in terms of both comments and spots? I can see existing comments on spots, but don't see any obvious way to add my own.

edit: sorry...ignore this question. I was logged in as the 'admin' user, and apparently these features aren't available with that account. Logging in as a regular user fixed it.

an actual cat irl fucked around with this message at 14:04 on Dec 22, 2012

an actual cat irl
Aug 29, 2004

mcsquared posted:

Hey I got spotweb running in a VM. Cool. How long is the initial retrieve going to take on a terrible 1.5Mbps connection? We're about to leave for the holidays so... should I just leave this thing running if its going to take days?

Just need to decide whether or not to verwijder de erotiek spots

My adsl runs at about 13mbps, and I managed to download all the spots in about 15-20 mins. It then took about another 20 mins to download all the comments. So, even on a slow DSL connection, I wouldn't expect it to take more than a few hours.

an actual cat irl
Aug 29, 2004

Has anyone set up a Newznab install running from a home server, rather than a VPS? If so, how have they found they found it running in an environment with relatively lower bandwidth etc? Does it totally saturate your connection for an extended period of time, and take forever to update groups (after the initial group populating period, I mean)?

an actual cat irl
Aug 29, 2004

A question to those of you who have set up your own newznab servers - which usenet provider do you use?

So far I've tried both Astraweb and Supernews, and have had mixed results. Astraweb works, but i don't seem to get as much stuff indexed as I'd expect (especially compared to the amount of stuff which gets indexed at nzbs.org or nzb.su). Supernews seemed to get more, but then I found i was getting hundreds upon hundreds of '.Server did not return 1 article(s).' for each group it was trying to index. Even with Supernews I wasn't getting as much stuff indexed as I'd have liked (or expected).

Has anyone else had mixed results, or know how to get around this?

an actual cat irl
Aug 29, 2004

Randuin posted:

Ah, I'm trying to find a solution that would hopefully not require me to run my own, since i don't really want to run something that disk space intensive right now

I'm only indexing about ten groups with newznab, but have found that disk space isn't really a massive problem. I'm using just over 5gb which includes the OS and everything too. I've focused entirely on HD/x264 groups, so get a fair few releases each day, but it eats a negligible amount of disk space each day. I'm not bothering to backfill at all, but I'd hope to be able to carry on running this for a couple of years, building up retention, until the virtual drive i installed it onto starts running out of space.

I think the key is to be selective about which groups you activate. I initially activated alt.binaries.multimedia and alt.binaries.cores, but found it was getting 5,000,000+ new articles every few hours from each group (most of which seemed to be img sets ripped from porn sites) which was completely unnecessary given the relatively small number of releases they carried which interested me. I've got it to the point where I index 90% of the content I want, which is pretty good IMO.

Providing you're not attempting to index loads of ridiculously busy groups, the CPU and bandwidth requirements aren't anything to worry about. Rather than leaving the update script running on a perpetual loop, set it to only run overnight or something.

an actual cat irl
Aug 29, 2004

Has anyone had a craic at writing their own regexes for Newznab?

I'm indexing alt.binaries.u4all, and about 50% of the contents come back with hosed up names and don't categorise properly. The posting might contain a filename like '[U4A]MOVIE.NAME-releasegroup.etc', and Newznab is smart enough to grab the correct metadata and art etc, but then fails to rename the post to anything meaningful and, instead, I get a post titled something like '[U4A] 100113 26 04 11' stuck into my 'misc' category.

I've had a go at modifying a regex to improve things, but haven't got a loving clue what i'm doing really. Obviously it's possible because, in most cases, sites like nzb.su achieve it most of the time.

Any ideas?

an actual cat irl
Aug 29, 2004

Madd0g11 posted:

In the misc/testing folder there is a script called update_parsing.php that renames the weird named releases using the NFO or release file of the post. I've been using it to clean up once a day and it's been helpful.

Thanks so much for this. I've now added it to my scheduler, and it seems to pick up a lot of previously missed releases. :woop:

an actual cat irl
Aug 29, 2004

St. Blaize posted:

I am setting up a local NewzNab server for private use and I can't quite figure out how to run the NewzNab update scripts on a schedule, like every 12 hours or once a day. Unfortunately I think I may be missing something about how those scripts work. Any help? Can't find anything on these free indexers that I want.

What OS are you using?

I'm using Ubuntu as my host OS, and installed Gnome Schedule to run my update scripts. It's just a GUI to crontab. I modified the newznab_screen_local.sh script to run through once rather than on a constant loop, then scheduled it to run a couple of times a day. It works well!

an actual cat irl
Aug 29, 2004

St. Blaize posted:

I am running a Ubuntu 12.04 headless server. Do you have any further info on modifying newznab_screen_local.sh?

This is more or less what mine looks like...

http://pastebin.com/dx5hKMHR

I trigger it a couple of times a day, when I'm less likely to notice the effects of the CPU peaking or my broadband getting blasted. I can definitely see the value in running the default script in a loop if you're indexing some of the busier groups (alt.binaries.multimedia, alt.binaries.movies, or some of the MP3 groups, for example), but I narrowed my selection down to a handful of x264 and HD groups which don't get enough posts to really make constant refreshing necessary. I probably index in the region of 2.5-3m headers a day in total.

I dump the output of the update process into a date-coded textfile so that I can go back and check over it, should I suspect there's any issues. My scheduled command looks like this...

code:
/path/to/newznab/misc/update_scripts/nix_scripts/newznab_screen_local.sh | tee "/home/moron/Desktop/newznab_logs/nab $(date +\%Y\%m\%d-\%k\%M).txt"; exit 0

an actual cat irl
Aug 29, 2004

Are the flood of releases with hashed titles in groups like alt.binaries.hdtv.x264 an attempt to avoid automated DMCA takedowns?

Newznab seems to completely fail at indexing these releases, and just shoves them in the misc catagory instead. It's really pretty irritating.

an actual cat irl
Aug 29, 2004

Telex posted:

If I set up a newznab on a VPS, is it gonna need its own usenet account and if so what's the state of reliability these days in terms of which one to use? I've got supernews at home so I suppose it'd be logical to get a second supernews account for the newznab to scrape with?

I scrape using Supernews, and it work well great.

I initially tried with Astraweb and, boy, what a pile of poo poo that turned out to be. I would end up with page after page of 'file not found' every time i ran an update, and so much stuff would be missing. It's a shame, because I'd have really liked to have been able to use the compressed headers feature. It's strange, because I used Astraweb for years prior to this at home for downloading, and never had a problem with takedowns or whatnot; but for some reason i found them totally hopeless for indexing.

an actual cat irl
Aug 29, 2004

I just noticed that my Newznab install has stopped adding releases. I watched the update script run, and noticed that it fails with this error before it gets a chance to actually add the releases to the database:

code:
PostPrc : Performing additional post processing on last 1 releases ...1.PHP Fatal error:  Allowed memory size of 268435456 bytes exhausted (tried to allocate 98 bytes) in /home/mike/Public/newznab/www/lib/nzbinfo.php on line 60
I have the memory_limit parameter in php.ini set to 512MB, which I thought would be ample. Does anyone know why this might have started happening?

FWIW, I'm using Nginx/FPM instead of Apache.

an actual cat irl
Aug 29, 2004

spathi-wa posted:

The message shows that it's running out of "268435456 bytes" which is 256MB. I think the memory_limit change didnt 'take'.

Did you restart the webserver after making this change?

Turns out i changed /etc/php5/php.ini, but neglected to change /etc/php5/cli/php.ini. Changed memory_limit to '-1' and it's working fine now.


Chick3n posted:

I'm getting the same error. My install was fine until a recent SVN update.

I know mine is only set to 256mb, but it was never an issue before. I've increased it to 512 to see how if that fixes it.

I've not updated newznab since i originally installed it a month ago. Is it worth grabbing new versions from their SVN regularly? I haven't found any sign of a changelog anywhere, and would be interested to see what's being changed/improved.

I must say, this memory error is troubling me a little bit. My newznab install is almost exactly a month old, and it seems to have outgrown it's original 256mb memory limit. Is the memory requirement going to keep on growing over time? I'm only running this from a relatively modest virtual machine with 2gb memory allocated. I'm concerned that, in six months time, I'll be getting more failed updates, except I won't be able to expand the memory further to accommodate.

an actual cat irl
Aug 29, 2004

I've managed to gently caress up my Newznab install somehow. I installed some Ubuntu OS and package updates, and also did an svn update for newznab itself, and now some pages refuse to load.

If I click on a category (say, 'movies' for example) then I just get a totally blank page. No error, no 404, no SQL error or anything. Other pages like the admin page, the upcoming releases page etc work fine, and I'm able to search and view individual releases like normal, I just can't browse categories.

I initially thought it was a mod_rewrite issue but, upon further experimentation, I no longer think it is. newznab.server.com/upcoming still works, whereas newznab.server.com/movies doesn't.

There's absolutely nothing in my error or access logs to indicate what's going wrong. FWIW I'm using nginx and Percona as my hosting platform. I've already tried reapplying all the stupid chmod 777 stuff after updating from svn, thinking it might be a permissions thing, but no.

Can anyone offer any advice? I'd be gutted if I had to nuke my install and start from scratch again!

edit:

Ok, I've just seen the following pop up in my nginx error log:

code:
PHP message: PHP Warning:  require_once(WWW_DIR/lib/movie.php): failed to open stream: No such file or directory in /home/mike/Public/newznab/www/pages/upcoming.php on line 2
PHP message: PHP Fatal error:  require_once(): Failed opening required 'WWW_DIR/lib/movie.php' (include_path='.:/usr/share/php:/usr/share/pear') in /home/mike/Public/newznab/www/pages/upcoming.php on line 2" while reading response header from upstream, client: 192.168.$
It only appears once, and not for every time I try and refresh the page, so i'm not sure if it's a red herring. It looks to me like the variable WWW_DIR isn't registering as my actual install location. As I understand it, this variable is defined in automated.config.php which is, in turn, called by config.php. Permissions are all good, so that shouldn't be the problem.

an actual cat irl fucked around with this message at 22:19 on Feb 7, 2013

an actual cat irl
Aug 29, 2004

Chick3n posted:

Did you follow the upgrade instructions?

svn update
run the db patches either manually from /newznab/db/patch/0.2.3 (look at the file modified/created dates) or running /newznab/misc/update_scripts/update_database_version.php should work
Delete contents of cached smarty files in /www/lib/smarty/templates_c/*

What is line 2 of your newznab/www/pages/upcoming.php file? It should be require_once(WWW_DIR."/lib/movie.php");
If it isn't, rename the file to upcoming.php.bak, then update from svn again which will recreate the file.

Thanks so much. I had forgotten to run the database update script, and I'm working perfectly again now! :D

an actual cat irl
Aug 29, 2004

My Sabnzbd install doesn't automatically pick up categories ftom either nzb.su or my own Newznab install. I was under the impression that it was supposed to do this. Can anyone tell me how to make it so?

an actual cat irl
Aug 29, 2004

EC posted:

SAB > Config > Categories. In the "group/indexer tags" column, put the name of the category coming from the indexer. You'll want to use wildcards if you're searching in subcategories, so use something like "Linux*" to catch everything with that tag in the name.

Huh...it was as simple as putting an asterisk at the end of the tag (so 'Movies*' rather than 'Movies'). Working great now....thanks!

an actual cat irl
Aug 29, 2004

Can anyone recommend a Dutch usenet host, with reasonably priced block price, to use as a backup server? I'm not familiar with any of the Dutch providers, so would welcome any recommendations!

an actual cat irl
Aug 29, 2004

Fart of Presto posted:

I've used tweaknews.eu for a couple of months now as a backup server and it has worked out really nice.

Thanks for the tip! I'm going to give them a go.

an actual cat irl
Aug 29, 2004

Recently, I've found an increasing number of my SABnzbd downloads fail, requiring additional blocks. I have four backup providers but, nonetheless, just assumed that DMCA takedowns were to blame.

However, on a whim, I just tried manually running MacPAR on two of the failed downloads in my SAB temp folder and, to my surprise, all the PAR and RAR files checked out perfectly, didn't need to be repaired, and extracted just fine. These particular nzbs had failed in SAB due to missing blocks (one was apparently 434 short, and the other 1795). Unfortunately, the log files had been rotated, so I was unable to look to those for insight.

Has anyone else experienced an issue like this? I'm using SAB 0.7.11 on OS X 10.8.3.

an actual cat irl
Aug 29, 2004

ClassH posted:

Anyone have a decent regex for a.b.mom for people running their own newznab setup? The ones provided seem to miss about 90% of the stuff that is worthwhile. The big sites like nzbs.org and dognzb seem to find them fine.

This is the one I use for a.b.mom....

code:
/^(?P<name>\d{1,6})\[(?P<parts>\d{1,3}\/\d{1,3})\] - \".*\.(\d{1,2}|rar|vol\d{1,4}\+\d{0,4}\.par2|par2)\" yEnc/i
Also, are you using the update_parsing.php script from the misc/testing/ folder? IIRC, a.b.mom is one of the groups that benefits greatly from it.

Gozinbulx posted:

how helpful would it be to run your own newznab? Like, what am I missing with nzb.su?

If you're a happy nzb.su user, there's not necessarily any immediate benefit to running your own newznab install. Infact, whilst the initial setup is straightforward enough, the gradual process of messing with regexes, scripts and settings, plus babysitting it to make sure it's picking up the same amount of content as nzb.su, is all a bit of a pain in the arse (although, to be fair, it only occupies a few minutes per day day of my time).

However, once you reach a certain point, it is reassuring to know that I have a fallback if/when the current round of big nzb indexers get shot down.

Anyone running newznab should have a look at http://www.newznabforums.com. Lots of helpful info and regexes there.

an actual cat irl fucked around with this message at 08:09 on Mar 28, 2013

an actual cat irl
Aug 29, 2004

YouTuber posted:

I have Astraweb and a block plan on a different backbone and I still seem to get them. Far as I can tell the blockplan server is active but I'm still getting those errors. Of the 1TB plan I have it's downloaded a mere 36mb.

Have a look in your SAB temp folder and try to manually check and extract the files using par2+rar apps. I've found SAB can be a bit poo poo at checksumming some files.

Adbot
ADBOT LOVES YOU

an actual cat irl
Aug 29, 2004

I'm getting lots and lots of failed downloads at the moment. Even when I try and repair manually, I'm finding lots of RARs with invalid checksums and missing PAR files. I have Supernews as my primary host, and then backup block accounts from Astraweb, Blocknews and Tweaknews and NewsXS, but still no joy. This seems to mostly affect TV shows.

Is this likely to be due to lovely posts? Is anyone else finding this? This has only started happening in the past couple of months (i previously almost never had failed downloads), and I'm concerned it might be something to do with my downloading setup.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply