|
Laserface posted:is there any fix for sickbeard under lion? I re-installed Cheetah and it worked. Download it here: http://www.cheetahtemplate.org/download.html Once extracted, open a terminal and run: sudo python setup.py install After that finishes, Sickbeard runs fine!
|
# ¿ Jul 25, 2011 21:52 |
|
|
# ¿ Apr 18, 2024 00:47 |
|
Vykk.Draygo posted:Mine is going super slow too, even after updating. I think it might have something to do with TheTVDB being down. Yup, it definitely doesn't like that site being down. Same problem here. Hopefully TVDB will come back up soon! Does anyone know if it's a temporary problem or a long term one?
|
# ¿ Sep 10, 2011 21:13 |
|
So now that Notifo is shutting down what should I use for notifications to my phone when shows download from Sickbeard?
|
# ¿ Sep 20, 2011 16:59 |
|
kri kri posted:Which phone? I looked on github the other day and someone added support for some android notifier app. I've got a couple of iOS devices. From the posts above Prowl looks like it will work well. I don't mind spending the $3 it seems to cost.
|
# ¿ Sep 20, 2011 21:53 |
|
Has anyone had issues with sickbeard corrupting its SQLite database? I used to run sickbeard on OSX and it ran flawlessly, but since I've moved to a Linux box the database has become corrupt three times now, and I keep having to reimport my shows. Not sure what to try now!
|
# ¿ Nov 15, 2011 06:42 |
|
FISHMANPET posted:If they're throttling Usenet traffic (NNTP sounds like the right protocol) then SSL will get around that, because the ISP won't see it as NNTP traffic, just unreadable gibberish going to an SSL port. Here in Canada, Rogers throttles all encrypted traffic, regardless of port (this was to shape Bittorrent encrypted stuff.) It sucks because at our office we use SSH for development a lot, and while sending a large file, say, a VM image, they'll kill our connection.
|
# ¿ Mar 16, 2012 17:59 |
|
I was curious about writing my own NZB indexer after a few of my friends got boned by the NZBMatrix closing. I spent a few hours writing a multithreaded Ruby app to crawl newsgroups and threw it on a VPS overnight. I was quite surprised to see about 7GB of storage was used for 24 hours of headers. Now that could definitely be lower as I'd group together the NZBs from the headers as they were complete, but it still takes quite a bit of disk space. I was kind of hoping to run my own indexer on a VPS, but since most of them come with like 25GB of space total, I'm not sure I could swing a good retention without going dedicated. So that kind of sucks!
|
# ¿ Dec 18, 2012 21:13 |
|
Butt Soup Barnes posted:Aren't headers just small text files? I know there's a shitload of stuff on usenet but 7GB/day in just headers is pretty amazing. Yes, the headers are very small. I was simply storing the subject, author, message_id as well as a couple integer fields into a postgres database. In 24 hours, from 89 a.b groups, I collected 25,043,238 headers. So that's what, an avg of 200 bytes per message? Ashex posted:Are you using compression at all? You should consider using something like snappy. Nope because I was just storing strings in rows in postgres. I'm sure I could do a lot better though. First of all I'd avoid any rows that don't have xxx/yyy in them (ie, are multipart binaries.) Then, instead of storing the entire subject every time I'd store the subject once and a row for just the part with message id. Thermopyle posted:I've been thinking of doing the same after looking at newznab. The whole PHP-ness of it just bothers me and it sounds like a fun project. It was surprisingly easy to get a crawler up and running, even maximizing my 15 connections to supernews. It was a fun little project and I might work on it a little more
|
# ¿ Dec 19, 2012 02:25 |
|
neurotech posted:I tried to do something like this with Ruby a few days ago and had serious issues with the few nntp gems available. Care to share how you pulled it off? You're right, those gems are awful. And they don't support the extension commands that make an indexer easier to write. So I rolled my own NNTP class. I'm not fully ready to release my indexer's source code, but here's the NNTP Connection class I used. It's pretty simple but works! https://gist.github.com/4337577
|
# ¿ Dec 19, 2012 16:35 |
|
Lone_Strider posted:I did near the same thing you did, and I incorporated rolling my own yenc decoding so I could use XZVER/XZHDR commands to pull compressed overview & header info. I'll post the code if you're interested Would absolutely love this!
|
# ¿ Dec 20, 2012 18:12 |
|
Lone_Strider posted:https://github.com/strider-/Nizbel/ Looks awesome. I'll look in more detail later.
|
# ¿ Dec 21, 2012 05:20 |
|
I occasionally have a weird error I haven’t been able to track down. Some shows will start to play fine, but then pause suddenly. I can skip past the bad part of the file but sometimes it’s a 10 minute hole, sometimes 10 seconds. I’m using Sonarr / Sabnzbd. Is this something wrong with my setup? Anyone seen this before? I figured if a file was corrupt Sabnzbd would mark it as failed.
|
# ¿ Apr 2, 2018 02:54 |
|
Two downloads failed last night so I looked closer in SAB. Both seem to have downloaded properly, but when I click for details they both say "No par2 sets" in the repair section and had missing articles. What's odd to me is shouldn't the extracting of the file fail if it's corrupt? One thing I've been thinking is that my connection is very fast here, so maybe I should delay before downloading? I've tried adding a 10 minute delay in Sonarr to see.
|
# ¿ Apr 4, 2018 01:06 |
|
|
# ¿ Apr 18, 2024 00:47 |
|
wolrah posted:Modern clients don't even try extracting if they don't completely download all the parts and don't have enough PARs to repair. Hell, most will even stop downloading when it becomes clear that there's no recovery. Which is what is surprising to me. It looks like it received the missing articles from my block account and extracted the file, however the extracted file was somewhat corrupt. The file I was downloading looks like it had a DMCA style takedown, so maybe it happened while I was downloading it? I'm still confused as to how SAB was able to extract it if it was corrupt though. I'm on the latest release.
|
# ¿ Apr 4, 2018 20:00 |