|
I've recently been having problems downloading files above around five gigabytes in size. I get an error message in sabnzbd stating that the transfer failed. Oddly, I downloaded 4 4.5 gigabyte files with no issues this week. This just started happening all of a sudden. My first thoughts were that this was perhaps bad ram corrupting the files, as I have had that happen before. However, the 4.5 gigabyte files had no issues. I am using supernews w ssl. Any suggestions on what might be causing this?
|
# ¿ Feb 7, 2012 18:37 |
|
|
# ¿ Apr 26, 2024 23:36 |
|
I have an issue where files larger than around 25 gigabytes or so pretty much always fail. I have been able to download some, but typically they fail. Which is really frustrating as it uses up a lot of my monthly data cap. I don't seem to have this problem at all with files in the 10-15 gigabyte range or smaller. I am using supernews as my primary, and thecubenet block account as a backup. All of the files are easily within the server's retention. The error message is typically "Some files failed to verify against "*************.sfv"" Perhaps there is a setting I need to change?
|
# ¿ Oct 11, 2012 00:54 |
|
Meta Ridley posted:Dognzb looks very nice, and the push feature almost makes sickbeard redundant (though sickbeard is more robust so I will keep using it). Unfortunately it does not say how many blocks short the large files are. i have some small 1-2 gig downloads that failed and had around 7-8 blocks short each, but I think that was due to them being pretty old, like over 600 days or so.
|
# ¿ Oct 11, 2012 03:18 |
|
I am getting a ton of new releases failing, generally the error message being "Repair failed, not enough repair blocks (x short)", though I am also getting "Download failed - Out of your server's retention?" and "Invalid par2 files, cannot verify or repair" At first I thought it was because of DMCA takedowns, but I have both a thecubenet.com block account for backup and supernews.com as my main provider. I can tell that the backup block account is being used, because it shows bandwidth being used from it every day. Any suggestions? Could this perhaps be a hardware problem?
|
# ¿ Jan 13, 2013 00:50 |
|
torjus posted:Try downloading without any sort of postprocessing and extract it manually. Yep, this worked. I used quickpar to check it out, then extracted the iso which was around 45gig with no issues. Are there some options I can set in sabnzbd to fix this issue?
|
# ¿ Jan 13, 2013 17:35 |
|
I'm trying to download a file that is around 60 days old. There are around 10 different versions of it of varying quality, unfortunately all are 'out of servers retention', in other words probably had a DMCA takedown notice to every single one. I have both supernews unlimited as my main provider, and a thecubenet.com as a secondary block account. The files are out of retention on both servers. Any suggestions on an additional block provider that might have the file I am looking for (I know its vague, but I don't want to break any rules).
|
# ¿ Feb 5, 2013 02:31 |
|
|
# ¿ Apr 26, 2024 23:36 |
|
tattoli posted:I was actually getting this error a lot due to hitting the 254 character file path limit in Windows, try this (which apparently messes up couchpotato/sickbeard's ability to find your downloads) or shortening the length of the file paths before buying a block plan. I tried this with no luck, set it to 64. I also tried getting a blocknews account. Same issue, I guess they somehow dmcaed all 10 versions on blocknews as well. Any other suggestions to try?
|
# ¿ Feb 5, 2013 21:34 |