Giter VIP home page Giter VIP logo

sabnzbd's Introduction

SABnzbd - The automated Usenet download tool

License Join our Discord

SABnzbd is an Open Source Binary Newsreader written in Python.

It's totally free, easy to use, and works practically everywhere. SABnzbd makes Usenet as simple and streamlined as possible by automating everything we can. All you have to do is add an .nzb. SABnzbd takes over from there, where it will be automatically downloaded, verified, repaired, extracted and filed away with zero human interaction. SABnzbd offers an easy setup wizard and has self-analysis tools to verify your setup.

If you want to know more you can head over to our website: https://sabnzbd.org.

Resolving Dependencies

SABnzbd has a few dependencies you'll need before you can get running. If you've previously run SABnzbd from one of the various Linux packages, then you likely already have all the needed dependencies. If not, here's what you're looking for:

  • python (Python 3.8 and above, often called python3)
  • Python modules listed in requirements.txt. Install with python3 -m pip install -r requirements.txt -U
  • par2 (Multi-threaded par2 installation guide can be found here)
  • unrar (make sure you get the "official" non-free version of unrar)

Optional:

  • See requirements.txt

Your package manager should supply these. If not, we've got links in our installation guide.

Running SABnzbd from source

Once you've sorted out all the dependencies, simply run:

python3 -OO SABnzbd.py

Or, if you want to run in the background:

python3 -OO SABnzbd.py -d -f /path/to/sabnzbd.ini

If you want multi-language support, run:

python3 tools/make_mo.py

Our many other command line options are explained in depth here.

About Our Repo

The workflow we use, is a simplified form of "GitFlow". Basically:

  • master contains only stable releases (which have been merged to master) and is intended for end-users.
  • develop is the target for integration and is not intended for end-users.
  • 1.1.x is a release and maintenance branch for 1.1.x (1.1.0 -> 1.1.1 -> 1.1.2) and is not intended for end-users.
  • feature/my_feature is a temporary feature branch based on develop.
  • bugfix/my_bugfix is an optional temporary branch for bugfix(es) based on develop.

Conditions:

  • Merging of a stable release into master will be simple: the release branch is always right.
  • master is not merged back to develop.
  • develop is not re-based on master.
  • Release branches branch from develop only.
  • Bugfixes created specifically for a release branch are done there (because they are specific, they're not cherry-picked to develop).
  • Bugfixes done on develop may be cherry-picked to a release branch.
  • We will not release a 1.0.2 if a 1.1.0 has already been released.

sabnzbd's People

Contributors

caronc avatar degville avatar dependabot[bot] avatar discordianfish avatar eberkund avatar fish2 avatar francoism90 avatar gwyden avatar hellowlol avatar jcfp avatar jdfalk avatar jm3 avatar lparry avatar manandre avatar mnightingale avatar onecdonly avatar pairofdimes avatar puzzledsab avatar pyup-bot avatar renovate[bot] avatar ricci2511 avatar riksmith avatar sabnzbd-automation avatar safihre avatar sanderjo avatar savef avatar severinh avatar shypike avatar thezoggy avatar transifex-integration[bot] avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

sabnzbd's Issues

feature bug unzip

I noticed muchmore release are in 7zip format. And they are not extracted , is there a way to do that or is this planned in a future release.
Thanks for the reply

I use sabn on a ds412+ nas synology

smpl - 0.7.x

noticed that the table header functions do nothing... click on the sorting arrow for Age, notice that nothing happens.
if you trace the logic you see its calling queue/sort_by_avg_age with the arguments, limit=5, start=0, dir=desc
running this manually you get the results but no sort just as smpl does. looking at the console I do see that the function is being run but dont see where its broken at..
2012-04-17 19:10:46,901::INFO::[nzbqueue:506] Sorting by average date...(reversed:False)
2012-04-17 19:11:41,427::INFO::[nzbqueue:506] Sorting by average date...(reversed:True)
2012-04-17 19:12:01,415::INFO::[nzbqueue:506] Sorting by average date...(reversed:True)
2012-04-17 19:12:09,654::INFO::[nzbqueue:506] Sorting by average date...(reversed:False)

the data returned is never sorted. this happens for all the table headers in smpl. sort_by_size / sort_by_name / sort_by_avg_age /

Icky Icky let's get busy stopping a newly downloading item

SAB 0.7.x git branch (Plush theme) latest pull bb99c0d
XP

I've noticed this for a while now, perhaps it's me, i dunno...

Add item to queue (via SB)...
as soon as humanly possible hit the (-) "Delete" then "Remove NZB & Delete Files" (item was on auto priority "force") ...
existing connections are not immediately disconnected (monitoring the console output). SAB tries to access the incomplete/folder, the stop process is not run, a traceback occurs (no such folder). In a nutshell, the proper finishing up functions need to be invoked.

Add item to queue (via SB)...
as soon as humanly possible hit the drop down "Stop", (item was on auto priority "force") ...
SAB item doesn't stop, I have to play with a combination of pause/stop a couple of times for the item to actually stop.

So, to clarify, SAB is NOT post processing at the time, the two above actions are attempted as soon as SAB starts to download articles.

I call these types of issue, "personality" - no biggy, just a little bothersome when trying to test stuff :-)

Request: download .rar file first instead of .r00 file

I am on windows using Version: 0.7.1 and I remember in 0.6.15 It used to sort The files to be downloaded so the .rar file was first. Now it puts the .r00 file first and the .rar file last. Please have the .rar file download first.

Fatal Python error: GC object already tracked

I am running latest sabnzb from git on FreeBSD 9.1-RELEASE with python 2.7.2, this also occured with the official version in ports/packages, that's why i upgraded to git-pull. This also occured to me on FreeBSD 9.0-RELEASE-p3 to -p5

After a few hours of runtime, sabnzbd just crashes and leaves the Fatal Python error in it's error log.

Looks to me like there is an object deallocated which still has references on it.

thanks in advance

FR: Show hidden folders in path browser

I'd really like the ability to show "hidden" folders in linux (folders that begin with a '.' character). I'm using unraid with an sabnzbd plugin that's stored on the unraid install's cache drive. The cache drive is a drive that sits outside of the array that's used to cache writes, but also to store data that you don't want on the array itself. The way it works is a mover script is run every few hours that looks for any directory on the root of the cache drive not beginning with a '.' character, and copies that folder into the array.

Because I want all of SAB's downloads to go to the cache drive, then let sickbeard/headphones/couchpotato do the actual moving to the array, I need to have the downloads folder either be a folder beginning with '.', or be inside a folder beginning with '.', and the default location within the SABNZBD directory won't work because I have two other content downloading applications, and they all need to go to the same directory.

So, I'd like to have the ability to set the default download directory to something like /mnt/cache/.downloads/.incomplete, and have the completed directory just be /mnt/cache/.downloads. I looked through the pathbrowser.py file and saw the culprit:

# starting at line 81 in pathbrowser.py
try:
    for filename in os.listdir(path):
        fpath = os.path.join(path, filename)
        try:
            if NT:
                doit = (win32api.GetFileAttributes(fpath) & MASK) == TMASK and filename != 'PerfLogs'
            else:
                doit = not filename.startswith('.')    ## <= This line right here
        except:
            doit = False
        if doit:
            file_list.append({ 'name': unicoder(filename), 'path': unicoder(fpath) })

While I can understand the reason you'd not include hidden folders by default, it would be nice to at least have a checkbox in the path browser that toggles the setting. For the time being, I have to modify the source myself, which is great until the plugin auto-updates and overwrites my changes.

I should mention that I've tried manually setting the paths both by editing the .ini file and by typing the correct paths into the text fields associated with the path browser, and it seems to work when I click save, but if I go to a different page in the admin and go back, the folders are back to their original values.

I also did manage to set the proper download location on a per-category basis, and those settings seem to stick, but I'd like to truly make that the default location rather than having to set it per-category.

When queue is paused, and a forced item is downloading, the summary ETA is incorrect, and the forced item has no ETA.

Steps to reproduce:

  1. Add 2 seperate items to an empty Sabnzbd Queue
  2. Pause the queue (note the current ETA)
  3. Set the priority on one item to 'Force' (make sure the other one is not on 'Force')

Result:
4) It should start downloading the Forced item.
5) The ETA on the forced item should be 'Waiting', not the time
6) The overall ETA is incorrectly showing the combined ETA of both items, instead of just whatever is forced (in this case, one item, not both.)

Image with annotations:
http://i.imgur.com/CrD7F.jpg

IPv6: No proper listening socket

Situation:


C:\Users\Flashover>netstat -an | findstr 9090
TCP 0.0.0.0:9090 0.0.0.0:0 LISTENING
TCP [::1]:9090 [::]:0 LISTENING


This isn't the same thing for IPv4 or IPv6 connectivity.

"0.0.0.0" (IPv4) == "::" (IPv6)

::(double colon) != ::1 (double colon and the number one)

The IPv4 socket is global, listening on all NICs for incoming connections.
The IPv6 is bound to "localhost" only, so this is not accessible from other PCs across the network.

The correct should be without the number "1" in the address, so "[::]:9090" would be the correct thing.

My "host" parameter in the sabnzbd.ini file can be set to either "0.0.0.0" or "::" but can't be used at the same time. When only using "::" it does work correctly for IPv6, but then IPv4 isn't assigned at all.

Running:
Windows 7 Pro.
SABNzbd 0.7.2.

Please fix.

Same as https://bugs.launchpad.net/sabnzbd/+bug/1029007

Very high load by downloaden with mac version 0.7.7

OS: Mac osx 10.6.8
Processor: 2.66ghz intel core 2 duo
memory: 2 gb

When we download a movie the download speed is extremely slow. I have a fiber 100/100 connection and a usernet account with 30 connection and a unlimited speed limit.

When i test my connection with speedtest.net the results are: download: 95 mbit and upload 90 mbit
I check the load on my cpu and its 85% +/-. Maybe its a bug in sabnzbd?

Speeds Incorrect (0.7.x)

After latest pull the speeds no longer show up correctly. It downloads just fine, just the speed and estimated times are no longer correct.

git pull @ 11/30/2011.

WishList: Login Failure Email Notification

Hi,
Can you add an email notification option to sabnzbd to notify of incorrect logins please. If sab is exposed through an external firewall this would assist with helping identify that people are trying to hack in, otherwise how would you know.

Thanks!

Feature Request: Updates for Pause feature

Here are a couple of ideas I've had for the Pause button.

  • Instead of counting down, show "Paused until ..."
  • Provide a way to pause for "n hours" instead of just minutes
  • Provide a way to pause "until xx:xx"
    • This might be able to be done by Pausing and then creating a one-time scheduled event to unpause

[REQ] Sort download queue by name

It would be nice if we could sort the queue by name. For example in case you are downloading some episodes ant want to download them in ascending order.

Request - In app version update

Like Couchpotato and Sickbeard have.. When a new version is available, let it auto-update via the web interface...
Saves me a pain in the but logging in to my server every once in a while :)

Feature Request: API levels

API queries should take an API level parameter
Old API levels should remain available for a reasonable amount of time after new ones are released
Third party app compatibility will not break with new releases

Auto refresh causes mouse menus to disappear

While in SAB's webpage, if I click on a menu such as to change the priority of an item, if the page refreshes while the menu is expanded, the menu will disappear which will require me to re-click on it to expand it again. Kind of like pressing escape while in a menu.

sabToSickBeard.py chooses Sample.mkv over Show.mkv

Destination folder for this episode: /data/TV/Show/Season 01

Moving file from /downloads/complete/sickbeard/Show.mkv to /data/TV/Show/Season 01/s01e1.mkv

Processing succeeded for /downloads/complete/sickbeard/Show.mkv

Processing /downloads/complete/sickbeard/Sample.mkv

Found result in history: (248741, 1, [])

Parsed /downloads/complete/sickbeard/Sample.mkv into Show (GroupName) [ABD: False]

File /data/TV/Show/Season 01/s01e1.mkv is larger than /downloads/complete/sickbeard/Sample.mkv

This download is marked a priority download so I'm going to replace an existing file if I find one

Deleting file /data/TV/Show/Season 01/s01e01.mkv

Destination folder for this episode: /data/TV/Show/Season 01

Moving file from /downloads/complete/sickbeard/Sample.mkv to /data/TV/Show/Season 01/s01e01.mkv

Processing succeeded for /downloads/complete/sickbeard/Sample.mkv

Error: During scheduler execution

Hi,

Received the following error today. I have no idea if I did something at the time that could have caused this.

Could be a one-time happening, but I chose to create this issue nevertheless, just in case.

2012-08-24 12:08:34,764 ERROR: ERROR: DURING SCHEDULER EXECUTION Read failed (no details available) Traceback (most recent call last): File "/home/jocke/bin/sabnzbd/sabnzbd/utils/kronos.py", line 305, in call self.execute() File "/home/jocke/bin/sabnzbd/sabnzbd/utils/kronos.py", line 317, in execute self.action(_self.args, *_self.kw) File "/home/jocke/bin/sabnzbd/sabnzbd/rss.py", line 83, in run_method return __RSS.run() File "/home/jocke/bin/sabnzbd/sabnzbd/rss.py", line 493, in run self.run_feed(feed, download=True, ignoreFirst=True) File "/home/jocke/bin/sabnzbd/sabnzbd/decorators.py", line 31, in newFunction return f(_args, *_kw) File "/home/jocke/bin/sabnzbd/sabnzbd/rss.py", line 325, in run_feed d = feedparser.parse(uri.replace('feed://', 'http://')) File "/home/jocke/bin/sabnzbd/sabnzbd/utils/feedparser.py", line 3888, in parse saxparser.parse(source) File "/usr/lib/python2.7/dist-packages/drv_libxml2.py", line 176, in parse SAXException("Read failed (no details available)")) File "/home/jocke/bin/sabnzbd/sabnzbd/utils/feedparser.py", line 1828, in fatalError raise exc SAXException: Read failed (no details available)

jocke@filserver:~/bin/sabnzbd$ git log -1
commit e24aedc6acf1e477887a115b9a423838bdb19172
Author: ShyPike <[email protected]>
Date:   Sat Aug 4 11:05:58 2012 +0200
jocke@filserver:~/bin/sabnzbd$ git remote -v
origin  https://github.com/sabnzbd/sabnzbd.git (fetch)
origin  https://github.com/sabnzbd/sabnzbd.git (push)

API addfile completely broken

Hi, I'm one of the FlexGet developers and we have had sabnzbd api support for long time. But the problem with adding url to sabnzbd is that we don't get any feedback from retrieve failures. And for cookies etc it would make more sense to retrieve the file ourselves and simply post it to sabnzbd.

After toying around for a while I had to take a closer look to the sources since there wasn't any documentation of this API.

Problem: It does not work at all.

def _api_addfile(name, output, kwargs):

it gets the name just fine but later checks screw it completely

if name is None or isinstance(name, str) or isinstance(name, unicode):

since it is string it will go and fetch the name from different argument, okay ... (also, consider isinstance(name, basestring) instead of that str or unicode thing)

Later it will crash because

name.filename and name.value:

variable name is string and it doesn't have filename or value ... so

AttributeError: 'str' object has no attribute 'filename'

Maybe I will have more luck with more restrictive localfile API.

Crash during finish/unpack.

As you can see below it's unpacking something and then... it isn't.
I'm not sure exactly what is going on but I occasionally just find sabnzb not running any more.

Version: 0.7.9

Python Version: 2.7.1 (r271:86832, Jun 16 2011, 16:59:06) [GCC 4.2.1 (Based on Apple Inc. build 5658) (LLVM build 2335.15.00)]

2013-05-10 04:34:18,546::INFO::[__init__:1592] Decoding /Users/kristoffer/Downloads/incomplete/something.720p.HDTV.X264-DIMENSION/222.720p-dimension.r20 yenc
2013-05-10 04:34:18,708::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:19,334::DEBUG::[__init__:1600] bps: 812220.759933
2013-05-10 04:34:19,651::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:22,130::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:22,448::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:22,974::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:23,132::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:23,502::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:24,236::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:24,290::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:24,341::DEBUG::[__init__:1600] bps: 811064.631139
2013-05-10 04:34:25,240::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:26,346::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:29,348::DEBUG::[__init__:1600] bps: 812513.022512
2013-05-10 04:34:29,717::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:30,088::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:31,091::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:31,303::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:31,567::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:31,780::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:31,887::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:32,781::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:32,793::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:33,096::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:34,358::DEBUG::[__init__:1600] bps: 812416.579367
2013-05-10 04:34:34,675::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:34,833::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:35,412::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:36,574::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:37,471::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:37,680::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:38,578::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:38,589::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:38,685::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:38,789::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:38,844::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:39,268::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:39,373::DEBUG::[__init__:1600] bps: 811282.731421
2013-05-10 04:34:39,427::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:39,534::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:39,544::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:39,585::DEBUG::[__init__:1600] Decoding [email protected]
2013-05-10 04:34:39,595::DEBUG::[__init__:1600] Saving data for SABnzbd_nzo_uP4M_L in /Users/kristoffer/Downloads/incomplete/something.S02E22.720p.HDTV.X264-DIMENSION/__ADMIN__
2013-05-10 04:34:39,633::INFO::[__init__:1592] Saving data for totals9.sab in /Users/kristoffer/Library/Application Support/SABnzbd/admin/totals9.sab
2013-05-10 04:34:39,634::INFO::[__init__:1592] Decoding /Users/kristoffer/Downloads/incomplete/something.S02E22.720p.HDTV.X264-DIMENSION/jc.is.a.badass.222.720p-dimension.sample.mkv yenc
2013-05-10 04:34:40,308::INFO::[__init__:1592] /Users/kristoffer/Downloads/incomplete/something.S02E22.720p.HDTV.X264-DIMENSION/__ADMIN__/SABnzbd_nzo_uP4M_L removed
2013-05-10 04:34:40,308::INFO::[__init__:1592] Saving queue
2013-05-10 04:34:40,308::INFO::[__init__:1592] Saving data for queue9.sab in /Users/kristoffer/Library/Application Support/SABnzbd/admin/queue9.sab
2013-05-10 04:34:40,309::INFO::[__init__:1592] Saving postproc queue
2013-05-10 04:34:40,309::INFO::[__init__:1592] Saving data for postproc1.sab in /Users/kristoffer/Library/Application Support/SABnzbd/admin/postproc1.sab
2013-05-10 04:34:40,586::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:40,688::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:40,790::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:40,892::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:40,994::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:41,096::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:41,198::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:41,299::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:41,401::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:41,503::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:41,605::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:41,707::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:41,809::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:41,911::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:42,013::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:42,115::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:42,217::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:42,319::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:42,421::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:42,523::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:42,625::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:42,726::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:42,828::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:42,930::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:43,032::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:43,134::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:43,236::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:43,337::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:43,439::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 04:34:43,541::INFO::[__init__:1592] Thread [email protected]:563: forcing disconnect
2013-05-10 13:37:38,656::INFO::[__init__:1592] Console logging for OSX App disabled
2013-05-10 13:37:38,657::INFO::[__init__:1592] --------------------------------
2013-05-10 13:37:38,657::INFO::[__init__:1592] SABnzbd.py-0.7.9 (rev=c237ddfef464649ec3713d43c441def6c8656f46)
2013-05-10 13:37:38,658::INFO::[__init__:1592] Platform = posix
2013-05-10 13:37:38,658::INFO::[__init__:1592] Python-version = 2.7.1 (r271:86832, Jun 16 2011, 16:59:06) 
[GCC 4.2.1 (Based on Apple Inc. build 5658) (LLVM build 2335.15.00)]

Init refresh too agressive

SAB currently begins refreshing automatically. This causes an issue if the credentials have changed as the cached page will begin refreshing while still prompting for credentials. This could be fixed with an AJAX request to a test URL forced to not be cached and only enable refreshing if it succeeds.

"Download" link broken on plush on .7.X

This morning I just pulled the latest devlop version of .7.x and the download link on plush now goes to ipaddress// instead of "/" which directs the user to a broken page.

Small issue, but wanted to repot it

Repair hangs when no main packet found

Hello.

My download finished recently and sabnzbd is showing Repair: Starting repair on that job in the history and it's still in the queue. I've tried to repair manually with par2cmdline from linux x86_64 and it shows this:

par2cmdline version 0.5.4
Copyright (C) 2003 Peter Brian Clements.
Copyright (C) 2011-2012 eMPee584.
Copyright (C) 2012 Ike Devolder.

par2cmdline comes with ABSOLUTELY NO WARRANTY.

This is free software, and you are welcome to redistribute it and/or modify
it under the terms of the GNU General Public License as published by the
Free Software Foundation; either version 2 of the License, or (at your
option) any later version. See COPYING for details.

Loading "the.raid.redemption.2011.dubbed.1080p.bluray.x264-geckos.par2".
No new packets found
Loading "the.raid.redemption.2011.dubbed.1080p.bluray.x264-geckos.vol00+01.par2".
No new packets found

Main packet not found.

Relevant lines from log are here

It looks like it detects the lack of main packet, but the behaviour should be different IMO. I restarted the instance and it spawned another job with full progress bar and nothing happening. It still hangs at the Starting repair status in history.

Reproducing

Any job in which .par2 is missing a main packet should do, at least +Repair must be set I believe.

Download might fail math/logic

I just downloaded the recent build as of yesterday, and now my some of my files fail saying:

"Download might fail, only 100.3% of required 100.5% available", I am not sure if this purely incorrect math(or something like it) or an incorrect calculation of what needs to be there in order to complete the download.

Multi-stage re-fetch attempts

When adding multiple nzbs from the same nzb source service, sabnzbd may end up with many items listed as:
WAIT 3581 sec / Trying to fetch NZB from https://nzbprovider/path/etc/

I believe this is due to failed fetch attempts or ratelimiting. It appears to wait exactly 1 hour (3600 seconds) before the next attempt. A staged approach for each re-attempt would incur less of a penalty for rate-limited services. For example:
3600 (1 hour)
5400 (1.5 hours)
8100 (2.25 hours)
12150 (3.375 hours)

Separately but perhaps in conjunction:
A recognition that the nzbs are all from the same service would also aid in preventing unnecessary fetches. For example, given 100 nzbs but with a ratelimit of 8 per hour, sabnzbd would receive an error on the 9th item. The subsequent 91 items should automatically be delayed for that same interim. In the subsequent fetch attempts, items 9 through 16 may be successful but item 17 may receive an error. The subsequent 83 should then also be delayed.

Enhancement Request: Purge Visible from history

I'd like to be able to filter the history list (e.g. search for "colbert") and purge them all with a single button + confirmation; rather than having to click the button next to each list item.

data load EOFError

2012-05-08 18:08:18,734::INFO::[init:814] Traceback:
Traceback (most recent call last):
File "/usr/local/sabnzbd/sabnzbd/init.py", line 805, in load_data
data = cPickle.load(_f)
EOFError

This has happened a few times on me today, sab will just die and I have to start it up again.

SABnzbd fails post-processing

Preconditions:

  • Growl 1.3.1 installed via the Mac App Store
  • SABnzbd v0.6.10
  • Mac OS X Lion 10.7.2

Problem

Disabling notifications for post-processing in Growl causes post processing to fail, re-enabling the notification fixes the problem.

HTTP Server crash

The http server crashes about once a day here. Here is the log dump :

2012-08-21 08:00:00,007::INFO::[downloader:252] Bandwidth limit set to 1024
2012-08-21 14:00:00,005::INFO::[downloader:252] Bandwidth limit set to 512
2012-08-22 02:00:00,000::INFO::[downloader:252] Bandwidth limit set to 0
2012-08-22 08:00:00,002::INFO::[downloader:252] Bandwidth limit set to 1024
2012-08-22 14:00:00,003::INFO::[downloader:252] Bandwidth limit set to 512
2012-08-22 17:50:57,547::ERROR::[_cplogging:55] [22/Aug/2012:17:50:57] ENGINE Error in HTTP server: shutting down
Traceback (most recent call last):
  File "/cfg/sabnzbd/cherrypy/process/servers.py", line 75, in _start_http_thread
  File "/cfg/sabnzbd/cherrypy/wsgiserver/__init__.py", line 1655, in start
    self.tick()
  File "/cfg/sabnzbd/cherrypy/wsgiserver/__init__.py", line 1703, in tick
    s, addr = self.socket.accept()
  File "<string>", line 4, in accept
  File "/usr/lib/python2.7/socket.py", line 202, in accept
error: [Errno 24] Trop de fichiers ouverts

(Translation from french: [Errno 24] Too many open files)

[REQ] Split combined NZB's in seperate download tasks for faster processing

A combined NZB which for example contains several episodes would be processed faster if it was split into separate download tasks. Downloading ep2 would continue while ep1 is post processed for example. In addition it is much clearer to the end user which of those episodes have failed. Currently if one episode fails it appears as the whole combined NZB failed.

Feature Request: content encodng

parse Accept-Encoding header then set Content-Encoding and compress response body with CherryPy appropriately

usage scenario:
mobile users using http interface
xml/json API compresses well too

Queuing issue with certain nzb links / UnpicklingError

On develop branch, commit 944568d (Mar 28) seems to have introduced a queueing issue with certain nzb links, both delivered by SickBeard, or using the Add NZB/Fetch option in sab itself (the Mar26 commit is unaffected)

Example:

Working: http://lolo.sickbeard.com/getnzb/78e9c962b710e88e5bda3799b15cd9ce.nzb&i=0&r=
Failing: http://www.newshost.co.za/nzb/506/CSI.S13E19.HDTV.x264-LOL.nzb

The nzb seems to be fetched correctly (it shows up in admin/future), but the nzb is not queued. Fetching the failing link manually, then uploading the nzb by hand does work correctly.

Log extract for the working link:

2013-04-11 12:25:10,187::INFO::[__init__:462] Fetching http://lolo.sickbeard.com/getnzb/78e9c962b710e88e5bda3799b15cd9ce.nzb&i=0&r=
2013-04-11 12:25:10,188::INFO::[nzbqueue:220] Saving queue
2013-04-11 12:25:10,189::DEBUG::[__init__:801] Saving data for SABnzbd_nzo_RgU5at in /home/edwin/.sabnzbd/admin/future
2013-04-11 12:25:10,190::INFO::[__init__:876] Saving data for queue10.sab in /home/edwin/.sabnzbd/admin/queue10.sab
2013-04-11 12:25:10,192::INFO::[urlgrabber:95] Grabbing URL http://lolo.sickbeard.com/getnzb/78e9c962b710e88e5bda3799b15cd9ce.nzb&i=0&r=
2013-04-11 12:25:10,708::INFO::[misc:770] Creating directories: /home/edwin/Downloads/incomplete/CSI.S13E19.HDTV.x264-LOL.4
2013-04-11 12:25:10,714::INFO::[nzbstuff:335] Skipping sample file [134685]-[FULL]-[#a.b.teevee]-[ CSI.S13E19.HDTV.x264-LOL ]-[01/41] - "csi.1319.hdtv-lol.sample.mp4" yEnc (1/20)
[...etc, fetching the files...]

But the failing link just stops after saving the queue.

2013-04-11 11:14:23,454::INFO::[__init__:462] Fetching http://www.newshost.co.za/nzb/506/CSI.S13E19.HDTV.x264-LOL.nzb
2013-04-11 11:14:23,456::INFO::[nzbqueue:220] Saving queue
2013-04-11 11:14:23,456::DEBUG::[__init__:801] Saving data for SABnzbd_nzo_UHgu2X in /home/edwin/.sabnzbd/admin/future
2013-04-11 11:14:23,456::INFO::[__init__:876] Saving data for queue10.sab in /home/edwin/.sabnzbd/admin/queue10.sab
2013-04-11 11:14:23,457::INFO::[urlgrabber:95] Grabbing URL http://www.newshost.co.za/nzb/506/CSI.S13E19.HDTV.x264-LOL.nzb
2013-04-11 11:14:23,656::INFO::[__init__:865] /home/edwin/.sabnzbd/admin/future/SABnzbd_nzo_UHgu2X removed
2013-04-11 11:14:23,656::INFO::[nzbqueue:220] Saving queue
2013-04-11 11:14:23,657::INFO::[__init__:876] Saving data for queue10.sab in /home/edwin/.sabnzbd/admin/queue10.sab
[...nothing more...]

When trying to repair the queue using Status/Queue Repair, we get thrown an UnpicklingError:

2013-04-11 12:52:55,542::INFO::[postproc:89] Loading postproc queue
2013-04-11 12:52:55,542::INFO::[__init__:902] Loading data for postproc2.sab from /home/edwin/.sabnzbd/admin/postproc2.sab
2013-04-11 12:52:55,562::DEBUG::[__init__:837] Loading data for CSI.S13E19.HDTV.x264-LOL.nzb.nzb from /home/edwin/.sabnzbd/admin/future/CSI.S13E19.HDTV.x264-LOL.nzb.nzb
2013-04-11 12:52:55,563::ERROR::[__init__:853] Loading /home/edwin/.sabnzbd/admin/future/CSI.S13E19.HDTV.x264-LOL.nzb.nzb failed
2013-04-11 12:52:55,563::INFO::[__init__:854] Traceback:
Traceback (most recent call last):
  File "/home/edwin/SABnzbd/sabnzbd/__init__.py", line 845, in load_data
    data = cPickle.load(_f)
UnpicklingError: invalid load key, '<'.

Using the alternative pickling method raises a similar error.

But it seems to me the file doesn't need to be unpickled, just loaded, as it is a nzb file -- not an object.

Addition: Seem to be 2 unrelated issues; I think the failing nzb shouldn't be "futured" in the first place, but now it has and load_data seems to be treating the nzb file incorrectly.

Bug: single factor authentication

When login and api key are both specified in config the api only needs apikey parameter to authenticate. I propose using two factor auth when the user has specified both. We could use a couple of bitmasks in api calls like

{
  auth: {
    status: 0
    mode: 11
  }
  queue: {
    ...
  }
}

Where status indicates failures and mode indicates required auth mechanisms
0x000001 = login
0x000010 = apikey

For example if both apikey and login are configured we'll get a mode of 11. Continuing with mode 11, if login passed but apikey failed we will have a status of 10. A single boolean does not tell the whole story, third party apps should not be string matching error messages as part of their business logic, and a single bool hardly tells the full story.

Feature Request: Store parsed RSS items locally

SAB to store parsed RSS items locally say up to xx,xxx (adv. config) items.

In use scenario: You realise you want a particular app. You have a RSS feed set up and go see the app was released from your now cached items. Now you tweak the SAB filterset so that future versions are auto-downloaded, and download the latest version at the same time.

  1. Speedy way for user to grab that download from ages ago.
  2. The parsed RSS data from today is put to good use much later
  3. Less hits on the server (thinking about how many users search as a collective)

Good for all :-)

Incomplete folder should respect category subfolders

I keep my media on different harddrives depending on type, so .../complete/category1 is a symlink to a different disk than .../complete/category2.

When the download is finished for category 1, the file is moved within the same disk, no problem, but for category 2, it moves it across disks. This could be avoided by having .../incomplete/category[12] folders which get used during download, and they could point to different disks, so the data is directly downloaded onto the right disk, making the mv a very fast operation.

This should not be too complicated, every category has its own subfolder in the incomplete folder.

Feature request: movie genre sorting

Would like the ability to sort movies by genre, i.e. pull the genre info from IMDB and use that as part of the eventual location name. This allows browsing by genre on systems that don't support further classification themselves, such as when viewing files via a PS3.

Would be good to be able to support multiple genres, e.g. have the actual file in Movies\Fred and then symlinks to Movies\Comedy\Fred, Movies\Romance\Fred, Movies\Sports\Fred

NZB URLs from 3rd party clients aren't being properly unescaped before fetching

I've noticed that my instance of SABnzbd has started to fail to retrieve NZB urls from an instance of Newznab. It usually will get an api request to add an NZB via a url, and then cycle it's queue trying to download, fail and then try again forever.

Logs are showing entries like this:

2013-02-03 13:31:45,903::INFO::[urlgrabber:116] Grabbing URL http%3A%2F2Fexample.com%2Fgetnzb%2F555ca516a2c68351578ee52afc1b2e0c476025fa.nzb%26i%3D0%26r%3D
2013-02-03 13:31:45,903::DEBUG::[urlgrabber:355] No response from indexer, retry after 60 sec
2013-02-03 13:31:45,904::INFO::[urlgrabber:179] Retry URL http%3A%2F%2Fexample.com%2Fgetnzb%2F555ca516a2c68351578ee52afc1b2e0c476025fa.nzb%26i%3D0%26r%3D

and upon manually unescaping the HTML entities in the url and trying to fetch it in my browser, it always succeeds, leading me to believe that SABnzbd isn't properly escaping the URLs before trying to fetch them.

Using NZB "download" button on newzbin feeds gives incorrect job name

When clicking on the Download button on the RSS Feed's Filter page, it starts the download and saves the file with a malformed name. The name appears to be some sort of HTML tag surrounding the correct name of the download. I've looked at the source in Chrome and discovered the name coming from the nzbname attribute of the Download button's form data.

Temporary Download Folder reverts to default on 0.7.6Beta1

I'm running sabnzbdplus on Ubuntu 10.04. My /home partition is very small, and I have a much larger LUKS volume mounted elsewhere that is unavailable immediately on boot until the password is entered. I desire my Temporary Download Folder and Completed Download Folder both be configured to paths on this larger LUKS volume.

In the past, on version 0.7.5 and prior, the temporary download folder would always revert if the configured path wasn't available upon sab start. As a workaround to that issue, I had simply removed sabnzbdplus from init.d using update-rc.d, and started it manually after the LUKS volume was mounted.

Since the 0.7.6Beta1 update, it seems that, even when the path is available when sabnzbdplus starts, the config gets reverted to default (Downloads/incomplete). It's impossible to even set the new path after sab starts, because now the config value doesn't stay set. Every time you go back to the "Folders" section of the config, the path has reverted. The current condition causes all downloads to stay in PAUSE state until manual action is performed (due to Minimum Free Space for Temporary Download Folder being configured).

In the past, I was able to workaround this via a script. Now, the only way is to use a cross-device symlink. This isn't ideal, and takes manual intervention every time the app is restarted.

Previous behavior was annoying, but I had a workaround. Now, 0.7.6Beta1 is unusable without manual intervention. I'm still trying to find a new workaround (for some reason, scripting the symlink to happen before sab start isn't working, either, so I'm investigating alternatives). Ideally, the fix should be in the sab codebase.

Please look into why this is happening and see if a fix can make it to the next release.

I submitted this report on Launchpad, but now see that is for translations, and this is for actual software bugs. I will delete the Launchpad post.

Thanks.

random cherrypy version

The bundled cherrypy version is random and does not match any release version (at least I could not find it), nor does it completely match any of the mentioned svn revs in VERSION.txt.

This makes it impossible to unbundle it, because of errors when uploading an nzb: https://gist.github.com/2966949

The possibility to unbundle this would be a requirement to package it in gentoo. We need compatibility with one of these cherrypy versions: 2.3.0, 3.1.2, 3.2.0, 3.2.1 or 3.2.2

tell me if I missed something

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.