Kodi Community Forum
Sick Beard - Automatic TV Show Episode download/sort/rename, nfo/tbn maker & TV Guide - Printable Version

+- Kodi Community Forum (https://forum.kodi.tv)
+-- Forum: Support (https://forum.kodi.tv/forumdisplay.php?fid=33)
+--- Forum: Supplementary Tools for Kodi (https://forum.kodi.tv/forumdisplay.php?fid=116)
+--- Thread: Sick Beard - Automatic TV Show Episode download/sort/rename, nfo/tbn maker & TV Guide (/showthread.php?tid=63591)



- jons2crazy - 2010-02-07

midgetspy Wrote:Was that a typo? Cause the post process script is sabToSickbeard.py not autoProcessTV.cfg :0)

DUh! was a mis selection. Guess i just picked the first thing in the scroll menu. Thanks agian.EekShockedOo


- jons2crazy - 2010-02-07

..


- midgetspy - 2010-02-07

jons2crazy Wrote:..

Looks like you added Weeds with an invalid dir? I don't know why it let you add it, but once it's in there it's not surprising that it is having problems.


- manifestdestiny - 2010-02-07

midgetspy Wrote:Yeah because unfortunately it's named "CSI: Crime Scene Investigation" on both TVDB and TVRage but everybody just posts it as CSI. I haven't decided what to do about that. The best solution right now is to watch better shows :0)

maybe what you should do is give users the ability to edit the name of the show that is used to search for the nzb files. so in this scenario i would tell sickbeard that when it searches nzbmatrix for any episode of CSI , it will use "CSI" rather than "CSI: Crime Scene Investigation". this would save you the time of having to deal with other shows that have the same problem.


- kricker - 2010-02-07

midgetspy Wrote:Are you talking about a tvshow.nfo file? Can you show me a comparison of what XBMC generates and what Sick Beard is generating?
Attached is an export from XBMC for the show "Top Gear". Just remove the .txt extension.

I'll see if I can get one from Sickbeard now.


- xexe - 2010-02-07

I am considering telling SickBeard about all my shows. However as i have many (hundreds) I want to be sure that I am not going to hammer tvdb and tvrage pointlessly.

Am i correct in saying:

SB only looks for an ep if the air date is now. i.e. it wont try and search for thousands eps aired a long time ago
Episodes are only backlogged it i tell them to be.
tvdb is only scraped for show updates when i tell it to.
tvnzb is only searched on by rss cache

essentially I am trying to understand the logic and more importantly frequency of any type of internet lookup. Sorry i don't speak python so i cannot work this out on my own.

Lastly is there any reason why SB wouldn't scale to this number of shows?. Currently it takes a couple of minutes to load with < 100 shows can I expect it to slow linearly as i up the show count?


- blacklist - 2010-02-07

xexe Wrote:I am considering telling SickBeard about all my shows. However as i have many (hundreds) I want to be sure that I am not going to hammer tvdb and tvrage pointlessly.

I actually did add all of mine, and it doesn't seem to be an issue of a lot of hammering, watching the traffic coming from sickbeard. I'm curious about the logic though as well. If a show is tagged as "ended", then sickbeard does not try to update it any longer?

For instance, lets take The Apprentice. The show hasn't been on for almost a year, but should be starting up again soon. How does sickbeard determine to start looking for it? Based on scraping the rss from nzbmatrix and looking for it at every instance? Or waiting for thetvdb to be updated with episode 10x1?

Just points of curiosity really! Thanks for this awesome tool.


- midgetspy - 2010-02-08

The logic isn't as good as it could be, but currently:

- providers are searched for episodes that are airing on that day or marked as missing every 30+ minutes depending on your search frequency. Newzbin, TVNZB, and TVBinz require a single search, NZBMatrix and NZBs.org require multiple searches for each episode (because they don't allow multiple searches in a single query)
- TVNZB and TVBinz only cache the latest RSS feed, no deep searches
- shows are updated every 12 hrs whether they're "ended" or not. (this should change). - updating a show means reading all the TVDB info from the TVDB cache, which is updated once every 24 hours. This could be done better but I'm waiting on the new TVDB API before I put any work into it.
- episodes only go in the backlog if you tell them to and they're only searched every week


- webmosher - 2010-02-08

Quick followup to issue below:
Changing the urlopen to the following fixed the problem:
Code:
f = urllib2.urlopen(url, None, 60)

Original issue is below:

I am having a bit of an issue with getting backlog results for NZBs.org. Using this example URL (generated by sickbeard):

Code:
http://www.nzbs.org/rss.php?dl=1&catid=1&i=MYUID&h=MYHASHNUMBER&age=300&q=%5EFraggle.Rock.S02E16&action=search

I get XML results in FF, Links2, and wget. However, I have backtraced this a bit by debugging the provider, and the urlopen library is not returning any data for the URL. This isn't logged as a known issue, and I searched the thread for issues with nzbs.org.

Some other things I tried:
1) Logged out of nzbs.org to ensure it wasn't caching something outside of the UID/hash.
2) Grabbed the URL on external browsers both from a local machine (Firefox) and the SABNZB/sickbeard machine (links2-console) to ensure there were no firewall issues.
3) Thought it might be a user agent detection issue. I am not sure what UA that the python URLlib is using, but I created a couple generic agents using the agent switcher in FF. It did not seem to impact the fetch.

Anyhow, hope this can be resolved (or that maybe its a local problem just my own). Sickbeard is a nice bit of work. Thanks for your efforts in providing it.

Cheers.


- midgetspy - 2010-02-08

So what exactly was the problem? You were getting no data but adding a timeout results in data from the call? The timeout param for urllib2 isn't supported in python 2.5 unfortunately.


- kricker - 2010-02-08

kricker Wrote:Attached is an export from XBMC for the show "Top Gear". Just remove the .txt extension.

I'll see if I can get one from Sickbeard now.
I turned meta data creation back on in Sickbeard, but now I am not getting any .nfo files from it. It did download images though. I left it alone all day thinking it would just take some time, but still no .nfo files. Do I need to stop Sickbeard and restart it?


- midgetspy - 2010-02-08

kricker Wrote:I turned meta data creation back on in Sickbeard, but now I am not getting any .nfo files from it. It did download images though. I left it alone all day thinking it would just take some time, but still no .nfo files. Do I need to stop Sickbeard and restart it?

Did you refresh/update the show? It should generate NFOs for any episode every time the show gets updated/refreshed. As for the tvshow.nfo I think you would have to re-add the show to get it to regenerate that (which is probably not the way it should be).


- kricker - 2010-02-08

Yes I did a forced refresh on a couple of shows. I'll add a new show tomorrow if you still need a Sickbeard generated .nfo file for comparison.


- xexe - 2010-02-08

The dbase upgrade patch for 302 isnt working on my system

Code:
[sickbeard-read-only] 10:09 AM>python fixDB-r302.py
Adding tvr_name column to tv_shows table
ALTER TABLE tv_shows ADD tvr_name TEXT
Fatal error executing query 'ALTER TABLE tv_shows ADD tvr_name TEXT': duplicate column name: tvr_name
Traceback (most recent call last):
  File "fixDB-r302.py", line 21, in <module>
    addColumn("tv_shows", "tvr_name", "TEXT", "")
  File "fixDB-r302.py", line 12, in addColumn
    connection.execute(sql)
sqlite3.OperationalError: duplicate column name: tvr_name

duplicate column name: tvr_name << doesn't exist bizarrely


- kricker - 2010-02-08

Okay, here is the .nfo file created by Sickbeard for the same show. I had to delete the show then re-add it to get the .nfo file.