SBS error
Issue #109
resolved
I’m getting “fetch error: This feed is limited to 1000 item(s).'}, 'status': 'failed'“
The whole
Traceback (most recent call last):
File "./grabber.py", line 62, in <module>
main()
File "./grabber.py", line 42, in main
for n in node.get_children():
File "/mnt/Video/webdl/common.py", line 53, in get_children
self.fill_children()
File "/mnt/Video/webdl/sbs.py", line 78, in fill_children
all_video_entries = self.load_all_video_entries()
File "/mnt/Video/webdl/sbs.py", line 91, in load_all_video_entries
entries = self.fetch_entries_page(offset, page_size)
File "/mnt/Video/webdl/sbs.py", line 115, in fetch_entries_page
raise Exception("Missing data in SBS response", data)
Exception: ('Missing data in SBS response', {'f': {'response': {'message': 'Feed fetch error: This feed is limited to 1000 item(s).'}, 'status': 'failed'}})
Comments (8)
-
-
Same here, FWIW I’m using this workaround - it would seem the feed is limited to 10000 items not 1000 as the API error reports:
$ git diff sbs.py diff --git a/sbs.py b/sbs.py index e2326ee..2fb077f 100644 --- a/sbs.py +++ b/sbs.py @@ -109,6 +109,8 @@ class SbsRootNode(SbsNavNode): return list(results.values()) def fetch_entries_page(self, offset, page_size): + if offset > 10000: + return[] url = append_to_qs(FULL_VIDEO_LIST, {"range": "%s-%s" % (offset, offset+page_size-1)}) data = grab_json(url) if "entries" not in data:
-
Thanks, that worked nicely. To call my Python skills pathetic is an understatement.
-
Worked for me too. Thanks.
-
Thanks.
-
repo owner -
assigned issue to
- edited description
-
assigned issue to
-
repo owner -
repo owner - changed status to resolved
- Log in to comment
I’m getting the same error.