You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
2023-07-29 09:40:23 [scrapy.core.scraper] ERROR: Spider error processing <GET https://weibo.com/ajax/statuses/mymblog?uid=xxx&page=79> (referer: https://weibo.com/ajax/statuses/mymblog?uid=xxx&page=78)
return next(self.data)
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/utils/python.py", line 336, in next
for r in iterable:
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/spidermiddlewares/offsite.py", line 28, in
return (r for r in result or () if self._filter(r, spider))
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/core/spidermw.py", line 106, in process_sync
for r in iterable:
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/spidermiddlewares/referer.py", line 352, in
return (self._set_referer(r, response) for r in result or ())
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/core/spidermw.py", line 106, in process_sync
for r in iterable:
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/spidermiddlewares/urllength.py", line 27, in
return (r for r in result or () if self._filter(r, spider))
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/core/spidermw.py", line 106, in process_sync
for r in iterable:
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/spidermiddlewares/depth.py", line 31, in
return (r for r in result or () if self._filter(r, response, spider))
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/core/spidermw.py", line 106, in process_sync
for r in iterable:
File "/home/lighthouse/work/WeiboSpider/weibospider/spiders/tweet_by_user_id.py", line 38, in parse
item = parse_tweet_info(tweet)
File "/home/lighthouse/work/WeiboSpider/weibospider/spiders/common.py", line 94, in parse_tweet_info
"geo": data['geo'],
KeyError: 'geo'
谢谢作者小而美的工具,很好用。出现这个报错,有可能是有历史格式‘geo’ key并不存在?
The text was updated successfully, but these errors were encountered:
2023-07-29 09:40:23 [scrapy.core.scraper] ERROR: Spider error processing <GET https://weibo.com/ajax/statuses/mymblog?uid=xxx&page=79> (referer: https://weibo.com/ajax/statuses/mymblog?uid=xxx&page=78)
return next(self.data)
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/utils/python.py", line 336, in next
for r in iterable:
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/spidermiddlewares/offsite.py", line 28, in
return (r for r in result or () if self._filter(r, spider))
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/core/spidermw.py", line 106, in process_sync
for r in iterable:
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/spidermiddlewares/referer.py", line 352, in
return (self._set_referer(r, response) for r in result or ())
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/core/spidermw.py", line 106, in process_sync
for r in iterable:
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/spidermiddlewares/urllength.py", line 27, in
return (r for r in result or () if self._filter(r, spider))
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/core/spidermw.py", line 106, in process_sync
for r in iterable:
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/spidermiddlewares/depth.py", line 31, in
return (r for r in result or () if self._filter(r, response, spider))
File "/home/lighthouse/.local/lib/python3.10/site-packages/scrapy/core/spidermw.py", line 106, in process_sync
for r in iterable:
File "/home/lighthouse/work/WeiboSpider/weibospider/spiders/tweet_by_user_id.py", line 38, in parse
item = parse_tweet_info(tweet)
File "/home/lighthouse/work/WeiboSpider/weibospider/spiders/common.py", line 94, in parse_tweet_info
"geo": data['geo'],
KeyError: 'geo'
谢谢作者小而美的工具,很好用。出现这个报错,有可能是有历史格式‘geo’ key并不存在?
The text was updated successfully, but these errors were encountered: