You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I had en error when try to start my script with scrapy and scrapy-rabbitmq
`2016-04-12 14:34:06 [twisted] CRITICAL: Unhandled error in Deferred:
Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1274, in unwindGenerator
return _inlineCallbacks(None, gen, Deferred())
File "/usr/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1128, in _inlineCallbacks
result = g.send(result)
File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 73, in crawl
yield self.engine.open_spider(self.spider, start_requests)
File "/usr/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1274, in unwindGenerator
return _inlineCallbacks(None, gen, Deferred())
--- ---
File "/usr/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1128, in _inlineCallbacks
result = g.send(result)
File "/usr/local/lib/python2.7/site-packages/scrapy/core/engine.py", line 238, in open_spider
yield scheduler.open(spider)
File "/app/project/external_apps/scrapy_rabbitmq/scheduler.py", line 56, in open
if len(self.queue):
File "/app/project/external_apps/scrapy_rabbitmq/queue.py", line 54, in len
response = self.server.queue_declare(self.key, passive=True)
File "/usr/local/lib/python2.7/site-packages/pika/adapters/blocking_connection.py", line 2329, in queue_declare
self._flush_output(declare_ok_result.is_ready)
File "/usr/local/lib/python2.7/site-packages/pika/adapters/blocking_connection.py", line 1181, in _flush_output
raise exceptions.ChannelClosed(method.reply_code, method.reply_text)
pika.exceptions.ChannelClosed: (404, "NOT_FOUND - no queue 'my_spider:requests' in vhost 'docker'")
2016-04-12 14:34:06 [twisted] CRITICAL: `
I set settings as in example. Queue with name RABBITMQ_QUEUE_NAME from settings in my rabbitMQ server will be created.
Queue with name "my_spider:request" wasn't created.
The text was updated successfully, but these errors were encountered:
Hi, I had en error when try to start my script with scrapy and scrapy-rabbitmq
`2016-04-12 14:34:06 [twisted] CRITICAL: Unhandled error in Deferred:
Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1274, in unwindGenerator
return _inlineCallbacks(None, gen, Deferred())
File "/usr/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1128, in _inlineCallbacks
result = g.send(result)
File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 73, in crawl
yield self.engine.open_spider(self.spider, start_requests)
File "/usr/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1274, in unwindGenerator
return _inlineCallbacks(None, gen, Deferred())
--- ---
File "/usr/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1128, in _inlineCallbacks
result = g.send(result)
File "/usr/local/lib/python2.7/site-packages/scrapy/core/engine.py", line 238, in open_spider
yield scheduler.open(spider)
File "/app/project/external_apps/scrapy_rabbitmq/scheduler.py", line 56, in open
if len(self.queue):
File "/app/project/external_apps/scrapy_rabbitmq/queue.py", line 54, in len
response = self.server.queue_declare(self.key, passive=True)
File "/usr/local/lib/python2.7/site-packages/pika/adapters/blocking_connection.py", line 2329, in queue_declare
self._flush_output(declare_ok_result.is_ready)
File "/usr/local/lib/python2.7/site-packages/pika/adapters/blocking_connection.py", line 1181, in _flush_output
raise exceptions.ChannelClosed(method.reply_code, method.reply_text)
pika.exceptions.ChannelClosed: (404, "NOT_FOUND - no queue 'my_spider:requests' in vhost 'docker'")
2016-04-12 14:34:06 [twisted] CRITICAL: `
I set settings as in example. Queue with name RABBITMQ_QUEUE_NAME from settings in my rabbitMQ server will be created.
Queue with name "my_spider:request" wasn't created.
The text was updated successfully, but these errors were encountered: