Skip to content

TypeError: __init__() got an unexpected keyword argument '_job' #16

@rvogel

Description

@rvogel

I'm running scrapyd and ScrapyKeeper. Spiders are set for periodic jobs. Each time they run, I've got this error.

I've inspected my envvars and there isn't any SCRAPY_ variable there.

Any ideas who is setting this SCRAPY_JOB variable?

INFO: Scrapy 2.1.0 started (bot: mybot)

CRITICAL: Unhandled error in Deferred:

CRITICAL:

Traceback (most recent call last):

File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/twisted/internet/defer.py", line 1418, in _inlineCallbacks

result = g.send(result)

File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/scrapy/crawler.py", line 86, in crawl

self.spider = self._create_spider(*args, **kwargs)

File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/scrapy/crawler.py", line 98, in _create_spider

return self.spidercls.from_crawler(self, *args, **kwargs)

File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/scrapy/spiders/__init__.py", line 49, in from_crawler

spider = cls(*args, **kwargs)

TypeError: __init__() got an unexpected keyword argument '_job'

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions