Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Django Dynamic Scraper: Celery Tasks not executed in Scrapyd #145

Open
benjaminelkrieff opened this issue Nov 19, 2020 · 0 comments
Open

Comments

@benjaminelkrieff
Copy link

I started using the django dynamic scraper for my personal project.

I created my scrapers as I should and everything works when I run scrapy crawl in a terminal Now I want to use django celery to schedule scraping. I followed everything in the tutorial (created a periodic task, ran celeryd, ran scrapyd, deployed the scrapy project, changed the scraper status to ACTIVE in the UI)

The very first time it runs, I can see that a process is spawned in the scrapyd server. It runs once, and never run again. Even when I define a new periodic task.

Celery keeps sending tasks, but all I see in the scrapyd is the following log: 2020-11-19T12:18:36+0200 [twisted.python.log#info] "127.0.0.1" - - [19/Nov/2020:10:18:36 +0000] "GET /listjobs.json?project=default HTTP/1.1" 200 93 "-" "Python-urllib/2.7"

I tried to deactivate dynamic scheduling as explained in the documentation but it still does not work. My tasks are spawned only once and I can't work that way.

If someone has already ran into this issue, I would highly appreciate the help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant