Scrapy Multiple Spiders executing serially

By : Saheb
Source: Stackoverflow.com
Question!

I have a scrapy project. I have integrated the spider with selenium webdriver and the code looks something like this.

def start(i):
    //do some navigation with selenium webdriver

    return url

class abc(BaseSpider):
    name = '123'
    x = start(5)
    start_urls = [x]
    def parse(self, response):

        // scrape data.


class fgh(BaseSpider):
    name = '456'
    x = start(8)
    start_urls = [x]
    def parse(self, response):

        // scrape data.

.... and so on {20 classes as such}..

Now whenever I try scrapy crawl 456, it starts executing from the first spider that is '123'. The control is not being transferred to respective spider. I'm new to selenium and scrapy. Please help.

By : Saheb


Answers

Actually, it was with the .pyc files. I deleted it, and now it runs fine.

By : Saheb


This video can help you solving your question :)
By: admin