用scrapy写爬虫(六)多个任务执行

mac2025-09-26  16

1、在spiders同级创建任意目录,如:commands

2、在其中创建 crawlall.py 文件 (此处文件名就是自定义的命令)

crawlall.py

from scrapy.commands import ScrapyCommand from scrapy.utils.project import get_project_settings class Command(ScrapyCommand): requires_project = True def syntax(self): return '[options]' def short_desc(self): return 'Runs all of the spiders' def run(self, args, opts): spider_list = self.crawler_process.spiders.list() for name in spider_list: self.crawler_process.crawl(name, **opts.__dict__) self.crawler_process.start()

3、到这里还没完,settings.py配置文件还需要加一条。

  COMMANDS_MODULE = ‘项目名称.目录名称’ 

COMMANDS_MODULE = 'zhihuuser.commands'

4.执行

scrapy crawlall

最新回复(0)