Skip to content

Commit

Permalink
updated readme
Browse files Browse the repository at this point in the history
  • Loading branch information
rmax committed Feb 17, 2013
1 parent 64a9164 commit 4352f85
Showing 1 changed file with 14 additions and 12 deletions.
26 changes: 14 additions & 12 deletions README → README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,16 +23,18 @@ Usage

In your settings.py:

# enables scheduling storing requests queue in redis
SCHEDULER = "scrapy_redis.scheduler.Scheduler"
.. code-block:: python
# don't cleanup redis queues, allows to pause/resume crawls
SCHEDULER_PERSIST = True
# enables scheduling storing requests queue in redis
SCHEDULER = "scrapy_redis.scheduler.Scheduler"
# store scraped item in redis for post-processing
ITEM_PIPELINES = [
'scrapy_redis.pipelines.RedisPipeline',
]
# don't cleanup redis queues, allows to pause/resume crawls
SCHEDULER_PERSIST = True
# store scraped item in redis for post-processing
ITEM_PIPELINES = [
'scrapy_redis.pipelines.RedisPipeline',
]
Running the example project
Expand All @@ -42,24 +44,24 @@ You can test the funcionality following the next steps:

1. Setup scrapy_redis package in your PYTHONPATH

2. Run the crawler for first time then stop it
2. Run the crawler for first time then stop it::

$ cd example-project
$ scrapy crawl dmoz
... [dmoz] ...
^C

3. Run the crawler again to resume stopped crawling
3. Run the crawler again to resume stopped crawling::

$ scrapy crawl dmoz
... [dmoz] DEBUG: Resuming crawl (9019 requests scheduled)

4. Start one or more additional scrapy crawlers
4. Start one or more additional scrapy crawlers::

$ scrapy crawl dmoz
... [dmoz] DEBUG: Resuming crawl (8712 requests scheduled)

5. Start one or more post-processing workers
5. Start one or more post-processing workers::

$ python process_items.py
Processing: Kilani Giftware (http://www.dmoz.org/Computers/Shopping/Gifts/)
Expand Down

0 comments on commit 4352f85

Please sign in to comment.