1
0
mirror of https://github.com/scrapy/scrapy.git synced 2025-02-23 14:04:22 +00:00

Merge pull request #2302 from Granitosaurus/pipeline_doc_fix

[MRG+1] Fix JsonWriterPipeline example in docs
This commit is contained in:
Mikhail Korobov 2016-10-18 16:20:59 +02:00 committed by GitHub
commit a5f4450313

View File

@ -106,9 +106,12 @@ format::
class JsonWriterPipeline(object):
def __init__(self):
def open_spider(self, spider):
self.file = open('items.jl', 'wb')
def close_spider(self, spider):
self.file.close()
def process_item(self, item, spider):
line = json.dumps(dict(item)) + "\n"
self.file.write(line)
@ -126,14 +129,7 @@ MongoDB address and database name are specified in Scrapy settings;
MongoDB collection is named after item class.
The main point of this example is to show how to use :meth:`from_crawler`
method and how to clean up the resources properly.
.. note::
Previous example (JsonWriterPipeline) doesn't clean up resources properly.
Fixing it is left as an exercise for the reader.
::
method and how to clean up the resources properly.::
import pymongo