1
0
mirror of https://github.com/scrapy/scrapy.git synced 2025-02-24 01:04:05 +00:00

DOC simplify extension docs

This commit is contained in:
Mikhail Korobov 2014-09-21 00:19:24 +06:00
parent a312ebfb43
commit e435b3e3a3
4 changed files with 13 additions and 18 deletions

View File

@ -51,8 +51,8 @@ particular setting. See each middleware documentation for more info.
Writing your own downloader middleware
======================================
Writing your own downloader middleware is easy. Each middleware component is a
single Python class that defines one or more of the following methods:
Each middleware component is a Python class that defines one or
more of the following methods:
.. module:: scrapy.contrib.downloadermiddleware

View File

@ -5,7 +5,7 @@ Extensions
==========
The extensions framework provides a mechanism for inserting your own
custom functionality into Scrapy.
custom functionality into Scrapy.
Extensions are just regular classes that are instantiated at Scrapy startup,
when extensions are initialized.
@ -75,14 +75,10 @@ included in the :setting:`EXTENSIONS_BASE` setting) you must set its order to
Writing your own extension
==========================
Writing your own extension is easy. Each extension is a single Python class
which doesn't need to implement any particular method.
The main entry point for a Scrapy extension (this also includes middlewares and
pipelines) is the ``from_crawler`` class method which receives a
``Crawler`` instance which is the main object controlling the Scrapy crawler.
Through that object you can access settings, signals, stats, and also control
the crawler behaviour, if your extension needs to such thing.
Each extension is a Python class. The main entry point for a Scrapy extension
(this also includes middlewares and pipelines) is the ``from_crawler``
class method which receives a ``Crawler`` instance. Through the Crawler object
you can access settings, signals, stats, and also control the crawling behaviour.
Typically, extensions connect to :ref:`signals <topics-signals>` and perform
tasks triggered by them.
@ -133,7 +129,7 @@ Here is the code of such extension::
crawler.signals.connect(ext.spider_closed, signal=signals.spider_closed)
crawler.signals.connect(ext.item_scraped, signal=signals.item_scraped)
# return the extension object
# return the extension object
return ext
def spider_opened(self, spider):
@ -183,12 +179,12 @@ Telnet console extension
~~~~~~~~~~~~~~~~~~~~~~~~
.. module:: scrapy.telnet
:synopsis: Telnet console
:synopsis: Telnet console
.. class:: scrapy.telnet.TelnetConsole
Provides a telnet console for getting into a Python interpreter inside the
currently running Scrapy process, which can be very useful for debugging.
currently running Scrapy process, which can be very useful for debugging.
The telnet console must be enabled by the :setting:`TELNETCONSOLE_ENABLED`
setting, and the server will listen in the port specified in

View File

@ -23,8 +23,7 @@ Typical use for item pipelines are:
Writing your own item pipeline
==============================
Writing your own item pipeline is easy. Each item pipeline component is a
single Python class that must implement the following method:
Each item pipeline component is a Python class that must implement the following method:
.. method:: process_item(item, spider)

View File

@ -52,8 +52,8 @@ particular setting. See each middleware documentation for more info.
Writing your own spider middleware
==================================
Writing your own spider middleware is easy. Each middleware component is a
single Python class that defines one or more of the following methods:
Each middleware component is a Python class that defines one or more of the
following methods:
.. module:: scrapy.contrib.spidermiddleware