2008-12-30 10:45:19 +00:00
|
|
|
.. _settings:
|
|
|
|
|
2009-01-02 19:48:31 +00:00
|
|
|
Available Settings
|
2008-12-29 11:38:34 +00:00
|
|
|
==================
|
|
|
|
|
2009-01-02 19:48:31 +00:00
|
|
|
Here's a list of all available Scrapy settings, in alphabetical order, along
|
|
|
|
with their default values and the scope where they apply.
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
The scope, where available, shows where the setting is being used, if it's tied
|
|
|
|
to any particular component. In that case the module of that component will be
|
|
|
|
shown, typically an extension, middleware or pipeline. It also means that the
|
|
|
|
component must be enabled in order for the setting to have any effect.
|
2008-12-16 11:53:20 +00:00
|
|
|
|
|
|
|
.. setting:: ADAPTORS_DEBUG
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
ADAPTORS_DEBUG
|
|
|
|
--------------
|
|
|
|
|
|
|
|
Default: ``False``
|
|
|
|
|
2009-04-10 05:35:53 +00:00
|
|
|
Enable debug mode for adaptors.
|
|
|
|
|
|
|
|
See :ref:`topics-adaptors`.
|
2008-12-29 11:38:34 +00:00
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: BOT_NAME
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
BOT_NAME
|
|
|
|
--------
|
|
|
|
|
|
|
|
Default: ``scrapybot``
|
|
|
|
|
|
|
|
The name of the bot implemented by this Scrapy project. This will be used to
|
|
|
|
construct the User-Agent by default, and also for logging.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: BOT_VERSION
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
BOT_VERSION
|
|
|
|
-----------
|
|
|
|
|
|
|
|
Default: ``1.0``
|
|
|
|
|
|
|
|
The version of the bot implemented by this Scrapy project. This will be used to
|
|
|
|
construct the User-Agent by default.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: CACHE2_DIR
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
CACHE2_DIR
|
|
|
|
----------
|
|
|
|
|
|
|
|
Default: ``''`` (empty string)
|
|
|
|
|
2009-01-13 14:43:38 +00:00
|
|
|
The directory to use for storing the low-level HTTP cache. If empty the HTTP
|
|
|
|
cache will be disabled.
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
.. setting:: CACHE2_EXPIRATION_SECS
|
|
|
|
|
|
|
|
CACHE2_EXPIRATION_SECS
|
|
|
|
----------------------
|
|
|
|
|
|
|
|
Default: ``0``
|
|
|
|
|
|
|
|
Number of seconds to use for cache expiration. Requests that were cached before
|
|
|
|
this time will be re-downloaded. If zero, cached requests will always expire.
|
|
|
|
Negative numbers means requests will never expire.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: CACHE2_IGNORE_MISSING
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
CACHE2_IGNORE_MISSING
|
|
|
|
---------------------
|
|
|
|
|
|
|
|
Default: ``False``
|
|
|
|
|
|
|
|
If enabled, requests not found in the cache will be ignored instead of downloaded.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: CACHE2_SECTORIZE
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
CACHE2_SECTORIZE
|
|
|
|
----------------
|
|
|
|
|
|
|
|
Default: ``True``
|
|
|
|
|
2009-01-11 06:34:38 +00:00
|
|
|
Whether to split HTTP cache storage in several dirs for performance improvements.
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: CLOSEDOMAIN_NOTIFY
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
CLOSEDOMAIN_NOTIFY
|
|
|
|
------------------
|
|
|
|
|
|
|
|
Default: ``[]``
|
|
|
|
Scope: ``scrapy.contrib.closedomain``
|
|
|
|
|
|
|
|
A list of emails to notify if the domain has been automatically closed by timeout.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: CLOSEDOMAIN_TIMEOUT
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
CLOSEDOMAIN_TIMEOUT
|
|
|
|
-------------------
|
|
|
|
|
|
|
|
Default: ``0``
|
|
|
|
Scope: ``scrapy.contrib.closedomain``
|
|
|
|
|
|
|
|
A timeout (in secs) for automatically closing a spider. Spiders that remain
|
|
|
|
open for more than this time will be automatically closed. If zero, the
|
|
|
|
automatically closing is disabled.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: CLUSTER_LOGDIR
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
CLUSTER_LOGDIR
|
|
|
|
--------------
|
|
|
|
|
|
|
|
Default: ``''`` (empty string)
|
|
|
|
|
|
|
|
The directory to use for cluster logging.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: CLUSTER_MASTER_CACHEFILE
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
CLUSTER_MASTER_CACHEFILE
|
|
|
|
------------------------
|
|
|
|
|
|
|
|
Default: ``''``
|
|
|
|
|
|
|
|
The file to use for storing the state of the cluster master, before shotting
|
|
|
|
down. And also used for restoring the state on start up. If not set, state
|
|
|
|
won't be persisted.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: CLUSTER_MASTER_ENABLED
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
CLUSTER_MASTER_ENABLED
|
|
|
|
------------------------
|
|
|
|
|
|
|
|
Default: ``False``
|
|
|
|
|
2009-01-30 23:20:21 +00:00
|
|
|
A boolean which specifies whether to enabled the cluster master.
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: CLUSTER_MASTER_NODES
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
CLUSTER_MASTER_NODES
|
|
|
|
--------------------
|
|
|
|
|
|
|
|
Default: ``{}``
|
|
|
|
|
|
|
|
A dict which defines the nodes of the cluster. The keys are the node/worker
|
2009-01-13 14:43:38 +00:00
|
|
|
names and the values are the worker URLs.
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
Example::
|
2008-12-30 15:04:59 +00:00
|
|
|
|
2008-12-30 13:28:36 +00:00
|
|
|
CLUSTER_MASTER_NODES = {
|
|
|
|
'local': 'localhost:8789',
|
|
|
|
'remote': 'someworker.example.com:8789',
|
|
|
|
}
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: CLUSTER_MASTER_POLL_INTERVAL
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
CLUSTER_MASTER_POLL_INTERVAL
|
|
|
|
----------------------------
|
|
|
|
|
|
|
|
Default: ``60``
|
|
|
|
|
|
|
|
The amount of time (in secs) that the master should wait before polling the
|
|
|
|
workers.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: CLUSTER_MASTER_PORT
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
CLUSTER_MASTER_PORT
|
|
|
|
-------------------
|
|
|
|
|
|
|
|
Default: ``8790``
|
|
|
|
|
|
|
|
The port where the cluster master will listen.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: CLUSTER_WORKER_ENABLED
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
CLUSTER_WORKER_ENABLED
|
|
|
|
------------------------
|
|
|
|
|
|
|
|
Default: ``False``
|
|
|
|
|
2009-01-30 23:20:21 +00:00
|
|
|
A boolean which specifies whether to enabled the cluster master.
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: CLUSTER_WORKER_MAXPROC
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
CLUSTER_WORKER_MAXPROC
|
|
|
|
------------------------
|
|
|
|
|
|
|
|
Default: ``4``
|
|
|
|
|
|
|
|
The maximum number of process that the cluster worker will be allowed to spawn.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: CLUSTER_WORKER_PORT
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
CLUSTER_WORKER_PORT
|
|
|
|
-------------------
|
|
|
|
|
|
|
|
Default: ``8789``
|
|
|
|
|
|
|
|
The port where the cluster worker will listen.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: COMMANDS_MODULE
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
COMMANDS_MODULE
|
|
|
|
---------------
|
|
|
|
|
2008-12-30 13:28:36 +00:00
|
|
|
Default: ``''`` (empty string)
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
A module to use for looking for custom Scrapy commands. This is used to add
|
|
|
|
custom command for your Scrapy project.
|
|
|
|
|
|
|
|
Example::
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
COMMANDS_MODULE = 'mybot.commands'
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: COMMANDS_SETTINGS_MODULE
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
COMMANDS_SETTINGS_MODULE
|
|
|
|
------------------------
|
|
|
|
|
2008-12-30 13:28:36 +00:00
|
|
|
Default: ``''`` (empty string)
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
A module to use for looking for custom Scrapy command settings.
|
|
|
|
|
|
|
|
Example::
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
COMMANDS_SETTINGS_MODULE = 'mybot.conf.commands'
|
|
|
|
|
2009-01-05 18:11:48 +00:00
|
|
|
.. setting:: CONCURRENT_DOMAINS
|
|
|
|
|
|
|
|
CONCURRENT_DOMAINS
|
|
|
|
------------------
|
|
|
|
|
|
|
|
Default: ``8``
|
|
|
|
|
|
|
|
Number of domains to scrape concurrently in one process. This doesn't affect
|
|
|
|
the number of domains scraped concurrently by the Scrapy cluster which spawns a
|
|
|
|
new process per domain.
|
|
|
|
|
2009-04-27 12:26:04 +00:00
|
|
|
.. setting:: COOKIES_DEBUG
|
|
|
|
|
|
|
|
COOKIES_DEBUG
|
|
|
|
-------------
|
|
|
|
|
|
|
|
Default: ``False``
|
|
|
|
|
|
|
|
Enable debugging message of Cookies Downloader Middleware.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: DEFAULT_ITEM_CLASS
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
DEFAULT_ITEM_CLASS
|
|
|
|
------------------
|
|
|
|
|
|
|
|
Default: ``'scrapy.item.ScrapedItem'``
|
|
|
|
|
2009-04-10 07:33:46 +00:00
|
|
|
The default class that will be used for instantiating items in the :ref:`the
|
|
|
|
Scrapy shell <topics-shell>`.
|
2008-12-29 11:38:34 +00:00
|
|
|
|
2009-01-05 18:11:48 +00:00
|
|
|
.. setting:: DEFAULT_SPIDER
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2009-01-05 18:11:48 +00:00
|
|
|
DEFAULT_SPIDER
|
|
|
|
--------------
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2009-01-06 16:42:59 +00:00
|
|
|
Default: ``None``
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2009-01-13 14:43:38 +00:00
|
|
|
The default spider class that will be instantiated for URLs for which no
|
2009-01-06 16:42:59 +00:00
|
|
|
specific spider is found. This class must have a constructor which receives as
|
2009-01-13 14:43:38 +00:00
|
|
|
only parameter the domain name of the given URL.
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: DEPTH_LIMIT
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
DEPTH_LIMIT
|
|
|
|
-----------
|
|
|
|
|
|
|
|
Default: ``0``
|
|
|
|
|
|
|
|
The maximum depth that will be allowed to crawl for any site. If zero, no limit
|
|
|
|
will be imposed.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: DEPTH_STATS
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
DEPTH_STATS
|
|
|
|
-----------
|
|
|
|
|
|
|
|
Default: ``True``
|
|
|
|
|
2009-01-11 06:34:38 +00:00
|
|
|
Whether to collect depth stats.
|
2008-12-29 11:38:34 +00:00
|
|
|
|
2009-01-14 00:19:22 +00:00
|
|
|
.. setting:: DOWNLOADER_DEBUG
|
|
|
|
|
|
|
|
DOWNLOADER_DEBUG
|
|
|
|
----------------
|
|
|
|
|
|
|
|
Default: ``False``
|
|
|
|
|
|
|
|
Whether to enable the Downloader debugging mode.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: DOWNLOADER_MIDDLEWARES
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
DOWNLOADER_MIDDLEWARES
|
|
|
|
----------------------
|
|
|
|
|
2008-12-30 15:04:59 +00:00
|
|
|
Default::
|
|
|
|
|
|
|
|
[
|
2009-01-03 07:41:43 +00:00
|
|
|
'scrapy.contrib.downloadermiddleware.robotstxt.RobotsTxtMiddleware',
|
2008-12-30 15:04:59 +00:00
|
|
|
'scrapy.contrib.downloadermiddleware.errorpages.ErrorPagesMiddleware',
|
|
|
|
'scrapy.contrib.downloadermiddleware.httpauth.HttpAuthMiddleware',
|
|
|
|
'scrapy.contrib.downloadermiddleware.useragent.UserAgentMiddleware',
|
|
|
|
'scrapy.contrib.downloadermiddleware.retry.RetryMiddleware',
|
|
|
|
'scrapy.contrib.downloadermiddleware.common.CommonMiddleware',
|
|
|
|
'scrapy.contrib.downloadermiddleware.redirect.RedirectMiddleware',
|
2009-04-22 20:03:56 +00:00
|
|
|
'scrapy.contrib.downloadermiddleware.cookies.CookiesMiddleware',
|
2009-02-25 09:53:44 +00:00
|
|
|
'scrapy.contrib.downloadermiddleware.httpcompression.HttpCompressionMiddleware',
|
2008-12-30 15:04:59 +00:00
|
|
|
'scrapy.contrib.downloadermiddleware.debug.CrawlDebug',
|
|
|
|
'scrapy.contrib.downloadermiddleware.stats.DownloaderStats',
|
|
|
|
'scrapy.contrib.downloadermiddleware.cache.CacheMiddleware',
|
|
|
|
]
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2009-01-13 14:43:38 +00:00
|
|
|
The list of enabled downloader middlewares. Keep in mind that some may need to
|
|
|
|
be enabled through a particular setting. The top (first) middleware is closer
|
|
|
|
to the engine, while the bottom (last) middleware is closer to the downloader.
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: DOWNLOADER_STATS
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
DOWNLOADER_STATS
|
|
|
|
----------------
|
|
|
|
|
|
|
|
Default: ``True``
|
|
|
|
|
2009-01-11 06:34:38 +00:00
|
|
|
Whether to enable downloader stats collection.
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2009-01-13 14:43:38 +00:00
|
|
|
.. setting:: DOWNLOAD_DELAY
|
|
|
|
|
|
|
|
DOWNLOAD_DELAY
|
|
|
|
--------------
|
|
|
|
|
|
|
|
Default: ``0``
|
|
|
|
|
|
|
|
The amount of time (in secs) that the downloader should wait before downloading
|
|
|
|
consecutive pages from the same spider. This can be used to throttle the
|
|
|
|
crawling speed to avoid hitting servers too hard. Decimal numbers are
|
|
|
|
supported. Example::
|
|
|
|
|
|
|
|
DOWNLOAD_DELAY = 0.25 # 250 ms of delay
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: DOWNLOAD_TIMEOUT
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2009-01-13 14:43:38 +00:00
|
|
|
DOWNLOAD_TIMEOUT
|
|
|
|
----------------
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
Default: ``180``
|
|
|
|
|
|
|
|
The amount of time (in secs) that the downloader will wait before timing out.
|
|
|
|
|
2009-03-01 11:14:30 +00:00
|
|
|
.. setting:: DUPEFILTER_FILTERCLASS
|
2009-02-12 07:07:13 +00:00
|
|
|
|
2009-03-01 11:14:30 +00:00
|
|
|
DUPEFILTER_FILTERCLASS
|
2009-02-12 07:07:13 +00:00
|
|
|
----------------------------
|
|
|
|
|
|
|
|
Default: ``scrapy.contrib.spidermiddleware.SimplePerDomainFilter``
|
|
|
|
|
|
|
|
The class used to detect and filter duplicated requests.
|
|
|
|
|
|
|
|
Default ``SimplePerDomainFilter`` filter based on request fingerprint and
|
|
|
|
grouping per domain.
|
|
|
|
|
2009-01-14 00:19:22 +00:00
|
|
|
.. setting:: ENGINE_DEBUG
|
|
|
|
|
|
|
|
ENGINE_DEBUG
|
|
|
|
------------
|
|
|
|
|
|
|
|
Default: ``False``
|
|
|
|
|
|
|
|
Whether to enable the Scrapy Engine debugging mode.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: ENABLED_SPIDERS_FILE
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
ENABLED_SPIDERS_FILE
|
|
|
|
--------------------
|
|
|
|
|
|
|
|
Default: ``''`` (empty string)
|
|
|
|
|
2009-04-10 07:33:46 +00:00
|
|
|
The path to a file containing a list of spiders (one domain name per line).
|
|
|
|
Those spiders will be considered enabled by Scrapy, and will be the spiders
|
|
|
|
crawled automatically when running ``scrapy-ctl.py crawl`` with no arguments.
|
|
|
|
|
|
|
|
If this setting is unset, all spiders to crawl must be passed explicitly in the
|
|
|
|
``crawl`` command.
|
|
|
|
|
|
|
|
Example::
|
|
|
|
|
|
|
|
'/etc/mybot/enabled_spiders.list'
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: EXTENSIONS
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
EXTENSIONS
|
|
|
|
----------
|
|
|
|
|
2008-12-30 15:04:59 +00:00
|
|
|
Default::
|
|
|
|
|
|
|
|
[
|
|
|
|
'scrapy.stats.corestats.CoreStats',
|
|
|
|
'scrapy.xpath.extension.ResponseLibxml2',
|
|
|
|
'scrapy.management.web.WebConsole',
|
|
|
|
'scrapy.management.telnet.TelnetConsole',
|
2009-01-13 02:49:50 +00:00
|
|
|
'scrapy.contrib.webconsole.scheduler.SchedulerQueue',
|
2008-12-30 15:04:59 +00:00
|
|
|
'scrapy.contrib.webconsole.livestats.LiveStats',
|
|
|
|
'scrapy.contrib.webconsole.spiderctl.Spiderctl',
|
|
|
|
'scrapy.contrib.webconsole.enginestatus.EngineStatus',
|
|
|
|
'scrapy.contrib.webconsole.stats.StatsDump',
|
|
|
|
'scrapy.contrib.webconsole.spiderstats.SpiderStats',
|
|
|
|
'scrapy.contrib.spider.reloader.SpiderReloader',
|
|
|
|
'scrapy.contrib.memusage.MemoryUsage',
|
|
|
|
'scrapy.contrib.memdebug.MemoryDebugger',
|
|
|
|
'scrapy.contrib.closedomain.CloseDomain',
|
2009-01-11 21:27:38 +00:00
|
|
|
'scrapy.contrib.debug.StackTraceDump',
|
2008-12-30 15:04:59 +00:00
|
|
|
'scrapy.contrib.response.soup.ResponseSoup',
|
|
|
|
]
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2009-01-11 06:34:38 +00:00
|
|
|
The list of available extensions. Keep in mind that some of them need need to
|
|
|
|
be enabled through a setting. By default, this setting contains all stable
|
|
|
|
built-in extensions.
|
|
|
|
|
|
|
|
For more information See the :ref:`extensions user guide <topics-extensions>`
|
|
|
|
and the :ref:`list of available extensions <ref-extensions>`.
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: GROUPSETTINGS_ENABLED
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
GROUPSETTINGS_ENABLED
|
|
|
|
---------------------
|
|
|
|
|
|
|
|
Default: ``False``
|
|
|
|
|
2009-01-11 06:34:38 +00:00
|
|
|
Whether to enable group settings where spiders pull their settings from.
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: GROUPSETTINGS_MODULE
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
GROUPSETTINGS_MODULE
|
|
|
|
--------------------
|
|
|
|
|
|
|
|
Default: ``''`` (empty string)
|
|
|
|
|
|
|
|
The module to use for pulling settings from, if the group settings is enabled.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: ITEM_PIPELINES
|
2008-12-29 11:38:34 +00:00
|
|
|
|
2008-12-30 13:28:36 +00:00
|
|
|
ITEM_PIPELINES
|
|
|
|
--------------
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
Default: ``[]``
|
|
|
|
|
|
|
|
The item pipelines to use (a list of classes).
|
|
|
|
|
|
|
|
Example::
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
ITEM_PIPELINES = [
|
|
|
|
'mybot.pipeline.validate.ValidateMyItem',
|
|
|
|
'mybot.pipeline.validate.StoreMyItem'
|
|
|
|
]
|
|
|
|
|
|
|
|
.. setting:: LOG_ENABLED
|
|
|
|
|
|
|
|
LOG_ENABLED
|
|
|
|
-----------
|
|
|
|
|
|
|
|
Default: ``True``
|
|
|
|
|
|
|
|
Enable logging.
|
|
|
|
|
|
|
|
.. setting:: LOG_STDOUT
|
|
|
|
|
|
|
|
LOG_STDOUT
|
|
|
|
----------
|
|
|
|
|
|
|
|
Default: ``False``
|
|
|
|
|
|
|
|
If enabled logging will be sent to standard output, otherwise standard error
|
|
|
|
will be used.
|
|
|
|
|
|
|
|
.. setting:: LOGFILE
|
|
|
|
|
|
|
|
LOGFILE
|
|
|
|
-------
|
|
|
|
|
|
|
|
Default: ``None``
|
|
|
|
|
|
|
|
File name to use for logging output. If None, standard input (or error) will be
|
|
|
|
used depending on the value of the LOG_STDOUT setting.
|
|
|
|
|
|
|
|
.. setting:: LOGLEVEL
|
|
|
|
|
|
|
|
LOGLEVEL
|
|
|
|
--------
|
|
|
|
|
|
|
|
Default: ``'DEBUG'``
|
|
|
|
|
|
|
|
Minimum level to log. Available levels are: SILENT, CRITICAL, ERROR, WARNING,
|
|
|
|
INFO, DEBUG, TRACE
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: MAIL_FROM
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
MAIL_FROM
|
|
|
|
---------
|
|
|
|
|
|
|
|
Default: ``'scrapy@localhost'``
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2009-01-11 19:49:11 +00:00
|
|
|
Email to use as sender address for sending emails using the :ref:`Scrapy e-mail
|
|
|
|
sending facility <ref-email>`.
|
2008-12-29 11:38:34 +00:00
|
|
|
|
2009-01-11 19:49:11 +00:00
|
|
|
.. setting:: MAIL_HOST
|
|
|
|
|
|
|
|
MAIL_HOST
|
|
|
|
---------
|
|
|
|
|
|
|
|
Default: ``'localhost'``
|
|
|
|
|
|
|
|
Host to use for sending emails using the :ref:`Scrapy e-mail sending facility
|
|
|
|
<ref-email>`.
|
2008-12-29 11:38:34 +00:00
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: MEMDEBUG_ENABLED
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
MEMDEBUG_ENABLED
|
|
|
|
----------------
|
|
|
|
|
|
|
|
Default: ``False``
|
|
|
|
|
2009-01-11 06:34:38 +00:00
|
|
|
Whether to enable memory debugging.
|
2008-12-29 11:38:34 +00:00
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: MEMDEBUG_NOTIFY
|
2008-12-29 11:38:34 +00:00
|
|
|
|
2009-01-11 06:34:38 +00:00
|
|
|
MEMDEBUG_NOTIFY
|
|
|
|
---------------
|
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
Default: ``[]``
|
|
|
|
|
2009-01-11 06:34:38 +00:00
|
|
|
When memory debugging is enabled a memory report will be sent to the specified
|
|
|
|
addresses if this setting is not empty, otherwise the report will be written to
|
|
|
|
the log.
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
Example::
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
MEMDEBUG_NOTIFY = ['user@example.com']
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: MEMUSAGE_ENABLED
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
MEMUSAGE_ENABLED
|
|
|
|
----------------
|
|
|
|
|
|
|
|
Default: ``False``
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
Scope: ``scrapy.contrib.memusage``
|
|
|
|
|
2009-01-11 06:34:38 +00:00
|
|
|
Whether to enable the memory usage extension that will shutdown the Scrapy
|
2008-12-29 11:38:34 +00:00
|
|
|
process when it exceeds a memory limit, and also notify by email when that
|
|
|
|
happened.
|
|
|
|
|
2009-01-11 06:34:38 +00:00
|
|
|
See :ref:`ref-extensions-memusage`.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: MEMUSAGE_LIMIT_MB
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
MEMUSAGE_LIMIT_MB
|
|
|
|
-----------------
|
|
|
|
|
|
|
|
Default: ``0``
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
Scope: ``scrapy.contrib.memusage``
|
|
|
|
|
|
|
|
The maximum amount of memory to allow (in megabytes) before shutting down
|
|
|
|
Scrapy (if MEMUSAGE_ENABLED is True). If zero, no check will be performed.
|
|
|
|
|
2009-01-11 06:34:38 +00:00
|
|
|
See :ref:`ref-extensions-memusage`.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: MEMUSAGE_NOTIFY_MAIL
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
MEMUSAGE_NOTIFY_MAIL
|
|
|
|
--------------------
|
|
|
|
|
|
|
|
Default: ``False``
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
Scope: ``scrapy.contrib.memusage``
|
|
|
|
|
|
|
|
A list of emails to notify if the memory limit has been reached.
|
|
|
|
|
|
|
|
Example::
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
MEMUSAGE_NOTIFY_MAIL = ['user@example.com']
|
|
|
|
|
2009-01-11 06:34:38 +00:00
|
|
|
See :ref:`ref-extensions-memusage`.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: MEMUSAGE_REPORT
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
MEMUSAGE_REPORT
|
|
|
|
---------------
|
|
|
|
|
|
|
|
Default: ``False``
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
Scope: ``scrapy.contrib.memusage``
|
|
|
|
|
2009-01-11 06:34:38 +00:00
|
|
|
Whether to send a memory usage report after each domain has been closed.
|
|
|
|
|
|
|
|
See :ref:`ref-extensions-memusage`.
|
2008-12-29 11:38:34 +00:00
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: MEMUSAGE_WARNING_MB
|
2008-12-29 11:38:34 +00:00
|
|
|
|
2009-01-11 06:34:38 +00:00
|
|
|
MEMUSAGE_WARNING_MB
|
|
|
|
-------------------
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
Default: ``0``
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
Scope: ``scrapy.contrib.memusage``
|
|
|
|
|
|
|
|
The maximum amount of memory to allow (in megabytes) before sending a warning
|
|
|
|
email notifying about it. If zero, no warning will be produced.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: MYSQL_CONNECTION_SETTINGS
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
MYSQL_CONNECTION_SETTINGS
|
|
|
|
-------------------------
|
|
|
|
|
|
|
|
Default: ``{}``
|
2009-01-03 07:41:43 +00:00
|
|
|
|
2008-12-30 13:28:36 +00:00
|
|
|
Scope: ``scrapy.utils.db.mysql_connect``
|
|
|
|
|
|
|
|
Settings to use for MySQL connections performed through
|
|
|
|
``scrapy.utils.db.mysql_connect``
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: NEWSPIDER_MODULE
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
NEWSPIDER_MODULE
|
|
|
|
----------------
|
|
|
|
|
|
|
|
Default: ``''``
|
|
|
|
|
2009-01-03 07:41:43 +00:00
|
|
|
Module where to create new spiders using the ``genspider`` command.
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
Example::
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
NEWSPIDER_MODULE = 'mybot.spiders_dev'
|
|
|
|
|
2009-02-08 21:00:26 +00:00
|
|
|
.. setting:: PROJECT_NAME
|
|
|
|
|
|
|
|
PROJECT_NAME
|
|
|
|
------------
|
|
|
|
|
|
|
|
Default: ``Not Defined``
|
|
|
|
|
|
|
|
The name of the current project. It matches the project module name as created
|
|
|
|
by ``startproject`` command, and is only defined by project settings file.
|
|
|
|
|
2009-01-12 00:53:37 +00:00
|
|
|
.. setting:: REQUEST_HEADER_ACCEPT
|
|
|
|
|
|
|
|
REQUEST_HEADER_ACCEPT
|
|
|
|
---------------------
|
|
|
|
|
|
|
|
Default: ``'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8'``
|
|
|
|
|
|
|
|
Default value to use for the ``Accept`` request header (if not already set
|
|
|
|
before).
|
|
|
|
|
|
|
|
See :ref:`ref-downloader-middleware-common`.
|
|
|
|
|
|
|
|
.. setting:: REQUEST_HEADER_ACCEPT_LANGUAGE
|
|
|
|
|
|
|
|
REQUEST_HEADER_ACCEPT_LANGUAGE
|
|
|
|
------------------------------
|
|
|
|
|
|
|
|
Default: ``'en'``
|
|
|
|
|
|
|
|
Default value to use for the ``Accept-Language`` request header, if not already
|
|
|
|
set before.
|
|
|
|
|
|
|
|
See :ref:`ref-downloader-middleware-common`.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: REQUESTS_QUEUE_SIZE
|
2008-12-29 11:38:34 +00:00
|
|
|
|
2009-01-03 07:41:43 +00:00
|
|
|
REQUESTS_PER_DOMAIN
|
|
|
|
-------------------
|
|
|
|
|
|
|
|
Default: ``8``
|
|
|
|
|
|
|
|
Specifies how many concurrent (ie. simultaneous) requests will be performed per
|
|
|
|
open spider.
|
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
REQUESTS_QUEUE_SIZE
|
|
|
|
-------------------
|
|
|
|
|
|
|
|
Default: ``0``
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
Scope: ``scrapy.contrib.spidermiddleware.limit``
|
|
|
|
|
|
|
|
If non zero, it will be used as an upper limit for the amount of requests that
|
|
|
|
can be scheduled per domain.
|
|
|
|
|
2009-01-03 07:41:43 +00:00
|
|
|
.. setting:: ROBOTSTXT_OBEY
|
|
|
|
|
|
|
|
ROBOTSTXT_OBEY
|
|
|
|
--------------
|
|
|
|
|
2009-01-29 18:22:59 +00:00
|
|
|
Default: ``False``
|
2009-01-03 07:41:43 +00:00
|
|
|
|
|
|
|
Scope: ``scrapy.contrib.downloadermiddleware.robotstxt``
|
|
|
|
|
|
|
|
If enabled, Scrapy will respect robots.txt policies. For more information see
|
|
|
|
:topic:`robotstxt`
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: SCHEDULER
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
SCHEDULER
|
|
|
|
---------
|
|
|
|
|
|
|
|
Default: ``'scrapy.core.scheduler.Scheduler'``
|
|
|
|
|
|
|
|
The scheduler to use for crawling.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: SCHEDULER_ORDER
|
2008-12-29 11:38:34 +00:00
|
|
|
|
2009-04-17 13:06:31 +00:00
|
|
|
SCHEDULER_ORDER
|
|
|
|
---------------
|
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
Default: ``'BFO'``
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
Scope: ``scrapy.core.scheduler``
|
|
|
|
|
2009-04-17 13:06:31 +00:00
|
|
|
The order to use for the crawling scheduler. Available orders are:
|
|
|
|
|
|
|
|
* ``'BFO'``: `Breadth-first order`_ - typically consumes more memory but
|
|
|
|
reaches most relevant pages earlier.
|
|
|
|
|
|
|
|
* ``'DFO'``: `Depth-first order`_ - typically consumes less memory than
|
|
|
|
but takes longer to reach most relevant pages.
|
|
|
|
|
|
|
|
.. _Breadth-first order: http://en.wikipedia.org/wiki/Breadth-first_search
|
|
|
|
.. _Depth-first order: http://en.wikipedia.org/wiki/Depth-first_search
|
2008-12-29 11:38:34 +00:00
|
|
|
|
2009-03-01 11:04:09 +00:00
|
|
|
.. setting:: SCHEDULER_MIDDLEWARES
|
|
|
|
|
|
|
|
SCHEDULER_MIDDLEWARES
|
|
|
|
----------------------
|
|
|
|
|
|
|
|
Default::
|
|
|
|
|
|
|
|
[
|
|
|
|
'scrapy.contrib.schedulermiddleware.duplicatesfilter.DuplicatesFilterMiddleware',
|
|
|
|
]
|
|
|
|
|
|
|
|
The list of enabled scheduler middlewares. Keep in mind that some may need to
|
|
|
|
be enabled through a particular setting. The top (first) middleware is closer
|
|
|
|
to the engine, while the bottom (last) middleware is closer to the scheduler.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: SPIDERPROFILER_ENABLED
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
SPIDERPROFILER_ENABLED
|
|
|
|
----------------------
|
|
|
|
|
|
|
|
Default: ``False``
|
|
|
|
|
|
|
|
Enable the spider profiler. Warning: this could have a big impact in
|
|
|
|
performance.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: SPIDER_MIDDLEWARES
|
2008-12-30 13:28:36 +00:00
|
|
|
|
|
|
|
SPIDER_MIDDLEWARES
|
|
|
|
------------------
|
|
|
|
|
2008-12-30 15:04:59 +00:00
|
|
|
Default::
|
|
|
|
|
|
|
|
[
|
|
|
|
'scrapy.contrib.itemsampler.ItemSamplerMiddleware',
|
|
|
|
'scrapy.contrib.spidermiddleware.limit.RequestLimitMiddleware',
|
|
|
|
'scrapy.contrib.spidermiddleware.restrict.RestrictMiddleware',
|
|
|
|
'scrapy.contrib.spidermiddleware.offsite.OffsiteMiddleware',
|
|
|
|
'scrapy.contrib.spidermiddleware.referer.RefererMiddleware',
|
|
|
|
'scrapy.contrib.spidermiddleware.urllength.UrlLengthMiddleware',
|
|
|
|
'scrapy.contrib.spidermiddleware.depth.DepthMiddleware',
|
|
|
|
]
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2009-01-13 14:43:38 +00:00
|
|
|
The list of enabled spider middlewares. Keep in mind that some may need to be
|
|
|
|
enabled through a particular setting. The top (first) middleware is closer to
|
|
|
|
the engine, while the bottom (last) middleware is closer to the spider.
|
2008-12-30 13:28:36 +00:00
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: SPIDER_MODULES
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
SPIDER_MODULES
|
|
|
|
--------------
|
|
|
|
|
|
|
|
Default: ``[]``
|
|
|
|
|
|
|
|
A list of modules where Scrapy will look for spiders.
|
|
|
|
|
|
|
|
Example::
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
SPIDER_MODULES = ['mybot.spiders_prod', 'mybot.spiders_dev']
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: STATS_CLEANUP
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
STATS_CLEANUP
|
|
|
|
-------------
|
|
|
|
|
|
|
|
Default: ``False``
|
|
|
|
|
|
|
|
Whether to cleanup (to save memory) the stats for a given domain,
|
|
|
|
when the domain is closed.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: STATS_DEBUG
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
STATS_DEBUG
|
|
|
|
-----------
|
|
|
|
|
|
|
|
Default: ``False``
|
|
|
|
|
|
|
|
Enable debugging mode for Scrapy stats. This logs the stats when a domain is
|
|
|
|
closed.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: STATS_ENABLED
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
STATS_ENABLED
|
|
|
|
-------------
|
|
|
|
|
|
|
|
Default: ``True``
|
|
|
|
|
|
|
|
Enable stats collection.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: TELNETCONSOLE_ENABLED
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
TELNETCONSOLE_ENABLED
|
|
|
|
---------------------
|
|
|
|
|
|
|
|
Default: ``True``
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
Scope: ``scrapy.management.telnet``
|
|
|
|
|
|
|
|
A boolean which specifies if the telnet management console will be enabled
|
|
|
|
(provided its extension is also enabled).
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: TELNETCONSOLE_PORT
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
TELNETCONSOLE_PORT
|
|
|
|
------------------
|
|
|
|
|
|
|
|
Default: ``None``
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
Scope: ``scrapy.management.telnet``
|
|
|
|
|
|
|
|
The port to use for the telnet console. If unset, a dynamically assigned port
|
|
|
|
is used.
|
|
|
|
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: TEMPLATES_DIR
|
2008-12-29 11:38:34 +00:00
|
|
|
|
2008-12-30 13:28:36 +00:00
|
|
|
TEMPLATES_DIR
|
|
|
|
-------------
|
|
|
|
|
|
|
|
Default: ``templates`` dir inside scrapy module
|
|
|
|
|
|
|
|
The directory where to look for template when creating new projects with
|
|
|
|
scrapy-admin.py newproject.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: URLLENGTH_LIMIT
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
URLLENGTH_LIMIT
|
|
|
|
---------------
|
|
|
|
|
|
|
|
Default: ``2083``
|
2008-12-30 10:45:19 +00:00
|
|
|
|
2008-12-29 11:38:34 +00:00
|
|
|
Scope: ``contrib.spidermiddleware.urllength``
|
|
|
|
|
|
|
|
The maximum URL length to allow for crawled URLs. For more information about
|
|
|
|
the default value for this setting see: http://www.boutell.com/newfaq/misc/urllength.html
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: USER_AGENT
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
USER_AGENT
|
|
|
|
----------
|
|
|
|
|
|
|
|
Default: ``"%s/%s" % (BOT_NAME, BOT_VERSION)``
|
|
|
|
|
|
|
|
The default User-Agent to use when crawling, unless overrided.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: WEBCONSOLE_ENABLED
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
WEBCONSOLE_ENABLED
|
|
|
|
------------------
|
|
|
|
|
|
|
|
Default: ``"%s/%s" % (BOT_NAME, BOT_VERSION)``
|
|
|
|
|
|
|
|
A boolean which specifies if the web management console will be enabled
|
|
|
|
(provided its extension is also enabled).
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: WEBCONSOLE_LOGFILE
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
WEBCONSOLE_LOGFILE
|
|
|
|
------------------
|
|
|
|
|
|
|
|
Default: ``None``
|
|
|
|
|
|
|
|
A file to use for logging HTTP requests made to the web console. If unset web
|
|
|
|
the log is sent to standard scrapy log.
|
|
|
|
|
2008-12-16 11:53:20 +00:00
|
|
|
.. setting:: WEBCONSOLE_PORT
|
2008-12-29 11:38:34 +00:00
|
|
|
|
|
|
|
WEBCONSOLE_PORT
|
|
|
|
---------------
|
|
|
|
|
|
|
|
Default: ``None``
|
|
|
|
|
|
|
|
The port to use for the web console. If unset, a dynamically assigned port is
|
|
|
|
used.
|