mirror of
https://github.com/scrapy/scrapy.git
synced 2025-02-27 12:44:55 +00:00
--HG-- rename : scrapy/trunk/AUTHORS => AUTHORS rename : scrapy/trunk/INSTALL => INSTALL rename : scrapy/trunk/LICENSE => LICENSE rename : scrapy/trunk/README => README rename : scrapy/trunk/bin/runtests.sh => bin/runtests.sh rename : scrapy/trunk/docs/Makefile => docs/Makefile rename : scrapy/trunk/docs/README => docs/README rename : scrapy/trunk/docs/_ext/scrapydocs.py => docs/_ext/scrapydocs.py rename : scrapy/trunk/docs/_static/items_adaptors-sample1.html => docs/_static/items_adaptors-sample1.html rename : scrapy/trunk/docs/_static/scrapydoc.css => docs/_static/scrapydoc.css rename : scrapy/trunk/docs/_static/selectors-sample1.html => docs/_static/selectors-sample1.html rename : scrapy/trunk/docs/conf.py => docs/conf.py rename : scrapy/trunk/docs/faq.rst => docs/faq.rst rename : scrapy/trunk/docs/index.rst => docs/index.rst rename : scrapy/trunk/docs/intro/index.rst => docs/intro/index.rst rename : scrapy/trunk/docs/intro/install.rst => docs/intro/install.rst rename : scrapy/trunk/docs/intro/overview.rst => docs/intro/overview.rst rename : scrapy/trunk/docs/intro/tutorial.rst => docs/intro/tutorial.rst rename : scrapy/trunk/docs/media/scrapy-architecture.dia => docs/media/scrapy-architecture.dia rename : scrapy/trunk/docs/misc/api-stability.rst => docs/misc/api-stability.rst rename : scrapy/trunk/docs/misc/index.rst => docs/misc/index.rst rename : scrapy/trunk/docs/proposed/_images/scrapy_architecture.odg => docs/proposed/_images/scrapy_architecture.odg rename : scrapy/trunk/docs/proposed/_images/scrapy_architecture.png => docs/proposed/_images/scrapy_architecture.png rename : scrapy/trunk/docs/proposed/index.rst => docs/proposed/index.rst rename : scrapy/trunk/docs/proposed/introduction.rst => docs/proposed/introduction.rst rename : scrapy/trunk/docs/proposed/newitem.rst => docs/proposed/newitem.rst rename : scrapy/trunk/docs/proposed/spiders.rst => docs/proposed/spiders.rst rename : scrapy/trunk/docs/ref/downloader-middleware.rst => docs/ref/downloader-middleware.rst rename : scrapy/trunk/docs/ref/email.rst => docs/ref/email.rst rename : scrapy/trunk/docs/ref/exceptions.rst => docs/ref/exceptions.rst rename : scrapy/trunk/docs/ref/extension-manager.rst => docs/ref/extension-manager.rst rename : scrapy/trunk/docs/ref/extensions.rst => docs/ref/extensions.rst rename : scrapy/trunk/docs/ref/index.rst => docs/ref/index.rst rename : scrapy/trunk/docs/ref/link-extractors.rst => docs/ref/link-extractors.rst rename : scrapy/trunk/docs/ref/logging.rst => docs/ref/logging.rst rename : scrapy/trunk/docs/ref/request-response.rst => docs/ref/request-response.rst rename : scrapy/trunk/docs/ref/selectors.rst => docs/ref/selectors.rst rename : scrapy/trunk/docs/ref/settings.rst => docs/ref/settings.rst rename : scrapy/trunk/docs/ref/signals.rst => docs/ref/signals.rst rename : scrapy/trunk/docs/ref/spiders.rst => docs/ref/spiders.rst rename : scrapy/trunk/docs/topics/_images/adaptors_diagram.png => docs/topics/_images/adaptors_diagram.png rename : scrapy/trunk/docs/topics/_images/adaptors_diagram.svg => docs/topics/_images/adaptors_diagram.svg rename : scrapy/trunk/docs/topics/_images/firebug1.png => docs/topics/_images/firebug1.png rename : scrapy/trunk/docs/topics/_images/firebug2.png => docs/topics/_images/firebug2.png rename : scrapy/trunk/docs/topics/_images/firebug3.png => docs/topics/_images/firebug3.png rename : scrapy/trunk/docs/topics/_images/scrapy_architecture.odg => docs/topics/_images/scrapy_architecture.odg rename : scrapy/trunk/docs/topics/_images/scrapy_architecture.png => docs/topics/_images/scrapy_architecture.png rename : scrapy/trunk/docs/topics/adaptors.rst => docs/topics/adaptors.rst rename : scrapy/trunk/docs/topics/architecture.rst => docs/topics/architecture.rst rename : scrapy/trunk/docs/topics/downloader-middleware.rst => docs/topics/downloader-middleware.rst rename : scrapy/trunk/docs/topics/extensions.rst => docs/topics/extensions.rst rename : scrapy/trunk/docs/topics/firebug.rst => docs/topics/firebug.rst rename : scrapy/trunk/docs/topics/firefox.rst => docs/topics/firefox.rst rename : scrapy/trunk/docs/topics/index.rst => docs/topics/index.rst rename : scrapy/trunk/docs/topics/item-pipeline.rst => docs/topics/item-pipeline.rst rename : scrapy/trunk/docs/topics/items.rst => docs/topics/items.rst rename : scrapy/trunk/docs/topics/link-extractors.rst => docs/topics/link-extractors.rst rename : scrapy/trunk/docs/topics/robotstxt.rst => docs/topics/robotstxt.rst rename : scrapy/trunk/docs/topics/selectors.rst => docs/topics/selectors.rst rename : scrapy/trunk/docs/topics/settings.rst => docs/topics/settings.rst rename : scrapy/trunk/docs/topics/shell.rst => docs/topics/shell.rst rename : scrapy/trunk/docs/topics/spider-middleware.rst => docs/topics/spider-middleware.rst rename : scrapy/trunk/docs/topics/spiders.rst => docs/topics/spiders.rst rename : scrapy/trunk/docs/topics/stats.rst => docs/topics/stats.rst rename : scrapy/trunk/docs/topics/webconsole.rst => docs/topics/webconsole.rst rename : scrapy/trunk/examples/experimental/googledir/googledir/__init__.py => examples/experimental/googledir/googledir/__init__.py rename : scrapy/trunk/examples/experimental/googledir/googledir/items.py => examples/experimental/googledir/googledir/items.py rename : scrapy/trunk/examples/experimental/googledir/googledir/pipelines.py => examples/experimental/googledir/googledir/pipelines.py rename : scrapy/trunk/examples/experimental/googledir/googledir/settings.py => examples/experimental/googledir/googledir/settings.py rename : scrapy/trunk/examples/experimental/googledir/googledir/spiders/__init__.py => examples/experimental/googledir/googledir/spiders/__init__.py rename : scrapy/trunk/examples/experimental/googledir/googledir/spiders/google_directory.py => examples/experimental/googledir/googledir/spiders/google_directory.py rename : scrapy/trunk/examples/experimental/googledir/googledir/templates/spider_basic.tmpl => examples/experimental/googledir/googledir/templates/spider_basic.tmpl rename : scrapy/trunk/examples/experimental/googledir/googledir/templates/spider_crawl.tmpl => examples/experimental/googledir/googledir/templates/spider_crawl.tmpl rename : scrapy/trunk/examples/experimental/googledir/googledir/templates/spider_csvfeed.tmpl => examples/experimental/googledir/googledir/templates/spider_csvfeed.tmpl rename : scrapy/trunk/examples/experimental/googledir/googledir/templates/spider_xmlfeed.tmpl => examples/experimental/googledir/googledir/templates/spider_xmlfeed.tmpl rename : scrapy/trunk/examples/experimental/googledir/scrapy-ctl.py => examples/experimental/googledir/scrapy-ctl.py rename : scrapy/trunk/examples/googledir/googledir/__init__.py => examples/googledir/googledir/__init__.py rename : scrapy/trunk/examples/googledir/googledir/items.py => examples/googledir/googledir/items.py rename : scrapy/trunk/examples/googledir/googledir/pipelines.py => examples/googledir/googledir/pipelines.py rename : scrapy/trunk/examples/googledir/googledir/settings.py => examples/googledir/googledir/settings.py rename : scrapy/trunk/examples/googledir/googledir/spiders/__init__.py => examples/googledir/googledir/spiders/__init__.py rename : scrapy/trunk/examples/googledir/googledir/spiders/google_directory.py => examples/googledir/googledir/spiders/google_directory.py rename : scrapy/trunk/examples/googledir/scrapy-ctl.py => examples/googledir/scrapy-ctl.py rename : scrapy/trunk/extras/sql/scraping.sql => extras/sql/scraping.sql rename : scrapy/trunk/profiling/priorityqueue/pq_classes.py => profiling/priorityqueue/pq_classes.py rename : scrapy/trunk/profiling/priorityqueue/run.py => profiling/priorityqueue/run.py rename : scrapy/trunk/profiling/priorityqueue/test_cases.py => profiling/priorityqueue/test_cases.py rename : scrapy/trunk/scrapy/__init__.py => scrapy/__init__.py rename : scrapy/trunk/scrapy/bin/scrapy-admin.py => scrapy/bin/scrapy-admin.py rename : scrapy/trunk/scrapy/command/__init__.py => scrapy/command/__init__.py rename : scrapy/trunk/scrapy/command/cmdline.py => scrapy/command/cmdline.py rename : scrapy/trunk/scrapy/command/commands/__init__.py => scrapy/command/commands/__init__.py rename : scrapy/trunk/scrapy/command/commands/crawl.py => scrapy/command/commands/crawl.py rename : scrapy/trunk/scrapy/command/commands/download.py => scrapy/command/commands/download.py rename : scrapy/trunk/scrapy/command/commands/genspider.py => scrapy/command/commands/genspider.py rename : scrapy/trunk/scrapy/command/commands/help.py => scrapy/command/commands/help.py rename : scrapy/trunk/scrapy/command/commands/list.py => scrapy/command/commands/list.py rename : scrapy/trunk/scrapy/command/commands/log.py => scrapy/command/commands/log.py rename : scrapy/trunk/scrapy/command/commands/parse.py => scrapy/command/commands/parse.py rename : scrapy/trunk/scrapy/command/commands/shell.py => scrapy/command/commands/shell.py rename : scrapy/trunk/scrapy/command/commands/start.py => scrapy/command/commands/start.py rename : scrapy/trunk/scrapy/command/commands/stats.py => scrapy/command/commands/stats.py rename : scrapy/trunk/scrapy/command/models.py => scrapy/command/models.py rename : scrapy/trunk/scrapy/conf/__init__.py => scrapy/conf/__init__.py rename : scrapy/trunk/scrapy/conf/commands/__init__.py => scrapy/conf/commands/__init__.py rename : scrapy/trunk/scrapy/conf/commands/crawl.py => scrapy/conf/commands/crawl.py rename : scrapy/trunk/scrapy/conf/commands/help.py => scrapy/conf/commands/help.py rename : scrapy/trunk/scrapy/conf/commands/list.py => scrapy/conf/commands/list.py rename : scrapy/trunk/scrapy/conf/commands/log.py => scrapy/conf/commands/log.py rename : scrapy/trunk/scrapy/conf/commands/scrape.py => scrapy/conf/commands/scrape.py rename : scrapy/trunk/scrapy/conf/commands/shell.py => scrapy/conf/commands/shell.py rename : scrapy/trunk/scrapy/conf/commands/stats.py => scrapy/conf/commands/stats.py rename : scrapy/trunk/scrapy/conf/commands/test.py => scrapy/conf/commands/test.py rename : scrapy/trunk/scrapy/conf/default_settings.py => scrapy/conf/default_settings.py rename : scrapy/trunk/scrapy/contrib/__init__.py => scrapy/contrib/__init__.py rename : scrapy/trunk/scrapy/contrib/aws.py => scrapy/contrib/aws.py rename : scrapy/trunk/scrapy/contrib/closedomain.py => scrapy/contrib/closedomain.py rename : scrapy/trunk/scrapy/contrib/cluster/__init__.py => scrapy/contrib/cluster/__init__.py rename : scrapy/trunk/scrapy/contrib/cluster/crawler/__init__.py => scrapy/contrib/cluster/crawler/__init__.py rename : scrapy/trunk/scrapy/contrib/cluster/crawler/manager.py => scrapy/contrib/cluster/crawler/manager.py rename : scrapy/trunk/scrapy/contrib/cluster/hooks/__init__.py => scrapy/contrib/cluster/hooks/__init__.py rename : scrapy/trunk/scrapy/contrib/cluster/hooks/svn.py => scrapy/contrib/cluster/hooks/svn.py rename : scrapy/trunk/scrapy/contrib/cluster/master/__init__.py => scrapy/contrib/cluster/master/__init__.py rename : scrapy/trunk/scrapy/contrib/cluster/master/manager.py => scrapy/contrib/cluster/master/manager.py rename : scrapy/trunk/scrapy/contrib/cluster/master/web.py => scrapy/contrib/cluster/master/web.py rename : scrapy/trunk/scrapy/contrib/cluster/master/ws_api.txt => scrapy/contrib/cluster/master/ws_api.txt rename : scrapy/trunk/scrapy/contrib/cluster/tools/scrapy-cluster-ctl.py => scrapy/contrib/cluster/tools/scrapy-cluster-ctl.py rename : scrapy/trunk/scrapy/contrib/cluster/tools/test-worker.py => scrapy/contrib/cluster/tools/test-worker.py rename : scrapy/trunk/scrapy/contrib/cluster/worker/__init__.py => scrapy/contrib/cluster/worker/__init__.py rename : scrapy/trunk/scrapy/contrib/cluster/worker/manager.py => scrapy/contrib/cluster/worker/manager.py rename : scrapy/trunk/scrapy/contrib/codecs/__init__.py => scrapy/contrib/codecs/__init__.py rename : scrapy/trunk/scrapy/contrib/codecs/x_mac_roman.py => scrapy/contrib/codecs/x_mac_roman.py rename : scrapy/trunk/scrapy/contrib/debug.py => scrapy/contrib/debug.py rename : scrapy/trunk/scrapy/contrib/delayedclosedomain.py => scrapy/contrib/delayedclosedomain.py rename : scrapy/trunk/scrapy/contrib/downloadermiddleware/__init__.py => scrapy/contrib/downloadermiddleware/__init__.py rename : scrapy/trunk/scrapy/contrib/downloadermiddleware/cache.py => scrapy/contrib/downloadermiddleware/cache.py rename : scrapy/trunk/scrapy/contrib/downloadermiddleware/common.py => scrapy/contrib/downloadermiddleware/common.py rename : scrapy/trunk/scrapy/contrib/downloadermiddleware/cookies.py => scrapy/contrib/downloadermiddleware/cookies.py rename : scrapy/trunk/scrapy/contrib/downloadermiddleware/debug.py => scrapy/contrib/downloadermiddleware/debug.py rename : scrapy/trunk/scrapy/contrib/downloadermiddleware/errorpages.py => scrapy/contrib/downloadermiddleware/errorpages.py rename : scrapy/trunk/scrapy/contrib/downloadermiddleware/httpauth.py => scrapy/contrib/downloadermiddleware/httpauth.py rename : scrapy/trunk/scrapy/contrib/downloadermiddleware/httpcompression.py => scrapy/contrib/downloadermiddleware/httpcompression.py rename : scrapy/trunk/scrapy/contrib/downloadermiddleware/redirect.py => scrapy/contrib/downloadermiddleware/redirect.py rename : scrapy/trunk/scrapy/contrib/downloadermiddleware/retry.py => scrapy/contrib/downloadermiddleware/retry.py rename : scrapy/trunk/scrapy/contrib/downloadermiddleware/robotstxt.py => scrapy/contrib/downloadermiddleware/robotstxt.py rename : scrapy/trunk/scrapy/contrib/downloadermiddleware/stats.py => scrapy/contrib/downloadermiddleware/stats.py rename : scrapy/trunk/scrapy/contrib/downloadermiddleware/useragent.py => scrapy/contrib/downloadermiddleware/useragent.py rename : scrapy/trunk/scrapy/contrib/groupsettings.py => scrapy/contrib/groupsettings.py rename : scrapy/trunk/scrapy/contrib/item/__init__.py => scrapy/contrib/item/__init__.py rename : scrapy/trunk/scrapy/contrib/item/models.py => scrapy/contrib/item/models.py rename : scrapy/trunk/scrapy/contrib/itemsampler.py => scrapy/contrib/itemsampler.py rename : scrapy/trunk/scrapy/contrib/link_extractors.py => scrapy/contrib/link_extractors.py rename : scrapy/trunk/scrapy/contrib/memdebug.py => scrapy/contrib/memdebug.py rename : scrapy/trunk/scrapy/contrib/memusage.py => scrapy/contrib/memusage.py rename : scrapy/trunk/scrapy/contrib/pipeline/__init__.py => scrapy/contrib/pipeline/__init__.py rename : scrapy/trunk/scrapy/contrib/pipeline/images.py => scrapy/contrib/pipeline/images.py rename : scrapy/trunk/scrapy/contrib/pipeline/media.py => scrapy/contrib/pipeline/media.py rename : scrapy/trunk/scrapy/contrib/pipeline/s3images.py => scrapy/contrib/pipeline/s3images.py rename : scrapy/trunk/scrapy/contrib/pipeline/show.py => scrapy/contrib/pipeline/show.py rename : scrapy/trunk/scrapy/contrib/prioritizers.py => scrapy/contrib/prioritizers.py rename : scrapy/trunk/scrapy/contrib/response/__init__.py => scrapy/contrib/response/__init__.py rename : scrapy/trunk/scrapy/contrib/response/soup.py => scrapy/contrib/response/soup.py rename : scrapy/trunk/scrapy/contrib/schedulermiddleware/__init__.py => scrapy/contrib/schedulermiddleware/__init__.py rename : scrapy/trunk/scrapy/contrib/schedulermiddleware/duplicatesfilter.py => scrapy/contrib/schedulermiddleware/duplicatesfilter.py rename : scrapy/trunk/scrapy/contrib/spider/__init__.py => scrapy/contrib/spider/__init__.py rename : scrapy/trunk/scrapy/contrib/spider/profiler.py => scrapy/contrib/spider/profiler.py rename : scrapy/trunk/scrapy/contrib/spider/reloader.py => scrapy/contrib/spider/reloader.py rename : scrapy/trunk/scrapy/contrib/spidermiddleware/__init__.py => scrapy/contrib/spidermiddleware/__init__.py rename : scrapy/trunk/scrapy/contrib/spidermiddleware/depth.py => scrapy/contrib/spidermiddleware/depth.py rename : scrapy/trunk/scrapy/contrib/spidermiddleware/duplicatesfilter.py => scrapy/contrib/spidermiddleware/duplicatesfilter.py rename : scrapy/trunk/scrapy/contrib/spidermiddleware/limit.py => scrapy/contrib/spidermiddleware/limit.py rename : scrapy/trunk/scrapy/contrib/spidermiddleware/offsite.py => scrapy/contrib/spidermiddleware/offsite.py rename : scrapy/trunk/scrapy/contrib/spidermiddleware/referer.py => scrapy/contrib/spidermiddleware/referer.py rename : scrapy/trunk/scrapy/contrib/spidermiddleware/restrict.py => scrapy/contrib/spidermiddleware/restrict.py rename : scrapy/trunk/scrapy/contrib/spidermiddleware/urlfilter.py => scrapy/contrib/spidermiddleware/urlfilter.py rename : scrapy/trunk/scrapy/contrib/spidermiddleware/urllength.py => scrapy/contrib/spidermiddleware/urllength.py rename : scrapy/trunk/scrapy/contrib/spiders/__init__.py => scrapy/contrib/spiders/__init__.py rename : scrapy/trunk/scrapy/contrib/spiders/crawl.py => scrapy/contrib/spiders/crawl.py rename : scrapy/trunk/scrapy/contrib/spiders/feed.py => scrapy/contrib/spiders/feed.py rename : scrapy/trunk/scrapy/contrib/spiders/generic.py => scrapy/contrib/spiders/generic.py rename : scrapy/trunk/scrapy/contrib/web/__init__.py => scrapy/contrib/web/__init__.py rename : scrapy/trunk/scrapy/contrib/web/http.py => scrapy/contrib/web/http.py rename : scrapy/trunk/scrapy/contrib/web/json.py => scrapy/contrib/web/json.py rename : scrapy/trunk/scrapy/contrib/web/service.py => scrapy/contrib/web/service.py rename : scrapy/trunk/scrapy/contrib/web/site.py => scrapy/contrib/web/site.py rename : scrapy/trunk/scrapy/contrib/web/stats.py => scrapy/contrib/web/stats.py rename : scrapy/trunk/scrapy/contrib/webconsole/__init__.py => scrapy/contrib/webconsole/__init__.py rename : scrapy/trunk/scrapy/contrib/webconsole/enginestatus.py => scrapy/contrib/webconsole/enginestatus.py rename : scrapy/trunk/scrapy/contrib/webconsole/livestats.py => scrapy/contrib/webconsole/livestats.py rename : scrapy/trunk/scrapy/contrib/webconsole/scheduler.py => scrapy/contrib/webconsole/scheduler.py rename : scrapy/trunk/scrapy/contrib/webconsole/spiderctl.py => scrapy/contrib/webconsole/spiderctl.py rename : scrapy/trunk/scrapy/contrib/webconsole/spiderstats.py => scrapy/contrib/webconsole/spiderstats.py rename : scrapy/trunk/scrapy/contrib/webconsole/stats.py => scrapy/contrib/webconsole/stats.py rename : scrapy/trunk/scrapy/contrib_exp/__init__.py => scrapy/contrib_exp/__init__.py rename : scrapy/trunk/scrapy/contrib_exp/adaptors/__init__.py => scrapy/contrib_exp/adaptors/__init__.py rename : scrapy/trunk/scrapy/contrib_exp/adaptors/date.py => scrapy/contrib_exp/adaptors/date.py rename : scrapy/trunk/scrapy/contrib_exp/adaptors/extraction.py => scrapy/contrib_exp/adaptors/extraction.py rename : scrapy/trunk/scrapy/contrib_exp/adaptors/markup.py => scrapy/contrib_exp/adaptors/markup.py rename : scrapy/trunk/scrapy/contrib_exp/adaptors/misc.py => scrapy/contrib_exp/adaptors/misc.py rename : scrapy/trunk/scrapy/contrib_exp/downloadermiddleware/__init__.py => scrapy/contrib_exp/downloadermiddleware/__init__.py rename : scrapy/trunk/scrapy/contrib_exp/downloadermiddleware/decompression.py => scrapy/contrib_exp/downloadermiddleware/decompression.py rename : scrapy/trunk/scrapy/contrib_exp/history/__init__.py => scrapy/contrib_exp/history/__init__.py rename : scrapy/trunk/scrapy/contrib_exp/history/history.py => scrapy/contrib_exp/history/history.py rename : scrapy/trunk/scrapy/contrib_exp/history/middleware.py => scrapy/contrib_exp/history/middleware.py rename : scrapy/trunk/scrapy/contrib_exp/history/scheduler.py => scrapy/contrib_exp/history/scheduler.py rename : scrapy/trunk/scrapy/contrib_exp/history/store.py => scrapy/contrib_exp/history/store.py rename : scrapy/trunk/scrapy/contrib_exp/link/__init__.py => scrapy/contrib_exp/link/__init__.py rename : scrapy/trunk/scrapy/contrib_exp/newitem/__init__.py => scrapy/contrib_exp/newitem/__init__.py rename : scrapy/trunk/scrapy/contrib_exp/newitem/adaptors.py => scrapy/contrib_exp/newitem/adaptors.py rename : scrapy/trunk/scrapy/contrib_exp/newitem/fields.py => scrapy/contrib_exp/newitem/fields.py rename : scrapy/trunk/scrapy/contrib_exp/newitem/models.py => scrapy/contrib_exp/newitem/models.py rename : scrapy/trunk/scrapy/contrib_exp/pipeline/shoveitem.py => scrapy/contrib_exp/pipeline/shoveitem.py rename : scrapy/trunk/scrapy/core/__init__.py => scrapy/core/__init__.py rename : scrapy/trunk/scrapy/core/downloader/__init__.py => scrapy/core/downloader/__init__.py rename : scrapy/trunk/scrapy/core/downloader/dnscache.py => scrapy/core/downloader/dnscache.py rename : scrapy/trunk/scrapy/core/downloader/handlers.py => scrapy/core/downloader/handlers.py rename : scrapy/trunk/scrapy/core/downloader/manager.py => scrapy/core/downloader/manager.py rename : scrapy/trunk/scrapy/core/downloader/middleware.py => scrapy/core/downloader/middleware.py rename : scrapy/trunk/scrapy/core/downloader/responsetypes/__init__.py => scrapy/core/downloader/responsetypes/__init__.py rename : scrapy/trunk/scrapy/core/downloader/responsetypes/mime.types => scrapy/core/downloader/responsetypes/mime.types rename : scrapy/trunk/scrapy/core/downloader/webclient.py => scrapy/core/downloader/webclient.py rename : scrapy/trunk/scrapy/core/engine.py => scrapy/core/engine.py rename : scrapy/trunk/scrapy/core/exceptions.py => scrapy/core/exceptions.py rename : scrapy/trunk/scrapy/core/manager.py => scrapy/core/manager.py rename : scrapy/trunk/scrapy/core/prioritizers.py => scrapy/core/prioritizers.py rename : scrapy/trunk/scrapy/core/scheduler/__init__.py => scrapy/core/scheduler/__init__.py rename : scrapy/trunk/scrapy/core/scheduler/middleware.py => scrapy/core/scheduler/middleware.py rename : scrapy/trunk/scrapy/core/scheduler/schedulers.py => scrapy/core/scheduler/schedulers.py rename : scrapy/trunk/scrapy/core/scheduler/store.py => scrapy/core/scheduler/store.py rename : scrapy/trunk/scrapy/core/signals.py => scrapy/core/signals.py rename : scrapy/trunk/scrapy/dupefilter/__init__.py => scrapy/dupefilter/__init__.py rename : scrapy/trunk/scrapy/extension/__init__.py => scrapy/extension/__init__.py rename : scrapy/trunk/scrapy/fetcher/__init__.py => scrapy/fetcher/__init__.py rename : scrapy/trunk/scrapy/http/__init__.py => scrapy/http/__init__.py rename : scrapy/trunk/scrapy/http/cookies.py => scrapy/http/cookies.py rename : scrapy/trunk/scrapy/http/headers.py => scrapy/http/headers.py rename : scrapy/trunk/scrapy/http/request/__init__.py => scrapy/http/request/__init__.py rename : scrapy/trunk/scrapy/http/request/form.py => scrapy/http/request/form.py rename : scrapy/trunk/scrapy/http/request/rpc.py => scrapy/http/request/rpc.py rename : scrapy/trunk/scrapy/http/response/__init__.py => scrapy/http/response/__init__.py rename : scrapy/trunk/scrapy/http/response/html.py => scrapy/http/response/html.py rename : scrapy/trunk/scrapy/http/response/text.py => scrapy/http/response/text.py rename : scrapy/trunk/scrapy/http/response/xml.py => scrapy/http/response/xml.py rename : scrapy/trunk/scrapy/http/url.py => scrapy/http/url.py rename : scrapy/trunk/scrapy/item/__init__.py => scrapy/item/__init__.py rename : scrapy/trunk/scrapy/item/adaptors.py => scrapy/item/adaptors.py rename : scrapy/trunk/scrapy/item/models.py => scrapy/item/models.py rename : scrapy/trunk/scrapy/item/pipeline.py => scrapy/item/pipeline.py rename : scrapy/trunk/scrapy/link/__init__.py => scrapy/link/__init__.py rename : scrapy/trunk/scrapy/link/extractors.py => scrapy/link/extractors.py rename : scrapy/trunk/scrapy/log/__init__.py => scrapy/log/__init__.py rename : scrapy/trunk/scrapy/mail/__init__.py => scrapy/mail/__init__.py rename : scrapy/trunk/scrapy/management/__init__.py => scrapy/management/__init__.py rename : scrapy/trunk/scrapy/management/telnet.py => scrapy/management/telnet.py rename : scrapy/trunk/scrapy/management/web.py => scrapy/management/web.py rename : scrapy/trunk/scrapy/patches/__init__.py => scrapy/patches/__init__.py rename : scrapy/trunk/scrapy/patches/monkeypatches.py => scrapy/patches/monkeypatches.py rename : scrapy/trunk/scrapy/spider/__init__.py => scrapy/spider/__init__.py rename : scrapy/trunk/scrapy/spider/manager.py => scrapy/spider/manager.py rename : scrapy/trunk/scrapy/spider/middleware.py => scrapy/spider/middleware.py rename : scrapy/trunk/scrapy/spider/models.py => scrapy/spider/models.py rename : scrapy/trunk/scrapy/stats/__init__.py => scrapy/stats/__init__.py rename : scrapy/trunk/scrapy/stats/corestats.py => scrapy/stats/corestats.py rename : scrapy/trunk/scrapy/stats/statscollector.py => scrapy/stats/statscollector.py rename : scrapy/trunk/scrapy/store/__init__.py => scrapy/store/__init__.py rename : scrapy/trunk/scrapy/store/db.py => scrapy/store/db.py rename : scrapy/trunk/scrapy/templates/project/module/__init__.py => scrapy/templates/project/module/__init__.py rename : scrapy/trunk/scrapy/templates/project/module/items.py.tmpl => scrapy/templates/project/module/items.py.tmpl rename : scrapy/trunk/scrapy/templates/project/module/pipelines.py.tmpl => scrapy/templates/project/module/pipelines.py.tmpl rename : scrapy/trunk/scrapy/templates/project/module/settings.py.tmpl => scrapy/templates/project/module/settings.py.tmpl rename : scrapy/trunk/scrapy/templates/project/module/spiders/__init__.py => scrapy/templates/project/module/spiders/__init__.py rename : scrapy/trunk/scrapy/templates/project/module/templates/spider_basic.tmpl => scrapy/templates/project/module/templates/spider_basic.tmpl rename : scrapy/trunk/scrapy/templates/project/module/templates/spider_crawl.tmpl => scrapy/templates/project/module/templates/spider_crawl.tmpl rename : scrapy/trunk/scrapy/templates/project/module/templates/spider_csvfeed.tmpl => scrapy/templates/project/module/templates/spider_csvfeed.tmpl rename : scrapy/trunk/scrapy/templates/project/module/templates/spider_xmlfeed.tmpl => scrapy/templates/project/module/templates/spider_xmlfeed.tmpl rename : scrapy/trunk/scrapy/templates/project/root/scrapy-ctl.py => scrapy/templates/project/root/scrapy-ctl.py rename : scrapy/trunk/scrapy/tests/__init__.py => scrapy/tests/__init__.py rename : scrapy/trunk/scrapy/tests/run.py => scrapy/tests/run.py rename : scrapy/trunk/scrapy/tests/sample_data/adaptors/enc-ascii.html => scrapy/tests/sample_data/adaptors/enc-ascii.html rename : scrapy/trunk/scrapy/tests/sample_data/adaptors/enc-cp1252.html => scrapy/tests/sample_data/adaptors/enc-cp1252.html rename : scrapy/trunk/scrapy/tests/sample_data/adaptors/enc-latin1.html => scrapy/tests/sample_data/adaptors/enc-latin1.html rename : scrapy/trunk/scrapy/tests/sample_data/adaptors/enc-utf8-meta-latin1.html => scrapy/tests/sample_data/adaptors/enc-utf8-meta-latin1.html rename : scrapy/trunk/scrapy/tests/sample_data/adaptors/enc-utf8.html => scrapy/tests/sample_data/adaptors/enc-utf8.html rename : scrapy/trunk/scrapy/tests/sample_data/adaptors/extr_unquoted.xml => scrapy/tests/sample_data/adaptors/extr_unquoted.xml rename : scrapy/trunk/scrapy/tests/sample_data/compressed/feed-sample1.tar => scrapy/tests/sample_data/compressed/feed-sample1.tar rename : scrapy/trunk/scrapy/tests/sample_data/compressed/feed-sample1.xml => scrapy/tests/sample_data/compressed/feed-sample1.xml rename : scrapy/trunk/scrapy/tests/sample_data/compressed/feed-sample1.xml.bz2 => scrapy/tests/sample_data/compressed/feed-sample1.xml.bz2 rename : scrapy/trunk/scrapy/tests/sample_data/compressed/feed-sample1.xml.gz => scrapy/tests/sample_data/compressed/feed-sample1.xml.gz rename : scrapy/trunk/scrapy/tests/sample_data/compressed/feed-sample1.zip => scrapy/tests/sample_data/compressed/feed-sample1.zip rename : scrapy/trunk/scrapy/tests/sample_data/compressed/html-gzip.bin => scrapy/tests/sample_data/compressed/html-gzip.bin rename : scrapy/trunk/scrapy/tests/sample_data/compressed/html-rawdeflate.bin => scrapy/tests/sample_data/compressed/html-rawdeflate.bin rename : scrapy/trunk/scrapy/tests/sample_data/compressed/html-zlibdeflate.bin => scrapy/tests/sample_data/compressed/html-zlibdeflate.bin rename : scrapy/trunk/scrapy/tests/sample_data/feeds/feed-sample1.xml => scrapy/tests/sample_data/feeds/feed-sample1.xml rename : scrapy/trunk/scrapy/tests/sample_data/feeds/feed-sample2.xml => scrapy/tests/sample_data/feeds/feed-sample2.xml rename : scrapy/trunk/scrapy/tests/sample_data/feeds/feed-sample3.csv => scrapy/tests/sample_data/feeds/feed-sample3.csv rename : scrapy/trunk/scrapy/tests/sample_data/feeds/feed-sample4.csv => scrapy/tests/sample_data/feeds/feed-sample4.csv rename : scrapy/trunk/scrapy/tests/sample_data/feeds/feed-sample5.csv => scrapy/tests/sample_data/feeds/feed-sample5.csv rename : scrapy/trunk/scrapy/tests/sample_data/link_extractor/image_linkextractor.html => scrapy/tests/sample_data/link_extractor/image_linkextractor.html rename : scrapy/trunk/scrapy/tests/sample_data/link_extractor/linkextractor_latin1.html => scrapy/tests/sample_data/link_extractor/linkextractor_latin1.html rename : scrapy/trunk/scrapy/tests/sample_data/link_extractor/linkextractor_noenc.html => scrapy/tests/sample_data/link_extractor/linkextractor_noenc.html rename : scrapy/trunk/scrapy/tests/sample_data/link_extractor/regex_linkextractor.html => scrapy/tests/sample_data/link_extractor/regex_linkextractor.html rename : scrapy/trunk/scrapy/tests/sample_data/test_site/index.html => scrapy/tests/sample_data/test_site/index.html rename : scrapy/trunk/scrapy/tests/sample_data/test_site/item1.html => scrapy/tests/sample_data/test_site/item1.html rename : scrapy/trunk/scrapy/tests/sample_data/test_site/item2.html => scrapy/tests/sample_data/test_site/item2.html rename : scrapy/trunk/scrapy/tests/test_adaptors.py => scrapy/tests/test_adaptors.py rename : scrapy/trunk/scrapy/tests/test_aws.py => scrapy/tests/test_aws.py rename : scrapy/trunk/scrapy/tests/test_c14nurls.py => scrapy/tests/test_c14nurls.py rename : scrapy/trunk/scrapy/tests/test_contrib_response_soup.py => scrapy/tests/test_contrib_response_soup.py rename : scrapy/trunk/scrapy/tests/test_dependencies.py => scrapy/tests/test_dependencies.py rename : scrapy/trunk/scrapy/tests/test_downloadermiddleware_cookies.py => scrapy/tests/test_downloadermiddleware_cookies.py rename : scrapy/trunk/scrapy/tests/test_downloadermiddleware_decompression.py => scrapy/tests/test_downloadermiddleware_decompression.py rename : scrapy/trunk/scrapy/tests/test_downloadermiddleware_httpcompression.py => scrapy/tests/test_downloadermiddleware_httpcompression.py rename : scrapy/trunk/scrapy/tests/test_downloadermiddleware_redirect.py => scrapy/tests/test_downloadermiddleware_redirect.py rename : scrapy/trunk/scrapy/tests/test_downloadermiddleware_retry.py => scrapy/tests/test_downloadermiddleware_retry.py rename : scrapy/trunk/scrapy/tests/test_downloadermiddleware_useragent.py => scrapy/tests/test_downloadermiddleware_useragent.py rename : scrapy/trunk/scrapy/tests/test_dupefilter.py => scrapy/tests/test_dupefilter.py rename : scrapy/trunk/scrapy/tests/test_engine.py => scrapy/tests/test_engine.py rename : scrapy/trunk/scrapy/tests/test_http_cookies.py => scrapy/tests/test_http_cookies.py rename : scrapy/trunk/scrapy/tests/test_http_headers.py => scrapy/tests/test_http_headers.py rename : scrapy/trunk/scrapy/tests/test_http_request.py => scrapy/tests/test_http_request.py rename : scrapy/trunk/scrapy/tests/test_http_response.py => scrapy/tests/test_http_response.py rename : scrapy/trunk/scrapy/tests/test_http_url.py => scrapy/tests/test_http_url.py rename : scrapy/trunk/scrapy/tests/test_item.py => scrapy/tests/test_item.py rename : scrapy/trunk/scrapy/tests/test_itemadaptor.py => scrapy/tests/test_itemadaptor.py rename : scrapy/trunk/scrapy/tests/test_libxml2.py => scrapy/tests/test_libxml2.py rename : scrapy/trunk/scrapy/tests/test_link.py => scrapy/tests/test_link.py rename : scrapy/trunk/scrapy/tests/test_newitem.py => scrapy/tests/test_newitem.py rename : scrapy/trunk/scrapy/tests/test_pipeline_images.py => scrapy/tests/test_pipeline_images.py rename : scrapy/trunk/scrapy/tests/test_responsetypes.py => scrapy/tests/test_responsetypes.py rename : scrapy/trunk/scrapy/tests/test_robustscrapeditem.py => scrapy/tests/test_robustscrapeditem.py rename : scrapy/trunk/scrapy/tests/test_schedulermiddleware_duplicatesfilter.py => scrapy/tests/test_schedulermiddleware_duplicatesfilter.py rename : scrapy/trunk/scrapy/tests/test_serialization.py => scrapy/tests/test_serialization.py rename : scrapy/trunk/scrapy/tests/test_spidermiddleware_duplicatesfilter.py => scrapy/tests/test_spidermiddleware_duplicatesfilter.py rename : scrapy/trunk/scrapy/tests/test_spidermonkey.py => scrapy/tests/test_spidermonkey.py rename : scrapy/trunk/scrapy/tests/test_spiders/__init__.py => scrapy/tests/test_spiders/__init__.py rename : scrapy/trunk/scrapy/tests/test_spiders/testspider.py => scrapy/tests/test_spiders/testspider.py rename : scrapy/trunk/scrapy/tests/test_stats.py => scrapy/tests/test_stats.py rename : scrapy/trunk/scrapy/tests/test_storedb.py => scrapy/tests/test_storedb.py rename : scrapy/trunk/scrapy/tests/test_utils_datatypes.py => scrapy/tests/test_utils_datatypes.py rename : scrapy/trunk/scrapy/tests/test_utils_defer.py => scrapy/tests/test_utils_defer.py rename : scrapy/trunk/scrapy/tests/test_utils_iterators.py => scrapy/tests/test_utils_iterators.py rename : scrapy/trunk/scrapy/tests/test_utils_markup.py => scrapy/tests/test_utils_markup.py rename : scrapy/trunk/scrapy/tests/test_utils_misc.py => scrapy/tests/test_utils_misc.py rename : scrapy/trunk/scrapy/tests/test_utils_python.py => scrapy/tests/test_utils_python.py rename : scrapy/trunk/scrapy/tests/test_utils_request.py => scrapy/tests/test_utils_request.py rename : scrapy/trunk/scrapy/tests/test_utils_response.py => scrapy/tests/test_utils_response.py rename : scrapy/trunk/scrapy/tests/test_utils_url.py => scrapy/tests/test_utils_url.py rename : scrapy/trunk/scrapy/tests/test_webclient.py => scrapy/tests/test_webclient.py rename : scrapy/trunk/scrapy/tests/test_xpath.py => scrapy/tests/test_xpath.py rename : scrapy/trunk/scrapy/tests/test_xpath_extension.py => scrapy/tests/test_xpath_extension.py rename : scrapy/trunk/scrapy/utils/__init__.py => scrapy/utils/__init__.py rename : scrapy/trunk/scrapy/utils/c14n.py => scrapy/utils/c14n.py rename : scrapy/trunk/scrapy/utils/datatypes.py => scrapy/utils/datatypes.py rename : scrapy/trunk/scrapy/utils/db.py => scrapy/utils/db.py rename : scrapy/trunk/scrapy/utils/defer.py => scrapy/utils/defer.py rename : scrapy/trunk/scrapy/utils/display.py => scrapy/utils/display.py rename : scrapy/trunk/scrapy/utils/http.py => scrapy/utils/http.py rename : scrapy/trunk/scrapy/utils/iterators.py => scrapy/utils/iterators.py rename : scrapy/trunk/scrapy/utils/markup.py => scrapy/utils/markup.py rename : scrapy/trunk/scrapy/utils/misc.py => scrapy/utils/misc.py rename : scrapy/trunk/scrapy/utils/python.py => scrapy/utils/python.py rename : scrapy/trunk/scrapy/utils/request.py => scrapy/utils/request.py rename : scrapy/trunk/scrapy/utils/response.py => scrapy/utils/response.py rename : scrapy/trunk/scrapy/utils/serialization.py => scrapy/utils/serialization.py rename : scrapy/trunk/scrapy/utils/test.py => scrapy/utils/test.py rename : scrapy/trunk/scrapy/utils/url.py => scrapy/utils/url.py rename : scrapy/trunk/scrapy/xlib/BeautifulSoup.py => scrapy/xlib/BeautifulSoup.py rename : scrapy/trunk/scrapy/xlib/ClientForm.py => scrapy/xlib/ClientForm.py rename : scrapy/trunk/scrapy/xlib/__init__.py => scrapy/xlib/__init__.py rename : scrapy/trunk/scrapy/xlib/lrucache.py => scrapy/xlib/lrucache.py rename : scrapy/trunk/scrapy/xlib/lsprofcalltree.py => scrapy/xlib/lsprofcalltree.py rename : scrapy/trunk/scrapy/xlib/pydispatch/__init__.py => scrapy/xlib/pydispatch/__init__.py rename : scrapy/trunk/scrapy/xlib/pydispatch/dispatcher.py => scrapy/xlib/pydispatch/dispatcher.py rename : scrapy/trunk/scrapy/xlib/pydispatch/errors.py => scrapy/xlib/pydispatch/errors.py rename : scrapy/trunk/scrapy/xlib/pydispatch/license.txt => scrapy/xlib/pydispatch/license.txt rename : scrapy/trunk/scrapy/xlib/pydispatch/robust.py => scrapy/xlib/pydispatch/robust.py rename : scrapy/trunk/scrapy/xlib/pydispatch/robustapply.py => scrapy/xlib/pydispatch/robustapply.py rename : scrapy/trunk/scrapy/xlib/pydispatch/saferef.py => scrapy/xlib/pydispatch/saferef.py rename : scrapy/trunk/scrapy/xlib/spidermonkey/INSTALL.scrapy => scrapy/xlib/spidermonkey/INSTALL.scrapy rename : scrapy/trunk/scrapy/xlib/spidermonkey/__init__.py => scrapy/xlib/spidermonkey/__init__.py rename : scrapy/trunk/scrapy/xlib/spidermonkey/sm_settings.py => scrapy/xlib/spidermonkey/sm_settings.py rename : scrapy/trunk/scrapy/xlib/spidermonkey/spidermonkey.py => scrapy/xlib/spidermonkey/spidermonkey.py rename : scrapy/trunk/scrapy/xpath/__init__.py => scrapy/xpath/__init__.py rename : scrapy/trunk/scrapy/xpath/constructors.py => scrapy/xpath/constructors.py rename : scrapy/trunk/scrapy/xpath/document.py => scrapy/xpath/document.py rename : scrapy/trunk/scrapy/xpath/extension.py => scrapy/xpath/extension.py rename : scrapy/trunk/scrapy/xpath/selector.py => scrapy/xpath/selector.py rename : scrapy/trunk/scrapy/xpath/types.py => scrapy/xpath/types.py rename : scrapy/trunk/scripts/rpm-install.sh => scripts/rpm-install.sh rename : scrapy/trunk/setup.cfg => setup.cfg rename : scrapy/trunk/setup.py => setup.py
905 lines
19 KiB
ReStructuredText
905 lines
19 KiB
ReStructuredText
.. _settings:
|
|
|
|
Available Settings
|
|
==================
|
|
|
|
Here's a list of all available Scrapy settings, in alphabetical order, along
|
|
with their default values and the scope where they apply.
|
|
|
|
The scope, where available, shows where the setting is being used, if it's tied
|
|
to any particular component. In that case the module of that component will be
|
|
shown, typically an extension, middleware or pipeline. It also means that the
|
|
component must be enabled in order for the setting to have any effect.
|
|
|
|
.. setting:: ADAPTORS_DEBUG
|
|
|
|
ADAPTORS_DEBUG
|
|
--------------
|
|
|
|
Default: ``False``
|
|
|
|
Enable debug mode for adaptors.
|
|
|
|
See :ref:`topics-adaptors`.
|
|
|
|
.. setting:: BOT_NAME
|
|
|
|
BOT_NAME
|
|
--------
|
|
|
|
Default: ``scrapybot``
|
|
|
|
The name of the bot implemented by this Scrapy project. This will be used to
|
|
construct the User-Agent by default, and also for logging.
|
|
|
|
.. setting:: BOT_VERSION
|
|
|
|
BOT_VERSION
|
|
-----------
|
|
|
|
Default: ``1.0``
|
|
|
|
The version of the bot implemented by this Scrapy project. This will be used to
|
|
construct the User-Agent by default.
|
|
|
|
.. setting:: CACHE2_DIR
|
|
|
|
CACHE2_DIR
|
|
----------
|
|
|
|
Default: ``''`` (empty string)
|
|
|
|
The directory to use for storing the low-level HTTP cache. If empty the HTTP
|
|
cache will be disabled.
|
|
|
|
.. setting:: CACHE2_EXPIRATION_SECS
|
|
|
|
CACHE2_EXPIRATION_SECS
|
|
----------------------
|
|
|
|
Default: ``0``
|
|
|
|
Number of seconds to use for cache expiration. Requests that were cached before
|
|
this time will be re-downloaded. If zero, cached requests will always expire.
|
|
Negative numbers means requests will never expire.
|
|
|
|
.. setting:: CACHE2_IGNORE_MISSING
|
|
|
|
CACHE2_IGNORE_MISSING
|
|
---------------------
|
|
|
|
Default: ``False``
|
|
|
|
If enabled, requests not found in the cache will be ignored instead of downloaded.
|
|
|
|
.. setting:: CACHE2_SECTORIZE
|
|
|
|
CACHE2_SECTORIZE
|
|
----------------
|
|
|
|
Default: ``True``
|
|
|
|
Whether to split HTTP cache storage in several dirs for performance improvements.
|
|
|
|
.. setting:: CLOSEDOMAIN_NOTIFY
|
|
|
|
CLOSEDOMAIN_NOTIFY
|
|
------------------
|
|
|
|
Default: ``[]``
|
|
Scope: ``scrapy.contrib.closedomain``
|
|
|
|
A list of emails to notify if the domain has been automatically closed by timeout.
|
|
|
|
.. setting:: CLOSEDOMAIN_TIMEOUT
|
|
|
|
CLOSEDOMAIN_TIMEOUT
|
|
-------------------
|
|
|
|
Default: ``0``
|
|
Scope: ``scrapy.contrib.closedomain``
|
|
|
|
A timeout (in secs) for automatically closing a spider. Spiders that remain
|
|
open for more than this time will be automatically closed. If zero, the
|
|
automatically closing is disabled.
|
|
|
|
.. setting:: CLUSTER_LOGDIR
|
|
|
|
CLUSTER_LOGDIR
|
|
--------------
|
|
|
|
Default: ``''`` (empty string)
|
|
|
|
The directory to use for cluster logging.
|
|
|
|
.. setting:: CLUSTER_MASTER_CACHEFILE
|
|
|
|
CLUSTER_MASTER_CACHEFILE
|
|
------------------------
|
|
|
|
Default: ``''``
|
|
|
|
The file to use for storing the state of the cluster master, before shotting
|
|
down. And also used for restoring the state on start up. If not set, state
|
|
won't be persisted.
|
|
|
|
.. setting:: CLUSTER_MASTER_ENABLED
|
|
|
|
CLUSTER_MASTER_ENABLED
|
|
------------------------
|
|
|
|
Default: ``False``
|
|
|
|
A boolean which specifies whether to enabled the cluster master.
|
|
|
|
.. setting:: CLUSTER_MASTER_NODES
|
|
|
|
CLUSTER_MASTER_NODES
|
|
--------------------
|
|
|
|
Default: ``{}``
|
|
|
|
A dict which defines the nodes of the cluster. The keys are the node/worker
|
|
names and the values are the worker URLs.
|
|
|
|
Example::
|
|
|
|
CLUSTER_MASTER_NODES = {
|
|
'local': 'localhost:8789',
|
|
'remote': 'someworker.example.com:8789',
|
|
}
|
|
|
|
.. setting:: CLUSTER_MASTER_POLL_INTERVAL
|
|
|
|
CLUSTER_MASTER_POLL_INTERVAL
|
|
----------------------------
|
|
|
|
Default: ``60``
|
|
|
|
The amount of time (in secs) that the master should wait before polling the
|
|
workers.
|
|
|
|
.. setting:: CLUSTER_MASTER_PORT
|
|
|
|
CLUSTER_MASTER_PORT
|
|
-------------------
|
|
|
|
Default: ``8790``
|
|
|
|
The port where the cluster master will listen.
|
|
|
|
.. setting:: CLUSTER_WORKER_ENABLED
|
|
|
|
CLUSTER_WORKER_ENABLED
|
|
------------------------
|
|
|
|
Default: ``False``
|
|
|
|
A boolean which specifies whether to enabled the cluster master.
|
|
|
|
.. setting:: CLUSTER_WORKER_MAXPROC
|
|
|
|
CLUSTER_WORKER_MAXPROC
|
|
------------------------
|
|
|
|
Default: ``4``
|
|
|
|
The maximum number of process that the cluster worker will be allowed to spawn.
|
|
|
|
.. setting:: CLUSTER_WORKER_PORT
|
|
|
|
CLUSTER_WORKER_PORT
|
|
-------------------
|
|
|
|
Default: ``8789``
|
|
|
|
The port where the cluster worker will listen.
|
|
|
|
.. setting:: COMMANDS_MODULE
|
|
|
|
COMMANDS_MODULE
|
|
---------------
|
|
|
|
Default: ``''`` (empty string)
|
|
|
|
A module to use for looking for custom Scrapy commands. This is used to add
|
|
custom command for your Scrapy project.
|
|
|
|
Example::
|
|
|
|
COMMANDS_MODULE = 'mybot.commands'
|
|
|
|
.. setting:: COMMANDS_SETTINGS_MODULE
|
|
|
|
COMMANDS_SETTINGS_MODULE
|
|
------------------------
|
|
|
|
Default: ``''`` (empty string)
|
|
|
|
A module to use for looking for custom Scrapy command settings.
|
|
|
|
Example::
|
|
|
|
COMMANDS_SETTINGS_MODULE = 'mybot.conf.commands'
|
|
|
|
.. setting:: CONCURRENT_DOMAINS
|
|
|
|
CONCURRENT_DOMAINS
|
|
------------------
|
|
|
|
Default: ``8``
|
|
|
|
Number of domains to scrape concurrently in one process. This doesn't affect
|
|
the number of domains scraped concurrently by the Scrapy cluster which spawns a
|
|
new process per domain.
|
|
|
|
.. setting:: COOKIES_DEBUG
|
|
|
|
COOKIES_DEBUG
|
|
-------------
|
|
|
|
Default: ``False``
|
|
|
|
Enable debugging message of Cookies Downloader Middleware.
|
|
|
|
.. setting:: DEFAULT_ITEM_CLASS
|
|
|
|
DEFAULT_ITEM_CLASS
|
|
------------------
|
|
|
|
Default: ``'scrapy.item.ScrapedItem'``
|
|
|
|
The default class that will be used for instantiating items in the :ref:`the
|
|
Scrapy shell <topics-shell>`.
|
|
|
|
.. setting:: DEFAULT_SPIDER
|
|
|
|
DEFAULT_SPIDER
|
|
--------------
|
|
|
|
Default: ``None``
|
|
|
|
The default spider class that will be instantiated for URLs for which no
|
|
specific spider is found. This class must have a constructor which receives as
|
|
only parameter the domain name of the given URL.
|
|
|
|
.. setting:: DEPTH_LIMIT
|
|
|
|
DEPTH_LIMIT
|
|
-----------
|
|
|
|
Default: ``0``
|
|
|
|
The maximum depth that will be allowed to crawl for any site. If zero, no limit
|
|
will be imposed.
|
|
|
|
.. setting:: DEPTH_STATS
|
|
|
|
DEPTH_STATS
|
|
-----------
|
|
|
|
Default: ``True``
|
|
|
|
Whether to collect depth stats.
|
|
|
|
.. setting:: DOWNLOADER_DEBUG
|
|
|
|
DOWNLOADER_DEBUG
|
|
----------------
|
|
|
|
Default: ``False``
|
|
|
|
Whether to enable the Downloader debugging mode.
|
|
|
|
.. setting:: DOWNLOADER_MIDDLEWARES
|
|
|
|
DOWNLOADER_MIDDLEWARES
|
|
----------------------
|
|
|
|
Default::
|
|
|
|
[
|
|
'scrapy.contrib.downloadermiddleware.robotstxt.RobotsTxtMiddleware',
|
|
'scrapy.contrib.downloadermiddleware.errorpages.ErrorPagesMiddleware',
|
|
'scrapy.contrib.downloadermiddleware.httpauth.HttpAuthMiddleware',
|
|
'scrapy.contrib.downloadermiddleware.useragent.UserAgentMiddleware',
|
|
'scrapy.contrib.downloadermiddleware.retry.RetryMiddleware',
|
|
'scrapy.contrib.downloadermiddleware.common.CommonMiddleware',
|
|
'scrapy.contrib.downloadermiddleware.redirect.RedirectMiddleware',
|
|
'scrapy.contrib.downloadermiddleware.cookies.CookiesMiddleware',
|
|
'scrapy.contrib.downloadermiddleware.httpcompression.HttpCompressionMiddleware',
|
|
'scrapy.contrib.downloadermiddleware.debug.CrawlDebug',
|
|
'scrapy.contrib.downloadermiddleware.stats.DownloaderStats',
|
|
'scrapy.contrib.downloadermiddleware.cache.CacheMiddleware',
|
|
]
|
|
|
|
The list of enabled downloader middlewares. Keep in mind that some may need to
|
|
be enabled through a particular setting. The top (first) middleware is closer
|
|
to the engine, while the bottom (last) middleware is closer to the downloader.
|
|
|
|
.. setting:: DOWNLOADER_STATS
|
|
|
|
DOWNLOADER_STATS
|
|
----------------
|
|
|
|
Default: ``True``
|
|
|
|
Whether to enable downloader stats collection.
|
|
|
|
.. setting:: DOWNLOAD_DELAY
|
|
|
|
DOWNLOAD_DELAY
|
|
--------------
|
|
|
|
Default: ``0``
|
|
|
|
The amount of time (in secs) that the downloader should wait before downloading
|
|
consecutive pages from the same spider. This can be used to throttle the
|
|
crawling speed to avoid hitting servers too hard. Decimal numbers are
|
|
supported. Example::
|
|
|
|
DOWNLOAD_DELAY = 0.25 # 250 ms of delay
|
|
|
|
.. setting:: DOWNLOAD_TIMEOUT
|
|
|
|
DOWNLOAD_TIMEOUT
|
|
----------------
|
|
|
|
Default: ``180``
|
|
|
|
The amount of time (in secs) that the downloader will wait before timing out.
|
|
|
|
.. setting:: DUPEFILTER_FILTERCLASS
|
|
|
|
DUPEFILTER_FILTERCLASS
|
|
----------------------------
|
|
|
|
Default: ``scrapy.contrib.spidermiddleware.SimplePerDomainFilter``
|
|
|
|
The class used to detect and filter duplicated requests.
|
|
|
|
Default ``SimplePerDomainFilter`` filter based on request fingerprint and
|
|
grouping per domain.
|
|
|
|
.. setting:: ENGINE_DEBUG
|
|
|
|
ENGINE_DEBUG
|
|
------------
|
|
|
|
Default: ``False``
|
|
|
|
Whether to enable the Scrapy Engine debugging mode.
|
|
|
|
.. setting:: ENABLED_SPIDERS_FILE
|
|
|
|
ENABLED_SPIDERS_FILE
|
|
--------------------
|
|
|
|
Default: ``''`` (empty string)
|
|
|
|
The path to a file containing a list of spiders (one domain name per line).
|
|
Those spiders will be considered enabled by Scrapy, and will be the spiders
|
|
crawled automatically when running ``scrapy-ctl.py crawl`` with no arguments.
|
|
|
|
If this setting is unset, all spiders to crawl must be passed explicitly in the
|
|
``crawl`` command.
|
|
|
|
Example::
|
|
|
|
'/etc/mybot/enabled_spiders.list'
|
|
|
|
.. setting:: EXTENSIONS
|
|
|
|
EXTENSIONS
|
|
----------
|
|
|
|
Default::
|
|
|
|
[
|
|
'scrapy.stats.corestats.CoreStats',
|
|
'scrapy.xpath.extension.ResponseLibxml2',
|
|
'scrapy.management.web.WebConsole',
|
|
'scrapy.management.telnet.TelnetConsole',
|
|
'scrapy.contrib.webconsole.scheduler.SchedulerQueue',
|
|
'scrapy.contrib.webconsole.livestats.LiveStats',
|
|
'scrapy.contrib.webconsole.spiderctl.Spiderctl',
|
|
'scrapy.contrib.webconsole.enginestatus.EngineStatus',
|
|
'scrapy.contrib.webconsole.stats.StatsDump',
|
|
'scrapy.contrib.webconsole.spiderstats.SpiderStats',
|
|
'scrapy.contrib.spider.reloader.SpiderReloader',
|
|
'scrapy.contrib.memusage.MemoryUsage',
|
|
'scrapy.contrib.memdebug.MemoryDebugger',
|
|
'scrapy.contrib.closedomain.CloseDomain',
|
|
'scrapy.contrib.debug.StackTraceDump',
|
|
'scrapy.contrib.response.soup.ResponseSoup',
|
|
]
|
|
|
|
The list of available extensions. Keep in mind that some of them need need to
|
|
be enabled through a setting. By default, this setting contains all stable
|
|
built-in extensions.
|
|
|
|
For more information See the :ref:`extensions user guide <topics-extensions>`
|
|
and the :ref:`list of available extensions <ref-extensions>`.
|
|
|
|
.. setting:: GROUPSETTINGS_ENABLED
|
|
|
|
GROUPSETTINGS_ENABLED
|
|
---------------------
|
|
|
|
Default: ``False``
|
|
|
|
Whether to enable group settings where spiders pull their settings from.
|
|
|
|
.. setting:: GROUPSETTINGS_MODULE
|
|
|
|
GROUPSETTINGS_MODULE
|
|
--------------------
|
|
|
|
Default: ``''`` (empty string)
|
|
|
|
The module to use for pulling settings from, if the group settings is enabled.
|
|
|
|
.. setting:: ITEM_PIPELINES
|
|
|
|
ITEM_PIPELINES
|
|
--------------
|
|
|
|
Default: ``[]``
|
|
|
|
The item pipelines to use (a list of classes).
|
|
|
|
Example::
|
|
|
|
ITEM_PIPELINES = [
|
|
'mybot.pipeline.validate.ValidateMyItem',
|
|
'mybot.pipeline.validate.StoreMyItem'
|
|
]
|
|
|
|
.. setting:: LOG_ENABLED
|
|
|
|
LOG_ENABLED
|
|
-----------
|
|
|
|
Default: ``True``
|
|
|
|
Enable logging.
|
|
|
|
.. setting:: LOG_STDOUT
|
|
|
|
LOG_STDOUT
|
|
----------
|
|
|
|
Default: ``False``
|
|
|
|
If enabled logging will be sent to standard output, otherwise standard error
|
|
will be used.
|
|
|
|
.. setting:: LOGFILE
|
|
|
|
LOGFILE
|
|
-------
|
|
|
|
Default: ``None``
|
|
|
|
File name to use for logging output. If None, standard input (or error) will be
|
|
used depending on the value of the LOG_STDOUT setting.
|
|
|
|
.. setting:: LOGLEVEL
|
|
|
|
LOGLEVEL
|
|
--------
|
|
|
|
Default: ``'DEBUG'``
|
|
|
|
Minimum level to log. Available levels are: SILENT, CRITICAL, ERROR, WARNING,
|
|
INFO, DEBUG, TRACE
|
|
|
|
.. setting:: MAIL_FROM
|
|
|
|
MAIL_FROM
|
|
---------
|
|
|
|
Default: ``'scrapy@localhost'``
|
|
|
|
Email to use as sender address for sending emails using the :ref:`Scrapy e-mail
|
|
sending facility <ref-email>`.
|
|
|
|
.. setting:: MAIL_HOST
|
|
|
|
MAIL_HOST
|
|
---------
|
|
|
|
Default: ``'localhost'``
|
|
|
|
Host to use for sending emails using the :ref:`Scrapy e-mail sending facility
|
|
<ref-email>`.
|
|
|
|
.. setting:: MEMDEBUG_ENABLED
|
|
|
|
MEMDEBUG_ENABLED
|
|
----------------
|
|
|
|
Default: ``False``
|
|
|
|
Whether to enable memory debugging.
|
|
|
|
.. setting:: MEMDEBUG_NOTIFY
|
|
|
|
MEMDEBUG_NOTIFY
|
|
---------------
|
|
|
|
Default: ``[]``
|
|
|
|
When memory debugging is enabled a memory report will be sent to the specified
|
|
addresses if this setting is not empty, otherwise the report will be written to
|
|
the log.
|
|
|
|
Example::
|
|
|
|
MEMDEBUG_NOTIFY = ['user@example.com']
|
|
|
|
.. setting:: MEMUSAGE_ENABLED
|
|
|
|
MEMUSAGE_ENABLED
|
|
----------------
|
|
|
|
Default: ``False``
|
|
|
|
Scope: ``scrapy.contrib.memusage``
|
|
|
|
Whether to enable the memory usage extension that will shutdown the Scrapy
|
|
process when it exceeds a memory limit, and also notify by email when that
|
|
happened.
|
|
|
|
See :ref:`ref-extensions-memusage`.
|
|
|
|
.. setting:: MEMUSAGE_LIMIT_MB
|
|
|
|
MEMUSAGE_LIMIT_MB
|
|
-----------------
|
|
|
|
Default: ``0``
|
|
|
|
Scope: ``scrapy.contrib.memusage``
|
|
|
|
The maximum amount of memory to allow (in megabytes) before shutting down
|
|
Scrapy (if MEMUSAGE_ENABLED is True). If zero, no check will be performed.
|
|
|
|
See :ref:`ref-extensions-memusage`.
|
|
|
|
.. setting:: MEMUSAGE_NOTIFY_MAIL
|
|
|
|
MEMUSAGE_NOTIFY_MAIL
|
|
--------------------
|
|
|
|
Default: ``False``
|
|
|
|
Scope: ``scrapy.contrib.memusage``
|
|
|
|
A list of emails to notify if the memory limit has been reached.
|
|
|
|
Example::
|
|
|
|
MEMUSAGE_NOTIFY_MAIL = ['user@example.com']
|
|
|
|
See :ref:`ref-extensions-memusage`.
|
|
|
|
.. setting:: MEMUSAGE_REPORT
|
|
|
|
MEMUSAGE_REPORT
|
|
---------------
|
|
|
|
Default: ``False``
|
|
|
|
Scope: ``scrapy.contrib.memusage``
|
|
|
|
Whether to send a memory usage report after each domain has been closed.
|
|
|
|
See :ref:`ref-extensions-memusage`.
|
|
|
|
.. setting:: MEMUSAGE_WARNING_MB
|
|
|
|
MEMUSAGE_WARNING_MB
|
|
-------------------
|
|
|
|
Default: ``0``
|
|
|
|
Scope: ``scrapy.contrib.memusage``
|
|
|
|
The maximum amount of memory to allow (in megabytes) before sending a warning
|
|
email notifying about it. If zero, no warning will be produced.
|
|
|
|
.. setting:: MYSQL_CONNECTION_SETTINGS
|
|
|
|
MYSQL_CONNECTION_SETTINGS
|
|
-------------------------
|
|
|
|
Default: ``{}``
|
|
|
|
Scope: ``scrapy.utils.db.mysql_connect``
|
|
|
|
Settings to use for MySQL connections performed through
|
|
``scrapy.utils.db.mysql_connect``
|
|
|
|
.. setting:: NEWSPIDER_MODULE
|
|
|
|
NEWSPIDER_MODULE
|
|
----------------
|
|
|
|
Default: ``''``
|
|
|
|
Module where to create new spiders using the ``genspider`` command.
|
|
|
|
Example::
|
|
|
|
NEWSPIDER_MODULE = 'mybot.spiders_dev'
|
|
|
|
.. setting:: PROJECT_NAME
|
|
|
|
PROJECT_NAME
|
|
------------
|
|
|
|
Default: ``Not Defined``
|
|
|
|
The name of the current project. It matches the project module name as created
|
|
by ``startproject`` command, and is only defined by project settings file.
|
|
|
|
.. setting:: REQUEST_HEADER_ACCEPT
|
|
|
|
REQUEST_HEADER_ACCEPT
|
|
---------------------
|
|
|
|
Default: ``'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8'``
|
|
|
|
Default value to use for the ``Accept`` request header (if not already set
|
|
before).
|
|
|
|
See :ref:`ref-downloader-middleware-common`.
|
|
|
|
.. setting:: REQUEST_HEADER_ACCEPT_LANGUAGE
|
|
|
|
REQUEST_HEADER_ACCEPT_LANGUAGE
|
|
------------------------------
|
|
|
|
Default: ``'en'``
|
|
|
|
Default value to use for the ``Accept-Language`` request header, if not already
|
|
set before.
|
|
|
|
See :ref:`ref-downloader-middleware-common`.
|
|
|
|
.. setting:: REQUESTS_QUEUE_SIZE
|
|
|
|
REQUESTS_PER_DOMAIN
|
|
-------------------
|
|
|
|
Default: ``8``
|
|
|
|
Specifies how many concurrent (ie. simultaneous) requests will be performed per
|
|
open spider.
|
|
|
|
REQUESTS_QUEUE_SIZE
|
|
-------------------
|
|
|
|
Default: ``0``
|
|
|
|
Scope: ``scrapy.contrib.spidermiddleware.limit``
|
|
|
|
If non zero, it will be used as an upper limit for the amount of requests that
|
|
can be scheduled per domain.
|
|
|
|
.. setting:: ROBOTSTXT_OBEY
|
|
|
|
ROBOTSTXT_OBEY
|
|
--------------
|
|
|
|
Default: ``False``
|
|
|
|
Scope: ``scrapy.contrib.downloadermiddleware.robotstxt``
|
|
|
|
If enabled, Scrapy will respect robots.txt policies. For more information see
|
|
:topic:`robotstxt`
|
|
|
|
.. setting:: SCHEDULER
|
|
|
|
SCHEDULER
|
|
---------
|
|
|
|
Default: ``'scrapy.core.scheduler.Scheduler'``
|
|
|
|
The scheduler to use for crawling.
|
|
|
|
.. setting:: SCHEDULER_ORDER
|
|
|
|
SCHEDULER_ORDER
|
|
---------------
|
|
|
|
Default: ``'BFO'``
|
|
|
|
Scope: ``scrapy.core.scheduler``
|
|
|
|
The order to use for the crawling scheduler. Available orders are:
|
|
|
|
* ``'BFO'``: `Breadth-first order`_ - typically consumes more memory but
|
|
reaches most relevant pages earlier.
|
|
|
|
* ``'DFO'``: `Depth-first order`_ - typically consumes less memory than
|
|
but takes longer to reach most relevant pages.
|
|
|
|
.. _Breadth-first order: http://en.wikipedia.org/wiki/Breadth-first_search
|
|
.. _Depth-first order: http://en.wikipedia.org/wiki/Depth-first_search
|
|
|
|
.. setting:: SCHEDULER_MIDDLEWARES
|
|
|
|
SCHEDULER_MIDDLEWARES
|
|
----------------------
|
|
|
|
Default::
|
|
|
|
[
|
|
'scrapy.contrib.schedulermiddleware.duplicatesfilter.DuplicatesFilterMiddleware',
|
|
]
|
|
|
|
The list of enabled scheduler middlewares. Keep in mind that some may need to
|
|
be enabled through a particular setting. The top (first) middleware is closer
|
|
to the engine, while the bottom (last) middleware is closer to the scheduler.
|
|
|
|
.. setting:: SPIDERPROFILER_ENABLED
|
|
|
|
SPIDERPROFILER_ENABLED
|
|
----------------------
|
|
|
|
Default: ``False``
|
|
|
|
Enable the spider profiler. Warning: this could have a big impact in
|
|
performance.
|
|
|
|
.. setting:: SPIDER_MIDDLEWARES
|
|
|
|
SPIDER_MIDDLEWARES
|
|
------------------
|
|
|
|
Default::
|
|
|
|
[
|
|
'scrapy.contrib.itemsampler.ItemSamplerMiddleware',
|
|
'scrapy.contrib.spidermiddleware.limit.RequestLimitMiddleware',
|
|
'scrapy.contrib.spidermiddleware.restrict.RestrictMiddleware',
|
|
'scrapy.contrib.spidermiddleware.offsite.OffsiteMiddleware',
|
|
'scrapy.contrib.spidermiddleware.referer.RefererMiddleware',
|
|
'scrapy.contrib.spidermiddleware.urllength.UrlLengthMiddleware',
|
|
'scrapy.contrib.spidermiddleware.depth.DepthMiddleware',
|
|
]
|
|
|
|
The list of enabled spider middlewares. Keep in mind that some may need to be
|
|
enabled through a particular setting. The top (first) middleware is closer to
|
|
the engine, while the bottom (last) middleware is closer to the spider.
|
|
|
|
.. setting:: SPIDER_MODULES
|
|
|
|
SPIDER_MODULES
|
|
--------------
|
|
|
|
Default: ``[]``
|
|
|
|
A list of modules where Scrapy will look for spiders.
|
|
|
|
Example::
|
|
|
|
SPIDER_MODULES = ['mybot.spiders_prod', 'mybot.spiders_dev']
|
|
|
|
.. setting:: STATS_CLEANUP
|
|
|
|
STATS_CLEANUP
|
|
-------------
|
|
|
|
Default: ``False``
|
|
|
|
Whether to cleanup (to save memory) the stats for a given domain,
|
|
when the domain is closed.
|
|
|
|
.. setting:: STATS_DEBUG
|
|
|
|
STATS_DEBUG
|
|
-----------
|
|
|
|
Default: ``False``
|
|
|
|
Enable debugging mode for Scrapy stats. This logs the stats when a domain is
|
|
closed.
|
|
|
|
.. setting:: STATS_ENABLED
|
|
|
|
STATS_ENABLED
|
|
-------------
|
|
|
|
Default: ``True``
|
|
|
|
Enable stats collection.
|
|
|
|
.. setting:: TELNETCONSOLE_ENABLED
|
|
|
|
TELNETCONSOLE_ENABLED
|
|
---------------------
|
|
|
|
Default: ``True``
|
|
|
|
Scope: ``scrapy.management.telnet``
|
|
|
|
A boolean which specifies if the telnet management console will be enabled
|
|
(provided its extension is also enabled).
|
|
|
|
.. setting:: TELNETCONSOLE_PORT
|
|
|
|
TELNETCONSOLE_PORT
|
|
------------------
|
|
|
|
Default: ``None``
|
|
|
|
Scope: ``scrapy.management.telnet``
|
|
|
|
The port to use for the telnet console. If unset, a dynamically assigned port
|
|
is used.
|
|
|
|
|
|
.. setting:: TEMPLATES_DIR
|
|
|
|
TEMPLATES_DIR
|
|
-------------
|
|
|
|
Default: ``templates`` dir inside scrapy module
|
|
|
|
The directory where to look for template when creating new projects with
|
|
scrapy-admin.py newproject.
|
|
|
|
.. setting:: URLLENGTH_LIMIT
|
|
|
|
URLLENGTH_LIMIT
|
|
---------------
|
|
|
|
Default: ``2083``
|
|
|
|
Scope: ``contrib.spidermiddleware.urllength``
|
|
|
|
The maximum URL length to allow for crawled URLs. For more information about
|
|
the default value for this setting see: http://www.boutell.com/newfaq/misc/urllength.html
|
|
|
|
.. setting:: USER_AGENT
|
|
|
|
USER_AGENT
|
|
----------
|
|
|
|
Default: ``"%s/%s" % (BOT_NAME, BOT_VERSION)``
|
|
|
|
The default User-Agent to use when crawling, unless overrided.
|
|
|
|
.. setting:: WEBCONSOLE_ENABLED
|
|
|
|
WEBCONSOLE_ENABLED
|
|
------------------
|
|
|
|
Default: ``"%s/%s" % (BOT_NAME, BOT_VERSION)``
|
|
|
|
A boolean which specifies if the web management console will be enabled
|
|
(provided its extension is also enabled).
|
|
|
|
.. setting:: WEBCONSOLE_LOGFILE
|
|
|
|
WEBCONSOLE_LOGFILE
|
|
------------------
|
|
|
|
Default: ``None``
|
|
|
|
A file to use for logging HTTP requests made to the web console. If unset web
|
|
the log is sent to standard scrapy log.
|
|
|
|
.. setting:: WEBCONSOLE_PORT
|
|
|
|
WEBCONSOLE_PORT
|
|
---------------
|
|
|
|
Default: ``None``
|
|
|
|
The port to use for the web console. If unset, a dynamically assigned port is
|
|
used.
|