2010-08-18 19:48:32 -03:00
.TH SCRAPY 1 "October 17, 2009"
2010-06-11 01:18:16 -03:00
.SH NAME
2010-09-03 15:54:42 -03:00
scrapy \- the Scrapy command-line tool
2010-06-11 01:18:16 -03:00
.SH SYNOPSIS
2010-08-18 19:48:32 -03:00
.B scrapy
2010-06-11 01:18:16 -03:00
[\fI command\fR ] [\fI OPTIONS\fR ] ...
.SH DESCRIPTION
.PP
2010-09-03 15:54:42 -03:00
Scrapy is controlled through the \fB scrapy\fR command-line tool. The script provides several commands, for different purposes. Each command supports its own particular syntax. In other words, each command supports a different set of arguments and options.
2010-06-11 01:18:16 -03:00
.SH OPTIONS
.SS fetch\fR [\fIOPTION\fR] \fI URL\fR
.TP
Fetch a URL using the Scrapy downloader
.TP
.I --headers
Print response HTTP headers instead of body
.SS runspider\fR [\fIOPTION\fR] \fI spiderfile\fR
Run a spider
.TP
.I --output=FILE
Store scraped items to FILE in XML format
.SS settings [\fIOPTION\fR]
Query Scrapy settings
.TP
.I --get=SETTING
Print raw setting value
.TP
.I --getbool=SETTING
2017-01-24 22:20:37 +01:00
Print setting value, interpreted as a boolean
2010-06-11 01:18:16 -03:00
.TP
.I --getint=SETTING
2017-01-24 22:20:37 +01:00
Print setting value, interpreted as an integer
2010-06-11 01:18:16 -03:00
.TP
.I --getfloat=SETTING
2017-01-25 11:28:20 +01:00
Print setting value, interpreted as a float
2010-06-11 01:18:16 -03:00
.TP
.I --getlist=SETTING
2017-01-25 11:28:20 +01:00
Print setting value, interpreted as a float
2010-06-11 01:18:16 -03:00
.TP
.I --init
Print initial setting value (before loading extensions and spiders)
.SS shell\fR \fI URL\fR | \fI file\fR
Launch the interactive scraping console
.SS startproject\fR \fI projectname\fR
Create new project with an initial project template
.SS --help, -h
Print command help and options
.SS --logfile=FILE
Log file. if omitted stderr will be used
.SS --loglevel=LEVEL, -L LEVEL
Log level (default: None)
.SS --nolog
Disable logging completely
.SS --spider=SPIDER
Always use this spider when arguments are urls
.SS --profile=FILE
Write python cProfile stats to FILE
.SS --lsprof=FILE
Write lsprof profiling stats to FILE
.SS --pidfile=FILE
Write process ID to FILE
2011-09-01 14:35:37 -03:00
.SS --set=NAME=VALUE, -s NAME=VALUE
2010-06-11 01:18:16 -03:00
Set/override setting (may be repeated)
.SH AUTHOR
2014-02-16 21:44:49 -02:00
Scrapy was written by the Scrapy Developers.
2010-06-11 01:18:16 -03:00
.PP
This manual page was written by Ignace Mouzannar <mouzannar@gmail.com>,
for the Debian project (but may be used by others).