1
0
mirror of https://github.com/scrapy/scrapy.git synced 2025-02-26 18:23:57 +00:00
scrapy/debian/scrapy.1
Pablo Hoffman 34554da201 Deprecated scrapy-ctl.py command in favour of simpler "scrapy" command. Closes #199. Also updated documenation accordingly and added convenient scrapy.bat script for running from Windows.
--HG--
rename : debian/scrapy-ctl.1 => debian/scrapy.1
rename : docs/topics/scrapy-ctl.rst => docs/topics/cmdline.rst
2010-08-18 19:48:32 -03:00

80 lines
2.1 KiB
Groff

.TH SCRAPY 1 "October 17, 2009"
.SH NAME
scrapy \- Python Scrapy control script
.SH SYNOPSIS
.B scrapy
[\fIcommand\fR] [\fIOPTIONS\fR] ...
.SH DESCRIPTION
.PP
Scrapy is controlled through the \fBscrapy\fR control script. The script provides several commands, for different purposes. Each command supports its own particular syntax. In other words, each command supports a different set of arguments and options.
.SH OPTIONS
.SS fetch\fR [\fIOPTION\fR] \fIURL\fR
.TP
Fetch a URL using the Scrapy downloader
.TP
.I --headers
Print response HTTP headers instead of body
.SS runspider\fR [\fIOPTION\fR] \fIspiderfile\fR
Run a spider
.TP
.I --output=FILE
Store scraped items to FILE in XML format
.SS settings [\fIOPTION\fR]
Query Scrapy settings
.TP
.I --get=SETTING
Print raw setting value
.TP
.I --getbool=SETTING
Print setting value, intepreted as a boolean
.TP
.I --getint=SETTING
Print setting value, intepreted as an integer
.TP
.I --getfloat=SETTING
Print setting value, intepreted as an float
.TP
.I --getlist=SETTING
Print setting value, intepreted as an float
.TP
.I --init
Print initial setting value (before loading extensions and spiders)
.SS shell\fR \fIURL\fR | \fIfile\fR
Launch the interactive scraping console
.SS startproject\fR \fIprojectname\fR
Create new project with an initial project template
.SS --help, -h
Print command help and options
.SS --version
Print Scrapy version and exit
.SS --logfile=FILE
Log file. if omitted stderr will be used
.SS --loglevel=LEVEL, -L LEVEL
Log level (default: None)
.SS --nolog
Disable logging completely
.SS --spider=SPIDER
Always use this spider when arguments are urls
.SS --profile=FILE
Write python cProfile stats to FILE
.SS --lsprof=FILE
Write lsprof profiling stats to FILE
.SS --pidfile=FILE
Write process ID to FILE
.SS --set=SET
Set/override setting (may be repeated)
.SS --settings=MODULE
Python path to the Scrapy project settings
.SH AUTHOR
Scrapy was written by the Scrapy Developers
<scrapy-developers@googlegroups.com>.
.PP
This manual page was written by Ignace Mouzannar <mouzannar@gmail.com>,
for the Debian project (but may be used by others).