mirror of
https://github.com/scrapy/scrapy.git
synced 2025-02-21 07:52:49 +00:00
Don’t use follow_all where a single item is expected (#4)
This commit is contained in:
parent
dd12f5fdcd
commit
5980b0f284
@ -616,20 +616,19 @@ instance; you still have to yield this Request.
|
||||
You can also pass a selector to ``response.follow`` instead of a string;
|
||||
this selector should extract necessary attributes::
|
||||
|
||||
for href in response.css('li.next a::attr(href)'):
|
||||
yield response.follow(href, callback=self.parse)
|
||||
href = response.css('li.next a::attr(href)')[0]
|
||||
yield response.follow(href, callback=self.parse)
|
||||
|
||||
For ``<a>`` elements there is a shortcut: ``response.follow`` uses their href
|
||||
attribute automatically. So the code can be shortened further::
|
||||
|
||||
for a in response.css('li.next a'):
|
||||
yield response.follow(a, callback=self.parse)
|
||||
a = response.css('li.next a')[0]
|
||||
yield response.follow(a, callback=self.parse)
|
||||
|
||||
To create multiple requests from an iterable, you can use
|
||||
:meth:`response.follow_all <scrapy.http.TextResponse.follow_all>` instead::
|
||||
|
||||
links = response.css('li.next a')
|
||||
yield from response.follow_all(links, callback=self.parse)
|
||||
yield from response.follow_all(response.css('a'), callback=self.parse)
|
||||
|
||||
|
||||
More examples and patterns
|
||||
|
Loading…
x
Reference in New Issue
Block a user