Scrapy随机切换用户代理User-Agent(scrapy使用代理)

网友投稿 539 2022-09-04


Scrapy随机切换用户代理User-Agent(scrapy使用代理)

使用fake-useragent:

​​install fake-useragent

使用方法:

from fake_useragent import UserAgentua = UserAgent()ua.ie# Mozilla/5.0 (Windows; U; MSIE 9.0; Windows NT 9.0; en-US);ua.msie# Mozilla/5.0 (compatible; MSIE 10.0; Macintosh; Intel Mac OS X 10_7_3; Trident/6.0)'ua['Internet Explorer']# Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; GTB7.4; InfoPath.2; SV1; .NET CLR 3.3.69573; WOW64; en-US)ua.opera# Opera/9.80 (X11; Linux i686; U; ru) Presto/2.8.131 Version/11.11ua.chrome# Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.2 (KHTML, like Gecko) Chrome/22.0.1216.0 Safari/537.2'ua.google# Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_4) AppleWebKit/537.13 (KHTML, like Gecko) Chrome/24.0.1290.1 Safari/537.13ua['google chrome']# Mozilla/5.0 (X11; CrOS i686 2268.111.0) AppleWebKit/536.11 (KHTML, like Gecko) Chrome/20.0.1132.57 Safari/536.11ua.firefox# Mozilla/5.0 (Windows NT 6.2; Win64; x64; rv:16.0.1) Gecko/20121011 Firefox/16.0.1ua.ff# Mozilla/5.0 (X11; Ubuntu; Linux i686; rv:15.0) Gecko/20100101 Firefox/15.0.1ua.safari# Mozilla/5.0 (iPad; CPU OS 6_0 like Mac OS X) AppleWebKit/536.26 (KHTML, like Gecko) Version/6.0 Mobile/10A5355d Safari/8536.25# and the best one, random via real world browser usage statisticua.random

写一个随机更换user-agent的Downloader Middleware,新的​​middlewares.py​​代码如下:

from fake_useragent import UserAgentclass RandomUserAgentMiddlware(object): def __init__(self, crawler): super(RandomUserAgentMiddlware, self).__init__() self.ua = UserAgent() #读取在settings文件中的配置,来决定ua采用哪个方法,默认是random,也可是ie、Firefox等等,参考前面的使用方法。 self.ua_type = crawler.settings.get("RANDOM_UA_TYPE", "random") @classmethod def from_crawler(cls, crawler): return cls(crawler) #更换用户代理逻辑在此方法中 def process_request(self, request, spider): def get_ua(): return getattr(self.ua, self.ua_type) print get_ua() request.headers.setdefault('User-Agent', get_ua())

后,在​​settings.py​​中开启Middleware,同时还要关闭scrapy自带的代理Middleware:

# Enable or disable downloader middlewares# See = { 'JobSpider.middlewares.RandomUserAgentMiddlware': 543, 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware': None,}


版权声明:本文内容由网络用户投稿,版权归原作者所有,本站不拥有其著作权,亦不承担相应法律责任。如果您发现本站中有涉嫌抄袭或描述失实的内容,请联系我们jiasou666@gmail.com 处理,核实后本网站将在24小时内删除侵权内容。

上一篇:Mybatis plus where添加括号方式
下一篇:scrapy--pipelines基本用法--如何自定义ImagesPipeline抓取图片(scrapy imagepipeline)
相关文章

 发表评论

暂时没有评论,来抢沙发吧~