requests调用代理
import requests proxy='123.58.10.36:8080' #本地代理 #proxy='username:password@123.58.10.36:8080' proxies={ 'http':'http://'+proxy, 'https':'https://'+proxy } try: response=requests.get('http://httpbin.org/get',proxies=proxies) print(response.text) except requests.exceptions.ConnectionError as e: print('错误:',e.args)
Selenium调用代理
selenium加上代理,selenium主要是实现自动化登录验证等操作
from selenium import webdriver proxy='123.58.10.36:8080' chrome_options=webdriver.ChromeOptions() chrome_options.add_argument('--proxy-server=http://'+proxy) browser=webdriver.Chrome(chrome_options=chrome_options) browser.get('http://httpbin.org/get')
调用 ProxyHandler 添加 代理
from urllib import request url = 'www.baidu.com' # 设置代理 handler = request.ProxyHandler({'http':'ip:port'}) opener = request.bulid_opener(handler) # 发送请求 req = request.Request(url=url) response = opener.open(req)
在 scrapy下载中间件添加代理
middlewares.py 自定义一个代理类,重写 process_request 方法
class MyDaiLi(object): """docstring for MyDaiLi""" # 重写这个方法 def process_request(self, request, spider): request.meta['proxy'] = 'http://ip:port'
settings.py (第55行)打开下载中间件,并将自定义的 MyDaiLi添加进去
DOWNLOADER_MIDDLEWARES = { 'daili_loginproject.middlewares.MyDaiLi': 543, }
推荐文章