python3爬虫进阶之构建自己的代理池

python3爬虫进阶之构建自己的代理池


一、爬取思路
1、访问西刺代理网址: https://www.xicidaili.com/nn/,找到IP存放的标签
2、验证代理的可用性
3、存放可用的代理,抛出不可用的代理

二、开始实战
1、打开西刺代理后,找到其IP以及端口存放标签如下图:
在这里插入图片描述
2、获取IP
利用BeautifulSoup获取IP存放在ip_list[]列表

url = 'http://www.xicidaili.com/nn'
		response = requests.get(url,headers=self.headers)
		html = response.text
		soup = BeautifulSoup(html, 'lxml')
		ip_list = soup.find(id='ip_list').find_all('tr')
		for i in range(1, len(ip_list)):
			ip_info = ip_list[i]
			tds = ip_info.find_all('td')
			ip = tds[1].text + ':' + tds[2].text

3、验证代理的可用性
用所得的IP访问百度,超过一定时间则不可用

proxies = {"http": ip}
		url = "http://www.baidu.com/"
		try:
			req = requests.get(url, headers=self.headers,proxies=proxies, timeout=3)
			if req.status_code == 200:
				return True
			else:
				return False
		except requests.RequestException as e:
			print("IP" + ip + "不可用 :")
			print(e)
			return False

三、源码以及运行结果

import requests, json, re, random, time
from bs4 import BeautifulSoup


def loadPage(url, headers):
    req = requests.get(url, headers=headers)
    html = req.text
    soup = BeautifulSoup(html, 'lxml')
    ip_list = soup.find(id='ip_list').find_all('tr')
    for i in range(1, len(ip_list)):
        ip_info = ip_list[i]
        tds = ip_info.find_all('td')
        ip = tds[1].text + ':' + tds[2].text
        # 验证ip是否可用
        if verify_IP(ip, headers):
            # 将可用ip存入文件
            dir_file = open("ip_records.txt", 'a', encoding="utf-8")
            dir_file.write(ip + "\n")
            dir_file.close()
            time.sleep(5)

def verify_IP(ip, headers):
    proxies = {"http": ip}
    url = "http://www.baidu.com/"
    try:
        req = requests.get(url, headers=headers, proxies=proxies, timeout=3)
        if req.status_code == 200:
            return True
        else:
            return False
    except requests.RequestException as e:
        print("IP" + ip + "不可用 :")
        print(e)
        return False


if __name__ == '__main__':
    headers = {
        "User-Agent": "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 "
                      "(KHTML, like Gecko) Chrome/51.0.2704.63 Safari/537.36"
    }
    url = 'http://www.xicidaili.com/nn'
    loadPage(url,headers)

运行结果:
在这里插入图片描述


版权声明:本文为wwq114原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接和本声明。