网页爬虫 - 求解: Python urlopen IOError: [Errno socket error] [Errno 10060]
大家讲道理
大家讲道理 2017-04-17 17:07:40
0
1
1568

用python写个小爬虫,只用了urllib2,urllib,re模块,各位大神,求解啊?

Traceback (most recent call last):
File "C:/Users/user/Desktop/python ����/mm/mm.py", line 62, in

urllib.urlretrieve(mat[0], fname)

File "D:\Python27\lib\urllib.py", line 94, in urlretrieve

return _urlopener.retrieve(url, filename, reporthook, data)

File "D:\Python27\lib\urllib.py", line 240, in retrieve

fp = self.open(url, data)

File "D:\Python27\lib\urllib.py", line 208, in open

return getattr(self, name)(url)

File "D:\Python27\lib\urllib.py", line 345, in open_http

h.endheaders(data)

File "D:\Python27\lib\httplib.py", line 991, in endheaders

self._send_output(message_body)

File "D:\Python27\lib\httplib.py", line 844, in _send_output

self.send(msg)

File "D:\Python27\lib\httplib.py", line 806, in send

self.connect()

File "D:\Python27\lib\httplib.py", line 787, in connect

self.timeout, self.source_address)

File "D:\Python27\lib\socket.py", line 571, in create_connection

raise err

IOError: [Errno socket error] [Errno 10060]

大家讲道理
大家讲道理

光阴似箭催人老,日月如移越少年。

reply all (1)
巴扎黑

This problem is normal. Frequent visits to a website will be considered a DOS attack. Usually websites with rate-limit will stop responding for a period of time. You can catch this Exception, sleep for a period of time and then try again, or you can try again based on the The number of times you try to do exponential backup off.

IOError: [Errno socket error] [Errno 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond
    Latest Downloads
    More>
    Web Effects
    Website Source Code
    Website Materials
    Front End Template
    About us Disclaimer Sitemap
    php.cn:Public welfare online PHP training,Help PHP learners grow quickly!