I'm scraping some internal pages using Python and requests. I've turned off SSL verifications and warnings.
requests.packages.urllib3.disable_warnings()
page = requests.get(url, verify=False)
On certain servers I receive an SSL error I can't get past.
Traceback (most recent call last):
File "scraper.py", line 6, in <module>
page = requests.get(url, verify=False)
File "/cygdrive/c/Users/jfeocco/VirtualEnv/scraping/lib/python3.4/site-packages/requests/api.py", line 71, in get
return request('get', url, params=params, **kwargs)
File "/cygdrive/c/Users/jfeocco/VirtualEnv/scraping/lib/python3.4/site-packages/requests/api.py", line 57, in request
return session.request(method=method, url=url, **kwargs)
File "/cygdrive/c/Users/jfeocco/VirtualEnv/scraping/lib/python3.4/site-packages/requests/sessions.py", line 475, in request
resp = self.send(prep, **send_kwargs)
File "/cygdrive/c/Users/jfeocco/VirtualEnv/scraping/lib/python3.4/site-packages/requests/sessions.py", line 585, in send
r = adapter.send(request, **kwargs)
File "/cygdrive/c/Users/jfeocco/VirtualEnv/scraping/lib/python3.4/site-packages/requests/adapters.py", line 477, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: [SSL: SSL_NEGATIVE_LENGTH] dh key too small (_ssl.c:600)
This happens both in/out of Cygwin, in Windows and OSX. My research hinted at outdated OpenSSL on the server. I'm looking for a fix client side ideally.
Edit:
I was able to resolve this by using a cipher set
import requests
requests.packages.urllib3.util.ssl_.DEFAULT_CIPHERS += 'HIGH:!DH:!aNULL'
try:
requests.packages.urllib3.contrib.pyopenssl.DEFAULT_SSL_CIPHER_LIST += 'HIGH:!DH:!aNULL'
except AttributeError:
# no pyopenssl support used / needed / available
pass
page = requests.get(url, verify=False)
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…