How to make requests library ignore SSL errors

We have valid SSL buyt hostnames change, urllib3.disable_warnings() seems it no longer works.

Any ideas?

Version: 7.0 (Python 3.10.12 on linux)

Hello @teknopaul
from request library doc , you have the Open session parameter Verify=False that should do the job if ssl error
for other warnings : disable_warnings parameter should also help

regards

1 Like

Still get WARNs and failures.

[ WARN ] Certificate did not match expected hostname:

SSLError: HTTPSConnectionPool(host='ā€¦

SSLError: HTTPSConnectionPool(host=ā€˜pr-c9606.{baseurl-redacted}ā€™, port=443): Max retries exceeded with url: /sport-results/apidocs/ (Caused by SSLError(SSLCertVerificationError(ā€œhostname ā€˜pr-c9606.{baseurl-redacted}ā€™ doesnā€™t match either of ā€˜api.{baseurl-redacted}ā€™, ā€˜docs.{baseurl-redacted}ā€™ā€)))

Hi @teknopaul,

If Iā€™m reading that error correctly, you have a certificate that is valid for api.{baseurl-redacted} and docs.{baseurl-redacted}, but you are trying to use this certificate for a web server running on another server (pr-c9606.{baseurl-redacted}) where the certificate is not valid, so this is a different case from ignoring a certificate thatā€™s simply never valid or self signed.

Iā€™m not familiar with the code of requests library, but I suspect this is why youā€™re encountering this issue.

I guess the questions are,

  • is this a bug in requests library? Iā€™m not sureā€¦
  • Is this a bug with youā€™re environment?
    • did you request the api.{baseurl-redacted} host name and get redirected to pr-c9606 indicating an issue with the load balancer?
  • I would think in this situation, if Iā€™ve understood it correctly, where there is a valid certificate but itā€™s being incorrectly used on the wrong server this should be an error and fail even when Verify=False, in this situation your IT people should be using a self signed cert rather than the cert they have used.

Dave.

Yeah, just the hostname is ā€œwrongā€. The SSL cert is valid for api.myco.com but for pull requests we have pr123.myco.com with the same cert.

Its actually a google certificate signed by them this is how our development in the cloud works. Previous robot versions did not have an issue.
It would be nice if google would deploy a wildcard cert.

I doubt its a bug in requests, its a problem with Verify=False not being propagated for some reason from robot to urllib3.

AFAIK Its possible in urllib3 to turn off hostname validation and still validate the cert, that is common in most SSL libraries. If we could disable hostname validation that would be better because the cert is valid and this flow goes over the Internet sometimes.

urllib3 has

assert_hostname=False

Seems to be a bug in RequestKeyworkds.py to me session has verify:False

in _common_request() fuction

    resp = method_function(
        self._merge_url(session, uri),  <- session has verify:False but it is not passed to kwargs
        timeout=self._get_timeout(kwargs.pop('timeout', None)),
        cookies=kwargs.pop('cookies', self.cookies),
        **kwargs)

    log.logger.error(kwargs)  <- this is missing Verify: False

    kwargs.update({'verify': False})  <-  if I add it it works

This is missing in requests lib, to pass the ssl bypass from session to the request

  kwargs.update({'verify': session.verify})

Hi @teknopaul,

Iā€™ve not looked at the code for requests, so it could well be a bug, might be best to raise an issue on the github page with this detail.

Dave.

Issue: Verify=False is not propogated to bypass ssl verification Ā· Issue #386 Ā· MarketSquare/robotframework-requests Ā· GitHub

Pull Request: Propagate verify=False from session to requests by phinds1 Ā· Pull Request #387 Ā· MarketSquare/robotframework-requests Ā· GitHub

1 Like