I have written a very small python client to access confluence restful api. I am using https protocol to connect with the confluence. I am running into Connection reset by peer
error.
Here is the full stack trace.
/Users/rakesh.kumar/.virtualenvs/wpToConfluence.py/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:318: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#snimissingwarning.
SNIMissingWarning
/Users/rakesh.kumar/.virtualenvs/wpToConfluence.py/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:122: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.
InsecurePlatformWarning
Traceback (most recent call last):
File "wpToConfluence.py", line 15, in <module>
main()
File "wpToConfluence.py", line 11, in main
headers={'content-type': 'application/json'})
File "/Users/rakesh.kumar/.virtualenvs/wpToConfluence.py/lib/python2.7/site-packages/requests/api.py", line 71, in get
return request('get', url, params=params, **kwargs)
File "/Users/rakesh.kumar/.virtualenvs/wpToConfluence.py/lib/python2.7/site-packages/requests/api.py", line 57, in request
return session.request(method=method, url=url, **kwargs)
File "/Users/rakesh.kumar/.virtualenvs/wpToConfluence.py/lib/python2.7/site-packages/requests/sessions.py", line 475, in request
resp = self.send(prep, **send_kwargs)
File "/Users/rakesh.kumar/.virtualenvs/wpToConfluence.py/lib/python2.7/site-packages/requests/sessions.py", line 585, in send
r = adapter.send(request, **kwargs)
File "/Users/rakesh.kumar/.virtualenvs/wpToConfluence.py/lib/python2.7/site-packages/requests/adapters.py", line 453, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', error(54, 'Connection reset by peer'))
Here is my client code:
import requests
def main():
auth = open('/tmp/confluence', 'r').readline().strip()
username = 'rakesh.kumar'
response = requests.get("https://<HOST-NAME>/rest/api/content/",
auth=(username, auth),
headers={'content-type': 'application/json'})
print response
if __name__ == "__main__":
main()
I am running this script in a virtual environment and following packages are installed on that environment:
(wpToConfluence.py)➜ Python pip list
You are using pip version 6.1.1, however version 8.1.2 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
appnope (0.1.0)
backports.shutil-get-terminal-size (1.0.0)
decorator (4.0.10)
ipdb (0.10.1)
ipython (5.0.0)
ipython-genutils (0.1.0)
pathlib2 (2.1.0)
pexpect (4.2.0)
pickleshare (0.7.3)
pip (6.1.1)
prompt-toolkit (1.0.5)
ptyprocess (0.5.1)
Pygments (2.1.3)
requests (2.10.0)
setuptools (25.1.6)
simplegeneric (0.8.1)
six (1.10.0)
traitlets (4.2.2)
urllib3 (1.16)
wcwidth (0.1.7)
It does complain about the python version number but I am not sure how to update my Mac/Virtual environment python.
I have tried to curl command and Postman both of them work fine for the given parameters.
While installing requests
library it skips few of optional security packages ('pyOpenSSL', 'ndg-httpsclient', and 'pyasn1') which are required for the SSL/Https connection.
You can fix it by either running this command
pip install "requests[security]"
or
pip install pyopenssl ndg-httpsclient pyasn1
I tried installing all the optional security packages provided in the answer above. But nothing seemed to work.
One important GOTCHA : Check if your url end-point actively prevents programmatic access.
Take a look at the robots.txt file in the root directory of a website: http://myweburl.com/robots.txt.
If it contains text that looks like this : User-agent: * Disallow: /
This site doesn’t like and want scraping. This gives you the same dreaded error 54, connection reset by the peer.
Here is a snapshot :
https://www.aclibrary.org/robots.txt
User-agent: discobot Disallow: / User-agent: AddThis.com Disallow: / User-agent: Yandex Disallow: / User-agent: Baiduspider Disallow: / User-agent: Baiduspider-video Disallow: / User-agent: Baiduspider-image Disallow: / User-agent: SemrushBot Disallow: / User-agent: SemrushBot-SA Disallow: / User-Agent: W3C-checklink Crawl-delay: 0 User-agent: Twitterbot Disallow: User-agent: * Crawl-delay: 10 Disallow: /er.php Disallow: /err.php Disallow: /go.php Disallow: /friendly.php Disallow: /ld.php Disallow: /srch.php Sitemap: https://aclibrary.org/sitemap.xml
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With