httplib can be thought of as the bottom layer of the stack: it does the low-level wrangling of sockets. The reason Requests is slower is because it does substantially more than httplib. There's a function for determining whether to bypass proxies which ends up invoking gethostbyname regardless of the actual request itself.Ĭopy-pasting response from posted in python-requests repo: requests.Session probably could cache hostname lookups, but it apparently does not.ĮDIT: After some further research, it's not just a simple matter of caching. ![]() You're providing the hostname to () once, so it makes sense it would call gethostbyname once. Python-requests: 1.76user 0.10system 0:02.17elapsed 85%CPUĪre my measurements and tests correct? Can you reproduce them too? If yes does anyone know what's going on inside http.client that make it so much faster? Why is there such big difference in processing time?īased on profiling both, the main difference appears to be that the requests version is doing a DNS lookup for every request, while the http.client version is doing so once. If I start SimpleHTTPServer: > python -m rverĪnd run above code samples (I'm using Python 3.5.2). import http.clientĬonn = ("localhost", port=8000)Īnd here is code doing same thing with python-requests: import requests To test it you can run following two code samples. ![]() I was testing different Python HTTP libraries today and I realized that http.client library seems to perform much much faster than requests.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |