TIL: benchmarking a single GET request with 5 different Python HTTP libraries

Tom Deneire
2 min readOct 6, 2023
Photo by Chris Liverani on Unsplash

Today I needed to figure out which Python HTTP library would give me the best performance for a plain and simple HTTP GET request, so I though I’d write up my approach and results for future reference.

Libraries

I tested five different libraries:

  • urllib, which is in the standard library
  • urllib3, which is presented asurllib on steroids
  • requests or “HTTP for Humans™ ”
  • aiohttp, an asynchronous HTTP Client/Server for asyncio and Python
  • httpx, the only library to offer both HTTP/1.1 and HTTP/2

For the record, I am aware that aiohttp (and to some extent httpx too) is specifically geared towards async requests, which is in fact only a disadvantage when testing with a single request, but I wanted to get the numbers anyway…

System

These tests were performed on machine with the following specifications.

Operating system:       Linux Mint 21.2 Cinnamon
Linux Kernel: 5.15.0-78-generic
Processor: Intel(R) Core(TM) i7-8550U CPU @ 1.80GHz x 4
RAM memory: 15.3 GiB
Hard Drives: 518.3 GB
System: Dell XPS 13 9370

I used Python 3.11.5 with these packages:

urllib3==1.26.5
requests==2.28.2
aiohttp==3.8.5
httpx==0.24.1

Scripts

This is the actual code that was run, which you can also find in this repository:

urllib

from urllib.request import urlopen


def fetch(url: str) -> None:
try:
with urlopen(url) as urlreader:
response = urlreader.read()
except Exception as err:
exit(1)
assert len(response) == 399609


if __name__ == "__main__":
fetch("https://tomdeneire.be/static/homepage/img/profile.png")

--

--