Python's urllib.request module offers a urlopen function which retrieves a URL's content plus some metadata and stores everything in main memory. In environments with limited memory this can quickly result in MemoryErrors.
There is another function called urlretrieve which seems to do what I am looking for. For some reason, though, the official documentation mentions that it might become deprecated in the future.
Is there an "official", built-in and non-legacy way for performing a download directly to the local file system? I am aware that this can easily be achieved with third-party libraries such as requests but I am working under strict computational & memory constraints and therefore would favor a built-in solution.
If you want to limit yourself to Python's standard library, please note that urlopen returns HTTPResponse objects, which have methods to read the response into memory chunk by chunk. You can buffer chunks of the response in RAM and write it to disk or elsewhere along the way.
The requests module makes the whole thing more streamlined.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With