Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

MemoryError exception while trying to read large website file data

Im trying to read large website data but im facing this MemoryError exception

import requests
requests.urllib3.disable_warnings()
search_page = "http://www.yachtworld.co.uk/core/listing/cache/searchResults.jsp?ps=99999"
y = requests.get(search_page, timeout=999999, stream=True)
result = y.text

I face MemoryError Exception when i try to read from result variable which is the output of the page,

Is there anyway to read the whole data without facing this exception,

Thanks.

like image 663
hackerman Avatar asked Aug 31 '25 05:08

hackerman


1 Answers

From what I know there has not been any changes to the problem - meaning no possibility, you can load the data in chunks like well presented here

The accepted answer from the link I provided states a quite good piece of code for chunking the response:

def download_file(url):
    local_filename = url.split('/')[-1]
    # NOTE the stream=True parameter
    r = requests.get(url, stream=True)
    with open(local_filename, 'wb') as f:
        for chunk in r.iter_content(chunk_size=1024): 
            if chunk: # filter out keep-alive new chunks
                f.write(chunk)
                #f.flush() commented by recommendation from J.F.Sebastian
    return local_filename
like image 124
trust512 Avatar answered Sep 02 '25 17:09

trust512