Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to limit content length response of simplified HTTP request in node?

I would like to setup the simplified HTTP request() client package to abort the download of HTTP resources that are too big.

Let's imagine request() is setup to download an url and the resource size is 5 Gigabytes. I would like request() to stop downloading after 10MB. Usually, when request gets an answer it gets all the HTTP headers and everything behind. Once you manipulate the data, you have already all the downloaded data.

In axios, there is a parameter called maxContentLength but I can't find anything similar for request().

I must also mention, that I don't it to catch an error but only download at least the headers and the beginning of the resource.

like image 282
Nicolas Guérinet Avatar asked Nov 22 '17 08:11

Nicolas Guérinet


1 Answers

const request = require('request');
const URL = 'http://de.releases.ubuntu.com/xenial/ubuntu-16.04.3-desktop-amd64.iso';
const MAX_SIZE = 10 * 1024 * 1024 // 10MB , maximum size to download
let total_bytes_read = 0;

1 - If the response from the server is gzip-compressed , you should enable gzip option. https://github.com/request/request#examples For backwards-compatibility, response compression is not supported by default. To accept gzip-compressed responses, set the gzip option to true.

request
    .get({
        uri: URL,
        gzip: true
    })
    .on('error', function (error) {
        //TODO: error handling
        console.error('ERROR::', error);
    })
    .on('data', function (data) {
        // decompressed data 
        console.log('Decompressed  chunck Recived:' + data.length, ': Total downloaded:', total_bytes_read)
        total_bytes_read += data.length;
        if (total_bytes_read >= MAX_SIZE) {
            //TODO: handle exceeds max size event
            console.error("Request exceeds max size.");
            throw new Error('Request exceeds max size'); //stop
        }
    })
    .on('response', function (response) {
        response.on('data', function (chunk) {
            //compressed data
            console.log('Compressed  chunck Recived:' + chunk.length, ': Total downloaded:', total_bytes_read)
        });
    })
    .on('end', function () {
        console.log('Request completed! Total size downloaded:', total_bytes_read)
    });

NB: If the server does not compress response but you still use gzip option / decompress, then the decompress chunk & the original chunk will be equal. Hence you can do the Limit check either way(from the decompressed / compressed chunk) However if response is compressed you should check the size limit of the decompressed chunk

2 - if the response is not compressed you don't need gzip option to decompress

request
    .get(URL)
    .on('error', function (error) {
        //TODO: error handling
        console.error('ERROR::', error);
    })
    .on('response', function (response) {
        response.on('data', function (chunk) {
            //compressed data
            console.log('Recived chunck:' + chunk.length, ': Total downloaded:', total_bytes_read)
            total_bytes_read += chunk.length;
            if (total_bytes_read >= MAX_SIZE) {
                //TODO: handle exceeds max size event
                console.error("Request as it exceds max size:")
                throw new Error('Request as it exceds max size');
            }
            console.log("...");
        });
    })
    .on('end', function () {
        console.log('Request completed! Total size downloaded:', total_bytes_read)
    });
like image 170
Mehari Mamo Avatar answered Sep 22 '22 14:09

Mehari Mamo