Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to synchronously download files from URL in node.js [closed]

I'm trying to download a number of files from Flickr using Flickr APIs and http.get() calls in a loop.

I have the array of image URL's and I use 'download' function to download pictures If there is a big number of images there are mostly empty files. I found the download code here.
Please advice how to approach this.
Thanks in advance!

for (i=1;i<100;i++){

    filename= "./images/file"+i+".jpg";

    download(photourl[i], filename,{});


    } //End of for-loop 

.....

var download = function(url, dest, cb) {
  var file = fs.createWriteStream(dest);
  var request = http.get(url, function(response) {
    response.pipe(file);
    file.on('finish', function() {
      file.close();
      //cb();
    });
  });
}

P.S. And then finally there is error: events.js:72 throw er; // Unhandled 'error' event ^ Error: socket hang up at createHangUpError (http.js:1442:15) at Socket.socketOnEnd [as onend] (http.js:1538:23) at Socket.g (events.js:175:14) at Socket.EventEmitter.emit (events.js:117:20) at _stream_readable.js:910:16 at process._tickCallback (node.js:415:13)

like image 489
user2013424 Avatar asked Oct 08 '13 20:10

user2013424


1 Answers

I recommend to use async module for this:

var i = 1, threads = 5;
require('async').eachLimit(photourl, threads, function(url, next){
  download(url, "./images/file"+(i++)+".jpg", next);
}, function(){
   console.log('finished');
})

and uncomment cb(); in the download function

like image 196
vp_arth Avatar answered Sep 29 '22 20:09

vp_arth