I'm trying to make simple feed reader in node and I'm facing a problem with multiple requests in node.js. For example, I got table with urls something like:
urls = [ "http://url1.com/rss.xml", "http://url2.com", "http://url3.com"];
Now I want to get contents of each url. First idea was to use for(var i in urls)
but it's not good idea. the best option would be to do it asynchronously but I don't know how to make it.
Any ideas?
EDIT:
I got this code:
var data = []; for(var i = 0; i<urls.length; i++){ http.get(urls[i], function(response){ console.log('Reponse: ', response.statusCode, ' from url: ', urls[i]); var body = ''; response.on('data', function(chunk){ body += chunk; }); response.on('end', function() { data.push(body); }); }).on('error', function(e){ console.log('Error: ', e.message); }); }
Problem is that first is call line "http.get..." for each element in loop and after that event response.on('data') is called and after that response.on('end'). It makes mess and I don't know how to handle this.
I know this is an old question, but I think a better solution would be to use JavaScripts Promise.all()
:
const request = require('request-promise'); const urls = ["http://www.google.com", "http://www.example.com"]; const promises = urls.map(url => request(url)); Promise.all(promises).then((data) => { // data = [promise1,promise2] });
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With