I need to make some concurrent XML feed requests in Perl. What is the fastest way to do this?
I would probably use AnyEvent, perhaps like this:
use AnyEvent;
use AnyEvent::HTTP;
sub get_feeds {
my @feeds = @_;
my $done = AnyEvent->condvar;
my %results;
$done->begin( sub { $done->send(\%results) } );
for my $feed (@feeds){
$done->begin;
http_get $feed, sub { $results{$feed} = \@_; $done->end };
}
$done->end;
return $done;
}
my $done = get_feeds(...);
my $result = $done->recv; # block until all feeds are fetched
HTTP::Async is pretty fast and quite easy to code.
Actually, AnyEvent::Curl::Multi is a non-blocking library built on top of libcurl. Very fast and tons of concurrency available. Much more powerful than AnyEvent::HTTP, IMO.
I used LWP::Parallel::UserAgent for something similar. An example from the POD:
require LWP::Parallel::UserAgent;
$ua = LWP::Parallel::UserAgent->new();
...
$ua->redirect (0); # prevents automatic following of redirects
$ua->max_hosts(5); # sets maximum number of locations accessed in parallel
$ua->max_req (5); # sets maximum number of parallel requests per host
...
$ua->register ($request); # or
$ua->register ($request, '/tmp/sss'); # or
$ua->register ($request, \&callback, 4096);
...
$ua->wait ( $timeout );
...
sub callback { my($data, $response, $protocol) = @_; .... }
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With