Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What's the fastest way to make concurrent web requests in Perl? [closed]

I need to make some concurrent XML feed requests in Perl. What is the fastest way to do this?

like image 965
git-noob Avatar asked Feb 24 '09 18:02

git-noob


4 Answers

I would probably use AnyEvent, perhaps like this:

use AnyEvent;
use AnyEvent::HTTP;


sub get_feeds {
    my @feeds = @_;
    my $done = AnyEvent->condvar;
    my %results;
    $done->begin( sub { $done->send(\%results) } );

    for my $feed (@feeds){
        $done->begin;
        http_get $feed, sub { $results{$feed} = \@_; $done->end };
    }

    $done->end;
    return $done;
}

my $done = get_feeds(...);
my $result = $done->recv; # block until all feeds are fetched
like image 96
jrockway Avatar answered Sep 24 '22 13:09

jrockway


HTTP::Async is pretty fast and quite easy to code.

like image 41
EvdB Avatar answered Sep 24 '22 13:09

EvdB


Actually, AnyEvent::Curl::Multi is a non-blocking library built on top of libcurl. Very fast and tons of concurrency available. Much more powerful than AnyEvent::HTTP, IMO.

like image 26
Jay Janssen Avatar answered Sep 21 '22 13:09

Jay Janssen


I used LWP::Parallel::UserAgent for something similar. An example from the POD:

require LWP::Parallel::UserAgent;
$ua = LWP::Parallel::UserAgent->new();
...

$ua->redirect (0); # prevents automatic following of redirects
$ua->max_hosts(5); # sets maximum number of locations accessed in parallel
$ua->max_req  (5); # sets maximum number of parallel requests per host
...
$ua->register ($request); # or
$ua->register ($request, '/tmp/sss'); # or
$ua->register ($request, \&callback, 4096);
...
$ua->wait ( $timeout ); 
...
sub callback { my($data, $response, $protocol) = @_; .... }
like image 31
gpojd Avatar answered Sep 23 '22 13:09

gpojd