Im required to use wget for its ability to work like a crawler to develop one for my project. But all around in google searches im seeing people recommend to use LWP instead of Wget. Can you guys enlighten me on why this is so?
If you're writing Perl and need to make an HTTP request, you should generally use LWP. It's silly to shell out to do something that is easily supported within the Perl process.
If you want to do something more complex, like recursive web crawling, you may want to look at the WWW::Mechanize
or Mojolicious
modules available from CPAN. But at that point it might be reasonable to shell out to take advantage of an external tool that already has recursive web-crawling capability.
If you're writing a shell script rather than a Perl program, then you have no choice but to use an external tool. The choice among wget
, curl
, and the LWP scripts (lwp-request
, GET
, etc) really comes down to what's easiest for your use case. They all have have approximately the same features, but some things are easier in one tool than the others. Use what's readily available for your system; there's usually more than one option, in which case you should give them all a try - read the doc, try a few use cases, see which one you like.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With