I use WWW::Mechanize::Shell to test stuff. Since I didn't managed to sign in on a web site I want to scrape, I thought I will use the browser cookie (chrome or firefox) for that specific website with the 'cookie' command WWW::Mechanize::Shell has.
The question is, Cookies usually stored in a single file, which is not good, how to get a cookie for only this specific site?
thanks,
Why isn't storing cookies in a file good?
Since WWW::Mechanize is built on top of LWP::UserAgent, you handle cookies just like you do in LWP::UserAgent. You can make the cookie jar a file or an in-memory hash.
If you don't want to save the cookies in a file, use an empty hash reference when you construct the mech object:
use WWW::Mechanize;
my $mech = WWW::Mechanize->new( cookie_jar => {} );
If you want to use a new file, make a new HTTP::Cookies object:
use WWW::Mechanize;
my $mech = WWW::Mechanize->new(
cookie_jar => HTTP::Cookies->new( file => "$ENV{HOME}/.cookies.txt" )
);
If you want to load a browser specific cookies file, use the right module for it:
use WWW::Mechanize;
my $mech = WWW::Mechanize->new(
cookie_jar => HTTP::Cookies::Netscape->new( file => $filename )
);
If you want no cookies at all, use undef explicitly:
use WWW::Mechanize;
my $mech = WWW::Mechanize->new( cookie_jar => undef );
All of this is in the docs.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With