I've written a Dancer web app that utilizes Net::OpenID::Consumer
to consume OpenIDs for authentication. It works well with Google and MyOpenID, but not Yahoo. When a user tries to authenticate using their Yahoo account, HTML::Parser
warns:
Parsing of undecoded UTF-8 will give garbage when decoding entities
and this warning kills my app (rightfully so).
I don't see any existing bugs with Net::OpenID::Consumer
(or Common) that relate to this.
The HTTP headers and the HTML meta tags both specify UTF-8 for the 'claimed id' URI.
Why would the response not be decoded for HTML::Parser
? Am I missing something obvious?
Here's the relevant code:
get '/openid_landing' => sub {
my $params = params();
my $csr = Net::OpenID::Consumer->new(
ua => LWP::UserAgent->new(),
consumer_secret => $secret,
params => $params,
);
my $id = $params->{'openid.claimed_id'};
if (my $setup_url = $csr->user_setup_url) {
redirect $setup_url;
} elsif ($csr->user_cancel) {
redirect uri_for('/');
} elsif (my $vident = $csr->verified_identity) {
# verified identity, log in or register user
...
} else {
die "Error validating identity: " . $csr->err;
}
};
The bug is in Net/OpenID/URIFetch.pm on lines 122-128 of version 1.14 (latest) It's using the raw content instead of the decoded content of the response object. Just remove the manual gzip decoding and use the decoded_content method in the response.
I haven't filed a bug report yet, feel free. :)
Here's a diff you can apply to fix it:
122c122
< my $content = $res->decoded_content;
---
> my $content = $res->content;
125a126,129
> if ($res->content_encoding && $res->content_encoding eq 'gzip') {
> $content = Compress::Zlib::memGunzip($content);
> }
>
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With