Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Curl fails after following 50 redirects but wget works fine

I have an experimental web crawler and I noticed that it cannot read some pages, for example on some particular domains curl says it failed after following 50 redirects but wget reads that same domain just fine:

curl 'netflix.com' -L -o 'output.txt'

Result:

curl: (47) Maximum (50) redirects followed

No data in output.txt file.

While this command works fine:

wget netflix.com

Any ideas on what can cause this? I doubt that remote server handles requests based on the two different user agents.

like image 683
adrianTNT Avatar asked Dec 16 '13 23:12

adrianTNT


1 Answers

This is probably because you didn't tell curl to use cookies, which it doesn't do unless you ask it to - while wget enables them by default.

Use the --cookie or --cookie-jar options to enable cookies.

like image 190
Daniel Stenberg Avatar answered Sep 23 '22 14:09

Daniel Stenberg