Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to curl or wget a web page?

Tags:

http

curl

I would like to make a nightly cron job that fetches my stackoverflow page and diffs it from the previous day's page, so I can see a change summary of my questions, answers, ranking, etc.

Unfortunately, I couldn't get the right set of cookies, etc, to make this work. Any ideas?

Also, when the beta is finished, will my status page be accessible without logging in?

like image 949
Mark Harrison Avatar asked Aug 05 '08 20:08

Mark Harrison


People also ask

How do I download a web page using curl?

To download you just need to use the basic curl command but add your username and password like this curl --user username:password -o filename. tar. gz ftp://domain.com/directory/filename.tar.gz .

How do you curl a URL?

The syntax for the curl command is as follows: curl [options] [URL...] In its simplest form, when invoked without any option, curl displays the specified resource to the standard output. The command will print the source code of the example.com homepage in your terminal window.

Should I use curl or wget?

Differences Between wget and cURLWget is a simple transfer utility, while curl offers so much more. Curl provides the libcurl library, which can be expanded into GUI applications. Wget, on the other hand, is a simple command-line utility. Wget supports fewer protocols compared to cURL.


1 Answers

Your status page is available now without logging in (click logout and try it). When the beta-cookie is disabled, there will be nothing between you and your status page.

For wget:

wget --no-cookies --header "Cookie: soba=(LookItUpYourself)" https://stackoverflow.com/users/30/myProfile.html
like image 118
Grant Avatar answered Oct 16 '22 05:10

Grant