I would like to make a nightly cron job that fetches my stackoverflow page and diffs it from the previous day's page, so I can see a change summary of my questions, answers, ranking, etc.
Unfortunately, I couldn't get the right set of cookies, etc, to make this work. Any ideas?
Also, when the beta is finished, will my status page be accessible without logging in?
To download you just need to use the basic curl command but add your username and password like this curl --user username:password -o filename. tar. gz ftp://domain.com/directory/filename.tar.gz .
The syntax for the curl command is as follows: curl [options] [URL...] In its simplest form, when invoked without any option, curl displays the specified resource to the standard output. The command will print the source code of the example.com homepage in your terminal window.
Differences Between wget and cURLWget is a simple transfer utility, while curl offers so much more. Curl provides the libcurl library, which can be expanded into GUI applications. Wget, on the other hand, is a simple command-line utility. Wget supports fewer protocols compared to cURL.
Your status page is available now without logging in (click logout and try it). When the beta-cookie is disabled, there will be nothing between you and your status page.
For wget:
wget --no-cookies --header "Cookie: soba=(LookItUpYourself)" https://stackoverflow.com/users/30/myProfile.html
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With