I know how to use wget to download from ftp but I couldn't use wget to download from the following link:
http://www.ncbi.nlm.nih.gov/geo/download/?acc=GSE46130&format=file
If you copy and paste it in the browser, it'll start to download. But I want to download it to our server directly so I don't need to move it from my desktop to the server. How do I do it?
Thanks!
In general, downloading a file from an HTTP server terminal via HTTP GET consists of the following steps: Make an HTTP GET request to send to the HTTP server. Send an HTTP request and receive an HTTP response from the HTTP server. Save the contents of the HTTP response file to a local file.
Grab file with curl run: $ curl https://your-domain/file.pdf. Get file using ftp or sftp protocol: $ curl ftp://ftp-your-domain-name/file.tar.gz. You can set the output file name while downloading file with the curl, execute: $ curl -o file. pdf https://your-domain-name/long-file-name.pdf.
To do this you need to right-click on the file and select “Copy Public Link“. This will generate a public link for this specific file. This link usually leads to a page where you can preview a file before downloading it.
This is what I did:
wget -O file.tar "http://www.ncbi.nlm.nih.gov/geo/download/?acc=GSE46130&format=file"
Use the -O option with wget, to specify where to save the file that is downloaded. For example:
wget -O /path/to/file http://www.ncbi.nlm.nih.gov/geo/download/?acc=GSE46130&format=file
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With