Is there a windows command to copy or download files from a http url to the filesystem? I've tried with copy, xcopy and robocopy and they dont seem to support http urls.
I am not familiar with any commands on Windows that can do that, but I always download GNU wget on Windows for these and similar purposes.
cURL comes to mind.
curl -o homepage.html http://www.apptranslator.com/
This command downloads the page and stores it into file homepage.html. Thousands options available.
You can use a powershell script to accomplish this.
Get-Web http://www.msn.com/ -toFile www.msn.com.html
function Get-Web($url,
[switch]$self,
$credential,
$toFile,
[switch]$bytes)
{
#.Synopsis
# Downloads a file from the web
#.Description
# Uses System.Net.Webclient (not the browser) to download data
# from the web.
#.Parameter self
# Uses the default credentials when downloading that page (for downloading intranet pages)
#.Parameter credential
# The credentials to use to download the web data
#.Parameter url
# The page to download (e.g. www.msn.com)
#.Parameter toFile
# The file to save the web data to
#.Parameter bytes
# Download the data as bytes
#.Example
# # Downloads www.live.com and outputs it as a string
# Get-Web http://www.live.com/
#.Example
# # Downloads www.live.com and saves it to a file
# Get-Web http://wwww.msn.com/ -toFile www.msn.com.html
$webclient = New-Object Net.Webclient
if ($credential) {
$webClient.Credential = $credential
}
if ($self) {
$webClient.UseDefaultCredentials = $true
}
if ($toFile) {
if (-not "$toFile".Contains(":")) {
$toFile = Join-Path $pwd $toFile
}
$webClient.DownloadFile($url, $toFile)
} else {
if ($bytes) {
$webClient.DownloadData($url)
} else {
$webClient.DownloadString($url)
}
}
}
source http://blogs.msdn.com/mediaandmicrocode/archive/2008/12/01/microcode-powershell-scripting-tricks-scripting-the-web-part-1-get-web.aspx
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With