$ftpServer = "ftp.example.com"
$username ="validUser"
$password ="myPassword"
$localToFTPPath = "C:\ToFTP"
$localFromFTPPath = "C:\FromFTP"
$remotePickupDir = "/Inbox"
$remoteDropDir = "/Outbox"
$SSLMode = [AlexPilotti.FTPS.Client.ESSLSupportMode]::ClearText
$ftp = new-object "AlexPilotti.FTPS.Client.FTPSClient"
$cred = New-Object System.Net.NetworkCredential($username,$password)
$ftp.Connect($ftpServer,$cred,$SSLMode) #Connect
$ftp.SetCurrentDirectory($remotePickupDir)
$ftp.GetFiles($localFromFTPPath, $false) #Get Files
This is the script I got for importing files from an FTP server.
However I am not sure what is the remotePickupDir
and is this script correct?
The AlexFTPS library used in the question seems to be dead (was not updated since 2011).
You can try to implement this without any external library. But unfortunately, neither the .NET Framework nor PowerShell have any explicit support for downloading all files in a directory (let alone recursive file downloads).
You have to implement that yourself:
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the .NET framework (FtpWebRequest
or WebClient
). The .NET framework unfortunately does not support the MLSD
command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
ListDirectory
method (NLST
FTP command) and simply download all the "names" as files.LIST
command = ListDirectoryDetails
method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d
at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)function DownloadFtpDirectory($url, $credentials, $localPath)
{
$listRequest = [Net.WebRequest]::Create($url)
$listRequest.Method =
[System.Net.WebRequestMethods+Ftp]::ListDirectoryDetails
$listRequest.Credentials = $credentials
$lines = New-Object System.Collections.ArrayList
$listResponse = $listRequest.GetResponse()
$listStream = $listResponse.GetResponseStream()
$listReader = New-Object System.IO.StreamReader($listStream)
while (!$listReader.EndOfStream)
{
$line = $listReader.ReadLine()
$lines.Add($line) | Out-Null
}
$listReader.Dispose()
$listStream.Dispose()
$listResponse.Dispose()
foreach ($line in $lines)
{
$tokens = $line.Split(" ", 9, [StringSplitOptions]::RemoveEmptyEntries)
$name = $tokens[8]
$permissions = $tokens[0]
$localFilePath = Join-Path $localPath $name
$fileUrl = ($url + $name)
if ($permissions[0] -eq 'd')
{
if (($name -ne ".") -and ($name -ne ".."))
{
if (!(Test-Path $localFilePath -PathType container))
{
Write-Host "Creating directory $localFilePath"
New-Item $localFilePath -Type directory | Out-Null
}
DownloadFtpDirectory ($fileUrl + "/") $credentials $localFilePath
}
}
else
{
Write-Host "Downloading $fileUrl to $localFilePath"
$downloadRequest = [Net.WebRequest]::Create($fileUrl)
$downloadRequest.Method =
[System.Net.WebRequestMethods+Ftp]::DownloadFile
$downloadRequest.Credentials = $credentials
$downloadResponse = $downloadRequest.GetResponse()
$sourceStream = $downloadResponse.GetResponseStream()
$targetStream = [System.IO.File]::Create($localFilePath)
$buffer = New-Object byte[] 10240
while (($read = $sourceStream.Read($buffer, 0, $buffer.Length)) -gt 0)
{
$targetStream.Write($buffer, 0, $read);
}
$targetStream.Dispose()
$sourceStream.Dispose()
$downloadResponse.Dispose()
}
}
}
Use the function like:
$credentials = New-Object System.Net.NetworkCredential("user", "mypassword")
$url = "ftp://ftp.example.com/directory/to/download/"
DownloadFtpDirectory $url $credentials "C:\target\directory"
The code is translated from my C# example in C# Download all files and subdirectories through FTP.
If you want to avoid troubles with parsing the server-specific directory listing formats, use a 3rd party library that supports the MLSD
command and/or parsing various LIST
listing formats. And ideally with a support for downloading all files from a directory or even recursive downloads.
For example with WinSCP .NET assembly you can download whole directory with a single call to Session.GetFiles
:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property @{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "ftp.example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Download files
$session.GetFiles("/directory/to/download/*", "C:\target\directory\*").Check()
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
Internally, WinSCP uses the MLSD
command, if supported by the server. If not, it uses the LIST
command and supports dozens of different listing formats.
The Session.GetFiles
method is recursive by default.
(I'm the author of WinSCP)
Here is the full working code to download all files (with wildcard or file extension) from the FTP site to local directory. Set the variable values.
#FTP Server Information - SET VARIABLES
$ftp = "ftp://XXX.com/"
$user = 'UserName'
$pass = 'Password'
$folder = 'FTP_Folder'
$target = "C:\Folder\Folder1\"
#SET CREDENTIALS
$credentials = new-object System.Net.NetworkCredential($user, $pass)
function Get-FtpDir ($url,$credentials) {
$request = [Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
if ($credentials) { $request.Credentials = $credentials }
$response = $request.GetResponse()
$reader = New-Object IO.StreamReader $response.GetResponseStream()
while(-not $reader.EndOfStream) {
$reader.ReadLine()
}
#$reader.ReadToEnd()
$reader.Close()
$response.Close()
}
#SET FOLDER PATH
$folderPath= $ftp + "/" + $folder + "/"
$files = Get-FTPDir -url $folderPath -credentials $credentials
$files
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$counter = 0
foreach ($file in ($files | where {$_ -like "*.txt"})){
$source=$folderPath + $file
$destination = $target + $file
$webclient.DownloadFile($source, $target+$file)
#PRINT FILE NAME AND COUNTER
$counter++
$counter
$source
}
Remote pick directory path should be the exact path on the ftp server you are tryng to access.. here is the script to download files from the server.. you can add or modify with SSLMode..
#ftp server
$ftp = "ftp://example.com/"
$user = "XX"
$pass = "XXX"
$SetType = "bin"
$remotePickupDir = Get-ChildItem 'c:\test' -recurse
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
foreach($item in $remotePickupDir){
$uri = New-Object System.Uri($ftp+$item.Name)
#$webclient.UploadFile($uri,$item.FullName)
$webclient.DownloadFile($uri,$item.FullName)
}
The remotePickupDir
would be the folder you want to go to on the ftp server. As far as "is this script correct", well, does it work? If it works then it's correct. If it does not work, then tell us what error message or unexpected behaviour you're getting and we'll be better able to help you.
Based on Why does FtpWebRequest download files from the root directory? Can this cause a 553 error?, I wrote a PowerShell script that enabled to download a file from a FTP-Server via explicit FTP over TLS:
# Config
$Username = "USERNAME"
$Password = "PASSWORD"
$LocalFile = "C:\PATH_TO_DIR\FILNAME.EXT"
#e.g. "C:\temp\somefile.txt"
$RemoteFile = "ftp://PATH_TO_REMOTE_FILE"
#e.g. "ftp://ftp.server.com/home/some/path/somefile.txt"
try{
# Create a FTPWebRequest
$FTPRequest = [System.Net.FtpWebRequest]::Create($RemoteFile)
$FTPRequest.Credentials = New-Object System.Net.NetworkCredential($Username,$Password)
$FTPRequest.Method = [System.Net.WebRequestMethods+Ftp]::DownloadFile
$FTPRequest.UseBinary = $true
$FTPRequest.KeepAlive = $false
$FTPRequest.EnableSsl = $true
# Send the ftp request
$FTPResponse = $FTPRequest.GetResponse()
# Get a download stream from the server response
$ResponseStream = $FTPResponse.GetResponseStream()
# Create the target file on the local system and the download buffer
$LocalFileFile = New-Object IO.FileStream ($LocalFile,[IO.FileMode]::Create)
[byte[]]$ReadBuffer = New-Object byte[] 1024
# Loop through the download
do {
$ReadLength = $ResponseStream.Read($ReadBuffer,0,1024)
$LocalFileFile.Write($ReadBuffer,0,$ReadLength)
}
while ($ReadLength -ne 0)
# Close file
$LocalFileFile.Close()
}catch [Exception]
{
$Request = $_.Exception
Write-host "Exception caught: $Request"
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With