I have a script that works great manually. However, when I schedule it in Task Scheduler, the script never ends and so the next time it tries to run, it fails because the prior instance is still running. The script itself takes a few seconds to complete the first time or when run manually. Here is the script:
$source = "\\server1\upload"
$destination = "\\server2\upload"
$logfile = "c:\Scripts\fileMover\log.txt"
$table = Get-ChildItem $source -include *
foreach ($file in $table){
$filename = $file.FullName
#write-host $filename
try
{
move-item -LiteralPath $filename -destination $destination -force
$body = "Successfully moved $filename to $destination"
$subject = "fileMover Succeeded"
}
catch
{
$body = "Failed to move $filename to $destination"
$subject = "fileMover Failed"
}
finally
{
$body | out-file $logfile -append -width 1000 -encoding ascii
Send-MailMessage -To "[email protected]" -From "[email protected]" -Subject $subject -SmtpServer "10.1.10.1" -Body $body
exit
}
}
exit
The script is scheduled with the following settings:
As a workaround, I configured the task to automatically stop after 1 minute. However, I'm concerned that in certain circumstances -- such as a large number of large files -- the script may get terminated before completing fully.
The script needs to run every 2 minutes.
I had this problem when trying to execute .ps1
file directly. Execute Powershell
and feed it your script as a parameter.
Scheduled Task Actions Tab:
Program\script: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
Add Arguments (optional):-command & 'E:\PowerShell_scripts\Script_name.ps1’
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With