Currently I have a one line batch file to back up a file. I run it manually when I need to back up a file. The only thing I would like to add to it is the current date. Here is what I have:
xcopy /W /Y ACTIVE.DB ACTIVE.DB.BACKUP
the destination file should simply be ACTIVE.DB.BACKUP.YYYYMMDD. How would I go about creating a script that will allow me to double click on it from Windows Explorer and make the xcopy happen?
in Windows, by right clicking on the script file and choosing “Run with PowerShell”. Then you see an information that you should connect your computer with external drive that should be disk F. Click any key, script is making a backup. After the process you will see a report on how many files was copied.
Just to point out that you can do this with Copy-Item e.g.:
Set-Location $path
Copy-Item ACTIVE.DB "ACTIVE.DB.$(get-date -f yyyyMMdd)" -Force -Confirm
If you're going for robust then I'd use robocopy.exe
.
You can customize your filename by embedding a formatted [datetime]::now
in the file name in PowerShell like so:
xcopy /W /Y ACTIVE.DB "ACTIVE.DB.BACKUP.$([datetime]::now.ToString('yyyy-MM-dd'))"
If the line feels busy and unmaintainable, you can refactor it to multiple lines:
$now = [datetime]::now.ToString('yyyy-MM-dd')
xcopy /W /Y ACTIVE.DB "ACTIVE.DB.BACKUP.$now"
To get double-click execution, I usually make a batch file that runs the PowerShell command as described here:
Set up PowerShell Script for Automatic Execution
I just made a Daily/Weekly/Monthly/Quarterly/Yearly backup script in Powershell this month, and hope it helps.
This DWMQY backup scenario is to zip a source folder to a date-named zip file, then keep the zips of:
Running as scheduled task on daily basis, it puts the zips into a target folder which is also a Microsoft OneDrive's local folder, so the zips also being remotely sync'd to OneDrive server. Those outdated (non-Friday daily or non-last-DWMQY) zips will be moved to a non-remotely-sync'd folder.
It's March 5, 2016 today, the following zips should be in the target folder:
So there will be 23 zips (actually less since the dups among DWMQY), our files are 250 text documents which is 0.4 GB after zipping, so it's 23*0.4 = 9.2 GB in total, which is less than OneDrive free 15 GB quota.
For large source data, 7-zip can be used, which provides maximum 16 mil TB zip size. For directly backup folders instead of zips, haven't tried. Guessing it's a transferable procedure from the current zip way.
# Note: there are following paths:
# 1. source path: path to be backed up.
# 2. target path: current zips stored at, which is also a remote-sync pair's local path.
# 3. moved-to path: outdated zips to be moved in this non-sync'able location.
# 4. temp path: to copy the source file in to avoid zip.exe failing of compressing them if they are occupied by some other process.
# Function declaration
. C:\Source\zipSaveDated\Functions.ps1
# <1> Zip data
$sourcePath = '\\remoteMachine1\c$\SourceDocs\*'
$TempStorage = 'C:\Source\TempStorage'
$enddate = (Get-Date).tostring("yyyyMMdd")
$zipFilename = '\\remoteMachine2\d$\DailyBackupRemote\OneDrive\DailyBackupRemote_OneDrive\' + $enddate + '_CompanyDoc.zip'
Remove-Item ($TempStorage + '\*') -recurse -Force
Copy-Item $sourcePath $TempStorage -recurse -Force
Add-Type -A System.IO.Compression.FileSystem
[IO.Compression.ZipFile]::CreateFromDirectory($TempStorage, $zipFilename)
# <2> Move old files
$SourceDir = "\\remoteMachine2\d$\DailyBackupRemote\OneDrive\DailyBackupRemote_OneDrive"
$DestinationDir = "\\remoteMachine2\d$\DailyBackupRemote\bak" # to store files moved out of the working folder (OneDrive)
$KeepDays = 7
$KeepWeeks = 4
$KeepMonths = 6
$KeepQuarters = 4
$KeepYears = 2
# <2.1>: Loop files
$Directory = $DestinationDir
if (!(Test-Path $Directory))
{
New-Item $directory -type directory -Force
}
$files = get-childitem $SourceDir *.*
foreach ($file in $files)
{ # L1
# daily removal will not remove weekly copy, 7
If($file.LastWriteTime -lt (Get-Date).adddays(-$KeepDays).date `
-and $file.LastWriteTime.DayOfWeek -NotMatch "Friday" `
)
{
Move-Item $file.fullname $Directory -force
}
} # L1 >>
$files = get-childitem $SourceDir *.*
foreach ($file in $files)
{ # L1
# weekly removal will not remove monthly copy, 4
If($file.LastWriteTime -lt (Get-Date).adddays(-$KeepWeeks * 7).date `
-and (Get-LastFridayOfMonth ($file.LastWriteTime)).Date.ToString("yyyyMMdd") -NotMatch $file.LastWriteTime.Date.ToString("yyyyMMdd")
)
{
Move-Item $file.fullname $Directory -force
}
} # L1 >>
$files = get-childitem $SourceDir *.*
foreach ($file in $files)
{ # L1
# monthly removal will not remove quarterly copy, 6
If($file.LastWriteTime.Month -lt ((Get-Date).Year - $file.LastWriteTime.Year) * 12 + (Get-Date).Month - $KeepMonths `
-and $file.LastWriteTime.Month -NotIn 3, 6, 9, 12
)
{
Move-Item $file.fullname $Directory -force
}
} # L1 >>
$files = get-childitem $SourceDir *.*
foreach ($file in $files)
{ # L1
# quarterly removal will not remove yearly copy, 4
If($file.LastWriteTime.Month -lt ( (Get-Date).Year - $file.LastWriteTime.Year) * 12 + (Get-Date).Month - $KeepQuarters * 3 `
-and $file.LastWriteTime.Month -NotIn 12
)
{
Move-Item $file.fullname $Directory -force
}
} # L1 >>
$files = get-childitem $SourceDir *.*
foreach ($file in $files)
{ # L1
# yearly removal will just go straight ahead. 2
If($file.LastWriteTime.Year -lt (Get-Date).Year - $KeepYears )
{
Move-Item $file.fullname $Directory -force
}
} # L1 >>
<Functions.ps1>
function Get-TimesResult3
{
Param ([int]$a,[int]$b)
$c = $a * $b
Write-Output $c
}
function Get-Weekday {
param(
$Month = $(Get-Date -format 'MM'),
$Year = $(Get-Date -format 'yyyy'),
$Days = 1..5
)
$MaxDays = [System.DateTime]::DaysInMonth($Year, $Month)
1..$MaxDays | ForEach-Object {
Get-Date -day $_ -Month $Month -Year $Year |
Where-Object { $Days -contains $_.DayOfWeek }
}
}
function Get-LastFridayOfMonth([DateTime] $d) {
$lastDay = new-object DateTime($d.Year, $d.Month, [DateTime]::DaysInMonth($d.Year, $d.Month))
$diff = ([int] [DayOfWeek]::Friday) - ([int] $lastDay.DayOfWeek)
if ($diff -ge 0) {
return $lastDay.AddDays(- (7-$diff))
}
else {
return $lastDay.AddDays($diff)
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With