I'm trying to write a Powershell function that takes an array argument. I want it to be called with the array either as an argument, or as pipeline input. So, calling looks something like this:
my-function -arg 1,2,3,4
my-function 1,2,3,4
1,2,3,4 | my-function
It's easy enough to get the first two:
function my-function {
    param([string[]]$arg)
    $arg
}
For pipeline input, though, it's harder. It's easy to get the arguments one at a time in the process block, by using ValueFromPipeline, but that means that the $args variable is a single value with pipeline input, but an array if -args is used. I can use $input in the END block, but that doesn't get -args input at all, and using $args in an END block only gets the final item from a pipeline.
I suppose that I can do this by explicitly collecting the argument values from the pipeline using begin/process/end blocks, as follows:
function my-function {
    param([Parameter(ValueFromPipeline=$true)][string[]]$args)
    begin {
        $a = @()
    }
    process {
        $a += $args
    }
    end {
        # Process array here
        $a -join ':'
    }
}
But that seems very messy. It also seems like a relatively common requirement to me, so I was expecting it to be easy to implement. Is there an easier way that I have missed? Or if not, is there a way to encapsulate the argument handling into a sub-function, so that I don't have to include all that in every function I want to work like this?
My concrete requirement is that I'm writing scripts that take SQL commands as input. Because SQL can be verbose, I want to allow for the possibility of piping in the command (maybe generated by another command, or from get-contents on a file) but also allow for an argument form, for a quick SELECT statement. So I get a series of strings from the pipeline, or as a parameter. If I get an array, I just want to join it with "`n" to make a single string - line by line processing is not appropriate.
I guess another question would be, is there a better design for my script that makes getting multi-line input like this cleaner?
Thanks - the trick is NOT to use ValueFromPipeline then...
The reason I was having so much trouble getting things to work the way I wanted was that in my test scripts, I was using $args as the name of my argument variable, forgetting that it is an automatic variable. So things were working very oddly...
PS> 1,2,3,4 | ./args
PS> get-content args.ps1
param([string[]]$args)
if ($null -eq $args) { $args = @($input) }
$args -join ':'
Doh :-)
Use the automatic variable $input.
If only pipeline input is expected then:
function my-function {
    $arg = @($input)
    $arg
}
But I often use this combined approach (a function that accepts input both as an argument or via pipeline):
function my-function {
    param([string[]]$arg)
    # if $arg is $null assume data are piped
    if ($null -eq $arg) {
        $arg = @($input)
    }
    $arg
}
# test
my-function 1,2,3,4
1,2,3,4 | my-function
                        Here's another example using Powershell 2.0+
This example is if the parameter is not required:
function my-function {
  [cmdletbinding()]
  Param(
    [Parameter(ValueFromPipeline=$True)]
    [string[]]$Names
  )
  End {
    # Verify pipe by Counting input
    $list = @($input)
    $Names = if($list.Count) { $list } 
      elseif(!$Names) { @(<InsertDefaultValueHere>) } 
      else { @($Names) }
    $Names -join ':'
  }
}
There's one case where it would error out without the 'elseif'. If no value was supplied for Names, then $Names variable will not exist and there'd be problems. See this link for explanation.
If it is required, then it doesn't have to be as complicated.
function my-function {
  [cmdletbinding()]
  Param(
    [Parameter(Mandatory=$true,ValueFromPipeline=$True)]
    [string[]]$Names
  )
  End {
    # Verify pipe by Counting input
    $list = @($input)
    if($list.Count) { $Names = $list } 
    $Names -join ':'
  }
}
It works, exactly as expected and I now I always reference that link when writing my Piped Functions.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With