A PowerShell workflow is a sequence of predefined linked steps/activities, which lead to a desired result. The advantages of a using a workflow instead of a regular script is the possibility of running tasks on multiple systems at the same time. A PowerShell workflow is usually a PowerShell script, translated into Extensible Application Markup Language (XAML), which is the processed by the Windows Workflow Foundation engine.
Writing and running a workflow is very similar to writing PowerShell functions. The difference is that we use the keyword Workflow, followed by the body of the script in curly brackets.
Workflow Test-Workflow
{
»Hello World«
}
The workflow is built around the concept of activity. Every PowerShell command, that we run in a workflow, is an independent activity. Because a workflow can be stopped and then released again, we need to count on this at every step. Variables, created with one command, won’t necessarily be available to the next command. For example:
Workflow Test-Workflow {
$obj = New-Object -TypeName PSObject
$obj | Add-Member -MemberType NoteProperty `
-Name ExampleProperty `
-Value 'Hello!'
$obj | Get-Member
}
Test-Workflow
Because the command, in which we are building up the variable with the add-member method, is running in its own memory space, the changes made to the variable won’t be reflected in the $obj variable in the third command. If that is what we would like to achieve, we can enclose all three commands into the InlineScript command.
workflow Test-Workflow {
InlineScript {
$obj = New-Object -TypeName PSObject
$obj | Add-Member -MemberType NoteProperty `
-Name ExampleProperty `
-Value 'Hello!'
$obj | Get-Member
}
}
We can use an outside variable inside the InlineScript block, but we must use the $Using modifier.
Workflow Stop-MyService
{
$ServiceName = "MyService"
$Output = InlineScript {
$Service = Get-Service -Name $Using:ServiceName
$Service.Stop()
$Service
}
$Output.Name
}
One of the advantages of using workflows is the possibility to run commands in parallel instead of the usual sequential method. We can achieve this with the Parallel command.
Parallel
{
<activity1>
<activity2>
}
<activity3>
In the example above, activity1 and activity2 are being run in parallel while activity3 is only run after both 1 and 2 have finished. An example of using this parallel method is a script for copying files to multiple computers. These scripts usually look like this:
Copy-Item -Path C:\file1.txt -Destination \\computer1\file1.txt
Copy-Item -Path C:\file2.txt -Destination \\computer2\file2.txt
Copy-Item -Path C:\file3.txt -Destination \\computer3\file3.txt
By using a parallel workflow, we can do it this way:
Workflow Copy-Files
{
Parallel
{
Copy-Item -Path "C:\file1.txt" -Destination "\\computer1"
Copy-Item -Path "C:\file2.txt" -Destination "\\computer2"
Copy-Item -Path "C:\file3.txt" -Destination "\\computer3"
}
Write-Output "Finished copying."
}
We will receive the message only after all three files have been copied. We can use the example above to show the ForEach-Parallel construct.
Workflow Copy-Files
{
$files = @("C:\file1.txt", "C:\file2.txt", "C:\file3.txt")
ForEach -Parallel -ThrottleLimit 10 ($File in $Files)
{
Copy-Item -Path $File -Destination \\computer1
Write-Output "$File has been copied."
}
Write-Output "All files have been copied.«
}
We can also use the Checkpoint-workflow construct after commands we don’t want to repeat after interruptions or errors. Checkpoint makes sure that a recording of the current state of the workflow is made, with all the current values of variables and all the output up until that point in the workflow. After an error, the workflow will always start from the last checkpoint.
In the example above, especially if we would be copying large file, it would make sense to use Checkpoint-workflow after the copy-item command.