Quantcast
Channel: Windows PowerShell - SAPIEN Blog
Viewing all 308 articles
Browse latest View live

Updated Online PowerShell Reference

$
0
0

We’ve just updated our online PowerShell Reference by adding two new sections, 1,706 cmdlets (bringing the total to 5,187!), and some minor enhancements to the cmdlets section. PowerShell Reference contains help for Windows PowerShell and PowerShell modules. This tool allows you to search through Cmdlet Help, About Help, Provider Help, Aliases and Modules with just a click of your mouse.

Aliases

We’ve added an Aliases section that contains 369 aliases associated with the cmdlets in our database. This page displays the name of the cmdlet, the module that contains it and the associated MAML help file.

 

Modules

We’ve also added a Modules section that contains the 163 modules currently in our database. This page lists the version number of the module, all the cmdlets and about-help files associated with a given module, the synopsis of each cmdlet (when available) and a link to the cmdlet page itself.

 

Cmdlets Page

The cmdlets page now displays the name of the cmdlet’s module and version, as well as any aliases that are contained in our database. A click on the cmdlet’s module will take you to the module page where you will see the list of all cmdlets in that module.

About PowerShell Reference

This library includes the data from Microsoft’s PowerShell documentation as well as many of the modules found in the PowerShell Gallery. If you’ve written a module for the gallery, and your module includes valid XML help files, it might appear in PowerShell Reference, too. (If you need help writing help, be sure to check out PowerShell HelpWriter!)

We run an updater once a quarter unless more frequency is needed, so check back often for added modules!

 


Microsoft Ignite 2017 Full Conference Pass Award

$
0
0

SAPIEN Technologies, Inc., developer of PowerShell Studio and PrimalScript, will award a full conference pass for Microsoft Ignite 2017 to a woman in the Windows PowerShell community, or any individual who is new to IT or devops within the last two years, and works with Windows PowerShell in some capacity.  SAPIEN encourages other Ignite vendors to match this award and support the attendance of technical women and new talent at IT and devop conferences.

The pass provides “All access. All breakout sessions. All social events,” including meals at the conference.

Candidates must be women who are currently learning and/or using Windows PowerShell in some capacity OR any individual who is new to the IT industry using Windows PowerShell in some capacity, and who will be able to attend the conference if awarded this pass. The award consists only of the attendee conference pass. The award recipient is responsible for all other costs including travel, lodging, and food and drink outside of the conference. SAPIEN assumes no liability. The awarded pass must be used by the recipient; it is not transferrable.

To be considered for this award, send an email introducing yourself and describing your work with Windows PowerShell and the PowerShell community to info@sapien.com by August 11th, 2017. Include a statement verifying that you will be able to attend the conference if you receive this award.

SAPIEN will select and announce the recipient soon thereafter, leaving plenty of time for securing hotel and travel arrangements. Selection of the recipient is entirely at the discretion of SAPIEN Technologies, Inc.

PowerShell Write Cmdlets

$
0
0

 

PowerShell has multiple Cmdlets that display information, but the question that often arises is which one should be used? In this post we will take a look at the differences between some of the more common ways to display information in PowerShell and which one will fit best for a given situation. Before we get start it is important to define a few terms are going to pop up. This first term is Stream; which is a sequence of data elements made available over time. A stream can be thought of a items on a conveyor belt being processed one at a time rather than in large batches. The second is pipeline, whish consists of a chain of processing elements arranged so that the output of each element is the input of the next.

 

Write-Host vs Write-Output

Write-Host outputs text directly to the console / host, bypassing the pipeline, therefore it cannot be used to pipe output to another cmdlet, such as Out-File. However, this cmdlet can change the color of the text’s background and of the color of the text itself.

Write-Output places the object in the pipeline which can be consumed by other Cmdlets.  If  it is the last Cmdlet, it will display the output in the console/host. Unless it is necessary to display different colored messages or display a richer user interface it would be best to use Write-Output since it allows for easier information passing.

 

Write-Host_Piped_And_Not_Piped

 

Another major difference between these two cmdlets is that Write-Host will output the text that it is given on the same line unless otherwise specified, such as using the newline character “`n”.

Write-Output is going to insert a new line character between each string that it is given.

 

Capture

 

Write-Information

Windows PowerShell 5.0 introduces a new, structured information stream (number 6 in Windows PowerShell streams) that you can use to transmit structured data between a script and its callers (or hosting environment).

Write-Information lets you add an informational message to the stream, and specify how Windows PowerShell handles information stream data for a command. By default, Write-Information will not write anything to the screen unless $InformationAction is set to ‘Continue’,

this can be accomplished by either setting $InformationAction to ‘Continue’ when the cmdlet is written or by setting $InformationAction to ‘Continue’ at the top of the script. Write-Information is useful when program information only needs to be displayed under

special circumstances or only if the user requests it.

 

Write-Information

Write-Error vs Write-Warning

Write-Error and Write-Warning are fairly similar as they are both used to show information about a problem that occurred. Where they differ is what information gets written to the screen.

Starting in PowerShell V3, warnings have their own stream; which means that this stream can be captured and piped to another Cmdlet.

Write-Warning will also not write any output to the screen unless $WarningPreference has been set to ‘Continue’ at the top of the script or by setting $WarningPreference to ‘Continue’ when the cmdlet is written.

 

Write-Warning

 

The Write-Error cmdlet declares a non-terminating error. By default, errors are sent in the error stream to the host program to be displayed, along with output.

Since the messages from the Write-Error cmdlet are sent to the error stream, they can e accessed by indexing the $Error variable array, with the ‘0’ index being the most recent

error that was placed in the stream.

 

Write-Error

Write-Debug

Write-Debug is used when its necessary to output debug messages in a critical section of code that can be turned on and off or when a script  developer does not want the extra information to be displayed when the script is ran normally.

Write-Debug is similar to Write-Warning in that unless the variable $DebugPreference is set to ‘Continue’ at the beginning of a file or before the cmdlet will be executed nothing will be displayed to the screen.

The debug parameter can also be used if it necessary for the user to acknowledge that something has occurred.

Debug Message

Write-Progress

The Write-Progress cmdlet displays a progress bar in a Windows PowerShell command window that depicts the status of a running command or script.

You can select the indicators that the bar reflects and the text that appears above and below the progress bar.

The Write-Progress command includes a status bar heading (“activity”), a status line, and the variable $I (the counter in the For loop), which indicates the relative completeness of the task.

 

writeprogressworking (2)

 

The Cmdlets discussed in this post are not the only was the PowerShell can display information, PowerShell has many more Cmdlets that have the ability to display information

or redirect that information. However,  the cmdlets discussed here will work for most day to day uses, with these cmdlets it is possible to write to the pipeline, display warnings, debug information,

and Errors.

Video featuring PowerShell Studio Cache Editor

Custom Actions in PrimalScript 2018’s MSI Builder

$
0
0

Starting with PrimalScript 2018, version 7.4.111 you can define custom actions for your Windows Installers. A custom action is anything that can be executed from a command line. Most commonly you will probably use executable files or a PowerShell Script.

Overview: Custom Actions for Windows Installers

If you are not familiar with custom actions for Windows installers, here is a brief overview:

When installing or updating a product, Windows Installer’s main job is to copy the files of a product to the proper place. Likewise, when uninstalling, it uses information created while installing to remove files that have been previously copied to your hard drive. Any files or documents created by the user or created during normal operation of a product are not removed. While Windows Installer has some functionality for creating registry entries—chaining other Windows installers (e.g., for dependencies), or opening ports in your firewall—it has very little functionality to support PowerShell or its modules. This is where custom actions come into play. With custom actions you can install modules you need from the PowerShell Gallery or any other repository. You can create actions which will then remove these modules again during uninstall. Likewise, you can create registry entries and remove them again as needed. All the powerful functionality of Windows PowerShell is now at your disposal during installer operations.

Custom Actions Example

We will use a simple WMI Explorer generated application to illustrate these custom actions.

In addition to the application, we add two PowerShell scripts (setvalue.ps1 and removevalue.ps1) and a readme.txt:

image

 

On the installer setting dialog, we have a new page “Custom Actions.” As you may have guessed, here is where you define them. Each custom action must have a unique name. The dialog will not allow you to use the same name for more than one action.

PowerShell scripts as custom actions are executed with the following command line:

”C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe” -NoLogo -NonInteractive -InputFormat None -NoProfile –File “<yourcustomactionscript>.ps1”

That means for PowerShell scripts you only need to specify the script file itself, nothing else.
For any other type of script you will need to specify the entire command, including what script engine you want to use (e.g., WScript.exe or JScript.exe for Windows Script Host.)
If you package your custom action scripts as executable files, you need to add the .exe file and reference that rather than the script.

 

image

 

The Properties options are defined as follows:

File Place the name of the file to be executed. This file must either be installed by the MSI you are creating or pre-exist on all target machines (e.g., Notepad.exe.)
Folder Defines the current directory for the custom action. In most cases, this will be the INSTALLDIR folder, which is where your application will be installed. The drop-down combo box has a few other common Windows folder shortcuts.
Arguments Specify any command line parameters you need to pass to your custom actions. The example below illustrates how the readme.txt file included in the sample installer is passed to Notepad.exe. Please note that there is absolutely no checking on the values you enter here, as we do not know what will exist on the target machine. That is for you to check and verify in your specific environment.

 

image

 

The Execution Time options may be a little bit confusing in their terminology unless you are very familiar with Windows Installer:

Immediately Indicates that the custom action will run during normal processing time with user privileges.
When the system is being modified (deferred) Indicates that the custom action runs in-script (possibly with elevated privileges). This is the default.
During installation rollback Indicates that a custom action will run in the rollback sequence when a failure occurs during installation, usually to undo changes made by a deferred custom action.
After the system has been successfully modified (commit) Indicates that the custom action will run after successful completion of the installation script (at the end of the installation).

 

The Execution Options are defined as follows:

Run under the system account (no impersonation) Typically the installer process will make changes to the system impersonating the installing user. Check this option to make sure your custom action runs under the system account rather than the installing user.
Wait for custom action to finish Indicates that the custom action will run synchronously and the return code will not be checked.
Check return code Indicates that the custom action will run synchronously and the return code will be checked for success.

 

For any Windows Installer custom action a non-zero return value is interpreted as an error and the installation stops. If neither ‘Wait’ nor ‘Check’ is specified, the custom action will run asynchronously and execution may continue after the installer terminates.

image

 

In our sample the Setvalue.ps1 script creates a registry path ‘HKCU:\Software\SAPIEN\ShowServices\Settings” and sets a value “Version” to 17324. This script is set to be executed during install and maintenance modes. So deleting that registry entry and running the MSI in maintenance mode will restore it:

image

 

The ‘Remove Registry’ custom action will delete the entire ‘ShowServices’ node on uninstall:

image

 

When you run this installer for installing or uninstalling you will notice the flashing consoles when the custom actions involving PowerShell are active. You might ask why we did not hide these flashing consoles or make them permanent so you can see what happened. Since this is just a sample to get you familiar with the new custom actions option, we kept the actions simple. They are more or less simple one-liners. In a real-world application, these scripts will become much more involved. Since we don’t know what you will do, we left everything as basic as possible. When using PowerShell scripts, you may want to display a progress bar in the console for any lengthy processes or only stop to show results in case of an error. Likewise, you may choose to not bother the user with any errors during uninstall and just leave things in place if they cannot be removed.

Some may prefer custom actions to be silent and invisible, in which case you can use a script during development to see what happens and, as the last step, convert these scripts to an executable file with any of the silent engine options for Windows PowerShell within PrimalScript or PowerShell Studio. You should also note that using PowerShell script files will make these files subject to the PowerShell execution policy on the target computer, so do not forget to sign your script files if so required.

Keep in mind that the choice is yours what to do when, and how to handle errors and user notifications.

You can download the sample files for this post here: https://sapien.s3.amazonaws.com/Blog_Post_Files/ShowServicesSampleApplication.zip

Feedback

If you have any questions, concerns or feedback you are, as always, more than welcome to visit our support forum and post a message in the appropriate section.

New PowerShell Service Template for PrimalScript 2018

$
0
0

PrimalScript 2018 supports advanced functionality for Windows services written in PowerShell and packaged with PrimalScript 2018 or PowerShell Studio 2018. While your old code will still work unmodified with this packager engine as it detects the presence of the new functions, it is a good idea to revamp your code to use the new template.

You can find the new template under File, New, PowerShell Service as shown below:

image

 

The new template features three preset functions:

Start-MyService This function is called when your service starts. You should place all initialization code for your service here.
Invoke-Myservice This is your service’s main loop. Take a minute to look at the template to understand the process – sleep – process cycle as your service executes.
Stop-MyService This function is called when your service is asked to stop by the operating system or when using Stop-Service. You can see some management code in here that will allow your main service loop to exit gracefully. Place code to close connections and release resources here.

 

Please do not rename these functions. You can add your own functions as you see fit, but these three functions and their names are essential to make your service execute properly once packaged.

image

 

Please note that the main service thread is forcefully terminated after you exit the Stop-MyService function. Make sure you terminate running jobs and secondary runspaces here; otherwise the service process may hang and not exit properly.
As some modules create their own threads, jobs, runspaces, etc. it is also a good idea to use remove-module for any modules you load implicitly or explicitly. The script itself does not operate as a service, but you could dot source it from a test script to execute the functions for testing.

Ultimately, to make a real Windows service, you need to package your script with a PowerShell Windows Service engine as shown here:

image

 

As illustrated in a previous blog article, the <service>.exe /i and <service>.exe /u command line parameters can be used to self-install / uninstall the service on your machine.

Feedback

If you have any questions, concerns or feedback you are, as always, more than welcome to visit our support forum and post a message in the appropriate section.

Installing Windows Services Created with PowerShell

$
0
0

Last week we introduced a new PrimalScript template for Windows PowerShell which makes writing Windows Services much easier. In this article, we will show you how to use MSI Builder to create an installer for your Windows Services.

We created the services packager engine to allow for self-install / uninstall of a service, but that is not always the best option. For example, if your service requires other files, you will most likely want to create an installer. It is quite easy to create an installer using the MSI builder, which is part of the PrimalScript and PowerShell Studio deployment package.

The MSI Builder had no special accommodation for Windows Services—until now. Since a service needs to be registered and started on install rather than just copied as a file, the MSI file needs to contain some extra information. Likewise, when uninstalling, the service needs to be stopped and unregistered before the file can be removed.

But fear not, there is not much for you to do. When you create the installer, simply select Windows Service as the ‘Product Type’, and you are almost done:

image

 

Almost—because there is one important detail that needs your attention. Since MSI does not know which file in your installer is a service and which is not, you need to make sure the actual service executable is the first file in the ‘Files’ list:

image

 

In future releases, we will add some more options for your service installers, but for this scenario, you simply need to select Windows Service as the ‘Product Type’ and choose the service executable as the first file. The installer will install your files, register the service, and start it. On uninstall it will stop the service and unregister it before it removes any files.

Note: Uninstalls can take longer for services since Windows Installer will wait for the service to stop. Depending on how you programmed the service, that can mean a 20 to 30-second delay.

Learning from a working example is always a little easier, so we updated last years SolitaireKiller service to use the new service template and create an installer for it. Download the code and working installer here: https://sapien.s3.amazonaws.com/Blog_Post_Files/SolitaireKiller2018.zip

Feedback

As always, if you have any ideas, comments, or feedback, please visit our feedback forum and reference this post.

Storing PowerShell Variables in External files

$
0
0

On our PowerShell community forums, we have recently received questions asking if it is possible to store PowerShell variables in external files, and if so, how is it done. It is indeed possible to store variables in external files for PowerShell to use. These external variables can be stored in a variety of file types. Storing them in a PowerShell file is one of the easiest because can just dot source these files.

We will cover the following methods to store variables:

  • Script Files
  • Text Files
  • JSON Files
  • XML Files

The examples shown in this post are pretty simple, but that doesn’t mean that it isn’t possible to store fairly complex variables in external files. If you want to experiment with storing external variables you can download the sample files for this post here.

Script Files

Dot sourcing may be the easiest way to store external variables—but it isn’t always the safest. When a file is dot sourced we are telling PowerShell to execute that script. If there is any malicious code in the file, then that code will also run.

In addtion to dot sourcing, you will also need to ensure that the external variables PowerShell script is signed and that Remote Execution is enabled on your machine.

Dot sourcing can be helpful if we need to get information about something dynamically. For the other options discussed in this post, the data stored in the file types will have to be manually changed.

1
2
3
4
# Here we dot source the External variables PowerShell File
. "C:\Test\BlogPosts\ExternalVariables.ps1"

Write-Host $External_Variable1 $External_Variable2

Here is what is in the ExternalVariables.ps1 file:

#Declaration of External variables
$External_Variable1 = ‘Sapien’
$External_Variable2 = ‘Technologies’

Text Files

External variables can also be stored in a number of text files and formats, such as plain text in a general .txt file using the Get-Content cmdlet. When we import variables this way, we aren’t running any code, so it’s not a problem if you don’t constantly monitor the files to get information.

The following three pictures are examples of different ways of storing information in a simple text file:

1
2
3
4
5
6
7
8
9
10
11
12
13
$ScriptDir = Split-Path $script:MyInvocation.MyCommand.Path
$ScriptDir += "\ExternalVariables.txt"

# Use get content to get all of the lines that are in the txt files

$External_Variables = Get-Content -Path $ScriptDir

#The Information from the ExternalVariables comes in as an array 
#So to print all of the strings in $program we use a foreach loop
foreach ($string in $External_Variables)
{
	Write-Host $string	
}

Here is what is in the ExternalVariables.txt file:

“PowerShell Studio”
“PrimalScript”
“Version Recall”

Just like an array, we can store hash tables in text files. To get our hash table from a text file, we will have to pipe the output of the Get-Content to the ConvertFrom-StringData cmdlet to convert the output into a hash table.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
$ScriptDir = Split-Path $script:MyInvocation.MyCommand.Path
$ScriptDir += "\ExternalVariablesHashTable.txt"
# Getting the contents of the External Variable text file
# This file is store in plan text and is not in any special format

# We use the "raw" parameter here in Get-Content so that when we get the contents
# of the file so that our hashtable is not converted to an object
$program = Get-Content -raw -Path $ScriptDir | ConvertFrom-StringData

write-host "`nType of the variable `$program`n"
$program.GetType()

write-host "`nPrinting `$program" 
$program

Here is what is in ExternalVariablesHashTable.txt:

Company=Sapien Technologies
Product=PowerShell Studio

Storing information in a text file like this is a convenient way to keep information in a human-readable format. Text files also come with the benefit of not being executable, so if there happens to be malicious code stored in a file you don’t regularly manage it won’t be executed.

JSON File

It is also possible to store external variables in a JSON format. The only caveat being that we will need to once again pipe the output of Get-Content to another cmdlet; however this time it’s ConvertFrom-Json rather than ConvertFrom-StringData. For those unfamiliar with JSON or need to brush up on the format, please visit www.JSON.org.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
$ScriptDir = Split-Path $script:MyInvocation.MyCommand.Path
$ScriptDir += "\jsonfile.json"
#Getting information from the json file
#The we pass the output from Get-Content to ConvertFrom-Json Cmdlet
$JsonObject = Get-Content $ScriptDir | ConvertFrom-Json

#Right now we have an array which means that we have to index
#an element to use it
$JsonObject.Users[0]

#When indexed we can call the attributes of the elements
Write-Host "Attributes individually printed"
$JsonObject.Users[0].Name
$JsonObject.Users[0].Age
$JsonObject.Users[0].City
$JsonObject.Users[0].Country
$JsonObject.Users[0].UserId

Here is what is in the JSON File:

{
“Users”: [{
Name:”John Smith”,
Age:”35″,
City:”San Francisco”,
Country:”USA”,
UserId:”52917″
},
{
Name:”Jane Wellington”,
Age:”28″,
City:”Seattle”,
Country:”USA”,
UserId:”25589″
},
{
Name:”Samantha Scott”,
Age:”33″,
City:”Los Angeles”,
Country:”USA”,
UserId:”11564″
}
]
}

XML File

If we store our variables in an XML format, we can add comments to the variable file if necessary. The only two file formats that we will talk about in this post that allows for comments are XML or PS1. JSON and normal TXT files do not allow for comments. For a concise overview of the XML format, visit w3schools.com/xml.

1
2
3
4
5
6
7
8
9
$ScriptDir = Split-Path $script:MyInvocation.MyCommand.Path
$ScriptDir += "\XMLFile.xml"
#Read in all of the information from our variables XML file
#We will need to cast the variable as [XML] when we store all of the file information in it

[xml]$XML_Variable = Get-Content -Path $ScriptDir

#Referencing the Food Object array stored within the Breakfast Object
$XML_Variable.Breakfast_menu.Food[0] | Format-List

Here is what is in the XML file:


Belgian Waffles
$5.95
Two of our famous Belgian Waffles with plenty of real maple syrup
650

Strawberry Belgian Waffles
$7.95
Light Belgian waffles covered with strawberries and whipped cream
900

Berry-Berry Belgian Waffles
$8.95
Belgian waffles covered with assorted fresh berries and whipped cream
900

When choosing between XML and JSON storage formats, it comes down to which one is more familiar. Since the main difference between them is that XML allows comments, it is just a matter of preference. All of these options are viable ways to store information in external files to either be read by another program or used by the same program at a later time. How complex, whether it is dynamic or not, and how much information needs to be stored will dictate the format to use.

Exporting to Files

Just like importing information with PowerShell, it is also possible to export information and objects to an external file from the program we are using. PowerShell Studio 2018 comes with snippets that make exporting information much easier—simply pass the path of the external file and the object to the corresponding export function and the snippet will take care of everything else. We will cover Exporting to Files using Snippets in a future blog post.

 


PowerShell HelpWriter 2018: Create a Single Function Help File

$
0
0

PowerShell HelpWriter™ is an essential application to have in your toolbox, especially when working with existing scripts, functions, or modules. Eventually, your scripts will be shared or given to someone else to maintain, and no one wants to inherit a script that is hard to follow or difficult to understand.

Including Help documentation with your scripts is a best practice that takes very little time to do, and demonstrates your quality of work to your IT peers. In this blog post we will show you how to create a PowerShell XML Help file and add it to a script.

Ease Of Use

PowerShell HelpWriter has the ability to build an XML Powershell Help file that can be included in a script file.

You can start using the tool as soon as you open the application—it’s as easy as filling in the blanks.

Use the application Menu, or both the Navigation panel and the Designer panel to build the Help file.

When the Help file is complete, the XML file can be validated before being implemented in the script file.

Create a Help File

We are going to create a new Help file for an existing function called Get-MySQLServices.

On the File menu, click on New then select New Help File. This will open an empty New-Cmdlet Help file.

We will change the name from New-Cmdlet to Get-MySQLServices and start to fill in the information in the Synopsis and Detailed Description sections.

The Synopsis is a short description about the function and the Detailed Description explains more about the function.

Adding Syntax and Parameters

In the Syntax and Parameters sections we will configure how the function will work and what parameters are going to be used by the function.

First, we need to add all of the parameters used by this function in the Parameters section. To start adding parameters click on the Add New Parameter button.

Parameter1 will be added first, then we will rename it to Status.

 

Next we will update the properties for each parameter by clicking on the “tool” icon next to the parameter name.

The following parameter properties can be modified in the Parameters tool editor:

  • Description
  • Default Value
  • Aliases
  • Accept Wildcard Characters
  • Dynamic
  • Value Required
  • Accepted Values

In this case we are working with the Status parameter properties.

Keep in mind this is just for the Help file and will not effect the existing function script file.

After adding all of the function parameters we need to go back to the Syntax section and add a parameter set to configure how the function will be used with the assigned parameters. We will add Parameter Set 1.

Creating Parameter Sets

When clicking on the Parameter Set 1 checkbox you’ll notice that the Parameters field is empty. And, No! We haven’t lost any of the parameters we previously added.

Next we will add the parameters that we previously loaded in the Parameters section to the Parameter Set 1 field.

To add a loaded parameter: In the Parameters section under Parameter Set 1, click on Add Parameters or right-click on the parameter field.  In the drop-down menu, select Status.

Remember, you can build a function script that can be executed in a number of ways. This is why we can add a Parameter Set for each way the function can be executed.

We will finish configuring the properties for the Status parameter to be use in Parameter Set 1 by going back into the Parameters tool editor pane. Now we can change some properties that were not enabled when we initially added the parameter:

  • Position
  • Input Pipeline
  • Required

After the changes are done for the parameter set they are displayed in Syntax | (All Parameters)*

*Note: To Add/Update/Delete parameters simply click on the selected “Parameter Set x”, then select either “X” to delete or select “Add New Parameter”.

Adding Examples

In the Examples section we can add examples of how the function is going to be executed with the parameters.

  • Click on Add Example or right-click on the Example field.
    •  
  • To edit the new line (Example x), click on the “tool” icon to add the example code.

There are three fields used to add Help information about the function: Introduction, Command, and Remarks.

In the Introduction field, type either a DOS prompt “PS C:\>” or a Bash prompt “PS /home/user1>“.

In the Command field, type the command one-liner example.

In the Remarks field, add any additional comments about the example.*

*Note: Make sure to press enter at the end of the Remark. 

Adding Notes

You can add any additional notes about the function in the Notes section. You can add titles for every Remark you want to add or edit.

 

Identify Input and Output

Input Types and Output Types is where you document what types of variable the function is accepting and what output is being returned when the function is completed.

In Input Types we add a “String” type input variable. Click on the “tool” icon next to the name and select Add Type to add the object type “String” in the Name field.  Include additional information about the Input object variable in the Description field.


In Output Types we add a “PSobject” type output variable. Click on the “tool” icon next to the name and select Add Type to add the object type “PSobject” in the Name field.  Include additional information about the Output object variable in the Description field.

Add Resource Link

Links is an optional section where you can provide a URL link to a Microsoft document, or perhaps a link to a blog post related to the custom function.

Validate and Preview

We will save the file as Get-MySQLServices-Help.xml.

After the Help file has been saved we will Validate Help Schema to check for errors and Preview the file.*

The validation result will display in the Output pane at the bottom of the designer. If there are no errors this message will display: Validation Status (Errors: 0, Warnings: 0): PASSED.

*Note: Make sure to save changes and validate the schema before previewing the Help file.

Although the Help file can be previewed at any time, it is a good idea to Save, Validate, and Preview the file while building the file.

Add Help File to the Script

Next we will open the script function Get-MySQLServices in PowerShell Studio to add the Help file Get-MySQLServices-Help.xml.

In order to add the Help XML file to the script, we need to add the ” # .EXTERNALHELP <HelpXMLfileLocation> ” line inside the function:

function Get-MyMSQLServices{
# .EXTERNALHELP C:\Users\user01\Documents\SAPIEN\Help Files\Get-MySQLServices-Help.xml

Param(..)
: code here…

}

After this line has been added to the script function we load the function in PowerShell. Then we can view the function Help information by executing the following:

## – Display help detailed information:
Get-Help Get-MySQLServices -Detailed

## – Or, Display Help in popup windows:
Get-Help Get-MySQLServices -ShowWindow

Help Your IT Peers

As you continue to build your PowerShell scripting skills, PowerShell HelpWriter 2018™ is a must have tool for properly documenting your scripts. In addition to helping your IT peers understand how your scripts, functions, and modules work, taking the time to include a Help file before deploying your script to production is clearly a Best Practice.

Comment Based Help is always an option but if you want to exponentially speed up the documentation process and make Help file updates on-the-fly—PowerShell HelpWriter is the answer.

Related Articles

Feedback

As always, if you have any ideas, comments, or feedback, please visit our feedback forum and reference this post.

 

Max Trinidad is a Technology Evangelist at SAPIEN Technologies Inc., and a Microsoft PowerShell MVP. You can reach him at maxt@sapien.com

SAPIEN Technologies at the Jacksonville IT Pro Camp 2018

$
0
0

SAPIEN Technologies is proud to have sponsored the Jacksonville IT Pro Camp 2018 on Saturday, June 9th.

This year marked the seventh anniversary for the event which was well attended with over 275 attendees and 21 sponsors. The event was a success due in large part to support from Keiser University, event sponsors, and especially the IT Community.

 

Organizer Sidney Moore along with his community leaders have taken this event to the next level of excellence:

 

These community leaders have taken the initiative to assist and grow their IT Community in the Jacksonville area, helping to boost the skills of everyone attending the event.

Most importantly, the volunteers kept the flow of the event going without a glitch! Here are some pictures of the volunteers at work:

 

Along with 49 speakers spanning 12 tracks, I gave two PowerShell Studio presentations: PowerShell Core Building Cloud VM with Azure Resource Manager and PowerShell Core SQL Server Management Object (SMO) Scripting Cross-platform. It was an honor to be with such great speakers including Joshua Corrick and Robert Cain.

 

There were some great moments during the event:

  • Free Microsoft Certifications All Day
  • Boy Scouts Pledge of Allegiance

  • Hovercraft Challenge

 

  • Star Wars Storm Trooper Visit

 

If anyone in the Jacksonville area is interested in the Jacksonville PowerShell User Group and future events, feel free to join their MeetUp group.

Event and User Group Sponsorship

SAPIEN Technologies is proud to support the PowerShell Community. Feel free to reach out and contact us for both events and user group sponsorship information. Send us an email: usergroups@sapien.com

 

Max Trinidad is a Technology Evangelist at SAPIEN Technologies Inc., and a Microsoft PowerShell MVP. You can reach him at: maxt@sapien.com 

New Video – PowerShell HelpWriter: Create a Single Function Help File

$
0
0

The latest in our video series demonstrates how to work with PowerShell HelpWriter 2018™ to create a new help file for a single function.

In this video we show you how to create a fully documented help file for a single function with:

  • Synopsis
  • Detailed Description
  • Syntax
  • Parameters
  • Examples
  • Notes
  • Input / Output Types
  • Resource Link

Follow the steps in the video and be creative!

View the video here: PowerShell HelpWriter 2018 – Building a Single Function Help File

Related Articles

 

Instructional Videos

Learn about other SAPIEN Technologies product features by checking out the videos on our YouTube channel.

Feedback

As always, if you have any ideas, comments, or feedback, please visit our feedback forum and reference this post.

 

Max Trinidad is a Technology Evangelist at SAPIEN Technologies Inc., and a Microsoft PowerShell MVP. You can reach him at maxt@sapien.com

New Video – Creating PowerShell Windows Service with PrimalScript

$
0
0

The latest in our video series demonstrates how to to create a Windows Service to kill a Solitaire process using SAPIEN’s PrimalScript 2018.

In this short video we show you how to build and install a PowerShell Windows Service, including:

  • Create a Windows Service from a PowerShell Service Template
  • Add the code from a custom Snippet library
  • Build the script Packager
  • Build the MSI Installer
  • Install / Uninstall the Windows Service

Follow the steps in the video and be creative!

View the video here: Creating PowerShell Windows Service with PrimalScript 2018

*Note: In Packager Settings > Output Settings, make sure the default value for “Manifest Creation” is changed to “Embed a default manifest”, or the service will fail to run.

Related Articles

Instructional Videos

Learn about other SAPIEN Technologies product features by checking out the videos on our YouTube channel.

Feedback

As always, if you have any ideas, comments, or feedback, please visit our feedback forum and reference this post.

 

Max Trinidad is a Technology Evangelist at SAPIEN Technologies Inc., and a Microsoft PowerShell MVP. You can reach him at maxt@sapien.com

PAUSE and CONTINUE added to PowerShell Service Packager Engine

$
0
0

We recently added support for a more refined handling of events in PowerShell based Windows services—allowing for the handling of Start, Stop, and Run events separately.  Please see this article for details. https://www.sapien.com/blog/2018/02/27/new-powershell-service-template-for-primalscript-2018/

The latest service builds of PrimalScript 2018 and PowerShell Studio 2018 include the ability to handle Pause and Continue events.

The PowerShell Service Template has been updated to contain functions for these new operations:

image

The new template functions merely set a global flag to indicate that the service is paused. The main loop of the service needs to be adjusted to handle this flag accordingly.

image

 

If you transform an existing service to support Pause and Continue, you will need to incorporate this mechanism similar to what is shown in the screenshot above.

You may ask why Pause does not simply suspend the thread the PowerShell engine is running on, and Continue would resume the thread. It would certainly add Pause and Continue to the service without you having to do anything.
The answer is because it is never a good idea to suspend a thread or a process from the outside. You simply never know what the current state of it is. Imagine that your service just sent a query to a server for a large amount of WMI data. Right after sending the query, your service’s thread gets paused—or frozen. When the user decides to continue the service, your code will find that the query’s result is no longer there—it is even likely that the server you connected to is no longer online. It would be very difficult to make every single line in your code ‘interruptible’ so it will not crash or cause an exception when this happens.

It is much better to leave the handling of Pause and Continue to your code, so you can decide when is the appropriate time to suspend doing whatever your service is doing. This will also allow your service to handle a Stop request properly while in a paused state.

Feedback

As always, if you have any ideas, comments, or feedback, please head over to our forum and post in the appropriate section: https://www.sapien.com/forums/index.php

 

PowerShell Studio: How to prevent Powershell cmdlets from hanging in GUI apps

$
0
0

We have seen issues reported in our support forum where Azure commands are not terminating, or more likely their cmdlets are not returning in a GUI environment. This is due to the Powershell pipeline being held up by the GUI, which could prevent Powershell from processing internal messages and events thus causing a hang when a cmdlet waits for these triggers.

AzureRM cmdlets are meant to be executed from a Powershell console. The fact that we can extend this to a Windows application is a wonderful idea. Now we are indeed thinking out-of-the-box!

What’s Going On?

Windows applications are synchronous—the user must wait until they complete. Unfortunately, as explained previously there will be occasions when event-driven tasks may not complete without any warning.

When working with Azure commands everything happens in the cloud. There is the possibility that executed commands may have completed on Azure without you knowing.

Fortunately, there is a convenient way to overcome this issue.

Asynchronous – Way To Go!

The asynchronous method involves submitting the task as a background job to be executed. When working from the GUI, as soon as the job is submitted, the control returns back to the GUI.

PowerShell provides some cmdlets for submitting jobs in the background:

Start-Job
Get-Job
Receive-Job

A number of the Azure cmdlets have the ‘-AsJob‘ parameter which can be used to submit Azure commands as individual jobs to be processed in the background.

Here are some AzureRM cmdlets with the ‘-AsJob‘ parameter:

New-AzureRmStorageAccount
Remove-AzureRmResourceGroup
New-AzureRmVM
Stop-AzureRmVM
Start-AzureRmVM

One thing to remember—submitting individual jobs will require the necessary logic to track them.

Building Code

The following is a basic console example of how we can accomplish creating a job and tracking its progress:

  • Prepare the scriptblock for the job to stop an AzureRm VM:
## - Create Scriptblock for the job:
$StopVmScript1 = {

## - SignOn to Azure:
Import-AzureRmContext -Path "c:\Temp\WinPS_AsubRMprofile.json";

## - Run Azure cmdlet with AsJob parameter:
$AzQryVMStop1 = Stop-AzureRMVM `
-ResourceGroupName "JaxItProCamp2018Resources" `
-Name 'Win2K16VM1' `
-Force;
};
  • Send the job for background execution and track the status:
## - Send/Start job in background with a jobname:
Start-Job -ScriptBlock $StopVmScript1 -Name "bg_StopVM1Script";

## - Loop to check for The Stop-AzureRM cmdlet job completion:
do {
Start-Sleep -Seconds 2;
}
while ((Get-Job).State -ne 'Completed');

## - Get background job results:
Get-job -Name "bg_StopVM1Script" | Format-List | Out-File -FilePath 'C:\Temp\GetBackgroundJobResults.txt';

This code can be tested and debugged at the Powershell console before being copied to the Windows Button control event. This code is just a starting point and can be enhanced.

Code GUI Implementation

Implementing this code in a GUI application will require more changes depending on the application requirements. In this test we only have two controls in the form:

  1. The Button Control is used to prepare the Azure task and submit the job to execute in the background.
  2. The RichTextBox Control is used to post the job progress and result.

Here is a sample implementation of the Powershell code in a Windows Button Control named “$buttonAzStopVMJob_Click“:

$buttonStopVMJob_Click={
#TODO: Place custom script here
$buttonStopVMJob.Enabled = $false;

## - Create Scriptblock for the job:
$StopVmScript1 = {

## - Sign-On to Azure:
Import-AzureRmContext -Path 'c:\Temp\WinPS_AsubRMprofile.json' | Out-Null;

## - Run Azure cmdlet with AsJob parameter:
Stop-AzureRMVM -ResourceGroupName "GlobalAzureBootCampResources" `
-Name 'Win2K16VM1' `
-Force;
}

## - Send/Start job in background with a jobname:
$RichTextBoxStatus.AppendText("`r`nStart Job to Stop VM!`r`n");
Start-Job -ScriptBlock $StopVmScript1 -Name "bgStopVM1Script";

## - Loop to check for The Stop-AzureRM cmdlet job completion:
do {
$RichTextBoxStatus.AppendText("Process Stopping VM!`r`n");
Start-Sleep -Seconds 20;
}
while ((Get-Job -Name "bgStopVM1Script").State -ne 'Completed');

## - Enable button when background job is done:
$buttonStopVMJob.Enabled = $true;
$RichTextBoxStatus.AppendText("Process Completed - VM Stopped!`r`n");

## - Get and Save to the file the background job results:
Get-job -Name "bgStopVM1Script" | Format-List | Out-File -FilePath 'C:\Temp\GetBackgroundJobResults.txt';
Start-Sleep -Seconds 2;

## - Display the background job results:
$RichTextBoxStatus.AppendText("`r`nDisplaying Asynch Job Results:`r`n");
$RichTextBoxStatus.AppendText((Get-Content -Path 'C:\Temp\GetBackgroundJobResults.txt' | Out-String -Width 1000));
}

This code will create one job result:

  1. The background job results from the submitted Scriptblock.

The results of this asynchronous process will be displayed in the GUI form under the RichTextBox object:

Note: PowerShell Studio has a “Job Tracker” control set to manage jobs within your GUI.

Summary

This sample code provides the foundation for you to start thinking about how to prepare any Azure task to take advantage of background processing while the Azure task simultaneously processes smoothly. Now you can take it to the next level!

You can download the sample Windows GUI here: AzStopVMJob_sample.

Stay tuned for a future blog article where we will discuss SAPIEN’s JobTracker helper functions in the wizard template “Grid Job” form.

Related Articles

Feedback

As always, if you have any ideas, comments, or feedback, please visit our feedback forum and reference this post.

 

Max Trinidad is a Technology Evangelist at SAPIEN Technologies Inc., and a Microsoft PowerShell MVP. You can reach him at maxt@sapien.com

SAPIEN Technologies on Windows PowerShell and PowerShell Core

$
0
0

SAPIEN Technologie’s PrimalScript™ and PowerShell Studio™ products have always supported Windows PowerShell. With the Generally Available (GA) release of the open source PowerShell Core, we began the integration and support of this new technology into our products.

The chart below shows the differences between the Windows PowerShell and PowerShell Core products:

We use this as our guide to support both technologies.

Unique Editor

The editors in our SAPIEN products stand on their own and do not follow the same model of other editors. It is not practical to compare a lightweight editor with a rapid application development editor, and we are not suggesting to stop using other editors—they all have their unique purpose and can complement each other in many ways.

We are continuing to integrate PowerShell Core in our PrimalScript and PowerShell Studio editors—two powerful tools developed to help you be more productive.

We will continue to update and improve these tools as PowerShell Core evolves.

What about Window PowerShell?

Microsoft has stated that Windows PowerShell is “complete”—this does not mean that it is obsolete, nor will it be deprecated anytime soon. Microsoft has invested a lot in this technology, and it is not being relegated to the trash bin.

Another chapter has opened for PowerShell—all future efforts and improvements will be towards PowerShell Core. There are no plans for another version of Windows PowerShell, as there will be “no feature development.”

Check out Jeffreys Snover’s video “State of the Art” about the state of Windows PowerShell.

There will be Windows PowerShell security fixes and customer “blocking” bug fixes only. Report any Windows PowerShell issues in the Windows PowerShell UserVoice forum, and report PowerShell Core issues on Github.

The last version of Windows PowerShell is 5.1.

Summary

As PowerShell Core keeps evolving our development team will make the necessary changes to our products, and we will also continue to support Windows PowerShell.

Most importantly we value your suggestions and feedback, which help our products be the best in this industry.

References

Feedback

As always, if you have any ideas, comments, or feedback, please visit our feedback forum and reference this post.

 

Max Trinidad is a Technology Evangelist at SAPIEN Technologies Inc., and a Microsoft PowerShell MVP. You can reach him at maxt@sapien.com


New Video – PowerShell Studio: Working with Azure cmdlets in Background Jobs

$
0
0

Our latest video shows two different ways to execute a console AzureRM script as a job using PowerShell Studio 2018.

In this short video we demonstrate how to use the Azure cmdlet Stop-AzureRmVM to stop an Azure VM by submitting the script code as a background job in two different ways:

1. Create the scriptblock containing the Azure cmdlet to be submitted as a job using the Start-Job cmdlet.
2. Create a more elaborate scriptblock containing the Azure cmdlet to be submitted as a job using the -AsJob parameter.

Both jobs can save the status and any output results from the Get-Job and Receive-Job cmdlets.

What’s involved in this process:

  1. Connect to Azure and execute the command.
  2. Use the Start-Job cmdlet to submit the scriptblock as a background job.
  3. Use the Do/While loop to display the job progress.
  4. Use the Receive-Job cmdlet to save the job results.
  5. Use the Get-Job cmdlet to display the saved job results at the end of the process.

We also show the use of adding an argument by passing a parameter to the scriptblock.

View the video here: SAPIEN PowerShell Studio: Working with Azure Cmdlets In Background Jobs

Take Way

This example provides a scripting framework that is not limited to Azure, and that can also easily be enhanced. It is better to use background jobs when working with Azure commands in a GUI application.

Most importantly, store the code in a scriptblock and submit the job using the Start-Job cmdlet. The Receive-Job cmdlet will save information generated by the job, and the Get-Job cmdlet will provide the status information of the background job—so if the job doesn’t generate any output, then it will not save any data.

Sample Code

Feel free to copy/paste the code against your AzureRM VM’s.

Check the AzureRM VM Status:

## - Check for VM status:
Get-AzureRmVM -ResourceGroupName 'GlobalAzureBootCampResources' -status `
| Select-Object Name, PowerState;

Run Stop-AzureRMVM without using the -AsJob parameter:

#region RunStop-AzureRMVM_NoAsJobParam

## 1 - Run Stop-AzureRMVM cmdlet without the -AsJob parameter:
$JobName = "StopVmScriptJob";
$JobScript = {

## - Sign-On to Azure:
Import-AzureRmContext -Path 'C:\Temp\WinPS_AsubRMprofile.json' | Out-Null;

## - Run Azure cmdlet with AsJob parameter:
Stop-AzureRMVM -ResourceGroupName "GlobalAzureBootCampResources" `
-Name 'Win2K16VM1' `
-Force -Verbose;
};

## - Step that submit the scriptblock as a background job - ##
$job = Start-Job -Name $Name -ScriptBlock $JobScript;

## - Loop to check for The Stop-AzureRM cmdlet job completion:
"Processing Started - $($JobName) - $((Get-Date).ToString("MM/dd/yyyy-MM:hh:ss"))"
do {
Start-Sleep -Seconds 10;
"Processing! - $($JobName) - $((Get-Date).ToString("MM/dd/yyyy-MM:hh:ss"))"
}
while ((Get-Job).State -ne 'Completed');
"Process Completed - $($JobName) - $((Get-Date).ToString("MM/dd/yyyy-MM:hh:ss"))"

## - Saving Job results:
$ReceiveJobStatus1 = Receive-Job -Job $job -Keep;
$GetJobStatus1 = Get-Job;

## - Displaying Job results:
$ReceiveJobStatus1
$GetJobStatus1 | Format-List;

#endregion RunStop-AzureRMVM_NoAsJobParam

Run Stop-AzureRMVM using the -AsJob parameter:

#region RunStop-AzureRMVM_WithAsJobParam

## 2 - Run Stop-AzureRMVM cmdlet with the -AsJob parameter:
$JobName = "StopVmCmdletAsJob";
$JobScript = {
param($Name)

## - Sign-On to Azure:
Import-AzureRmContext -Path 'C:\Temp\WinPS_AsubRMprofile.json' | Out-Null;
$TaskName = "StopVM1_$($Name)";

## - Run Azure cmdlet with AsJob parameter:
Stop-AzureRMVM -ResourceGroupName "GlobalAzureBootCampResources" `
-Name $Name `
-AsJob `
-Force -Verbose;

## - Inside Job Loop to check for the cmdlet job completion:
"`r`nProcess Started - $TaskName - $((Get-Date).ToString("MM/dd/yyyy-MM:hh:ss"))"
do {
Start-Sleep -Seconds 10;
"Processing! $TaskName - $((Get-Date).ToString("MM/dd/yyyy-MM:hh:ss"))"
}
while ((Get-Job).State -ne 'Completed');
"Process Ended - $TaskName - $((Get-Date).ToString("MM/dd/yyyy-MM:hh:ss"))"

"`r`nGetting Job General Results:"
$AzGetJobStatus1 = Get-Job
$AzGetJobStatus1 | Format-List

"Getting Receive Results:"
$AzureReceiveJobStatus2 = Receive-Job * -Keep;
$AzureReceiveJobStatus2 | Format-List
}

## - Step that submit the scriptblock as a background job - ##
$job = Start-Job -Name $Name -ScriptBlock $JobScript -ArgumentList 'Win2K16VM1';

## - Loop to check for The Stop-AzureRM cmdlet job completion:
"`r`nBackground Process Started - $($JobName) - $((Get-Date).ToString("MM/dd/yyyy-MM:hh:ss"))"
do {
Start-Sleep -Seconds 10;
"Processing Job - $($JobName) - $((Get-Date).ToString("MM/dd/yyyy-MM:hh:ss"))"
}
while ((Get-Job).State -ne 'Completed');
"Background Process Completed - $($JobName) - $((Get-Date).ToString("MM/dd/yyyy-MM:hh:ss"))"

## - Saving Job results:
$ReceiveJobStatus2 = Receive-Job -Job $job -Keep;
$GetJobStatus2 = Get-Job

## - Displaying Job results:
$ReceiveJobStatus2
$GetJobStatus2 | Format-List

#endregion RunStop-AzureRMVM_WithAsJobParam

Summary

Create and test the script using the PowerShell Console before implementing in a GUI application such as PowerShell Studio. This sample script will require some minor changes as you build the GUI application.  Refer to the related articles below for more details.

Related Articles

Instructional Videos

Learn about other SAPIEN Technologies product features by checking out the videos on our YouTube channel.

Feedback

As always, if you have any ideas, comments, or feedback, please visit our feedback forum and reference this post.

 

Max Trinidad is a Technology Evangelist at SAPIEN Technologies Inc., and a Microsoft PowerShell MVP. You can reach him at maxt@sapien.com

 

New Video – PowerShell Studio: Working with Azure cmdlets in Background Jobs in a GUI app

$
0
0

This is the latest installment in our video series on working with Azure cmdlets in background jobs. In this example we are executing the PowerShell script in a Windows GUI Application.

View the video here: SAPIEN PowerShell Studio: GUI Working with Azure Cmdlets In Background Jobs

Minor Sample Script Changes

As the sample script evolves to execute the job from a Windows GUI application instead of the Console, depending on the script complexity more changes might be needed to interact with the GUI components.

The sample script executes the Stop-AzureRMVM cmdlet using the -AsJob parameter which will capture more information about the cmdlet being sent as a job:

:
## - Run Azure cmdlet with AsJob parameter:
Stop-AzureRMVM -ResourceGroupName "GlobalAzureBootCampResources" `
-Name $Name `
-AsJob `
-Force -Verbose;
:

The date/time format will display the processing information in the RichTextBox “$DisplayText” object:

:
$t = Get-Date;
$DisplayText.AppendText("`r`nBackground Process Started - $($JobName) - $(($t).ToString("MM/dd/yyyy - HH:mm:ss"))")
:

Sample GUI Application

The sample GUI Application has two controls: Button, and RichTextBox.

The Button executes the code that will send the jobs to the background, and the RichTextBox is the area that will display the result of the task submitted by the button.

You can download the updated sample Windows GUI here: AzStopVMJob_GUI_sample_asJob.

Sample Code

This is the modified code used in the button control:

$buttonAzStopVMJob_Click = {
#TODO: Place custom script here
$buttonAzStopVMJob.Enabled = $false;

#region RunStop-AzureRMVM_WithAsJobParam
## 2 - Run Stop-AzureRMVM cmdlet with the -AsJob parameter:

## - Create Scriptblock for the job:
$JobName = "StopVmCmdletAsJob";
$JobScript = {
param ($Name)

## - Sign-On to Azure:
Import-AzureRmContext -Path 'C:\Temp\WinPS_AsubRMprofile.json' | Out-Null;
$TaskName = "StopVM1_$($Name)";

## - Run Azure cmdlet with AsJob parameter:
Stop-AzureRMVM -ResourceGroupName "GlobalAzureBootCampResources" `
-Name $Name `
-AsJob `
-Force -Verbose;

## - Inside Job Loop to check for the cmdlet job completion:
$t = Get-Date;
"`r`nCmdlet Job Process Started - $TaskName - $(($t).ToString("MM/dd/yyyy - HH:mm:ss"))"
do {
Start-Sleep -Seconds 20;
$t = Get-Date;
"Cmdlet Job Processing! $TaskName - $(($t).ToString("MM/dd/yyyy - HH:mm:ss"))"
}
while ((Get-Job).State -ne 'Completed');
$t = Get-Date;
"Cmdlet Job Process Ended - $TaskName - $(($t).ToString("MM/dd/yyyy - HH:mm:ss"))"

"`r`nScriptblock - Getting Get-Job Results:"
Get-Job | Format-List

"Scriptblock - Getting Receive-Job Results:"
Receive-Job * -Keep | Format-List
}

## - Send/Start job in background with a jobname:
$DisplayText.AppendText("`r`nStart Job to Stop VM!`r`n");

## - Step that submit the scriptblock as a background job - ##
$job = Start-Job -Name $JobName -ScriptBlock $JobScript -ArgumentList 'Win2K16VM1';

## - Loop to check for The Stop-AzureRM cmdlet job completion:
$t = Get-Date;
$DisplayText.AppendText("`r`nBackground Process Started - $($JobName) - $(($t).ToString("MM/dd/yyyy - HH:mm:ss"))")
do {
Start-Sleep -Seconds 20;
$t = Get-Date;
$DisplayText.AppendText("`r`nProcessing Job - $($JobName) - $(($t).ToString("MM/dd/yyyy - HH:mm:ss"))")
}
while ((Get-Job).State -ne 'Completed');
$t = Get-Date;
$DisplayText.AppendText("`r`nBackground Process Completed - $($JobName) - $(($t).ToString("MM/dd/yyyy - HH:mm:ss"))");

## - Enable button when background job is done:
$buttonAzStopVMJob.Enabled = $true;
$DisplayText.AppendText("`r`nProcess Completed - VM Stopped!`r`n");

## - Display the background job results:
$DisplayText.AppendText("`r`nDisplaying Receive-Job from Stop-AzureRM AsJob Results:`r`n");
$DisplayText.AppendText((Receive-Job -Job $job -Keep | Out-String -Width 1000));

## - Display Azure Stop-AzureRM cmdlet job results:
$DisplayText.AppendText("`r`nDisplaying Get-Job Main Task Results:`r`n");
$DisplayText.AppendText((Get-Job | Format-List | Out-String -Width 1000));

#endregion RunStop-AzureRMVM_WithAsJobParam

}

Feel free to copy/paste this code and modify as necessary.

Summary

The sample script underwent some minor changes to interact with the GUI application. Please reference the related links below for more details.

Stay tuned for a future blog article where we will discuss SAPIEN’s JobTracker helper functions in the wizard template “Grid Job” form.

Related Links

Feedback

As always, if you have any ideas, comments, or feedback, please visit our feedback forum and reference this post.

 

Max Trinidad is a Technology Evangelist at SAPIEN Technologies Inc., and a Microsoft PowerShell MVP. You can reach him at maxt@sapien.com

 

 

PowerShell Studio: Azure Background Jobs in GUI application with Helper Functions

$
0
0

PowerShell Studio provides application templates and helper functions to help you speed up your development process.

The recent blog post series on creating an Azure application shows the evolution of an Azure solution from a console script to a Windows application. In the previous post I used an empty form with two controls (Button & RichTextBox), and then started building from an existing console script. Then I added code with logic to handle the background job process.

Now I’m going to take it a little further with the help of PowerShell Studio’s ready-to-use templates.

Using “Grid Job” Template

I am going to use the GUI Form “Grid Job” template (File > New > New  Form > Forms > Grid Job).

This is a PowerShell Studio single form file which includes some of SAPIEN’s Helper Functions  to assist with your GUI application development.

Below is a list of the Helper Functions available when using the “Grid Job” template:

  • Add-JobTracker
  • Update-JobTracker
  • Stop-JobTracker
  • ConvertTo-DataTable
  • Update-DataGridView
  • SearchGrid

In this article I will cover the JobTracker Add-JobTracker helper function and the DataGridView ConvertTo-DataTable and Update-DataGridView helper functions.

First I will change the Button control text property from “Query” to “StartJob“. Then I will add a Label control and change the label text property to “Time:“. Finally, I will save the file as “JobTracker_AzureRM_Sample.psf“.

Understanding JobTracker Function

Before I start to implement my code, let’s take a look at the main function used for background job processing: Add-JobTracker.

Basic usage syntax:
Add-JobTracker [-Name] <String> [-JobScript] <ScriptBlock> [[-ArgumentList] <Object>] [[-CompletedScript] <ScriptBlock>] [[-UpdateScript] <ScriptBlock>] [<CommonParameters>]

The following parameters can be used:

  • Name – This is the name of the background job.
  • JobScript – This is the code to executed in the background job.
  • CompletedScript – This is the code to execute at the completion of the background job.
  • UpdateScript – This code executes while the background job is running.
  • ArgumentList – (Optional) Only used if there is a need to pass a value to the ScriptBlock.

The JobTracker functions can be found in a custom Control Set labeled “Button – Start Job” which is part of the “Grid Job” template.

The JobTracker helper functions are highlighted below:

The “Grid Job” template also has a DataGridView component, with helper functions that convert and display the receive-job result in a simple format.

Understanding DataGridView Functions

Here’s where I can use the following helper functions:

ConvertTo-DataTable – This function converts the result object into a data table object.
Update-DataGridView – This function reads the data type object to be displayed in the DataGridView component.

Basic usage syntax:
ConvertTo-DataTable [[-InputObject] <Object>] [[-Table] <DataTable>] [-RetainColumns] [-FilterWMIProperties] [<CommonParameters>]

Update-DataGridView [-DataGridView] <DataGridView> [-Item] <Object> [[-DataMember] <String>] `
[[-AutoSizeColumns] {None | ColumnHeader | AllCellsExceptHeader | AllCells | DisplayedCellsExceptHeader | DisplayedCells | Fill} ] [<CommonParameters>]

Array objects created in PowerShell can be displayed in a DataGridView component. The benefit of using a DataTable is that it allows you to use sorting. This can be accomplished with the following two lines of code:

## - Sample Code to display converted data table object results into a Databgridview component:
$dgvResults = ConvertTo-DataTable -InputObject $results -FilterWMIProperties;
Update-DataGridView -DataGridView $datagridviewResults -Item $dgvResults -AutoSizeColumns DisplayedCells;

Note that “$datagridviewResults” is the name on the DataGridView component in the form:

The DataGridView helper functions are highlighted below:

Some Additional Changes

I also made the following additional changes:

In the $formMain_Load form event, I added a global variable for the job name: $global:JobName. For now, this is a hard-coded constant string.

In the $buttonStartJob_Click button event, I added the Timer code which updates the label “Timer:” text property with the recurring execution time.

In the Add-JobTracker function I added the “-ArgumentList” parameter to pass the hard-coded system name value. In the “-JobScript” parameter, values can be passed through the Param($Argument1).

Note that you can rename this variable to suit your needs—with a descriptive name that describes the purpose of the argument.

Remember, any hard-coded variable is a prospect for automation. It is up to the developer to improve the application.

Working with Add-JobTracker function

Next I start adding the code in the Add-JobTracker function. All of this is happening in the $buttonStartJob_Click button event.

The following image shows the areas where the code can be added and follows the process after the ScriptBlock has been submitted to the background for processing:

In this Azure GUI form sample I am using the following Add-JobTracker function parameters in the $buttonStartJob_Click button event:

1. ‘-Name‘ and the ‘-ArgumentList‘:

 Add-JobTracker -Name $global:JobName -ArgumentList 'Win2K16VM1' `...

2. ‘-JobScript‘:

#region AddScriptBlockJob

#--------------------------------------------------
# Add Script Code below
# -------------------------------------------------

## - Automatic Sign-On to Azure:
Import-AzureRmContext -Path 'c:\Temp\WinPS_AsubRMprofile.json' | Out-Null;

Write-Host "Stopping Azure VM - $Argument1";

## - Run Azure cmdlet with AsJob parameter:
Stop-AzureRMVM -ResourceGroupName "GlobalAzureBootCampResources" `
-Name $Argument1 `
-Force;

#--------------------------------------------------
# End of Script Code
# -------------------------------------------------

#endregion AddScriptBlockJob

3. ‘-CompletedScript‘:

#region EndOfJopScript

#--------------------------------------------------
# Add Script Code below
# -------------------------------------------------
 
## - Popup messagebox:
[System.Windows.Forms.MessageBox]::Show('Background Task Completed!','CompletedScript Event Message')
## - CompletedScript will only show data if there are any:
$results = Receive-Job -Job $Job -Keep;

## - Add code display Receive-Job results in the Databgridview:
$dgvResults = ConvertTo-DataTable -InputObject $results -FilterWMIProperties;
Update-DataGridView -DataGridView $datagridviewResults -Item $dgvResults -AutoSizeColumns DisplayedCells;

#--------------------------------------------------
# End of Script Code
# -------------------------------------------------

#endregion EndOfJopScript

4. ‘-UpdateScript‘ – There are two added code blocks: “UpdateJobResults” and the “TimerUpdateCode“.

#region UpdateJobResults

#--------------------------------------------------
# Add Script Code below
# -------------------------------------------------

## - Commented out line below, enabled if needed:
$results = Get-Job -Name $global:JobName `
| Select-Object -property State, Name, PSBeginTime, PSEndTime, PSJobTypeName, Command, location;

$dgvResults = ConvertTo-DataTable -InputObject $results -FilterWMIProperties;
Update-DataGridView -DataGridView $datagridviewResults -Item $dgvResults -AutoSizeColumns DisplayedCells

#--------------------------------------------------
# End of Script Code
# -------------------------------------------------

#endregion UpdateJobResults
#region TimerUpdateCode

#--------------------------------------------------
#			Script Code Added for timer
# -------------------------------------------------

## - To Display Time Elapsed and update datagridviewGetJob:
$CurrentTime = $global:Time.Elapsed;
$DisplayTime = $([string]::Format("Time: {0:d2}:{1:d2}:{2:d2}",
	       $CurrentTime.hours,
	       $CurrentTime.minutes,
	       $CurrentTime.seconds))
$labelTime.Text = $DisplayTime;

#--------------------------------------------------
#			End of Script Code
# -------------------------------------------------

#endregion TimerUpdateCode

You can use both the ‘-UpdateScript‘ and ‘-CompletedScript‘ parameter section to update the information on the DataGridView component.

In this Azure GUI form application I am using the Get-Job cmdlet results to display the progress of the background job in the DataGridView component during the “-UpdateScript” process. When the job completes, the “-CompletedScript” section gets the result from the Receive-Job cmdlet that is sent to the DataGridView component.

Summary

This “Grid job” sample shows how to integrate Azure scripts into a GUI application to take advantage of background processing.

Remember to submit Azure scripts as background jobs to prevent the GUI form from freezing.

This article covered two main topics:
1. Use the JobTracker Add-JobTracker helper function to submit the Azure code as a background job.
2. Use the DataGridView helper functions (ConvertTo-DataTable and Update-DataGridView) to display the results during and after the background process.

I also covered the parameters used in the Add-JobTracker function which is enough to get you started.

PowerShell Studio template can cut down time in development. Specially when taking time to understand these helper functions.

You can download the updated sample Windows GUI here: JobTracker_AzureRM_Sample.zip
Checkout this short two minute video showing the GUI form in action: Sample_AzureGUI_HelperFunctions

Related Articles

Feedback

As always, if you have any ideas, comments, or feedback, please visit our feedback forum and reference this post.

 

Max Trinidad is a Technology Evangelist at SAPIEN Technologies Inc., and a Microsoft PowerShell MVP. You can reach him at maxt@sapien.com

 

Module Manager – the indispensable new tool for managing PowerShell modules

$
0
0

SAPIEN Technologies recently released a free Preview version of the new Module Manager tool. This tool will make your module management free of headaches!

How many times have you discovered that you are working with an outdated module? Or perhaps you already have a routine of doing a manual, tedious search for the latest releases of your modules. Module Manager can solve these problems and many more.

What Module Manager brings to the table

1. Add existing repositories (Internal or External).

2. Research for module information.

3. Visually identify modules that have an update available.

4. Displays information about installed modules.

5. Module installation is manageable with options like Update, Disable, and Uninstall.

As you can see, this new tool will give you the ability to keep your modules up-to-date and more. Download the Module Manager preview, and let us know what you think!

Free Preview for a Limited Time

Module Manager is available for a time-limited community preview, which expires on December 1, 2018. Module Manager is currently available in 64 bit and only for Windows PowerShell 5.0

Download preview here: Download Module Manager (64 bit)

Related Articles

Feedback

As always, if you have any ideas, comments, or feedback, please visit our feedback forum and reference this post.

 

Max Trinidad is a Technology Evangelist at SAPIEN Technologies Inc., and a Microsoft PowerShell MVP. You can reach him at maxt@sapien.com

 

PowerShell Studio: Working with Remote Systems (Part 1)

$
0
0

This is the first post in a blog series about working with remote systems using PowerShell Studio.

The following topics are covered:
Part 1 – Caching PowerShell modules from remote systems
Part 2 – PowerShell remoting

In this first installment I will show you how to use SAPIEN Technologies’ Cache Export tool to collect module information from a remote system, and then use PowerShell Studio to create scripting solutions by using the exported cache.

Part 1 – Caching Remote Systems PowerShell Module

The ability to access modules residing in remote systems allows us to create PowerShell solutions that span infrastructure and DevOps environments. PowerShell Studio and PrimalScript include components, such as the “CacheExport2018.exe”, to help you access modules on remote systems.

Cache Export Executable

PowerShell Studio and PrimalScript include a component that will collect the module information from the remote system—”CacheExport.exe“. This program will extract module information into a file which can be accessed from the Cache Editor feature in PowerShell Studio and PrimalScript. The file “CacheExport2018.zip” contains the cache export executable.

CacheExport2018.zip Location

CacheExport2018.zip” is located in the SAPIEN Technologies products installation folder under “Redistributables“:

PowerShell Studio file location:

C:\Program Files\SAPIEN Technologies, Inc\PowerShell Studio 2018\Redistributables

 

PrimalScript file location:

C:\Program Files\SAPIEN Technologies, Inc\PrimalScript 2018\Redistributables

 

Building the Cache Export File from a Remote System

The files in “CacheExport2018.zip” need to be extracted on the server where you are going to create the Export Cache file.  The steps outlined below are my preferred method because I don’t have to copy over the executable or leave it on the server.

 

Use Shared Folder

  1. Create a shared folder locally.
  2. Copy “CacheExport2018.zip” to the shared folder and extract the files, which will create the “CacheExport2018” folder containing the executable “CacheExport.exe“.

Now we can proceed to connect to this shared folder from the remote system.

 

Connect to the Shared Folder from the Remote System

  1. Open a Remote Desktop session to the remote system.
  2. Use PowerShell or File Explorer to map a drive (x:) to the shared folder that contains the Cache Export executable.
  3. Navigate to the attached shared folder (e.g.,  x:\CacheExport2018).

 

Build the Cache File

  1. Run “CacheExport.exe” to build the cache file:
    • If using PowerShell Console, type the program name CacheExport <Tab>, then press <Enter> to execute.
    • If using File Explorer, just double-click on the file name “CacheExport.exe“.
  2. In the Cache Export pop-up window, change the Export cache files to: filepath to the shared folder.
  3. Select Rebuild All Cache Files to build the export file.

At the end of this process, select Save to store the cache file in the shared folder.

Reminder: This folder resides where you will use PowerShell Studio (or PrimalScript) to access the exported cache modules.

 

IMPORTANT: The Cache Export file naming format is “<servername>.CacheExport”. Please do not rename this file.

 

Load the Exported Cached Modules

The “<servername>.CacheExport” file has been created from the remote system. Now we can access the file from PowerShell Studio. On the Home ribbon > in the Platform section > select Import Remote Cache:

Navigate to the “<servername>.CacheExport” file and click Open:

 

In the Import Remote Cache dialog, select the Using Windows PowerShell Remoting checkbox > enter the user credentials > then click OK:

 

Access the Exported Cached Modules

In the Platform section,  change the selection from “Local Machine” to the remote system name (e.g., “SUN“):

After selecting the remote system name, the Object Browser panel will refresh and display the Remote System modules as well as any installed snap-ins*:

*Note: PowerShell snap-ins will not be supported in the newer version of PowerShell (> v5).

 

Now all of the the remote system modules are accessible from the PowerShell Studio Editor.

You can also use the PowerShell Cache Editor to select/deselect modules from the remote system (in the Platform section > select the desired machine > then click the Edit Cache button):

*Note: When selecting a “Computer” from the list in the PowerShell Cache Editor, all checked modules belong to the selected computer.

 

In our next blog post I will cover “PowerShell Studio: Using PowerShell Remote“.

Related Articles

Feedback

As always, if you have any ideas, comments, or feedback, please visit our feedback forum and reference this post.

 

Max Trinidad is a Technology Evangelist at SAPIEN Technologies Inc., and a Microsoft PowerShell MVP. You can reach him at maxt@sapien.com

 

Viewing all 308 articles
Browse latest View live