Controlling Powershell Code
Optional introduction
Development in any programming language is inextricably linked with the problems of managing the accumulated code base. The larger the code itself and the development participants, the more important it is to adhere to common coding standards. With the growing number of consumers of developed solutions and the coverage of processes with automation, the need to ensure the stability of the operation of these solutions also grows. This leads us to the need to implement well-known practices of continuous integration and continuous delivery (CI/CD).
The Powershell language is often used in teams exclusively to perform one-time admin tasks. ad hoc scenarios – in such a situation, the problems of software development are obviously not relevant. I opened the terminal, executed it and forgot. But it so happened that in our team pipelines CI/CD normal projects are built precisely on Powershell scripts. Yes, everything runs on TeamCity, it can clone sources from a monitored repository and even supports a certain set of template steps that do not require programming during configuration. Convenient templates really help a lot in putting together a basic minimal pipeline, but they don’t make it any easier to build sprawling scenarios that are typical for large teams with a developed imagination. Especially if you have to build something not quite standard. So, we have a TeamCity build config for assembling a PR for a project on T-SQL has 70 steps, 180 parameters – there are never so many templates. Most of these steps are, of course, custom developments.
Initially, only I had fun with this entertaining script writing; over time, several dozen scripts accumulated. “Suddenly” it turned out that the supposedly existing design standard in the head can be observed only within one script and only approximately. The next script, written on an odd day of the week and without first drinking a double portion of coffee, for some reason ended up all in PascalCase instead of camelCase. A similar story happened with all other design variations. IDE called Notepad++ not to say that she offered any assistance in these matters. And the periodic twitching of scripts in the terminal hardly allowed us to call it testing.
It is worth noting that I approached the implementation of the first task of automating work with some API in Pavershell in a state of “ok, Google, what does the entry you need to work with look like?” And “by the way, Google, give me something on powershell basic syntax, powershell quickstart”. Similar specialists worked in the immediate environment. Moreover, intensive googling showed that in the world of powershell one-time scripts are more likely to predominate, somewhat adult SDLC Almost no one is trying to build it at all. Or he’s in no hurry to share his experience. Therefore, the construction of the process moved progressively; some decisions may seem unexpected. There's still not much to compare it to.
When there were several of us, active script writers, the problems closely related to the lack of a normal development process, let’s say, began to sparkle with brighter colors, and we all went together to look for solutions to the accumulated issues.
Development environment
It became clear quite quickly that the most normal option was VS Code. The studio has a perfect plugin that highlights syntax and provides a code completion function.
Somewhat similar functionality is provided by the environment Powershell ISE. However, it was initially something stand-alone, plus colleagues who liked to use this environment stumbled upon several irreparable situations: unremarkable code worked anywhere except ISE. Those who were used to it had to give up. Today, the main page of the ISE description contains an announcement that you need to use VS Code with the Powershell extension.
This and other extensions can be added to the parameters workspacecommit this file. After cloning the repository, the developer will be able to open this workspace in VS Code and immediately receive recommendations on extensions that need to be installed for comfortable work.
"extensions": {
"recommendations": [
"ms-vscode.powershell",
"pspester.pester-test",
"ms-vscode.test-adapter-converter",
"hbenl.vscode-test-explorer"
]
}
Formatting and Linting
The VS Code extension includes PSScriptAnalyzerthe code is not only highlighted, but also automatically formatted and linked. There are not many rules in PSScriptAnayzer, but the project is alive, and occasionally something is added.
Something funny: this linter has the concept of “dangerous verbs” and if you, for example, give a name to a method Delete-Something
then the linter immediately begins to persistently recommend adding the attribute ShouldProcess
so that a dangerous action during debugging can be run idle, without actually deleting anything. Without influence on the environment.
PSScriptAnalyzer can be used separately from the studio, that is, it can be used not only on the developer’s side, but also in the pipeline C.I. — monitor compliance with the included rules. The settings file is the same as that used in VS Code.
@{
Rules = @{
PSAvoidUsingCmdletAliases = @{
Whitelist = @('%', '?')
}
PSAvoidSemicolonsAsLineTerminators = @{
Enable = $true
}
PSUseCorrectCasing = @{
Enable = $false # too slow
}
}
ExcludeRules = @(
'PSAvoidUsingWriteHost',
'PSAvoidUsingInvokeExpression',
'PSUseDeclaredVarsMoreThanAssignments',
'PSUseApprovedVerbs',
'PSReviewUnusedParameter',
'PSAvoidUsingPlainTextForPassword',
'PSAvoidUsingConvertToSecureStringWithPlainText')
}
The path to this configuration file and other extension parameters are configured quite transparently:
"[powershell]": {
"editor.tabSize": 4,
"editor.defaultFormatter": "ms-vscode.powershell"
},
"powershell.codeFormatting.useCorrectCasing": true,
"powershell.codeFormatting.whitespaceBetweenParameters": true,
"powershell.integratedConsole.suppressStartupBanner": true,
"powershell.integratedConsole.showOnStartup": false,
"powershell.scriptAnalysis.settingsPath": "./.vscode/PSScriptAnalyzerSettings.psd1",
"powershell.codeFormatting.pipelineIndentationStyle": "IncreaseIndentationForFirstPipeline",
"powershell.developer.editorServicesLogLevel": "Warning",
This is a fragment from the same *.code-workspace
file that is committed, and thus the basic settings for all developers are synchronized.
Agreements
Powershell language developers and active evangelists provide some coding advice, including:
richly describe function parameters;
specify the return type;
use Pascal-Kebab-Case for naming functions;
for the first word in such names, select verbs from the directory;
name incoming parameters as PascalCase, and local variables as camelCase;
format the method body in begin-process-end style, even if the function is not intended to be used in the pipe;
Do not end expressions with a semicolon.
There is a lot of good advice, but there are also plenty of ambiguous ideas.
In addition to these recommendations, our team has recorded the following agreements:
The main body of a script intended to be called from the command line is always enclosed in a block (method), which we call
Main
immediately after declaring the script parameters, add a block for setting basic options:
elevated mode Strictwhich leads to errors when accessing, for example, an undeclared variable;
an error should cause execution to stop;
exhaust encoding – utf8; otherwise the Cyrillic alphabet breaks;
When saving files, we always explicitly indicate the encoding;
we pass parameters by name, not by position (except for the most basic functions adapted for working in a pipe);
We write a description for scripts and their parameters; where we missed it, we try to fill it in;
When declaring parameters, we write each attribute on a separate line, the name of the parameter itself, too, otherwise the names are spread out across the width;
in scripts called directly from build steps or CLI, we first output via
Write-Verbose
all values of input parameters – this greatly helps in debugging CI-CD scripts.
And some other things. Linter does not know how to control this, so we leave it for review. Snippets help you remember the main thing.
Snippet for a new Powershell script
{
"Init new script": {
"scope": "ps1,powershell",
"prefix": "posh-snippet-script",
"description": "Init cmdlet script",
"body": [
"<#",
".SYNOPSIS",
" tbd",
"",
".PARAMETER $1",
" tbd",
"",
".PARAMETER $2",
" tbd",
"",
".EXAMPLE",
" ./$TM_FILENAME -$1 foo -$2 bar",
"",
" Description",
" -----------",
" tbd",
"",
".NOTES",
"Version: 1.0",
"Author: ?",
"Creation Date: $CURRENT_YEAR-$CURRENT_MONTH-$CURRENT_DATE",
"Original name: $TM_FILENAME",
"#>",
"#Requires -Version 5.1",
"[CmdletBinding()]",
"param (",
" [Parameter(Mandatory)]",
" [string]",
" $$1,",
"",
" [Parameter(Mandatory)]",
" [string]",
" $$2",
")",
"",
"Set-StrictMode -Version 3.0",
"\\$ErrorActionPreference="Stop"",
"\\$PSDefaultParameterValues = @{ '*:Encoding' = 'utf8' }",
"[Console]::OutputEncoding = [System.Text.Encoding]::UTF8",
"",
"",
"function Main {",
" [CmdletBinding()]",
" param (",
" [Parameter(Mandatory)]",
" [string]",
" $$1,",
"",
" [Parameter(Mandatory)]",
" [string]",
" $$2",
" )",
"",
" begin {",
" . \"$$PSScriptRoot/../modules/env/output_lib.ps1\"",
"",
" Write-VerboseParam -invocation $MyInvocation",
"",
" }",
"",
" process {",
" # tbd",
" }",
"",
" end {",
" # tbd",
" }",
"",
"}",
"",
"Main `",
" -$1 $$1 `",
" -$2 $$2",
""
]
},
Unit tests
It is quite possible to cover Posh code with tests – there is a tool for this. Pester. It can be launched in the CI pipeline with a script still written in Powershell, or it can be connected to VS Code.
Tests are integrated perfectly into the studio: test cases are detected automatically, execution progress icons spin like crazy, errors are displayed in the tab with the terminal.
Debugging options are the same as in other IDEs. You can set breakpoints both in the test and in the code under test. There is a display of local variables with their values, Step Into, Step Over work – like in the best houses in Paris.
Pester tests consist of standard blocks. To make it easier to write a basic blank, you can prepare a snippet.
Snippet for a new script with tests on the Pester framework
"Unit-test file sceleton": {
"scope": "ps1,powershell",
"prefix": "posh-snippet-test",
"body": [
"BeforeAll {",
" . \"$$PSScriptRoot/../../src/testing/expand_testdrive.ps1\"",
" . \"$$PSScriptRoot/../../src/testing/mocks.ps1\"",
"",
" $$cmd = \"$$PSScriptRoot/../../src/${TM_DIRECTORY/^.+[\\\\/\\\\]+(.+)$/$1/gi}/${TM_FILENAME_BASE/^(.+)[.]tests$/$1/gi}.ps1\"",
"}",
"",
"",
"Describe 'tbd' {",
" BeforeAll {",
" #tbd",
" $$params = @{",
" arg1 = val1",
" }",
" }",
"",
" It 'tbd' {",
" #tbd",
" & $$cmd @params",
" $$false | Should -BeTrue",
" }",
"",
" AfterEach {",
" if (Test-Path $$params.outputFile -PathType Leaf) {",
" Remove-Item $$params.outputFile -Force | Out-Null",
" }",
" }",
"}",
""
]
}
}
The code in the repository is distributed into semantic folders, and the tests are in a separate folder tests
and below – in the semantic folder of the same name, the same one in which the tested script is located. Therefore, the snippet imports include this one: /../../
double navigation to the top of directories with substitution of the name of the most recent directory from the path of the current file.
A file with snippets can be committed and shared with everyone. For snippets and for PSScriptAnalyzer settings, the appropriate place is a folder .vs-code
at the root of the repository.
Continuous Integration
Linting using PSScriptAnalyzer and running tests on the Pester framework are built into our CI pipeline. In each pull request, the code is linked and tests are run. The output of both tools can be converted into a format understandable by TeamCity, just like for any other tool. So, in each build we see the number of tests performed, which of them failed, calculated by Coverage. If PSScriptAnalyzer complains about something, we will get a broken build, just like with failed tests.
Pester saves test execution information in the format JaCoCo. To help TeamCity understand what is being said to it, the following minimum is needed:
Import-Module Pester
$cfg = New-PesterConfiguration
# здесь настраиваем $cfg
Invoke-Pester -Configuration $cfg | ConvertTo-NUnitReport
In this case, the Coverage percentages for highlighting as in the screenshot will still have to be extracted manually. In order not to do this completely manually, you can choose from ready-made converters, for example, ReportGenerator by Daniel Palme. He can make both a format for sonar and a summary for team city from the original JaCoCo. PSScriptAnalyzer also returns “whatever”, so you’ll have to convert there too.
So, we converted it for SonarQube, we can upload the received data there as third party-reports. Thus, the minimal pipeline includes four steps:
linting using PSScriptAnalyzer
running tests using Pester
converting one and the other into formats for TeamCity and SonarQube
delivery to SonarQube
hi, sonar, step teamcity is in touch
%SONAR_SCANNER_PATH% ^
-Dsonar.sources=. ^
-Dsonar.tests=tests/ ^
-Dsonar.projectKey=my_proj ^
-Dsonar.host.url=%SONAR_ROOT_URL% ^
-Dsonar.login=%SONAR_TOKEN% ^
-Dsonar.sourceEncoding=UTF-8 ^
-Dsonar.inclusions=**/*.ps1,config/**/*.json ^
-Dsonar.exclusions=**/*.sql,**/*.sln,**/*.sqlproj,**/*.xml,**/*.txt,**/*.md,tests/**/*.* ^
-Dsonar.cpd.exclusions=tests/**/*.*,rest/*_api.ps1 ^
-Dsonar.coverage.exclusions=rest/*_api.ps1 ^
-Dsonar.coverageReportPaths=%teamcity.build.checkoutDir%\%ARTIFACT_FOLDER_NAME%\SonarQube.xml ^
-Dsonar.projectVersion=%SONAR_VERSION_NUMBER% %SONAR_SCAN_EXCLUSIONS% %SONAR_SCAN_PR_PARAMS%
something like this…
The sonar side is configured Quality Profilein which you can enable or disable the desired rule or change the severity level of its violation. AND Quality Gatewhich defines a set of conditions for recognizing a pull revest as having passed the quality control gate or failing the tests.
Default QG contains the following conditions: the changes made must be 80% covered by tests, code duplication should not exceed 3%, assembly and linting should not have found anything serious, and there should not be too much frivolous stuff. The default values are sufficient.
Sonar not only monitors the passage of the QG, but is also very useful as a general place to accumulate information about the codebase.
Here you can see that you need to cover 15 thousand lines of code in Powershell, the current coverage is 76%. There are 60 thousand lines in total, very few of which look like complete copy-paste. And this is how the percentage of coverage has changed over the last year:
Pitfalls
Integration with VS Code did not fully work for everyone who tried it.
After some time, I lost the ability to run all tests at once or a large number of tests, and it’s completely unclear how to get it back.
Coverage is not visible in VS Code, we find out only after building on CI.
Running tests with Coverage collection takes much longer than without Coverage. In general, the time is tolerable, but not very good. In a neighboring project in C# there are about two thousand tests and they are processed in one and a half minutes. There are half as many tests in the Powershell repository, but the execution time is already approaching twenty minutes.
There are very few rules in PSScriptAnalyzer; integration with the CI process and uploading to SonarQube, while obviously useful, does not allow one to claim that the code is truly under control; a lot has to be left to manual review.
Either Pester, or its wrapper for VS Code, or Powershell itself caches something somewhere and begins to “get confused in the readings”: intensive work on tests is forced to be accompanied by periodic restarts of VS Code, because some internal mechanics break down . Running tests in such a situation may not lead to anything; the execution result may differ in debug and without debug. This manifests itself, for example, in the fact that everything supposedly worked locally, but on CI it turns out that the tests are broken.
We need additional converters of code comments and test coverage into the format for TeamCity, GitLab, SonarQube, at least some of which will have to be written by hand.
In general, you can work with the given set of tools. Thus, the development process on Powershell becomes a little more similar to what is considered a normal and decent organization of the process. Our script repository currently contains 60K lines of Powershell code, including test code. Without additional tools, without automation, relying only on eyes and hands, it would be impossible to keep this volume under control.