Category Archives: Powershell

Monitoring with PowerShell: Monitoring log on of specific users.

Hi Guys, This’ll be the last blog before I go on holidays, So enjoy it and see you all in two weeks.

This time we’re going to talk about montoring the logon of specific users. We use named accounts for all our engineers and want to alert if another account that is unnamed has been logged onto in an interactive session. To do this, we’ll use the WMI instrumentation Win32_LoggedOnUser.

Just as an extra disclaimer: Please remember that in co-managed or environments that belong to others you won’t be able to always perform all best practices. This script will mostly be used to monitor those messy environments, and give you a little bit more sense of extra security. My personal advice would always be to disable accounts that are no longer allowed to login, use managed service accounts without interactive permissions for services, and delete accounts of ex-employees directly after the leave the company.

The script

Let’s get started on the script. First we’ll have to define which accounts we do not want to have in interactive sessions:

$ForbiddenList = @("Cyberdrain","cyber","migration","administrator","admin","service-QuickBooks*","svc-QB*","ExEmployee1")

So in this list, we name all accounts that are forbidden. You can add any user you would like to this list, We manage a lot of servers, and sometimes after an engineer leaves our company servers are still logged in with the user that we’ve disabled/deleted. With this script we also monitor those situations and log out the deleted user from all servers.

The next step is getting a list of the active users and comparing them:

$ActiveUsers = (Get-CimInstance Win32_LoggedOnUser).antecedent | Select-Object -Unique | Where-Object {$_.name -in $ForbiddenList}

So here we get all users that are currently logged on to the machine, and compare them to our forbidden list, our $activeUsers variable only gets filled if there is a match in the list.

if(!$ActiveUsers){$ActiveUsers = 'false'}

And this last line says that if $ActiveUsers is empty, so no users have been found that are logged in and in our list, it will say “false”. The complete script is just 3 lines and can be found below.

$ForbiddenList = @("Cyberdrain","cyber","migration","administrator","admin","service-QuickBooks*","svc-QB*","ExEmployee1")
$ActiveUsers = (Get-CimInstance Win32_LoggedOnUser).antecedent | Select-Object -Unique | Where-Object {$_.name -in $ForbiddenList}
if(!$ActiveUsers){$ActiveUsers = 'false'}

And that’s it. Some monitoring for situations you do not want to end up in. Remember to always follow security best practices first. Only use these scripts as an early warning system that someone, somewhere has made a mistake. 🙂 As always, Happy PowerShelling!

Documenting with PowerShell: Chapter 2 – Documenting Bitlocker keys

Our RMM system currently does not have support to securely store the bitlocker key inside of the RMM system itself. I’ve subscribed to the school of bitlocking everything that passes through my company, So also computers that sometimes never get connected to Azure AD, Active Directory to store the key in. We also get users that lost the USB drive or piece of paper that the key was stored on.

As we use a documentation system (IT-Glue) to store all our passwords, I figured why not try to also store our Bitlocker keys there, while tagging the device too so we can always find which device belongs to which key easily.

First for the none IT-Glue users I’ll generate a HTML file. With some small adaptation you can upload this to Confluence, ITBoost, or any other system you use. After that example, we’ll get onto IT-Glue again. So let’s get started!

Base script

The base script is the part of the script that captures the data that we want. In our case This will be the Bitlocker key, and output it an HTML file in C:\Temp\Temp.html You can use this script however you’d like.

$BitlockVolumes = Get-BitLockerVolume
#Some HTML to make the page pretty.
$head = @"
<script>
function myFunction() {
    const filter = document.querySelector('#myInput').value.toUpperCase();
    const trs = document.querySelectorAll('table tr:not(.header)');
    trs.forEach(tr => tr.style.display = [...tr.children].find(td => td.innerHTML.toUpperCase().includes(filter)) ? '' : 'none');
  }</script>
<title>Audit Log Report</title>
<style>
body { background-color:#E5E4E2;
      font-family:Monospace;
      font-size:10pt; }
td, th { border:0px solid black; 
        border-collapse:collapse;
        white-space:pre; }
th { color:white;
    background-color:black; }
table, tr, td, th {
     padding: 2px; 
     margin: 0px;
     white-space:pre; }
tr:nth-child(odd) {background-color: lightgray}
table { width:95%;margin-left:5px; margin-bottom:20px; }
h2 {
font-family:Tahoma;
color:#6D7B8D;
}
.footer 
{ color:green; 
 margin-left:10px; 
 font-family:Tahoma;
 font-size:8pt;
 font-style:italic;
}
#myInput {
  background-image: url('https://www.w3schools.com/css/searchicon.png'); /* Add a search icon to input */
  background-position: 10px 12px; /* Position the search icon */
  background-repeat: no-repeat; /* Do not repeat the icon image */
  width: 50%; /* Full-width */
  font-size: 16px; /* Increase font-size */
  padding: 12px 20px 12px 40px; /* Add some padding */
  border: 1px solid #ddd; /* Add a grey border */
  margin-bottom: 12px; /* Add some space below the input */
}
</style>
"@

foreach($BitlockVolume in $BitlockVolumes) {
$HTMLTop = @"
    <h1>Bitlocker Information</h1>
    <b>Computername: </b>$($BitlockVolume.ComputerName)<br>
    <b>Encryption Method:</b>$($BitlockVolume.EncryptionMethod)<br>
    <b>Volume Type:</b>$($BitlockVolume.VolumeType)<br>
    <b>Volume Status:</b>$($BitlockVolume.VolumeStatus)<br>
"@
$HTML += $BitlockVolume.KeyProtector | convertto-html -Head $head -PreContent "$HTMLTop <br> <h1>Keys for $($ENV:COMPUTERNAME) - $($BitlockVolume.Mountpoint)</h1>"
}
$html | Out-File C:\Temp\temp.html

Now, that’s cool. This gives us a good ol’ HTML file. We now have a choice, use the previous script found here and adapt it to upload it to IT-Glue as a Flexible Asset or make the choice to upload it as an embedded password and tag the correct device. That sounds cooler to me!

This script looks for a configuration in your IT-Glue database based on the computer’s serial number. If it finds a match it uploads the bitlocker key as an embedded password, with the name “COMPUTERNAME – DRIVE:” as an example for my computer “DESKTOP-U3984 – C:” – We do this because the hostname might change over time and you’d want the keys to be uploaded separately.

IT-Glue script

#####################################################################
$APIKEy =  "APIKEYHERE"
$APIEndpoint = "https://api.eu.itglue.com"
$orgID = "ORGIDHERE"
#####################################################################
#Grabbing ITGlue Module and installing,etc
If(Get-Module -ListAvailable -Name "ITGlueAPI") {Import-module ITGlueAPI} Else { install-module ITGlueAPI -Force; import-module ITGlueAPI}
#Settings IT-Glue logon information
Add-ITGlueBaseURI -base_uri $APIEndpoint
Add-ITGlueAPIKey $APIKEy
#This is the data we'll be sending to IT-Glue. 
$BitlockVolumes = Get-BitLockerVolume
#The script uses the following line to find the correct asset by serialnumber, match it, and connect it if found. Don't want it to tag at all? Comment it out by adding #
$TaggedResource = (Get-ITGlueConfigurations -organization_id $orgID -filter_serial_number (get-ciminstance win32_bios).serialnumber).data
foreach($BitlockVolume in $BitlockVolumes) {
$PasswordObjectName = "$($Env:COMPUTERNAME) - $($BitlockVolume.MountPoint)"
$PasswordObject = @{
    type = 'passwords'
    attributes = @{
            name = $PasswordObjectName
            password = $BitlockVolume.KeyProtector.recoverypassword[1]
            notes = "Bitlocker key for $($Env:COMPUTERNAME)"

    }
}
if($TaggedResource){ 
    $Passwordobject.attributes.Add("resource_id",$TaggedResource.Id)
    $Passwordobject.attributes.Add("resource_type","Configuration")
}

#Now we'll check if it already exists, if not. We'll create a new one.
$ExistingPasswordAsset = (Get-ITGluePasswords -filter_organization_id $orgID -filter_name $PasswordObjectName).data
#If the Asset does not exist, we edit the body to be in the form of a new asset, if not, we just upload.
if(!$ExistingPasswordAsset){
Write-Host "Creating new Bitlocker Password" -ForegroundColor yellow
$ITGNewPassword = New-ITGluePasswords -organization_id $orgID -data $PasswordObject
} else {
Write-Host "Updating Bitlocker Password" -ForegroundColor Yellow
$ITGNewPassword = Set-ITGluePasswords -id $ExistingPasswordAsset.id -data $PasswordObject
}
}

This script can also be found as an AMP file here, that’s it! as always, happy PowerShelling!

IT-Glue unofficial backup script.

The last couple of weeks I’ve been focused on some API efforts for IT-Glue and was asked by a couple of partners if I couldn’t solve the problem that IT-Glue does not have a backup feature available. This makes it difficult to move away from the product, but also access you data in case of an emergency.

So to make sure that IT-Glue partners get some form of data portability and backups I’ve created the following script. The script connects to the IT-Glue API and creates a HTML and CSV export of all Flexible Assets and Passwords in IT-Glue.

It does not currently backup documents, as those are not exposed by the API. It also does not download attachments included on Flexible Assets. The export always creates 2 files; one HTML file for quick viewing, and a CSV file with all the information included.

I’ve also decided not to directly download all configurations as we store no data in there. If anyone wants to also download the configurations, let me know and I’ll edit the script a little.

Warning: The HTML files contain all your documentation, in plain-text format. Store this in a safe location or adapt the script to upload the data to your Azure Key Vault or secondary password management tool instead. Do not store these in a public location.

All you have to do is change the variables for your environment, APIKey, APIEndpoint, and the Export Directory.

The Unofficial IT-Glue Backup Script.

#####################################################################
#The unofficial Kelvin Tegelaar IT-Glue Backup script. Run this script whenever you want to create a backup of the ITGlue database.
#Creates a file called "password.html" with all passwords in Plain-text. please only store in secure location.
#Creates folders per organisation, and copies flexible assets there as HTML table  &amp; CSV file for data portability.
$APIKEy =  "APIKEYHERE"
$APIEndpoint = "https://api.eu.itglue.com"
$ExportDir = "C:\Hello\ITGBackup"
#####################################################################
Write-Host "Creating backup directory" -ForegroundColor Green
if (!(Test-Path $ExportDir)) { new-item $ExportDir -ItemType Directory }

#Header for HTML files, to make it look pretty.
$head = @"
<script>
function myFunction() {
    const filter = document.querySelector('#myInput').value.toUpperCase();
    const trs = document.querySelectorAll('table tr:not(.header)');
    trs.forEach(tr => tr.style.display = [...tr.children].find(td => td.innerHTML.toUpperCase().includes(filter)) ? '' : 'none');
  }</script>
<title>Audit Log Report</title>
<style>
body { background-color:#E5E4E2;
      font-family:Monospace;
      font-size:10pt; }
td, th { border:0px solid black; 
        border-collapse:collapse;
        white-space:pre; }
th { color:white;
    background-color:black; }
table, tr, td, th {
     padding: 2px; 
     margin: 0px;
     white-space:pre; }
tr:nth-child(odd) {background-color: lightgray}
table { width:95%;margin-left:5px; margin-bottom:20px; }
h2 {
font-family:Tahoma;
color:#6D7B8D;
}
.footer 
{ color:green; 
 margin-left:10px; 
 font-family:Tahoma;
 font-size:8pt;
 font-style:italic;
}
#myInput {
  background-image: url('https://www.w3schools.com/css/searchicon.png'); /* Add a search icon to input */
  background-position: 10px 12px; /* Position the search icon */
  background-repeat: no-repeat; /* Do not repeat the icon image */
  width: 50%; /* Full-width */
  font-size: 16px; /* Increase font-size */
  padding: 12px 20px 12px 40px; /* Add some padding */
  border: 1px solid #ddd; /* Add a grey border */
  margin-bottom: 12px; /* Add some space below the input */
}
</style>
"@

#ITGlue Download starts here
If(Get-Module -ListAvailable -Name "ITGlueAPI") {Import-module ITGlueAPI} Else { install-module ITGlueAPI -Force; import-module ITGlueAPI}
#Settings IT-Glue logon information
Add-ITGlueBaseURI -base_uri $APIEndpoint
Add-ITGlueAPIKey $APIKEy
$i = 0
#grabbing all orgs for later use.
do {
    $orgs += (Get-ITGlueOrganizations -page_size 1000 -page_number $i).data
    $i++
    Write-Host "Retrieved $($orgs.count) Organisations" -ForegroundColor Yellow
}while ($orgs.count % 1000 -eq 0 -and $orgs.count -ne 0)
#Grabbing all passwords.

Write-Host "Creating backup directory per organisation" -ForegroundColor Green
foreach($org in $orgs){
if (!(Test-Path "$($ExportDir)\$($org.attributes.name)\")) { 
    $org.attributes.name = $($org.attributes.name).Replace('\W'," ")
    new-item "$($ExportDir)\$($org.attributes.name)\" -ItemType Directory | out-null }
}

$i = 0
$Passwords = @()
Write-Host "Getting passwords" -ForegroundColor Green
    do {
        $i++
        $PasswordList += (Get-ITGluePasswords -page_size 1000 -page_number $i).data
        Write-Host "Retrieved $($PasswordList.count) Passwords" -ForegroundColor Yellow
    }while ($PasswordList.count % 1000 -eq 0 -and $PasswordList.count -ne 0)
    Write-Host "Processing Passwords. This might take some time." -ForegroundColor Yellow
foreach($PasswordItem in $passwordlist){
    $Passwords += (Get-ITGluePasswords -show_password $true -id $PasswordItem.id).data
}
Write-Host "Processed Passwords. Moving on." -ForegroundColor Yellow

    Write-Host "Getting Flexible Assets" -ForegroundColor Green
    $FlexAssetTypes = (Get-ITGlueFlexibleAssetTypes -page_size 1000).data
    foreach($FlexAsset in $FlexAssetTypes){
    $i = 0
    do {
        $i++
        Write-Host "Getting FlexibleAssets for $($Flexasset.attributes.name)" -ForegroundColor Yellow
        $FlexibleAssets += (Get-ITGlueFlexibleAssets -filter_flexible_asset_type_id $FlexAsset.id -page_size 1000 -page_number $i).data
        Write-Host "Retrieved $($FlexibleAssets.count) Flexible Assets" -ForegroundColor Yellow
    }while ($FlexibleAssets.count % 1000 -eq 0 -and $FlexibleAssets.count -ne 0)
    }
 
#Generate single HTML file with all passwords.
$Passwords.attributes | select-object 'organization-name',name,username,password,url,created-at,updated-at | convertto-html -head $head -precontent '<h1>Password export from IT-Glue</h1><input type="text" id="myInput" onkeyup="myFunction()" placeholder="Search for content.." title="Type a query">' | out-file "$($ExportDir)\passwords.html"

foreach($FlexibleAsset in $FlexibleAssets.attributes){
$HTMLTop = @"
<h1> Flexible Asset Information </h1>
<b>organization ID: </b>$($FlexibleAsset."organization-id") <br>
<b>organization Name:</b>$($FlexibleAsset."organization-name")<br>
<b>name:</b>$($FlexibleAsset.name)<br>
<b>created-at </b>$($FlexibleAsset."created-at")<br>
<b>updated-at </b>$($FlexibleAsset."updated-at")<br>
<b>resource url: </b>$($FlexibleAsset."resource-url")<br>
<b>flexible-asset-type-id: </b>$($FlexibleAsset."flexible-asset-type-id")<br>
<b>flexible-asset-type-name: </b>$($FlexibleAsset."flexible-asset-type-name")<br>
"@ 

$HTMLTables = $FlexibleAsset.traits | convertto-html -Head $head -PreContent "$HTMLTop <br> <h1> Flexible Asset Traits</h1>"

write-host "Ouputting $outputpath" -ForegroundColor Yellow
$OutputPath = "$($ExportDir)\$($flexibleasset.'organization-name')\"
$outputfilename = "$($Flexibleasset.'flexible-asset-type-name') - $($Flexibleasset.name)"
$outputfilename = $outputfilename -replace '[\W]',''
write-host "Ouputting $outputpath\$outputfilename" -ForegroundColor Yellow
$HTMLTables | out-file "$outputpath\$outputfilename HTML.html"
$FlexibleAsset.traits  | export-csv -path "$outputpath\$outputfilename CSV.csv" -NoTypeInformation
}

Documenting with PowerShell: Chapter 1 – Server overview page

In the previous blog I’ve showed how to upload data to IT-Glue by creating a new flexible asset with two fields: a name field on which we perform a match if a document exists, and a field where we placed some data. This time, we’re going to upload an entire HTML file we’ll create by gathering data from the server we run the script on. So, let’s get started

The scripts below can also be downloaded for your RMM system, find the AMP file here.

Data gathering

First, We’ll make a list of all the data that is imported for us to have documented. For me, that would be the following list of information:

  • Server Name
  • Server type (A physical or virtual machine)
  • How much RAM the machine currently has assigned
  • The NIC configuration
  • What applications are installed
  • What server roles are installed
  • What the physical disk layout is
  • What the RAID layout is

Let’s get started with each of the data collection parts. In the following script we check if the model contains “Virtual” or “VMWare”. two signs that show if a machine is a physical or virtual machine.

$ComputerSystemInfo = Get-CimInstance -ClassName Win32_ComputerSystem
if($ComputerSystemInfo.model -match "Virtual" -or $ComputerSystemInfo.model -match "VMware") { $MachineType = "Virtual"} Else { $MachineType = "Physical"}

To get the amount of RAM we run the following oneliner

$RAM = (systeminfo | Select-String 'Total Physical Memory:').ToString().Split(':')[1].Trim()

Applications and roles are also quite easy to get

$ApplicationsFrag = Get-ItemProperty HKLM:\Software\Wow6432Node\Microsoft\Windows\CurrentVersion\Uninstall\* | Select-Object DisplayName, DisplayVersion, Publisher, InstallDate | Convertto-html -Fragment | select -skip 1
$ApplicationsTable = "<br/><table class=`"table table-bordered table-hover`" >" + $ApplicationsFrag

$RolesFrag = Get-WindowsFeature | Where-Object {$_.Installed -eq $True} | Select-Object displayname,name  | convertto-html -Fragment | Select-Object -Skip 1
$RolesTable = "<br/><table class=`"table table-bordered table-hover`" >" + $RolesFrag

For the NIC configuration, we need to do a little more work. We want the NIC configuration to show both IPv4 and IPv6 configurations. We also want to look up all NIC configurations, if a server has multiple NICs.

$networkName = Get-CimInstance -ClassName Win32_NetworkAdapter | Where-Object {$_.PhysicalAdapter -eq "True"} | Sort Index
$networkIP = Get-CimInstance -ClassName Win32_NetworkAdapterConfiguration | Where-Object {$_.MACAddress -gt 0} | Sort Index
$networkSummary = New-Object -TypeName 'System.Collections.ArrayList'

foreach($nic in $networkName) {
    $nic_conf = $networkIP | Where-Object {$_.Index -eq $nic.Index}
 
    $networkDetails = New-Object PSObject -Property @{
        Index                = [int]$nic.Index;
        AdapterName         = [string]$nic.NetConnectionID;
        Manufacturer         = [string]$nic.Manufacturer;
        Description          = [string]$nic.Description;
        MACAddress           = [string]$nic.MACAddress;
        IPEnabled            = [bool]$nic_conf.IPEnabled;
        IPAddress            = [string]$nic_conf.IPAddress;
        IPSubnet             = [string]$nic_conf.IPSubnet;
        DefaultGateway       = [string]$nic_conf.DefaultIPGateway;
        DHCPEnabled          = [string]$nic_conf.DHCPEnabled;
        DHCPServer           = [string]$nic_conf.DHCPServer;
        DNSServerSearchOrder = [string]$nic_conf.DNSServerSearchOrder;
    }
    $networkSummary += $networkDetails
}
$NicRawConf = $networkSummary | select AdapterName,IPaddress,IPSubnet,DefaultGateway,DNSServerSearchOrder,MACAddress | Convertto-html -Fragment | select -Skip 1
$NicConf = "<br/><table class=`"table table-bordered table-hover`" >" + $NicRawConf

If a server is a physical machine, and a Dell server I also want to see the disk configuration. For this, you’ll need OpenManage installed.

if($machineType -eq "Physical" -and $ComputerSystemInfo.Manufacturer -matchf "Dell"){
$DiskLayoutRaw = omreport storage pdisk controller=0 -fmt cdv
$DiskLayoutSemi = $DiskLayoutRaw |  select-string -SimpleMatch "ID,Status," -context 0,($DiskLayoutRaw).Length | convertfrom-csv -Delimiter "," | select Name,Status,Capacity,State,"Bus Protocol","Product ID","Serial No.","Part Number",Media | convertto-html -Fragment
$DiskLayoutTable = "<br/><table class=`"table table-bordered table-hover`" >" + $DiskLayoutsemi

#Try to get RAID layout
$RAIDLayoutRaw = omreport storage vdisk controller=0 -fmt cdv
$RAIDLayoutSemi = $RAIDLayoutRaw |  select-string -SimpleMatch "ID,Status," -context 0,($RAIDLayoutRaw).Length | convertfrom-csv -Delimiter "," | select Name,Status,State,Layout,"Device Name","Read Policy","Write Policy",Media |  convertto-html -Fragment
$RAIDLayoutTable = "<br/><table class=`"table table-bordered table-hover`" >" + $RAIDLayoutsemi
}else {
    $RAIDLayoutTable = "Could not get physical disk info"
    $DiskLayoutTable = "Could not get physical disk info"
}

Putting all of that together will give us the following script, I’ve also added some headers to make the output HTML file look nice. You can use this as a stand-alone script.

#Server documentation script
$ComputerSystemInfo = Get-CimInstance -ClassName Win32_ComputerSystem
if($ComputerSystemInfo.model -match "Virtual" -or $ComputerSystemInfo.model -match "VMware") { $MachineType = "Virtual"} Else { $MachineType = "Physical"}
$networkName = Get-CimInstance -ClassName Win32_NetworkAdapter | Where-Object {$_.PhysicalAdapter -eq "True"} | Sort Index
$networkIP = Get-CimInstance -ClassName Win32_NetworkAdapterConfiguration | Where-Object {$_.MACAddress -gt 0} | Sort Index
$networkSummary = New-Object -TypeName 'System.Collections.ArrayList'

foreach($nic in $networkName) {
    $nic_conf = $networkIP | Where-Object {$_.Index -eq $nic.Index}
 
    $networkDetails = New-Object PSObject -Property @{
        Index                = [int]$nic.Index;
        AdapterName         = [string]$nic.NetConnectionID;
        Manufacturer         = [string]$nic.Manufacturer;
        Description          = [string]$nic.Description;
        MACAddress           = [string]$nic.MACAddress;
        IPEnabled            = [bool]$nic_conf.IPEnabled;
        IPAddress            = [string]$nic_conf.IPAddress;
        IPSubnet             = [string]$nic_conf.IPSubnet;
        DefaultGateway       = [string]$nic_conf.DefaultIPGateway;
        DHCPEnabled          = [string]$nic_conf.DHCPEnabled;
        DHCPServer           = [string]$nic_conf.DHCPServer;
        DNSServerSearchOrder = [string]$nic_conf.DNSServerSearchOrder;
    }
    $networkSummary += $networkDetails
}
$NicRawConf = $networkSummary | select AdapterName,IPaddress,IPSubnet,DefaultGateway,DNSServerSearchOrder,MACAddress | Convertto-html -Fragment | select -Skip 1
$NicConf = "<br/><table class=`"table table-bordered table-hover`" >" + $NicRawConf

$RAM = (systeminfo | Select-String 'Total Physical Memory:').ToString().Split(':')[1].Trim()

$ApplicationsFrag = Get-ItemProperty HKLM:\Software\Wow6432Node\Microsoft\Windows\CurrentVersion\Uninstall\* | Select-Object DisplayName, DisplayVersion, Publisher, InstallDate | Convertto-html -Fragment | select -skip 1
$ApplicationsTable = "<br/><table class=`"table table-bordered table-hover`" >" + $ApplicationsFrag

$RolesFrag = Get-WindowsFeature | Where-Object {$_.Installed -eq $True} | Select-Object displayname,name  | convertto-html -Fragment | Select-Object -Skip 1
$RolesTable = "<br/><table class=`"table table-bordered table-hover`" >" + $RolesFrag

if($machineType -eq "Physical" -and $ComputerSystemInfo.Manufacturer -match "Dell"){
$DiskLayoutRaw = omreport storage pdisk controller=0 -fmt cdv
$DiskLayoutSemi = $DiskLayoutRaw |  select-string -SimpleMatch "ID,Status," -context 0,($DiskLayoutRaw).Length | convertfrom-csv -Delimiter "," | select Name,Status,Capacity,State,"Bus Protocol","Product ID","Serial No.","Part Number",Media | convertto-html -Fragment
$DiskLayoutTable = "<br/><table class=`"table table-bordered table-hover`" >" + $DiskLayoutsemi

#Try to get RAID layout
$RAIDLayoutRaw = omreport storage vdisk controller=0 -fmt cdv
$RAIDLayoutSemi = $RAIDLayoutRaw |  select-string -SimpleMatch "ID,Status," -context 0,($RAIDLayoutRaw).Length | convertfrom-csv -Delimiter "," | select Name,Status,State,Layout,"Device Name","Read Policy","Write Policy",Media |  convertto-html -Fragment
$RAIDLayoutTable = "<br/><table class=`"table table-bordered table-hover`" >" + $RAIDLayoutsemi
}else {
    $RAIDLayoutTable = "Could not get physical disk info"
    $DiskLayoutTable = "Could not get physical disk info"
}
#Head for HTML
$head = @"
<Title>Server Log Report</Title>
<style>
body { background-color:#E5E4E2;
      font-family:Monospace;
      font-size:10pt; }
td, th { border:0px solid black; 
        border-collapse:collapse;
        white-space:pre; }
th { color:white;
    background-color:black; }
table, tr, td, th {
     padding: 2px; 
     margin: 0px;
     white-space:pre; }
tr:nth-child(odd) {background-color: lightgray}
table { width:95%;margin-left:5px; margin-bottom:20px; }
h2 {
font-family:Tahoma;
color:#6D7B8D;
}
.footer 
{ color:green; 
 margin-left:10px; 
 font-family:Tahoma;
 font-size:8pt;
 font-style:italic;
}
</style>
"@

$HTMLFile = @"
$head
<b>Servername</b>: $ENV:COMPUTERNAME <br>
<b>Server Type</b>: $machineType <br>
<b>Amount of RAM</b>: $RAM <br>
<br>
<h1>NIC Configuration</h1> <br>
$NicConf
<br>
<h1>Installed Applications</h1> <br>
$ApplicationsTable
<br>
<h1>Installed Roles</h1> <br>
$RolesTable
<br>
<h1>Physical Disk information</h1>
$DiskLayoutTable
<h1>RAID information</h1>
$RAIDLayoutTable
"@

$HTMLFile | out-file C:\Temp\ServerDoc.html

To upload this data to IT-Glue, we can use the exact same script as we’ve used last time, with two small edits; the title of the flexible asset, and the description.

#####################################################################
$APIKEy =  "YOUR API KEY GOES HERE"
$APIEndpoint = "https://api.eu.itglue.com"
$orgID = "THE ORGANISATIONID YOU WOULD LIKE TO UPDATE GOES HERE"
$FlexAssetName = "ITGLue AutoDoc - Server Overview"
$Description = "a server one-page document that shows the current configuration"
#####################################################################
#This is the object we'll be sending to IT-Glue. 
$ComputerSystemInfo = Get-CimInstance -ClassName Win32_ComputerSystem
if($ComputerSystemInfo.model -match "Virtual" -or $ComputerSystemInfo.model -match "VMware") { $MachineType = "Virtual"} Else { $MachineType = "Physical"}
$networkName = Get-CimInstance -ClassName Win32_NetworkAdapter | Where-Object {$_.PhysicalAdapter -eq "True"} | Sort Index
$networkIP = Get-CimInstance -ClassName Win32_NetworkAdapterConfiguration | Where-Object {$_.MACAddress -gt 0} | Sort Index
$networkSummary = New-Object -TypeName 'System.Collections.ArrayList'

foreach($nic in $networkName) {
    $nic_conf = $networkIP | Where-Object {$_.Index -eq $nic.Index}
 
    $networkDetails = New-Object PSObject -Property @{
        Index                = [int]$nic.Index;
        AdapterName         = [string]$nic.NetConnectionID;
        Manufacturer         = [string]$nic.Manufacturer;
        Description          = [string]$nic.Description;
        MACAddress           = [string]$nic.MACAddress;
        IPEnabled            = [bool]$nic_conf.IPEnabled;
        IPAddress            = [string]$nic_conf.IPAddress;
        IPSubnet             = [string]$nic_conf.IPSubnet;
        DefaultGateway       = [string]$nic_conf.DefaultIPGateway;
        DHCPEnabled          = [string]$nic_conf.DHCPEnabled;
        DHCPServer           = [string]$nic_conf.DHCPServer;
        DNSServerSearchOrder = [string]$nic_conf.DNSServerSearchOrder;
    }
    $networkSummary += $networkDetails
}
$NicRawConf = $networkSummary | select AdapterName,IPaddress,IPSubnet,DefaultGateway,DNSServerSearchOrder,MACAddress | Convertto-html -Fragment | select -Skip 1
$NicConf = "<br/><table class=`"table table-bordered table-hover`" >" + $NicRawConf

$RAM = (systeminfo | Select-String 'Total Physical Memory:').ToString().Split(':')[1].Trim()

$ApplicationsFrag = Get-ItemProperty HKLM:\Software\Wow6432Node\Microsoft\Windows\CurrentVersion\Uninstall\* | Select-Object DisplayName, DisplayVersion, Publisher, InstallDate | Convertto-html -Fragment | select -skip 1
$ApplicationsTable = "<br/><table class=`"table table-bordered table-hover`" >" + $ApplicationsFrag

$RolesFrag = Get-WindowsFeature | Where-Object {$_.Installed -eq $True} | Select-Object displayname,name  | convertto-html -Fragment | Select-Object -Skip 1
$RolesTable = "<br/><table class=`"table table-bordered table-hover`" >" + $RolesFrag

if($machineType -eq "Physical" -and $ComputerSystemInfo.Manufacturer -match "Dell"){
$DiskLayoutRaw = omreport storage pdisk controller=0 -fmt cdv
$DiskLayoutSemi = $DiskLayoutRaw |  select-string -SimpleMatch "ID,Status," -context 0,($DiskLayoutRaw).Length | convertfrom-csv -Delimiter "," | select Name,Status,Capacity,State,"Bus Protocol","Product ID","Serial No.","Part Number",Media | convertto-html -Fragment
$DiskLayoutTable = "<br/><table class=`"table table-bordered table-hover`" >" + $DiskLayoutsemi

#Try to get RAID layout
$RAIDLayoutRaw = omreport storage vdisk controller=0 -fmt cdv
$RAIDLayoutSemi = $RAIDLayoutRaw |  select-string -SimpleMatch "ID,Status," -context 0,($RAIDLayoutRaw).Length | convertfrom-csv -Delimiter "," | select Name,Status,State,Layout,"Device Name","Read Policy","Write Policy",Media |  convertto-html -Fragment
$RAIDLayoutTable = "<br/><table class=`"table table-bordered table-hover`" >" + $RAIDLayoutsemi
}else {
    $RAIDLayoutTable = "Could not get physical disk info"
    $DiskLayoutTable = "Could not get physical disk info"
}

$HTMLFile = @"
<b>Servername</b>: $ENV:COMPUTERNAME <br>
<b>Server Type</b>: $machineType <br>
<b>Amount of RAM</b>: $RAM <br>
<br>
<h1>NIC Configuration</h1> <br>
$NicConf
<br>
<h1>Installed Applications</h1> <br>
$ApplicationsTable
<br>
<h1>Installed Roles</h1> <br>
$RolesTable
<br>
<h1>Physical Disk information</h1>
$DiskLayoutTable
<h1>RAID information</h1>
$RAIDLayoutTable
"@



$FlexAssetBody = 
@{
    type = 'flexible-assets'
    attributes = @{
            name = $FlexAssetName
            traits = @{
                "name" = $ENV:COMPUTERNAME
                "information" = $HTMLFile
            }
    }
}

#ITGlue upload starts here.
If(Get-Module -ListAvailable -Name "ITGlueAPI") {Import-module ITGlueAPI} Else { install-module ITGlueAPI -Force; import-module ITGlueAPI}
#Settings IT-Glue logon information
Add-ITGlueBaseURI -base_uri $APIEndpoint
Add-ITGlueAPIKey $APIKEy
#Checking if the FlexibleAsset exists. If not, create a new one.
$FilterID = (Get-ITGlueFlexibleAssetTypes -filter_name $FlexAssetName).data
if(!$FilterID){ 
    $NewFlexAssetData = 
    @{
        type = 'flexible-asset-types'
        attributes = @{
                name = $FlexAssetName
                icon = 'sitemap'
                description = $description
        }
        relationships = @{
            "flexible-asset-fields" = @{
                data = @(
                    @{
                        type       = "flexible_asset_fields"
                        attributes = @{
                            order           = 1
                            name            = "name"
                            kind            = "Text"
                            required        = $true
                            "show-in-list"  = $true
                            "use-for-title" = $true
                        }
                    },
                    @{
                        type       = "flexible_asset_fields"
                        attributes = @{
                            order          = 2
                            name           = "information"
                            kind           = "Textbox"
                            required       = $false
                            "show-in-list" = $false
                        }
                    }
                )
                }
            }
              
       }
New-ITGlueFlexibleAssetTypes -Data $NewFlexAssetData 
$FilterID = (Get-ITGlueFlexibleAssetTypes -filter_name $FlexAssetName).data
} 

#Upload data to IT-Glue. We try to match the Server name to current computer name.
$ExistingFlexAsset = (Get-ITGlueFlexibleAssets -filter_flexible_asset_type_id $Filterid.id -filter_organization_id $orgID).data | Where-Object {$_.attributes.name -eq $ENV:COMPUTERNAME}

#If the Asset does not exist, we edit the body to be in the form of a new asset, if not, we just upload.
if(!$ExistingFlexAsset){
$FlexAssetBody.attributes.add('organization-id', $orgID)
$FlexAssetBody.attributes.add('flexible-asset-type-id', $FilterID.id)
Write-Host "Creating new flexible asset"
New-ITGlueFlexibleAssets -data $FlexAssetBody

} else {
Write-Host "Updating Flexible Asset"
Set-ITGlueFlexibleAssets -id $ExistingFlexAsset.id  -data $FlexAssetBody}

And that’s it! Happy PowerShelling! the next chapter will be how to upload the Bitlocker key for a client machine to IT-Glue as a password object, it will also tag all related devices, something that’s pretty cool!

Documenting with PowerShell – New series

Hi All!

Starting this week I’ll be blogging about using PowerShell with your RMM/Automation platform and running scripts to collect valuable documentation. I’ll try to keep it as generic as possible and export the documentation to HTML, but I’ll always include a version to upload it to IT-Glue or Confluence. As requested by some I’ll also include the AMP for N-Central so you can get going with it.

To get started straight away, I’ll share the script that we will be using throughout this series to upload documentation to IT-Glue fully automated. You won’t even need to create flexible assets as the script does this for you.

The Script

For the script you’ll need at least Windows 10, or Server 2012R2+. You’ll also need your IT-Glue API key and the URL, generally speaking that URL is “https://api.itglue.com” or “https://api.eu.itglue.com” for european users. Now let’s get started on our uploading script. 🙂

N-Able users can download the AMP for this script here (Right click->Save as) The script can use Custom Device or Organisation Properties as input, and as thus you can enter the Organisation ID on each Custom Organisation Property and automate your documentation process completely.

#####################################################################
$APIKEy =  "YOUR API KEY GOES HERE"
$APIEndpoint = "https://api.eu.itglue.com"
$orgID = "THE ORGANISATIONID YOU WOULD LIKE TO UPDATE GOES HERE"
$FlexAssetName = "ITGLue AutoDoc - Quick example"
$Description = "a quick overview of easy it is to upload data to IT-Glue"
#####################################################################
#This is the object we'll be sending to IT-Glue. 
$HTMLStuff = @"
<b>Servername</b>: $ENV:COMPUTERNAME <br>
<b>Number of Processors</b>: $ENV:NUMBER_OF_PROCESSORS <br>

This is a little example of how we upload data to IT-Glue.
"@
$FlexAssetBody = 
@{
    type = 'flexible-assets'
    attributes = @{
            name = $FlexAssetName
            traits = @{
                "name" = $ENV:COMPUTERNAME
                "information" = $HTMLStuff
            }
    }
}

#ITGlue upload starts here.
If(Get-Module -ListAvailable -Name "ITGlueAPI") {Import-module ITGlueAPI} Else { install-module ITGlueAPI -Force; import-module ITGlueAPI}
#Settings IT-Glue logon information
Add-ITGlueBaseURI -base_uri $APIEndpoint
Add-ITGlueAPIKey $APIKEy
#Checking if the FlexibleAsset exists. If not, create a new one.
$FilterID = (Get-ITGlueFlexibleAssetTypes -filter_name $FlexAssetName).data
if(!$FilterID){ 
    $NewFlexAssetData = 
    @{
        type = 'flexible-asset-types'
        attributes = @{
                name = $FlexAssetName
                icon = 'sitemap'
                description = $description
        }
        relationships = @{
            "flexible-asset-fields" = @{
                data = @(
                    @{
                        type       = "flexible_asset_fields"
                        attributes = @{
                            order           = 1
                            name            = "name"
                            kind            = "Text"
                            required        = $true
                            "show-in-list"  = $true
                            "use-for-title" = $true
                        }
                    },
                    @{
                        type       = "flexible_asset_fields"
                        attributes = @{
                            order          = 2
                            name           = "information"
                            kind           = "Textbox"
                            required       = $false
                            "show-in-list" = $false
                        }
                    }
                )
                }
            }
              
       }
New-ITGlueFlexibleAssetTypes -Data $NewFlexAssetData 
$FilterID = (Get-ITGlueFlexibleAssetTypes -filter_name $FlexAssetName).data
} 

#Upload data to IT-Glue. We try to match the Server name to current computer name.
$ExistingFlexAsset = (Get-ITGlueFlexibleAssets -filter_flexible_asset_type_id $Filterid.id -filter_organization_id $orgID).data | Where-Object {$_.attributes.name -eq $ENV:COMPUTERNAME}

#If the Asset does not exist, we edit the body to be in the form of a new asset, if not, we just upload.
if(!$ExistingFlexAsset){
$FlexAssetBody.attributes.add('organization-id', $orgID)
$FlexAssetBody.attributes.add('flexible-asset-type-id', $FilterID.id)
Write-Host "Creating new flexible asset"
New-ITGlueFlexibleAssets -data $FlexAssetBody

} else {
Write-Host "Updating Flexible Asset"
Set-ITGlueFlexibleAssets -id $ExistingFlexAsset.id  -data $FlexAssetBody}

The script does multiple things for you, that a lot of other scripts tend to skimp over;

  • We check if a Flexible Asset type with our chosen name is already present, if its not. we create it
  • We then check if a Flexible Asset form already exists with the same name as we’ve entered, if not, we’ll upload a fresh one, if it does, we’ll upload an update for that specific item.

In the following series I’ll teach you how to get the organisation ID by information we gather on the machine you are running your script on. We’ll be tackling how to get the correct devices tagged on your flexible assets, but of course we’ll start by taking apart the script above and teaching you how to create fully automated network documentation.

As always, Happy Powershelling!

Monitoring with PowerShell Chapter 3: Monitoring network state

Our clients often want us to monitor specific network connections, such as VPN tunnels that need to be online, services that always need to be reachable, or even simply to report on internet connection speeds. To do this, we mostly use our network controller software and default RMM sets. In rare cases, that is not enough, so we’ve developed some monitoring sets for our RMM to help us with this.

To start, we have a RMM monitoring set that uses the Test-Connection cmdlet to ping multiple hosts entered in our RMM system. We define these per client, so their most important resources are checked constantly.

$State = "Healthy"
$IPsToPing = $IPsToPing.split(",")
try{
$ConnectionTest = test-connection $IPsToPing -count 3 -ErrorAction stop -Verbose
}
catch [System.Management.Automation.ActionPreferenceStopException]            
{            
try {            
throw $_.exception            
}                 
catch [System.Net.NetworkInformation.PingException] {            
$state = "$($error[0])"            
}                      
catch {            
$state = "$($error[0])"           
}            
}

$EndResult = $ConnectionTest | measure-Object -Property ResponseTime -Average -sum -Maximum -Minimum

$AVGMS = $EndResult.Average.ToString(00)
$MaxMS = $EndResult.Maximum.ToString(00)
$MinMs = $EndResult.Minimum.ToString(00)

$AVGMS shows the average of MS with 3 pings, $MaxMS shows the highest reached MS during 3 pings, and $MinMs, you’ve guessed it – Shows the fastests pings in the west 😉

Next to ping monitoring we also check the health state of the internal network on Windows Servers. We see when doing take overs of infrastructure that in a lot of situations Network Location Awareness does not function or start correctly.

The network location awareness service actually is the service that tells your OS what network profile it should use, like “public”, “private” or “domain”. Not having it running can cause a myriad of issues such a SSPI issues on SQL servers, Firewalling issues, and much more!

Most of our environments use $Domainname.com or when doing takeovers $domainname.local. The common denominator in this is that the network profile contains a period and from there we compare it to the actual Network Category. If these do not match, we alert and see if we need to recover the Network Location Awareness service.

$NetworkProfile = Get-NetConnectionProfile

if ($NetworkProfile.Name -contains ".")
{ $NLAState = "Domain Authentiction might not work properly."  }else{ $NLAState = "Healthy" }

foreach ($NetProfile in $NetworkProfile | where {$_.Name -match "."}) {
 if ($NetProfile.NetworkCategory -ne "DomainAuthenticated")
 { $DomainState = "Network is not set to DomainAuthenticated."  }else{ $DomainState = "Healthy" }
}

if(!$NLAState){ $NLAState = "healthy }

And that’s the blog for today! enjoy and as always, Happy PowerShelling!

Monitoring with PowerShell Chapter 3: Monitoring SSL certificates on IIS

For some clients that still have on-site servers running IIS we have to monitor the SSL status to make sure that things such as the Remote Desktop Gateway, or SSTP VPN are always available. To make sure that we are able to replace SSL certifcates on time we need to know when specific certificates expire.

To do this, we will use the Get-IISSite cmdlet. This cmdlet exposes all of the IIS configuration in an easy to use format. First, we’ll get all sites with bindings like this;

(Get-IISSite).bindings

Now if you run this, you won’t see that much information. To make sure we get a list of all possible items we have a couple of choices, but the simplest is using the select statement:

(Get-IISSite).bindings | select *

Here we’re telling PowerShell we want to select every possible object returned by our command. With this you’ll see we have a lot more information about the bindings. Including that we can filter on the protocol used. Because we only want information about the pages secured with a certificated we can run the same command again, but this time filtering on the protocol “HTTPS”

(Get-IISSite).bindings | where-object {$_.Protocol -eq "https"}

Now that we have the information that we want! our next step is easy; we will find the bound certificate in the local certificate store and check the expiry date. We like to be warned 14 days before expiry, but you can change the days to anything you’d like 🙂

$Days = (Get-Date).AddDays("14")
$CertsBound = (Get-IISSite).bindings | where-object {$_.Protocol -eq "https"}
foreach($Cert in $CertsBound){
$CertFile = Get-ChildItem -path "CERT:LocalMachine\$($Cert.CertificateStoreName)" | Where-Object -Property ThumbPrint -eq $cert.RawAttributes.certificateHash
if($certfile.NotAfter -lt $Days) { $CertState += "$($certfile.FriendlyName) will expire on $($certfile.NotAfter)" }
}

if(!$certState){$CertState = "Healthy"}

And thats it! if the certificate will expire in 14 days, this set will start alerting. Happy PowerShelling!

Monitoring with PowerShell Chapter 3: Monitoring user creation

We all know that bad actors often create accounts for repeat access when they gain access to your network. To make sure that we are aware of these situations we user PowerShell monitoring.

For security measures we monitor 3 types of user creation:

  • All created domain users,
  • All users added privileged groups(e.g. Domain Admins)
  • All created local users.
  • All (Temporary) users created that contain specific keywords.
Temporary users Monitoring:

The temporary users monitoring set is only used on active directory users, not local users. as we’re already monitoring all local user creation this would be redudant.

$ArrayOfNames = @("test", "tmp","skykick","mig", "migwiz","temp","-admin","supervisor")

foreach($name in $ArrayOfNames){
$filter =  'Name -like "*'+ $($name) + '*"'
$Users = Get-ADUser -Filter $filter
if($users -ne $null){
foreach($user in $users){
$TemporaryUser += "$($user.name) has been found, created at $($user.whencreated)`n"
}
}
}
if(!$TemporaryUser){$Userlist = "No Temporary Accounts Found" }
Created Domain Users

For domain users, we alert on all users created in the last 24 hours. after 24 hours this monitoring set goes back to a healthy state.

$When = ((Get-Date).AddDays(-1)).Date
$GetUsers = Get-ADUser -Filter {whenCreated -ge $When} -Properties whenCreated

foreach($user in $GetUsers){
$userchanges += "$($user.name) has been created at $($user.whencreated) `n"
}

if($UserChanges -eq $Null) { $UserChanges = "No Changes Detected"}
All created local users

Local users do not have a creation date, that makes it a little more difficult to monitor. To solve this we create a file with the results of get-localuser, We update that file every 24 hours. Then from there on, we compare the file to the most recent results of the output of get-localuser. We are alerted if the compare finds newer content, Meaning a user has been created or removed in the local user database.

$Outputfile = "C:\Windows\temp\LocalUsers.txt"
$Localusers = Get-LocalUser | Select-Object *
Test-Path $Outputfile | %{if($_ -eq $false){new-item $Outputfile} }
$OutPutFileContents = Get-Content $Outputfile | ConvertFrom-Json

If((get-item $Outputfile).LastWriteTime -lt (Get-Date).AddDays((-1))){    $Localusers | ConvertTo-Json |out-file $Outputfile  }

$Compare = Compare-Object -DifferenceObject $OutPutFileContents.name -ReferenceObject $Localusers.name

if(!$Compare) { $Compare = "Healthy"}
Privileged Group Changes

We monitor the privileged groups by using Ashley McGlone’s script to get the changes. You can find that full script here.

Function Get-PrivilegedGroupChanges {            
    Param(            
        $Server = "localhost",            
        $Hour = 24            
    )            
                
        $ProtectedGroups = Get-ADGroup -Filter 'AdminCount -eq 1' -Server $Server            
        $Members = @()            
                
        ForEach ($Group in $ProtectedGroups) {            
            $Members += Get-ADReplicationAttributeMetadata -Server $Server `
                -Object $Group.DistinguishedName -ShowAllLinkedValues |            
             Where-Object {$_.IsLinkValue} |            
             Select-Object @{name='GroupDN';expression={$Group.DistinguishedName}}, `
                @{name='GroupName';expression={$Group.Name}}, *            
        }            
                
        $Members |            
            Where-Object {$_.LastOriginatingChangeTime -gt (Get-Date).AddHours(-1 * $Hour)}            
                
    }     
    
    $ListOfChanges = Get-PrivilegedGroupChanges
    
    if($ListOfChanges -eq $Null) 
    {
    $GroupChanges = "No Changes Detected"
    }
    else {
    foreach($item in $ListOfChanges){
    $GroupChanges += "Group $($ListOfChanges.GroupName) has been changed - $($listofchanges.AttributeValue) has been added or removed"
    }
    }
    
    if(!$groupChanges) { $GroupChanges = "No Changes Detected"}

Monitoring with PowerShell Chapter 3: Monitoring and remediating Windows Feature Update status

With the advent of Windows 10 all MSP’s are faced with a new challenge: How do we manage the different Windows 10 Feature versions and how do we make sure we can automatically upgrade our clients to the latest version of the Windows 10 OS? Microsoft has not made feature updating very straightforward, and sometimes the automatic updates error out.

A lot of RMM systems claim they do version upgrades perfectly, unfortunately I have not seen any RMM that gracefully upgrades machines without too many issues and without user intervention. as an MSP its key to automate as much as possible and prevent engineers from having to bother users to perform machine upgrades, so here is PowerShell to the rescue:

Checking the Windows version and comparing it to your standards.

We’ve decided we want all of our users on the same feature level, because of this we’ve created a monitoring set that alerts us when Windows 10 computers have a lower version then one we’ve centrally set. Quite simply our monitoring set only returns the current release ID we pull from the registry:

$ReleaseID = (Get-ItemProperty -Path "HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion" -Name releaseid).releaseid

Now we alert on this monitoring set, when it is below a expected value. In our case anything lower than 1903 generates an alert. Our system then sees that this alert has been generated and performs a remediation script during the next maintenance cycle we’ve agreed with the client.

Remediation

Remediation for having the wrong feature update is easy. Run the update and done. The problem is that most RMM systems don’t handle this very cleanly. To resolve this we’ve created a PowerShell script that grabs the ISO from a networkshare, copies it to a temporary location, and runs the upgrade from there.

The script also checks if there is enough space on the GPT boot partition, and if not deletes the unnecessary font files.

If the network share is not available, It will download the ISO from a web server you specify, which is great for a mobile workforce. All you have to change in this script is $ISOPath and $Webserver.

$ISOPath = "\\Servername\Netlogon\Windows10.iso"
$WebServer = "http://YOURWEBSERVER/Windows10.iso"
$OSDiskIndex = gwmi -query "Select * from Win32_DiskPartition WHERE Bootable = True" | Select-Object -ExpandProperty DiskIndex
$PartTypeFull = gwmi -query "Select * from Win32_DiskPartition WHERE Index = 0" | Select-Object -ExpandProperty Type
$PartTypeMid = $PartTypeFull.Substring(0,3)
$PartType = Out-String -InputObject $PartTypeMid
if ($PartType -like "*GPT*")
{
write-output ("System has a GPT partition, clearing EFI fonts....");
cmd.exe /c "mountvol b: /s"
Remove-Item b:\efi\Microsoft\Boot\Fonts\*.* -force
cmd.exe /c "mountvol b: /d"
}
if (Test-Path $ISOPath)
{
write-output ("Mounting ISO from: "+$ISOPath);
}
else
{
write-output ("Warn: ISO not found at: "+$ISOPath);
write-output ("Downloading ISO from webserver....");
mkdir c:\temp
$ISOPath = "c:\temp\Windows_10_upgrade.iso";
invoke-webrequest $Webserver -OutFile $ISOPath
}
Mount-DiskImage -ImagePath $ISOPath
$ISODrive = Get-DiskImage -ImagePath $ISOPath | Get-Volume | Select-Object -ExpandProperty DriveLetter
write-output ("Mounted ISO on drive: "+$ISODrive)
$Exe = ":\setup.exe"
$Arguments = "/auto upgrade /quiet /noreboot"
$ExePath = $ISODrive + $Exe
write-output ("Running setup from ISO: " + $ExePath)
Start-Process $ExePath $Arguments

And tada! thats it. The upgrade will run, but the machine will not reboot until the user performs the reboot. We schedule more tasks during our maintenance cycle, so we’d rather have the RMM system handle the reboot.

Happy PowerShelling!

Monitoring with PowerShell Chapter 3: Monitoring MFA-Server and Office365 MFA status

We use both Azure MFA Server to secure our on-site resources, and Office365 MFA for our clients. To make sure we don’t have aggressors changing the MFA settings, or simply administrators forgetting to set-up MFA for clients we make sure that we alert on both.

The issue with monitoring the MFA server is that its a product Microsoft bought later on its in life. As such does not have a PowerShell module included, and it does not conform to the current Common Engineering Criteria.

To solve this, we add a DLL that exposes the functionality that we need, when they get a list of all users(starting with A, until Z).

Add-Type -Path "C:\Program Files\Multi-Factor Authentication Server\pfsvcclientclr.DLL"
$problem = [PfSvcClientClr.ConstructResult]::miscError;
$main = [PfSvcClientClr.PfSvcClient]::construct([PfSvcClientClr.ConstructTarget]::local, [ref] $problem);
$DLLModule = $main.GetType().GetMethod("getInterface").MakeGenericMethod([PfSvcClientClr.IPfMasterComposite]).Invoke($main, $null);
$users = $DLLModule.find_users_startsWith("a")
foreach($username in $users){
if($DLLModule.get_user_disabledBehavior("$username") -eq "succeed") { $UserOutput += "$($username) is set to succeed authentication without MFA `n" }
}
if(!$UserOutput) { $UserOutput = "Healthy" }

Next we will monitor the multi-factor authentication on the Office365 side. For this we will use the MSOL module and your partner credentials you’ve generated using this blog. The script can be used both to get information for all clients, or a single client. To demonstrate I’ve added both

All Clients script:

$ApplicationId       = "xxxxxx-xxxx-xxxx-xxxx-xxxxxxxx"
$ApplicationSecret   = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx="
$RefreshToken        = "VeryLongRefreshToken"
$Tenantname          = "YourTenant.onmicrosoft.com"

$PasswordToSecureString = $ApplicationSecret  | ConvertTo-SecureString -asPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId,$PasswordToSecureString)
$aadGraphToken = New-PartnerAccessToken -RefreshToken $refreshToken -Resource https://graph.windows.net -Credential $credential -TenantId $Tenantname
$graphToken =  New-PartnerAccessToken -RefreshToken $refreshToken -Resource https://graph.microsoft.com -Credential $credential -TenantId $Tenantname

Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken
$AllUSers = Get-MsolPartnerContract |  Get-MsolUser -all | select DisplayName,UserPrincipalName,@{N="MFA Status"; E={ 
if( $_.StrongAuthenticationRequirements.State -ne $null) {$_.StrongAuthenticationRequirements.State} else { "Disabled"}}}
foreach($User in $allusers | Where-Object{ $_.'MFA Status' -eq "Disabled"}){
$DisabledUsers += "$($User.UserPrincipalName) Has MFA disabled `n"
}
if(!$DisabledUsers){ $DisabledUsers = "Healthy"}

Single client script

$ApplicationId       = "xxxx-xxxxx-xxxxx-xxxxx-xxxxxx"
$ApplicationSecret   = "xxxxxxxxxxxxxxxxxxxxxxxxxxxx="
$RefreshToken        = "VeryLongStringHere"
$Tenantname          = "YourTenant.onmicrosoft.com"
$ClientTenantName    = "TheClientsTenant.onmicrosoft.com"
$PasswordToSecureString = $ApplicationSecret  | ConvertTo-SecureString -asPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId,$PasswordToSecureString)
$aadGraphToken = New-PartnerAccessToken -RefreshToken $refreshToken -Resource https://graph.windows.net -Credential $credential -TenantId $Tenantname
$graphToken =  New-PartnerAccessToken -RefreshToken $refreshToken -Resource https://graph.microsoft.com -Credential $credential -TenantId $Tenantname

Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken
$AllUSers = Get-MsolPartnerContract -DomainName $ClientTenantName |  Get-MsolUser -all | select DisplayName,UserPrincipalName,@{N="MFA Status"; E={ 
if( $_.StrongAuthenticationRequirements.State -ne $null) {$_.StrongAuthenticationRequirements.State} else { "Disabled"}}}
foreach($User in $allusers | Where-Object{ $_.'MFA Status' -eq "Disabled"}){
$DisabledUsers += "$($User.UserPrincipalName) Has MFA disabled"
}
if(!$DisabledUsers){ $DisabledUsers = "Healthy"}