Category Archives: Automation

IT-Glue unofficial backup script.

The last couple of weeks I’ve been focused on some API efforts for IT-Glue and was asked by a couple of partners if I couldn’t solve the problem that IT-Glue does not have a backup feature available. This makes it difficult to move away from the product, but also access you data in case of an emergency.

So to make sure that IT-Glue partners get some form of data portability and backups I’ve created the following script. The script connects to the IT-Glue API and creates a HTML and CSV export of all Flexible Assets and Passwords in IT-Glue.

It does not currently backup documents, as those are not exposed by the API. It also does not download attachments included on Flexible Assets. The export always creates 2 files; one HTML file for quick viewing, and a CSV file with all the information included.

I’ve also decided not to directly download all configurations as we store no data in there. If anyone wants to also download the configurations, let me know and I’ll edit the script a little.

Warning: The HTML files contain all your documentation, in plain-text format. Store this in a safe location or adapt the script to upload the data to your Azure Key Vault or secondary password management tool instead. Do not store these in a public location.

All you have to do is change the variables for your environment, APIKey, APIEndpoint, and the Export Directory.

The Unofficial IT-Glue Backup Script.

#####################################################################
#The unofficial Kelvin Tegelaar IT-Glue Backup script. Run this script whenever you want to create a backup of the ITGlue database.
#Creates a file called "password.html" with all passwords in Plain-text. please only store in secure location.
#Creates folders per organisation, and copies flexible assets there as HTML table  & CSV file for data portability.
$APIKEy =  "APIKEYHERE"
$APIEndpoint = "https://api.eu.itglue.com"
$ExportDir = "C:\Hello\ITGBackup"
#####################################################################
Write-Host "Creating backup directory" -ForegroundColor Green
if (!(Test-Path $ExportDir)) { new-item $ExportDir -ItemType Directory }

#Header for HTML files, to make it look pretty.
$head = @"
<script>
function myFunction() {
    const filter = document.querySelector('#myInput').value.toUpperCase();
    const trs = document.querySelectorAll('table tr:not(.header)');
    trs.forEach(tr => tr.style.display = [...tr.children].find(td => td.innerHTML.toUpperCase().includes(filter)) ? '' : 'none');
  }</script>
<title>Audit Log Report</title>
<style>
body { background-color:#E5E4E2;
      font-family:Monospace;
      font-size:10pt; }
td, th { border:0px solid black; 
        border-collapse:collapse;
        white-space:pre; }
th { color:white;
    background-color:black; }
table, tr, td, th {
     padding: 2px; 
     margin: 0px;
     white-space:pre; }
tr:nth-child(odd) {background-color: lightgray}
table { width:95%;margin-left:5px; margin-bottom:20px; }
h2 {
font-family:Tahoma;
color:#6D7B8D;
}
.footer 
{ color:green; 
 margin-left:10px; 
 font-family:Tahoma;
 font-size:8pt;
 font-style:italic;
}
#myInput {
  background-image: url('https://www.w3schools.com/css/searchicon.png'); /* Add a search icon to input */
  background-position: 10px 12px; /* Position the search icon */
  background-repeat: no-repeat; /* Do not repeat the icon image */
  width: 50%; /* Full-width */
  font-size: 16px; /* Increase font-size */
  padding: 12px 20px 12px 40px; /* Add some padding */
  border: 1px solid #ddd; /* Add a grey border */
  margin-bottom: 12px; /* Add some space below the input */
}
</style>
"@

#ITGlue Download starts here
If(Get-Module -ListAvailable -Name "ITGlueAPI") {Import-module ITGlueAPI} Else { install-module ITGlueAPI -Force; import-module ITGlueAPI}
#Settings IT-Glue logon information
Add-ITGlueBaseURI -base_uri $APIEndpoint
Add-ITGlueAPIKey $APIKEy
$i = 0
#grabbing all orgs for later use.
do {
    $orgs += (Get-ITGlueOrganizations -page_size 1000 -page_number $i).data
    $i++
    Write-Host "Retrieved $($orgs.count) Organisations" -ForegroundColor Yellow
}while ($orgs.count % 1000 -eq 0 -and $orgs.count -ne 0)
#Grabbing all passwords.

Write-Host "Creating backup directory per organisation" -ForegroundColor Green
foreach($org in $orgs){
if (!(Test-Path "$($ExportDir)\$($org.attributes.name)\")) { 
    $org.attributes.name = $($org.attributes.name).Replace('\W'," ")
    new-item "$($ExportDir)\$($org.attributes.name)\" -ItemType Directory | out-null }
}

$i = 0
$Passwords = @()
Write-Host "Getting passwords" -ForegroundColor Green
    do {
        $i++
        $PasswordList += (Get-ITGluePasswords -page_size 1000 -page_number $i).data
        Write-Host "Retrieved $($PasswordList.count) Passwords" -ForegroundColor Yellow
    }while ($PasswordList.count % 1000 -eq 0 -and $PasswordList.count -ne 0)
    Write-Host "Processing Passwords. This might take some time." -ForegroundColor Yellow
foreach($PasswordItem in $passwordlist){
    $Passwords += (Get-ITGluePasswords -show_password $true -id $PasswordItem.id).data
}
Write-Host "Processed Passwords. Moving on." -ForegroundColor Yellow

    Write-Host "Getting Flexible Assets" -ForegroundColor Green
    $FlexAssetTypes = (Get-ITGlueFlexibleAssetTypes -page_size 1000).data
    foreach($FlexAsset in $FlexAssetTypes){
    $i = 0
    do {
        $i++
        Write-Host "Getting FlexibleAssets for $($Flexasset.attributes.name)" -ForegroundColor Yellow
        $FlexibleAssets += (Get-ITGlueFlexibleAssets -filter_flexible_asset_type_id $FlexAsset.id -page_size 1000 -page_number $i).data
        Write-Host "Retrieved $($FlexibleAssets.count) Flexible Assets" -ForegroundColor Yellow
    }while ($FlexibleAssets.count % 1000 -eq 0 -and $FlexibleAssets.count -ne 0)
    }
 
#Generate single HTML file with all passwords.
$Passwords.attributes | select-object 'organization-name',name,username,password,url,created-at,updated-at | convertto-html -head $head -precontent '<h1>Password export from IT-Glue</h1><input type="text" id="myInput" onkeyup="myFunction()" placeholder="Search for content.." title="Type a query">' | out-file "$($ExportDir)\passwords.html"

foreach($FlexibleAsset in $FlexibleAssets.attributes){
$HTMLTop = @"
<h1> Flexible Asset Information </h1>
<b>organization ID: </b>$($FlexibleAsset."organization-id") <br>
<b>organization Name:</b>$($FlexibleAsset."organization-name")<br>
<b>name:</b>$($FlexibleAsset.name)<br>
<b>created-at </b>$($FlexibleAsset."created-at")<br>
<b>updated-at </b>$($FlexibleAsset."updated-at")<br>
<b>resource url: </b>$($FlexibleAsset."resource-url")<br>
<b>flexible-asset-type-id: </b>$($FlexibleAsset."flexible-asset-type-id")<br>
<b>flexible-asset-type-name: </b>$($FlexibleAsset."flexible-asset-type-name")<br>
"@ 

$HTMLTables = $FlexibleAsset.traits | convertto-html -Head $head -PreContent "$HTMLTop <br> <h1> Flexible Asset Traits</h1>"

write-host "Ouputting $outputpath" -ForegroundColor Yellow
$OutputPath = "$($ExportDir)\$($flexibleasset.'organization-name')\"
$outputfilename = "$($Flexibleasset.'flexible-asset-type-name') - $($Flexibleasset.name)"
$outputfilename = $outputfilename -replace '[\W]',''
write-host "Ouputting $outputpath\$outputfilename" -ForegroundColor Yellow
$HTMLTables | out-file "$outputpath\$outputfilename HTML.html"
$FlexibleAsset.traits  | export-csv -path "$outputpath\$outputfilename CSV.csv" -NoTypeInformation
}

Documenting with PowerShell: Chapter 1 – Server overview page

In the previous blog I’ve showed how to upload data to IT-Glue by creating a new flexible asset with two fields: a name field on which we perform a match if a document exists, and a field where we placed some data. This time, we’re going to upload an entire HTML file we’ll create by gathering data from the server we run the script on. So, let’s get started

The scripts below can also be downloaded for your RMM system, find the AMP file here.

Data gathering

First, We’ll make a list of all the data that is imported for us to have documented. For me, that would be the following list of information:

  • Server Name
  • Server type (A physical or virtual machine)
  • How much RAM the machine currently has assigned
  • The NIC configuration
  • What applications are installed
  • What server roles are installed
  • What the physical disk layout is
  • What the RAID layout is

Let’s get started with each of the data collection parts. In the following script we check if the model contains “Virtual” or “VMWare”. two signs that show if a machine is a physical or virtual machine.

$ComputerSystemInfo = Get-CimInstance -ClassName Win32_ComputerSystem
if($ComputerSystemInfo.model -match "Virtual" -or $ComputerSystemInfo.model -match "VMware") { $MachineType = "Virtual"} Else { $MachineType = "Physical"}

To get the amount of RAM we run the following oneliner

$RAM = (systeminfo | Select-String 'Total Physical Memory:').ToString().Split(':')[1].Trim()

Applications and roles are also quite easy to get

$ApplicationsFrag = Get-ItemProperty HKLM:\Software\Wow6432Node\Microsoft\Windows\CurrentVersion\Uninstall\* | Select-Object DisplayName, DisplayVersion, Publisher, InstallDate | Convertto-html -Fragment | select -skip 1
$ApplicationsTable = "<br/><table class=`"table table-bordered table-hover`" >" + $ApplicationsFrag

$RolesFrag = Get-WindowsFeature | Where-Object {$_.Installed -eq $True} | Select-Object displayname,name  | convertto-html -Fragment | Select-Object -Skip 1
$RolesTable = "<br/><table class=`"table table-bordered table-hover`" >" + $RolesFrag

For the NIC configuration, we need to do a little more work. We want the NIC configuration to show both IPv4 and IPv6 configurations. We also want to look up all NIC configurations, if a server has multiple NICs.

$networkName = Get-CimInstance -ClassName Win32_NetworkAdapter | Where-Object {$_.PhysicalAdapter -eq "True"} | Sort Index
$networkIP = Get-CimInstance -ClassName Win32_NetworkAdapterConfiguration | Where-Object {$_.MACAddress -gt 0} | Sort Index
$networkSummary = New-Object -TypeName 'System.Collections.ArrayList'

foreach($nic in $networkName) {
    $nic_conf = $networkIP | Where-Object {$_.Index -eq $nic.Index}
 
    $networkDetails = New-Object PSObject -Property @{
        Index                = [int]$nic.Index;
        AdapterName         = [string]$nic.NetConnectionID;
        Manufacturer         = [string]$nic.Manufacturer;
        Description          = [string]$nic.Description;
        MACAddress           = [string]$nic.MACAddress;
        IPEnabled            = [bool]$nic_conf.IPEnabled;
        IPAddress            = [string]$nic_conf.IPAddress;
        IPSubnet             = [string]$nic_conf.IPSubnet;
        DefaultGateway       = [string]$nic_conf.DefaultIPGateway;
        DHCPEnabled          = [string]$nic_conf.DHCPEnabled;
        DHCPServer           = [string]$nic_conf.DHCPServer;
        DNSServerSearchOrder = [string]$nic_conf.DNSServerSearchOrder;
    }
    $networkSummary += $networkDetails
}
$NicRawConf = $networkSummary | select AdapterName,IPaddress,IPSubnet,DefaultGateway,DNSServerSearchOrder,MACAddress | Convertto-html -Fragment | select -Skip 1
$NicConf = "<br/><table class=`"table table-bordered table-hover`" >" + $NicRawConf

If a server is a physical machine, and a Dell server I also want to see the disk configuration. For this, you’ll need OpenManage installed.

if($machineType -eq "Physical" -and $ComputerSystemInfo.Manufacturer -matchf "Dell"){
$DiskLayoutRaw = omreport storage pdisk controller=0 -fmt cdv
$DiskLayoutSemi = $DiskLayoutRaw |  select-string -SimpleMatch "ID,Status," -context 0,($DiskLayoutRaw).Length | convertfrom-csv -Delimiter "," | select Name,Status,Capacity,State,"Bus Protocol","Product ID","Serial No.","Part Number",Media | convertto-html -Fragment
$DiskLayoutTable = "<br/><table class=`"table table-bordered table-hover`" >" + $DiskLayoutsemi

#Try to get RAID layout
$RAIDLayoutRaw = omreport storage vdisk controller=0 -fmt cdv
$RAIDLayoutSemi = $RAIDLayoutRaw |  select-string -SimpleMatch "ID,Status," -context 0,($RAIDLayoutRaw).Length | convertfrom-csv -Delimiter "," | select Name,Status,State,Layout,"Device Name","Read Policy","Write Policy",Media |  convertto-html -Fragment
$RAIDLayoutTable = "<br/><table class=`"table table-bordered table-hover`" >" + $RAIDLayoutsemi
}else {
    $RAIDLayoutTable = "Could not get physical disk info"
    $DiskLayoutTable = "Could not get physical disk info"
}

Putting all of that together will give us the following script, I’ve also added some headers to make the output HTML file look nice. You can use this as a stand-alone script.

#Server documentation script
$ComputerSystemInfo = Get-CimInstance -ClassName Win32_ComputerSystem
if($ComputerSystemInfo.model -match "Virtual" -or $ComputerSystemInfo.model -match "VMware") { $MachineType = "Virtual"} Else { $MachineType = "Physical"}
$networkName = Get-CimInstance -ClassName Win32_NetworkAdapter | Where-Object {$_.PhysicalAdapter -eq "True"} | Sort Index
$networkIP = Get-CimInstance -ClassName Win32_NetworkAdapterConfiguration | Where-Object {$_.MACAddress -gt 0} | Sort Index
$networkSummary = New-Object -TypeName 'System.Collections.ArrayList'

foreach($nic in $networkName) {
    $nic_conf = $networkIP | Where-Object {$_.Index -eq $nic.Index}
 
    $networkDetails = New-Object PSObject -Property @{
        Index                = [int]$nic.Index;
        AdapterName         = [string]$nic.NetConnectionID;
        Manufacturer         = [string]$nic.Manufacturer;
        Description          = [string]$nic.Description;
        MACAddress           = [string]$nic.MACAddress;
        IPEnabled            = [bool]$nic_conf.IPEnabled;
        IPAddress            = [string]$nic_conf.IPAddress;
        IPSubnet             = [string]$nic_conf.IPSubnet;
        DefaultGateway       = [string]$nic_conf.DefaultIPGateway;
        DHCPEnabled          = [string]$nic_conf.DHCPEnabled;
        DHCPServer           = [string]$nic_conf.DHCPServer;
        DNSServerSearchOrder = [string]$nic_conf.DNSServerSearchOrder;
    }
    $networkSummary += $networkDetails
}
$NicRawConf = $networkSummary | select AdapterName,IPaddress,IPSubnet,DefaultGateway,DNSServerSearchOrder,MACAddress | Convertto-html -Fragment | select -Skip 1
$NicConf = "<br/><table class=`"table table-bordered table-hover`" >" + $NicRawConf

$RAM = (systeminfo | Select-String 'Total Physical Memory:').ToString().Split(':')[1].Trim()

$ApplicationsFrag = Get-ItemProperty HKLM:\Software\Wow6432Node\Microsoft\Windows\CurrentVersion\Uninstall\* | Select-Object DisplayName, DisplayVersion, Publisher, InstallDate | Convertto-html -Fragment | select -skip 1
$ApplicationsTable = "<br/><table class=`"table table-bordered table-hover`" >" + $ApplicationsFrag

$RolesFrag = Get-WindowsFeature | Where-Object {$_.Installed -eq $True} | Select-Object displayname,name  | convertto-html -Fragment | Select-Object -Skip 1
$RolesTable = "<br/><table class=`"table table-bordered table-hover`" >" + $RolesFrag

if($machineType -eq "Physical" -and $ComputerSystemInfo.Manufacturer -match "Dell"){
$DiskLayoutRaw = omreport storage pdisk controller=0 -fmt cdv
$DiskLayoutSemi = $DiskLayoutRaw |  select-string -SimpleMatch "ID,Status," -context 0,($DiskLayoutRaw).Length | convertfrom-csv -Delimiter "," | select Name,Status,Capacity,State,"Bus Protocol","Product ID","Serial No.","Part Number",Media | convertto-html -Fragment
$DiskLayoutTable = "<br/><table class=`"table table-bordered table-hover`" >" + $DiskLayoutsemi

#Try to get RAID layout
$RAIDLayoutRaw = omreport storage vdisk controller=0 -fmt cdv
$RAIDLayoutSemi = $RAIDLayoutRaw |  select-string -SimpleMatch "ID,Status," -context 0,($RAIDLayoutRaw).Length | convertfrom-csv -Delimiter "," | select Name,Status,State,Layout,"Device Name","Read Policy","Write Policy",Media |  convertto-html -Fragment
$RAIDLayoutTable = "<br/><table class=`"table table-bordered table-hover`" >" + $RAIDLayoutsemi
}else {
    $RAIDLayoutTable = "Could not get physical disk info"
    $DiskLayoutTable = "Could not get physical disk info"
}
#Head for HTML
$head = @"
<Title>Server Log Report</Title>
<style>
body { background-color:#E5E4E2;
      font-family:Monospace;
      font-size:10pt; }
td, th { border:0px solid black; 
        border-collapse:collapse;
        white-space:pre; }
th { color:white;
    background-color:black; }
table, tr, td, th {
     padding: 2px; 
     margin: 0px;
     white-space:pre; }
tr:nth-child(odd) {background-color: lightgray}
table { width:95%;margin-left:5px; margin-bottom:20px; }
h2 {
font-family:Tahoma;
color:#6D7B8D;
}
.footer 
{ color:green; 
 margin-left:10px; 
 font-family:Tahoma;
 font-size:8pt;
 font-style:italic;
}
</style>
"@

$HTMLFile = @"
$head
<b>Servername</b>: $ENV:COMPUTERNAME <br>
<b>Server Type</b>: $machineType <br>
<b>Amount of RAM</b>: $RAM <br>
<br>
<h1>NIC Configuration</h1> <br>
$NicConf
<br>
<h1>Installed Applications</h1> <br>
$ApplicationsTable
<br>
<h1>Installed Roles</h1> <br>
$RolesTable
<br>
<h1>Physical Disk information</h1>
$DiskLayoutTable
<h1>RAID information</h1>
$RAIDLayoutTable
"@

$HTMLFile | out-file C:\Temp\ServerDoc.html

To upload this data to IT-Glue, we can use the exact same script as we’ve used last time, with two small edits; the title of the flexible asset, and the description.

#####################################################################
$APIKEy =  "YOUR API KEY GOES HERE"
$APIEndpoint = "https://api.eu.itglue.com"
$orgID = "THE ORGANISATIONID YOU WOULD LIKE TO UPDATE GOES HERE"
$FlexAssetName = "ITGLue AutoDoc - Server Overview"
$Description = "a server one-page document that shows the current configuration"
#####################################################################
#This is the object we'll be sending to IT-Glue. 
$ComputerSystemInfo = Get-CimInstance -ClassName Win32_ComputerSystem
if($ComputerSystemInfo.model -match "Virtual" -or $ComputerSystemInfo.model -match "VMware") { $MachineType = "Virtual"} Else { $MachineType = "Physical"}
$networkName = Get-CimInstance -ClassName Win32_NetworkAdapter | Where-Object {$_.PhysicalAdapter -eq "True"} | Sort Index
$networkIP = Get-CimInstance -ClassName Win32_NetworkAdapterConfiguration | Where-Object {$_.MACAddress -gt 0} | Sort Index
$networkSummary = New-Object -TypeName 'System.Collections.ArrayList'

foreach($nic in $networkName) {
    $nic_conf = $networkIP | Where-Object {$_.Index -eq $nic.Index}
 
    $networkDetails = New-Object PSObject -Property @{
        Index                = [int]$nic.Index;
        AdapterName         = [string]$nic.NetConnectionID;
        Manufacturer         = [string]$nic.Manufacturer;
        Description          = [string]$nic.Description;
        MACAddress           = [string]$nic.MACAddress;
        IPEnabled            = [bool]$nic_conf.IPEnabled;
        IPAddress            = [string]$nic_conf.IPAddress;
        IPSubnet             = [string]$nic_conf.IPSubnet;
        DefaultGateway       = [string]$nic_conf.DefaultIPGateway;
        DHCPEnabled          = [string]$nic_conf.DHCPEnabled;
        DHCPServer           = [string]$nic_conf.DHCPServer;
        DNSServerSearchOrder = [string]$nic_conf.DNSServerSearchOrder;
    }
    $networkSummary += $networkDetails
}
$NicRawConf = $networkSummary | select AdapterName,IPaddress,IPSubnet,DefaultGateway,DNSServerSearchOrder,MACAddress | Convertto-html -Fragment | select -Skip 1
$NicConf = "<br/><table class=`"table table-bordered table-hover`" >" + $NicRawConf

$RAM = (systeminfo | Select-String 'Total Physical Memory:').ToString().Split(':')[1].Trim()

$ApplicationsFrag = Get-ItemProperty HKLM:\Software\Wow6432Node\Microsoft\Windows\CurrentVersion\Uninstall\* | Select-Object DisplayName, DisplayVersion, Publisher, InstallDate | Convertto-html -Fragment | select -skip 1
$ApplicationsTable = "<br/><table class=`"table table-bordered table-hover`" >" + $ApplicationsFrag

$RolesFrag = Get-WindowsFeature | Where-Object {$_.Installed -eq $True} | Select-Object displayname,name  | convertto-html -Fragment | Select-Object -Skip 1
$RolesTable = "<br/><table class=`"table table-bordered table-hover`" >" + $RolesFrag

if($machineType -eq "Physical" -and $ComputerSystemInfo.Manufacturer -match "Dell"){
$DiskLayoutRaw = omreport storage pdisk controller=0 -fmt cdv
$DiskLayoutSemi = $DiskLayoutRaw |  select-string -SimpleMatch "ID,Status," -context 0,($DiskLayoutRaw).Length | convertfrom-csv -Delimiter "," | select Name,Status,Capacity,State,"Bus Protocol","Product ID","Serial No.","Part Number",Media | convertto-html -Fragment
$DiskLayoutTable = "<br/><table class=`"table table-bordered table-hover`" >" + $DiskLayoutsemi

#Try to get RAID layout
$RAIDLayoutRaw = omreport storage vdisk controller=0 -fmt cdv
$RAIDLayoutSemi = $RAIDLayoutRaw |  select-string -SimpleMatch "ID,Status," -context 0,($RAIDLayoutRaw).Length | convertfrom-csv -Delimiter "," | select Name,Status,State,Layout,"Device Name","Read Policy","Write Policy",Media |  convertto-html -Fragment
$RAIDLayoutTable = "<br/><table class=`"table table-bordered table-hover`" >" + $RAIDLayoutsemi
}else {
    $RAIDLayoutTable = "Could not get physical disk info"
    $DiskLayoutTable = "Could not get physical disk info"
}

$HTMLFile = @"
<b>Servername</b>: $ENV:COMPUTERNAME <br>
<b>Server Type</b>: $machineType <br>
<b>Amount of RAM</b>: $RAM <br>
<br>
<h1>NIC Configuration</h1> <br>
$NicConf
<br>
<h1>Installed Applications</h1> <br>
$ApplicationsTable
<br>
<h1>Installed Roles</h1> <br>
$RolesTable
<br>
<h1>Physical Disk information</h1>
$DiskLayoutTable
<h1>RAID information</h1>
$RAIDLayoutTable
"@



$FlexAssetBody = 
@{
    type = 'flexible-assets'
    attributes = @{
            name = $FlexAssetName
            traits = @{
                "name" = $ENV:COMPUTERNAME
                "information" = $HTMLFile
            }
    }
}

#ITGlue upload starts here.
If(Get-Module -ListAvailable -Name "ITGlueAPI") {Import-module ITGlueAPI} Else { install-module ITGlueAPI -Force; import-module ITGlueAPI}
#Settings IT-Glue logon information
Add-ITGlueBaseURI -base_uri $APIEndpoint
Add-ITGlueAPIKey $APIKEy
#Checking if the FlexibleAsset exists. If not, create a new one.
$FilterID = (Get-ITGlueFlexibleAssetTypes -filter_name $FlexAssetName).data
if(!$FilterID){ 
    $NewFlexAssetData = 
    @{
        type = 'flexible-asset-types'
        attributes = @{
                name = $FlexAssetName
                icon = 'sitemap'
                description = $description
        }
        relationships = @{
            "flexible-asset-fields" = @{
                data = @(
                    @{
                        type       = "flexible_asset_fields"
                        attributes = @{
                            order           = 1
                            name            = "name"
                            kind            = "Text"
                            required        = $true
                            "show-in-list"  = $true
                            "use-for-title" = $true
                        }
                    },
                    @{
                        type       = "flexible_asset_fields"
                        attributes = @{
                            order          = 2
                            name           = "information"
                            kind           = "Textbox"
                            required       = $false
                            "show-in-list" = $false
                        }
                    }
                )
                }
            }
              
       }
New-ITGlueFlexibleAssetTypes -Data $NewFlexAssetData 
$FilterID = (Get-ITGlueFlexibleAssetTypes -filter_name $FlexAssetName).data
} 

#Upload data to IT-Glue. We try to match the Server name to current computer name.
$ExistingFlexAsset = (Get-ITGlueFlexibleAssets -filter_flexible_asset_type_id $Filterid.id -filter_organization_id $orgID).data | Where-Object {$_.attributes.name -eq $ENV:COMPUTERNAME}

#If the Asset does not exist, we edit the body to be in the form of a new asset, if not, we just upload.
if(!$ExistingFlexAsset){
$FlexAssetBody.attributes.add('organization-id', $orgID)
$FlexAssetBody.attributes.add('flexible-asset-type-id', $FilterID.id)
Write-Host "Creating new flexible asset"
New-ITGlueFlexibleAssets -data $FlexAssetBody

} else {
Write-Host "Updating Flexible Asset"
Set-ITGlueFlexibleAssets -id $ExistingFlexAsset.id  -data $FlexAssetBody}

And that’s it! Happy PowerShelling! the next chapter will be how to upload the Bitlocker key for a client machine to IT-Glue as a password object, it will also tag all related devices, something that’s pretty cool!

Documenting with PowerShell – New series

Hi All!

Starting this week I’ll be blogging about using PowerShell with your RMM/Automation platform and running scripts to collect valuable documentation. I’ll try to keep it as generic as possible and export the documentation to HTML, but I’ll always include a version to upload it to IT-Glue or Confluence. As requested by some I’ll also include the AMP for N-Central so you can get going with it.

To get started straight away, I’ll share the script that we will be using throughout this series to upload documentation to IT-Glue fully automated. You won’t even need to create flexible assets as the script does this for you.

The Script

For the script you’ll need at least Windows 10, or Server 2012R2+. You’ll also need your IT-Glue API key and the URL, generally speaking that URL is “https://api.itglue.com” or “https://api.eu.itglue.com” for european users. Now let’s get started on our uploading script. 🙂

N-Able users can download the AMP for this script here (Right click->Save as) The script can use Custom Device or Organisation Properties as input, and as thus you can enter the Organisation ID on each Custom Organisation Property and automate your documentation process completely.

#####################################################################
$APIKEy =  "YOUR API KEY GOES HERE"
$APIEndpoint = "https://api.eu.itglue.com"
$orgID = "THE ORGANISATIONID YOU WOULD LIKE TO UPDATE GOES HERE"
$FlexAssetName = "ITGLue AutoDoc - Quick example"
$Description = "a quick overview of easy it is to upload data to IT-Glue"
#####################################################################
#This is the object we'll be sending to IT-Glue. 
$HTMLStuff = @"
<b>Servername</b>: $ENV:COMPUTERNAME <br>
<b>Number of Processors</b>: $ENV:NUMBER_OF_PROCESSORS <br>

This is a little example of how we upload data to IT-Glue.
"@
$FlexAssetBody = 
@{
    type = 'flexible-assets'
    attributes = @{
            name = $FlexAssetName
            traits = @{
                "name" = $ENV:COMPUTERNAME
                "information" = $HTMLStuff
            }
    }
}

#ITGlue upload starts here.
If(Get-Module -ListAvailable -Name "ITGlueAPI") {Import-module ITGlueAPI} Else { install-module ITGlueAPI -Force; import-module ITGlueAPI}
#Settings IT-Glue logon information
Add-ITGlueBaseURI -base_uri $APIEndpoint
Add-ITGlueAPIKey $APIKEy
#Checking if the FlexibleAsset exists. If not, create a new one.
$FilterID = (Get-ITGlueFlexibleAssetTypes -filter_name $FlexAssetName).data
if(!$FilterID){ 
    $NewFlexAssetData = 
    @{
        type = 'flexible-asset-types'
        attributes = @{
                name = $FlexAssetName
                icon = 'sitemap'
                description = $description
        }
        relationships = @{
            "flexible-asset-fields" = @{
                data = @(
                    @{
                        type       = "flexible_asset_fields"
                        attributes = @{
                            order           = 1
                            name            = "name"
                            kind            = "Text"
                            required        = $true
                            "show-in-list"  = $true
                            "use-for-title" = $true
                        }
                    },
                    @{
                        type       = "flexible_asset_fields"
                        attributes = @{
                            order          = 2
                            name           = "information"
                            kind           = "Textbox"
                            required       = $false
                            "show-in-list" = $false
                        }
                    }
                )
                }
            }
              
       }
New-ITGlueFlexibleAssetTypes -Data $NewFlexAssetData 
$FilterID = (Get-ITGlueFlexibleAssetTypes -filter_name $FlexAssetName).data
} 

#Upload data to IT-Glue. We try to match the Server name to current computer name.
$ExistingFlexAsset = (Get-ITGlueFlexibleAssets -filter_flexible_asset_type_id $Filterid.id -filter_organization_id $orgID).data | Where-Object {$_.attributes.name -eq $ENV:COMPUTERNAME}

#If the Asset does not exist, we edit the body to be in the form of a new asset, if not, we just upload.
if(!$ExistingFlexAsset){
$FlexAssetBody.attributes.add('organization-id', $orgID)
$FlexAssetBody.attributes.add('flexible-asset-type-id', $FilterID.id)
Write-Host "Creating new flexible asset"
New-ITGlueFlexibleAssets -data $FlexAssetBody

} else {
Write-Host "Updating Flexible Asset"
Set-ITGlueFlexibleAssets -id $ExistingFlexAsset.id  -data $FlexAssetBody}

The script does multiple things for you, that a lot of other scripts tend to skimp over;

  • We check if a Flexible Asset type with our chosen name is already present, if its not. we create it
  • We then check if a Flexible Asset form already exists with the same name as we’ve entered, if not, we’ll upload a fresh one, if it does, we’ll upload an update for that specific item.

In the following series I’ll teach you how to get the organisation ID by information we gather on the machine you are running your script on. We’ll be tackling how to get the correct devices tagged on your flexible assets, but of course we’ll start by taking apart the script above and teaching you how to create fully automated network documentation.

As always, Happy Powershelling!

Monitoring with PowerShell Chapter 3: Monitoring network state

Our clients often want us to monitor specific network connections, such as VPN tunnels that need to be online, services that always need to be reachable, or even simply to report on internet connection speeds. To do this, we mostly use our network controller software and default RMM sets. In rare cases, that is not enough, so we’ve developed some monitoring sets for our RMM to help us with this.

To start, we have a RMM monitoring set that uses the Test-Connection cmdlet to ping multiple hosts entered in our RMM system. We define these per client, so their most important resources are checked constantly.

$State = "Healthy"
$IPsToPing = $IPsToPing.split(",")
try{
$ConnectionTest = test-connection $IPsToPing -count 3 -ErrorAction stop -Verbose
}
catch [System.Management.Automation.ActionPreferenceStopException]            
{            
try {            
throw $_.exception            
}                 
catch [System.Net.NetworkInformation.PingException] {            
$state = "$($error[0])"            
}                      
catch {            
$state = "$($error[0])"           
}            
}

$EndResult = $ConnectionTest | measure-Object -Property ResponseTime -Average -sum -Maximum -Minimum

$AVGMS = $EndResult.Average.ToString(00)
$MaxMS = $EndResult.Maximum.ToString(00)
$MinMs = $EndResult.Minimum.ToString(00)

$AVGMS shows the average of MS with 3 pings, $MaxMS shows the highest reached MS during 3 pings, and $MinMs, you’ve guessed it – Shows the fastests pings in the west 😉

Next to ping monitoring we also check the health state of the internal network on Windows Servers. We see when doing take overs of infrastructure that in a lot of situations Network Location Awareness does not function or start correctly.

The network location awareness service actually is the service that tells your OS what network profile it should use, like “public”, “private” or “domain”. Not having it running can cause a myriad of issues such a SSPI issues on SQL servers, Firewalling issues, and much more!

Most of our environments use $Domainname.com or when doing takeovers $domainname.local. The common denominator in this is that the network profile contains a period and from there we compare it to the actual Network Category. If these do not match, we alert and see if we need to recover the Network Location Awareness service.

$NetworkProfile = Get-NetConnectionProfile

if ($NetworkProfile.Name -contains ".")
{ $NLAState = "Domain Authentiction might not work properly."  }else{ $NLAState = "Healthy" }

foreach ($NetProfile in $NetworkProfile | where {$_.Name -match "."}) {
 if ($NetProfile.NetworkCategory -ne "DomainAuthenticated")
 { $DomainState = "Network is not set to DomainAuthenticated."  }else{ $DomainState = "Healthy" }
}

if(!$NLAState){ $NLAState = "healthy }

And that’s the blog for today! enjoy and as always, Happy PowerShelling!

Monitoring with PowerShell Chapter 3: Monitoring Modern Authentication

Modern Authentication is turned on by default for new tenants, but if you have legacy tenants or take over tenants from others MSP’s than sometimes you might have tenants that do not use Modern Authentication yet.

Monitoring and auto remediation is key in this when using Multi factor Authentication. We want the best user experience, so we must have it enabled to make sure users get a nice looking pop-up in outlook. also we want to avoid using App Passwords.

PowerShell Monitoring script:

This script only monitors the Modern Auth status, and does not auto-remediate.

$creds = get-credential
Connect-MsolService -Credential $creds
$clients = Get-MsolPartnerContract -All
 
foreach ($client in $clients) { 
 $ClientDomain = Get-MsolDomain -TenantId $client.TenantId | Where-Object {$_.IsInitial -eq $true}
 Write-host "Logging into portal for $($client.Name)"
 $DelegatedOrgURL = "https://ps.outlook.com/powershell-liveid?DelegatedOrg=" + $ClientDomain.Name
 $ExchangeOnlineSession = New-PSSession -ConnectionUri $DelegatedOrgURL -Credential $credential -Authentication Basic -ConfigurationName Microsoft.Exchange -AllowRedirection
 Import-PSSession -Session $ExchangeOnlineSession -AllowClobber -DisableNameChecking
 
 $Oauth = Get-OrganizationConfig 
 
 if($Oauth.OAuth2ClientProfileEnabled -eq $false){ $ModernAuthState += "$($ClientDomain.name) has modern auth disabled"}
 
 Remove-PSSession $ExchangeOnlineSession
}

if(!$ModernAuthState){ $ModernAuthState = "Healthy"}
PowerShell auto-remediation script
$creds = get-credential
Connect-MsolService -Credential $creds 
$clients = Get-MsolPartnerContract -All
 
foreach ($client in $clients) { 
 $ClientDomain = Get-MsolDomain -TenantId $client.TenantId | Where-Object {$_.IsInitial -eq $true}
 Write-host "Logging into portal for $($client.Name)"
 $DelegatedOrgURL = "https://ps.outlook.com/powershell-liveid?DelegatedOrg=" + $ClientDomain.Name
 $ExchangeOnlineSession = New-PSSession -ConnectionUri $DelegatedOrgURL -Credential $credential -Authentication Basic -ConfigurationName Microsoft.Exchange -AllowRedirection
 Import-PSSession -Session $ExchangeOnlineSession -AllowClobber -DisableNameChecking
 
 $Oauth = Get-OrganizationConfig 
 
 if($Oauth.OAuth2ClientProfileEnabled -eq $false){ Set-OrganizationConfig -OAuth2ClientProfileEnabled $true }
 
 Remove-PSSession $ExchangeOnlineSession
}

And that’s it! Hope it helps and as always, Happy PowerShelling.

Connectwise: Quick Time Entry Form

After releasing the Autotask Quick Time Entry Form I’ve gotten so many responses by MSP’s that loved it, and also some requesting a Connectwise version, or wishing for a CW version. Guess what?! I’m making dreams come true today!

This is my first Connectwise integration, and I had some great help from some friends that use Connectwise Manage as the terms are slightly different than Autotasks.

preparations

Create an API user in Connectwise. That’s it.

Download the files to install here.

Installation:
  • Copy all files to your webhost.
  • edit the file settings.php. Fill in all fields. Note that you only have to edit the URL portion of $APIEndpoint
  • Open EnterTime.php, find all entries called “YOURWEBHOST” and replace with your actual webhost URL.

When all of this is done, you can browse to http://YOURWEBHOST.com/frontend/EnterTime.php and you’ll be be able to enter time entries in a couple of quick steps. Feel free to edit the files, change how it looks or even use it in other projects as long as you credit me as the source.

Good luck!

Monitoring with PowerShell Chapter 3: Monitoring SSL certificates on IIS

For some clients that still have on-site servers running IIS we have to monitor the SSL status to make sure that things such as the Remote Desktop Gateway, or SSTP VPN are always available. To make sure that we are able to replace SSL certifcates on time we need to know when specific certificates expire.

To do this, we will use the Get-IISSite cmdlet. This cmdlet exposes all of the IIS configuration in an easy to use format. First, we’ll get all sites with bindings like this;

(Get-IISSite).bindings

Now if you run this, you won’t see that much information. To make sure we get a list of all possible items we have a couple of choices, but the simplest is using the select statement:

(Get-IISSite).bindings | select *

Here we’re telling PowerShell we want to select every possible object returned by our command. With this you’ll see we have a lot more information about the bindings. Including that we can filter on the protocol used. Because we only want information about the pages secured with a certificated we can run the same command again, but this time filtering on the protocol “HTTPS”

(Get-IISSite).bindings | where-object {$_.Protocol -eq "https"}

Now that we have the information that we want! our next step is easy; we will find the bound certificate in the local certificate store and check the expiry date. We like to be warned 14 days before expiry, but you can change the days to anything you’d like 🙂

$Days = (Get-Date).AddDays("14")
$CertsBound = (Get-IISSite).bindings | where-object {$_.Protocol -eq "https"}
foreach($Cert in $CertsBound){
$CertFile = Get-ChildItem -path "CERT:LocalMachine\$($Cert.CertificateStoreName)" | Where-Object -Property ThumbPrint -eq $cert.RawAttributes.certificateHash
if($certfile.NotAfter -lt $Days) { $CertState += "$($certfile.FriendlyName) will expire on $($certfile.NotAfter)" }
}

if(!$certState){$CertState = "Healthy"}

And thats it! if the certificate will expire in 14 days, this set will start alerting. Happy PowerShelling!

Autotask Quick Time Entry

Autotask’s app for iOS and Android isn’t the greatest app ever made, because of this our on-site engineers do not always enter time-entries because it’s too much of a hassle.

So to make sure that time entries are added on time by them I’ve created a quick web application to enter the time entries easily from a phone. Please note that I am not a PHP developer in any way, so if you feel the code is messy or not up to standards you are free to fix it yourself. This entire blog post comes without guarantees. 😉

preparations

You have to have a AT API user. Create this user and document the username and password. the user should have the role “API User” and have the following extra settings enabled:

  • Note > Contract > ImpersonateForCreate
  • Note > Contract > ImpersonateForModify
  • CrmNote > ImpersonateForCreate
  • CrmNote > ImpersonateForModify
  • Note > Project > ImpersonateForCreate
  • Note > Project > ImpersonateForModify
  • Note > ProjectTask > ImpersonateForCreate
  • Note > ProjectTask > ImpersonateForModify
  • Note > Ticket > ImpersonateForCreate
  • Note > Ticket > ImpersonateForModify
  • TimeEntry > ImpersonateForCreate
  • TimeEntry > ImpersonateForModify

Download the files to install here.

Installation:
  • Copy all files to your webhost.
  • edit the file bootstrap.php in _libAT. Enter your AutoTask API username and password.
  • Enter the correct WSDL location. You can find this in the API documentation here
  • browse to https://yourwebhost.com/frontend/GetRoles.php. Copy the ID you want to use as the default role.
  • Open SubmitTime.php and paste the DefaultRoleID in the correct location here.
  • Open EnterTime.php and find “worktype”. Here you’ll see a list of worktypes. edit this list to match your Autotask configuration. Keep this file open
  • Find all entries called “YOURWEBHOST” and replace with your actual webhost URL.

When all of this is done, you can browse to http://YOURWEBHOST.com/frontend/EnterTime.php and you’ll be be able to enter time entries in a couple of quick steps. Feel free to edit the files, change how it looks or even use it in other projects as long as you credit me as the source.

Monitoring with PowerShell Chapter 3: Monitoring user creation

We all know that bad actors often create accounts for repeat access when they gain access to your network. To make sure that we are aware of these situations we user PowerShell monitoring.

For security measures we monitor 3 types of user creation:

  • All created domain users,
  • All users added privileged groups(e.g. Domain Admins)
  • All created local users.
  • All (Temporary) users created that contain specific keywords.
Temporary users Monitoring:

The temporary users monitoring set is only used on active directory users, not local users. as we’re already monitoring all local user creation this would be redudant.

$ArrayOfNames = @("test", "tmp","skykick","mig", "migwiz","temp","-admin","supervisor")

foreach($name in $ArrayOfNames){
$filter =  'Name -like "*'+ $($name) + '*"'
$Users = Get-ADUser -Filter $filter
if($users -ne $null){
foreach($user in $users){
$TemporaryUser += "$($user.name) has been found, created at $($user.whencreated)`n"
}
}
}
if(!$TemporaryUser){$Userlist = "No Temporary Accounts Found" }
Created Domain Users

For domain users, we alert on all users created in the last 24 hours. after 24 hours this monitoring set goes back to a healthy state.

$When = ((Get-Date).AddDays(-1)).Date
$GetUsers = Get-ADUser -Filter {whenCreated -ge $When} -Properties whenCreated

foreach($user in $GetUsers){
$userchanges += "$($user.name) has been created at $($user.whencreated) `n"
}

if($UserChanges -eq $Null) { $UserChanges = "No Changes Detected"}
All created local users

Local users do not have a creation date, that makes it a little more difficult to monitor. To solve this we create a file with the results of get-localuser, We update that file every 24 hours. Then from there on, we compare the file to the most recent results of the output of get-localuser. We are alerted if the compare finds newer content, Meaning a user has been created or removed in the local user database.

$Outputfile = "C:\Windows\temp\LocalUsers.txt"
$Localusers = Get-LocalUser | Select-Object *
Test-Path $Outputfile | %{if($_ -eq $false){new-item $Outputfile} }
$OutPutFileContents = Get-Content $Outputfile | ConvertFrom-Json

If((get-item $Outputfile).LastWriteTime -lt (Get-Date).AddDays((-1))){    $Localusers | ConvertTo-Json |out-file $Outputfile  }

$Compare = Compare-Object -DifferenceObject $OutPutFileContents.name -ReferenceObject $Localusers.name

if(!$Compare) { $Compare = "Healthy"}
Privileged Group Changes

We monitor the privileged groups by using Ashley McGlone’s script to get the changes. You can find that full script here.

Function Get-PrivilegedGroupChanges {            
    Param(            
        $Server = "localhost",            
        $Hour = 24            
    )            
                
        $ProtectedGroups = Get-ADGroup -Filter 'AdminCount -eq 1' -Server $Server            
        $Members = @()            
                
        ForEach ($Group in $ProtectedGroups) {            
            $Members += Get-ADReplicationAttributeMetadata -Server $Server `
                -Object $Group.DistinguishedName -ShowAllLinkedValues |            
             Where-Object {$_.IsLinkValue} |            
             Select-Object @{name='GroupDN';expression={$Group.DistinguishedName}}, `
                @{name='GroupName';expression={$Group.Name}}, *            
        }            
                
        $Members |            
            Where-Object {$_.LastOriginatingChangeTime -gt (Get-Date).AddHours(-1 * $Hour)}            
                
    }     
    
    $ListOfChanges = Get-PrivilegedGroupChanges
    
    if($ListOfChanges -eq $Null) 
    {
    $GroupChanges = "No Changes Detected"
    }
    else {
    foreach($item in $ListOfChanges){
    $GroupChanges += "Group $($ListOfChanges.GroupName) has been changed - $($listofchanges.AttributeValue) has been added or removed"
    }
    }
    
    if(!$groupChanges) { $GroupChanges = "No Changes Detected"}

Monitoring with PowerShell Chapter 3: Monitoring and remediating Windows Feature Update status

With the advent of Windows 10 all MSP’s are faced with a new challenge: How do we manage the different Windows 10 Feature versions and how do we make sure we can automatically upgrade our clients to the latest version of the Windows 10 OS? Microsoft has not made feature updating very straightforward, and sometimes the automatic updates error out.

A lot of RMM systems claim they do version upgrades perfectly, unfortunately I have not seen any RMM that gracefully upgrades machines without too many issues and without user intervention. as an MSP its key to automate as much as possible and prevent engineers from having to bother users to perform machine upgrades, so here is PowerShell to the rescue:

Checking the Windows version and comparing it to your standards.

We’ve decided we want all of our users on the same feature level, because of this we’ve created a monitoring set that alerts us when Windows 10 computers have a lower version then one we’ve centrally set. Quite simply our monitoring set only returns the current release ID we pull from the registry:

$ReleaseID = (Get-ItemProperty -Path "HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion" -Name releaseid).releaseid

Now we alert on this monitoring set, when it is below a expected value. In our case anything lower than 1903 generates an alert. Our system then sees that this alert has been generated and performs a remediation script during the next maintenance cycle we’ve agreed with the client.

Remediation

Remediation for having the wrong feature update is easy. Run the update and done. The problem is that most RMM systems don’t handle this very cleanly. To resolve this we’ve created a PowerShell script that grabs the ISO from a networkshare, copies it to a temporary location, and runs the upgrade from there.

The script also checks if there is enough space on the GPT boot partition, and if not deletes the unnecessary font files.

If the network share is not available, It will download the ISO from a web server you specify, which is great for a mobile workforce. All you have to change in this script is $ISOPath and $Webserver.

$ISOPath = "\\Servername\Netlogon\Windows10.iso"
$WebServer = "http://YOURWEBSERVER/Windows10.iso"
$OSDiskIndex = gwmi -query "Select * from Win32_DiskPartition WHERE Bootable = True" | Select-Object -ExpandProperty DiskIndex
$PartTypeFull = gwmi -query "Select * from Win32_DiskPartition WHERE Index = 0" | Select-Object -ExpandProperty Type
$PartTypeMid = $PartTypeFull.Substring(0,3)
$PartType = Out-String -InputObject $PartTypeMid
if ($PartType -like "*GPT*")
{
write-output ("System has a GPT partition, clearing EFI fonts....");
cmd.exe /c "mountvol b: /s"
Remove-Item b:\efi\Microsoft\Boot\Fonts\*.* -force
cmd.exe /c "mountvol b: /d"
}
if (Test-Path $ISOPath)
{
write-output ("Mounting ISO from: "+$ISOPath);
}
else
{
write-output ("Warn: ISO not found at: "+$ISOPath);
write-output ("Downloading ISO from webserver....");
mkdir c:\temp
$ISOPath = "c:\temp\Windows_10_upgrade.iso";
invoke-webrequest $Webserver -OutFile $ISOPath
}
Mount-DiskImage -ImagePath $ISOPath
$ISODrive = Get-DiskImage -ImagePath $ISOPath | Get-Volume | Select-Object -ExpandProperty DriveLetter
write-output ("Mounted ISO on drive: "+$ISODrive)
$Exe = ":\setup.exe"
$Arguments = "/auto upgrade /quiet /noreboot"
$ExePath = $ISODrive + $Exe
write-output ("Running setup from ISO: " + $ExePath)
Start-Process $ExePath $Arguments

And tada! thats it. The upgrade will run, but the machine will not reboot until the user performs the reboot. We schedule more tasks during our maintenance cycle, so we’d rather have the RMM system handle the reboot.

Happy PowerShelling!