Category Archives: Powershell

Monitoring with PowerShell: Monitoring disk speed

Sometimes we get complaints from clients reporting “my machine is slow” and not really get any leads to work with. The client just experiences slowness. In most cases this is due to disk speeds – the client bought some cheap computer with a 5400RPM spinner in it and is expecting it to perform just as good as any machine we supplied with an SSD.

To prevent this we could look at things such as the disk queue explained in an earlier blog here. But the problem with this type of monitoring is that its quite intermittent; You often only find the issue after a user has already complained. Because of this I’ve created a monitoring set that runs once or twice a (work) day in our RMM system. This script simply does a quick test on how fast it can actually create and read files and is used for reporting on it.

The script

So the script uses Diskspd.exe by Microsoft. You can download diskspd.exe here. You have to download the file and host it somewhere yourself. The script then downloads diskspd.exe from this location and executes two commands; a read of a 50mb file for 30 seconds, and writing a 50mb file for 30 seconds, totaling to 1 minute of performance testing.

$DownloadURL = "https://example.com/diskspd.exe"
Invoke-WebRequest -Uri $DownloadURL -OutFile "C:\Windows\Temp\diskspd.exe"
$ReadTest =  & "C:\Windows\Temp\diskspd.exe" -b128K -d30 -o32 -t1 -W0 -S -w0 -c50M test.dat
$Writetest = & "C:\Windows\Temp\diskspd.exe" -b128K -d30 -o32 -t1 -W0 -S -w100 -Z128K -c50M test.dat
$ReadResults = $readtest[-8] | convertfrom-csv -Delimiter "|" -Header Bytes,IO,Mib,IOPS,File | Select-Object IO,MIB,IOPs
$writeResults = $writetest[-1] | convertfrom-csv -Delimiter "|" -Header Bytes,IO,Mib,IOPS,File | Select-Object IO,MIB,IOPS

$ReadResults and $WriteResults will contain the IO, Mib/ps and IOPS. You can alert on each of these values. My personal preference is alering on them when the Mib/ps drops below 500mbps, because then you can be fairly sure its either a spinning disk, performance issue, or ancient SSD that needs to be replaced.

And that’s all for today! as always, happy PowerShelling.

Connect to Exchange Online automated when MFA is enabled (Using the SecureApp Model)

So in the past months Microsoft has been forcing CSPs and MSPs to use MFA, something I strongly encourage and am glad with. The only issue with this was that Microsoft made this move without accounting for automation and automated jobs that need to run, especially jobs that run unattended and over multiple delegates.

To make sure that we could be able to use this I’ve been advocating and discussing this with Microsoft a lot, including a public part here, but after several months and great help from
Janosch Ulmer at Microsoft I’ve been able to compose a script for everyone to use, to connect to all Microsoft resources using the secure application model, including Exchange Online.

Compiling all of these scripts took me quite some time. If you have questions or issues just let me know!

The Script

Param
(
    [Parameter(Mandatory = $false)]
    [switch]$ConfigurePreconsent,
    [Parameter(Mandatory = $true)]
    [string]$DisplayName,
    [Parameter(Mandatory = $false)]
    [string]$TenantId
)

$ErrorActionPreference = "Stop"

# Check if the Azure AD PowerShell module has already been loaded.
if ( ! ( Get-Module AzureAD ) ) {
    # Check if the Azure AD PowerShell module is installed.
    if ( Get-Module -ListAvailable -Name AzureAD ) {
        # The Azure AD PowerShell module is not load and it is installed. This module
        # must be loaded for other operations performed by this script.
        Write-Host -ForegroundColor Green "Loading the Azure AD PowerShell module..."
        Import-Module AzureAD
    } else {
        Install-Module AzureAD
    }
}

try {
    Write-Host -ForegroundColor Green "When prompted please enter the appropriate credentials... Warning: Window might have pop-under in VSCode"

    if([string]::IsNullOrEmpty($TenantId)) {
        Connect-AzureAD | Out-Null

        $TenantId = $(Get-AzureADTenantDetail).ObjectId
    } else {
        Connect-AzureAD -TenantId $TenantId | Out-Null
    }
} catch [Microsoft.Azure.Common.Authentication.AadAuthenticationCanceledException] {
    # The authentication attempt was canceled by the end-user. Execution of the script should be halted.
    Write-Host -ForegroundColor Yellow "The authentication attempt was canceled. Execution of the script will be halted..."
    Exit
} catch {
    # An unexpected error has occurred. The end-user should be notified so that the appropriate action can be taken.
    Write-Error "An unexpected error has occurred. Please review the following error message and try again." `
        "$($Error[0].Exception)"
}

$adAppAccess = [Microsoft.Open.AzureAD.Model.RequiredResourceAccess]@{
    ResourceAppId = "00000002-0000-0000-c000-000000000000";
    ResourceAccess =
    [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
        Id = "5778995a-e1bf-45b8-affa-663a9f3f4d04";
        Type = "Role"},
    [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
        Id = "a42657d6-7f20-40e3-b6f0-cee03008a62a";
        Type = "Scope"},
    [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
        Id = "311a71cc-e848-46a1-bdf8-97ff7156d8e6";
        Type = "Scope"}
}

$graphAppAccess = [Microsoft.Open.AzureAD.Model.RequiredResourceAccess]@{
    ResourceAppId = "00000003-0000-0000-c000-000000000000";
    ResourceAccess =
        [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
            Id = "bf394140-e372-4bf9-a898-299cfc7564e5";
            Type = "Role"},
        [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
            Id = "7ab1d382-f21e-4acd-a863-ba3e13f7da61";
            Type = "Role"}
}

$partnerCenterAppAccess = [Microsoft.Open.AzureAD.Model.RequiredResourceAccess]@{
    ResourceAppId = "fa3d9a0c-3fb0-42cc-9193-47c7ecd2edbd";
    ResourceAccess =
        [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
            Id = "1cebfa2a-fb4d-419e-b5f9-839b4383e05a";
            Type = "Scope"}
}

$SessionInfo = Get-AzureADCurrentSessionInfo

Write-Host -ForegroundColor Green "Creating the Azure AD application and related resources..."

$app = New-AzureADApplication -AvailableToOtherTenants $true -DisplayName $DisplayName -IdentifierUris "https://$($SessionInfo.TenantDomain)/$((New-Guid).ToString())" -RequiredResourceAccess $adAppAccess, $graphAppAccess, $partnerCenterAppAccess -ReplyUrls @("urn:ietf:wg:oauth:2.0:oob","https://localhost","http://localhost","http://localhost:8400")
$password = New-AzureADApplicationPasswordCredential -ObjectId $app.ObjectId
$spn = New-AzureADServicePrincipal -AppId $app.AppId -DisplayName $DisplayName


    $adminAgentsGroup = Get-AzureADGroup -Filter "DisplayName eq 'AdminAgents'"
    Add-AzureADGroupMember -ObjectId $adminAgentsGroup.ObjectId -RefObjectId $spn.ObjectId

write-host "Installing PartnerCenter Module." -ForegroundColor Green
install-module PartnerCenter -Force
write-host "Sleeping for 30 seconds to allow app creation on O365" -foregroundcolor green
start-sleep 30
write-host "Please approve General consent form." -ForegroundColor Green
$PasswordToSecureString = $password.value | ConvertTo-SecureString -asPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential($($app.AppId),$PasswordToSecureString)
$token = New-PartnerAccessToken -ApplicationId "$($app.AppId)" -Scopes 'https://api.partnercenter.microsoft.com/user_impersonation' -ServicePrincipal -Credential $credential -Tenant $($spn.AppOwnerTenantID) -UseAuthorizationCode
write-host "Please approve Exchange consent form." -ForegroundColor Green
$Exchangetoken = New-PartnerAccessToken -ApplicationId 'a0c73c16-a7e3-4564-9a95-2bdf47383716' -Scopes 'https://outlook.office365.com/.default' -Tenant $($spn.AppOwnerTenantID) -UseDeviceAuthentication
write-host "Last initation required: Please browse to https://login.microsoftonline.com/$($spn.AppOwnerTenantID)/adminConsent?client_id=$($app.AppId)"
write-host "Press any key after auth. An error report about incorrect URIs is expected!"
[void][System.Console]::ReadKey($true)
Write-Host "================ Secrets ================"
Write-Host "`$ApplicationId         = $($app.AppId)"
Write-Host "`$ApplicationSecret     = $($password.Value)"
Write-Host "`$TenantID              = $($spn.AppOwnerTenantID)"
write-host "`$RefreshToken          = $($token.refreshtoken)" -ForegroundColor Blue
write-host "`$Exchange RefreshToken = $($ExchangeToken.Refreshtoken)" -ForegroundColor Green
Write-Host "================ Secrets ================"
Write-Host "    SAVE THESE IN A SECURE LOCATION     " 

The following script creates a new application, and connects to all resources. At the end you will receive several private keys. Store these in a secure location for future usage such as a Azure Keyvault or IT-Glue. With these keys you can connect to all your delegated resources without MFA. There currently are some slight issues on the Azure side with performing consent, and as thus you’ll have to click a couple of times. Sorry about that 🙂

So, now that you have these keys you can use the following scripts to connect to the correct resources:

MSOL-Module
$ApplicationId         = 'xxxx-xxxx-xxxx-xxxx-xxx'
$ApplicationSecret     = 'YOURSECRET' | Convertto-SecureString -AsPlainText -Force
$TenantID              = 'xxxxxx-xxxx-xxx-xxxx--xxx'
$RefreshToken          = 'LongResourcetoken'
$ExchangeRefreshToken  = 'LongExchangeToken'
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)

$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID 
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID 

Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken
AzureAD Module
$ApplicationId         = 'xxxx-xxxx-xxxx-xxxx-xxx'
$ApplicationSecret     = 'YOURSECRET' | Convertto-SecureString -AsPlainText -Force
$TenantID              = 'xxxxxx-xxxx-xxx-xxxx--xxx'
$RefreshToken          = 'LongResourcetoken'
$ExchangeRefreshToken  = 'LongExchangeToken'
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)

$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID 
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID 

Connect-AzureAD -AadAccessToken $aadGraphToken.AccessToken -AccountId 'VALIDEMAILADDRESS' -MsAccessToken $graphToken.AccessToken -TenantId $tenantID
Exchange Online

For the Exchange Online module we’ll need to do a little more effort – You will need the tenantid of the client you are connecting too. I’ll show you how to perform a specific action for each client you have delegated access too.

$ApplicationId         = 'xxxx-xxxx-xxx-xxxx-xxxx'
$ApplicationSecret     = 'TheSecretTheSecrey' | Convertto-SecureString -AsPlainText -Force
$TenantID              = 'YourTenantID'
$RefreshToken          = 'RefreshToken'
$ExchangeRefreshToken  = 'ExchangeRefreshToken'
$upn                   = 'UPN-Used-To-Generate-Tokens'
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)

$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID 
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID 

Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken
$customers = Get-MsolPartnerContract -All
foreach($customer in $customers){
    $token = New-PartnerAccessToken -ApplicationId 'a0c73c16-a7e3-4564-9a95-2bdf47383716'-RefreshToken $ExchangeRefreshToken -Scopes 'https://outlook.office365.com/.default' -Tenant $customer.TenantId
    $tokenValue = ConvertTo-SecureString "Bearer $($token.AccessToken)" -AsPlainText -Force
    $credential = New-Object System.Management.Automation.PSCredential($upn, $tokenValue)
    $customerId = $customer.DefaultDomainName
    $session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://ps.outlook.com/powershell-liveid?DelegatedOrg=$($customerId)&BasicAuthToOAuthConversion=true" -Credential $credential -Authentication Basic -AllowRedirection
    Import-PSSession $session
#From here you can enter your own commands
get-mailbox 
#end of commands
Remove-PSSession $session
}

And that’s it! I hope it helps MSPs, CSPs, etc. 🙂 as always, Happy PowerShelling!

Monitoring with PowerShell: Monitoring Event log size

Lately I’ve been getting some questions about how to handle event logs when you do not have a SIEM or log collector in place. I like thinking about these situations as I know a lot of MSPs struggle with log analysis and collection.

As a test I’ve set our event logs to never overwrite as a lot of attackers these days simply try to spam the event log so hide their traces and make sure you can’t do forensics using the Windows Event logs. To make sure that we are able to view the past event logs we can use the following monitoring set. This set alerts when there is less than 10% space available in the event log.

The script checks the Application, System, and Security log.

$Logs = get-ciminstance -ClassName Win32_NTEventlogFile | Where-Object { $_.LogfileName -eq "Application" -or $_.LogfileName -eq "System" -or $_.LogfileName -eq "Security" }
$FullLogs = @()
foreach ($Log in $Logs) {
    if ($log.MaxFileSize / $log.FileSize -gt '1.1') { $FullLogs += "$($log.LogFileName) has less than 10% available." }
}
if (!$FullLogs) {
    $LogStatus = "Healthy"
}
else {
    $Logstatus = $FullLogs
}

Now that we have reporting and alerting available, the next step is to make sure we backup the logs to a secure location when we notice the logs are filling. To do this we use another script that is triggered as a self-healing protocol by our RMM system.

$RightNow = Get-Date -Format FileDateTime
$Logs = get-ciminstance -ClassName Win32_NTEventlogFile | Where-Object { $_.LogfileName -eq "Application" -or $_.LogfileName -eq "System" -or $_.LogfileName -eq "Security" }
foreach ($log in $logs) {
    $BackupPath = Join-Path "C:\SecureSecureLocation\$RightNow" "$($log.FileName).evtx"
    New-Item -ItemType File -Path $BackupPath -Force
    Copy-Item -path $($Log.Name) -Destination $BackupPath -Force
    If ($env:ClearLogs -eq "Clear") { Clear-EventLog $log.FileName }
}

There are a couple of cool things here – First is that we use the Get-date formatting FileDateTime, which makes a nice looking format for the folder where we want to store our backups, next is that we’re using New-item before copy-item, this is to make sure that the folder structure is created before we just copy the item over.

And that’s it! This way you can monitor the size of your event logs, and back them up automatically. As always, Happy PowerShelling!

Monitoring with PowerShell: Monitoring Active Directory replication

I’ve often deployed domain controllers in environments that weren’t the most stable due to connectivity issues. To make sure that the domain controllers keep replicating correctly and we detect issues early we use the Active Directory cmdlets in combination with our RMM system. This makes it so we can monitor the current status of the replication and alert if it does not work for a longer period of time.

The script is suitable for server 2012R2 and up. You can use this in your RMM system to detect issues early. I like monitoring when the replication has not worked for 6 hours, but you can always change this to your own preference.

Active Directory Replication Monitoring

$AlertTime = (get-date).AddHours(-6)
$FailedArr = @()
$RepStatus = Get-ADReplicationPartnerMetadata -Target * -Partition * | Select-Object Server, Partition, Partner, ConsecutiveReplicationFailures, LastReplicationSuccess, LastRepicationResult
foreach ($Report in $RepStatus) {
    $Partner = $Report.partner -split "CN="
    if ($report.LastReplicationSuccess -lt $AlertTime) {
        $FailedArr += "$($Report.Server) could not replicate with partner $($Partner[2]) for 6 hours. please investigate"
    }
}
if (!$FailedArr) { $FailedArr = "Healthy" } 

And that’s it for today! as always, Happy PowerShelling.

Monitoring with PowerShell: Monitoring RDS UPD size

So our clients have RDS deployment, WVD deployments, and just in general VDI-like environments. To make sure their profile can be loaded on each machine without having to set everything up again we use UPDs.

Of course these UDP’s have a maximum size defined and need to be monitored, you can monitor the location where you host your UDP’s but that is not enough. The disk could reach a maximum size without running out of disk space on the shared location.

You can use the following script to monitor the RDS UPD size, measured against the disks own maximum size. This script only works for disks that are currently mounted – So the user has be logged in to monitor the disk size.

UPD Monitor script

$DisksInWarning = @()
$VHDS = get-disk | Where-Object {$_.Location -match "VHD"}
foreach($VHD in $VHDS){
$Volume = $VHD | Get-Partition | Get-Volume
if($Volume.SizeRemaining -lt $volume.Size * 0.10 ){ $DisksInWarning += "$($Volume.friendlyname) Less than 10% remaining"}
}

This script will alert when we have less than 10% available. Now the downside of using this script is that it only states the friendly name of the disk that is in warning. In the case of UPDs this is often a long SID or a generic name. So to make sure this is actually useful we’re also going to retrieve the SID of the users, and translate these to the username.

$DisksInWarning = @()
$VHDS = get-disk | Where-Object {$_.Location -match "VHD"}
foreach($VHD in $VHDS){
$FilePath = [io.path]::GetFileNameWithoutExtension("$($VHD.Location)")
$SIDObject = New-Object System.Security.Principal.SecurityIdentifier ($FilePath) 
$Username = $SIDObject.Translate([System.Security.Principal.NTAccount])
$Volume = $VHD | Get-Partition | Get-Volume
if($Volume.SizeRemaining -lt $volume.Size * 0.10 ){ $DisksInWarning += "$($Username.Value) UPD Less than 10% remaining. Path: $($VHD.Location)"}
}

And that’s it! as always, Happy PowerShelling!

Functional PowerShell for MSPs webinar

Hi all,

I hope you’ve enjoyed the webinar. The recording can be found here. I have to admit I was a little nervous due to over 600 attendees! The scripts used during the presentation can be found attached here.

As I said; I am available for code reviews, personal PowerShell classes, or any other automation assistance you need, just let me know! I love to help.

In any case; I hope I’ll be able to do another webinar around the new year. Thank you all for attending and asking all your questions. Questions that I did not answer will get an e-mail from me directly.

Monitoring with PowerShell: Monitoring RRAS status.

So for my clients I’ve always relied completely on the Microsoft stack – I do not like most VPN appliances but still want to offer a stable SSL VPN for all clients. Enter SSTP, I’ve blogged about SSTP before when looking at DirectAccess or even Always-on VPN.

As with all products, appliances or server I always want to know the current state of the availability. In the case of SSTP VPN on anything higher than Windows Server 2012 we have a lot of PowerShell options available but we’ll only need one – Microsoft’s get-RemoteAccessHealth. When we execute Get-RemoteAccessHealth we get a nice display of the current state of the RRAS services.

Using the script below you can directly monitor the health of the current RRAS server, it’s a simple and short one.

$RRASHealth = Get-RemoteAccessHealth -Refresh | where-object { $_.HealthState -ne "OK"}
if(!$RRASHealth) { $RRASHealth = "Healthy"}

Hope it helps, and as always, Happy PowerShelling!

Monitoring with PowerShell: Monitoring Office365 admin password changes

So when I was at Dattocon I was approached by an MSP that was using his RMM system to alert on changes of the local admin password, as he wanted to be updated every time a local admin got a new password. He did this by using an older script of mine below.

Monitoring Local Admin Password changes

$LastDay = (Get-Date).addhours(-24)
$AdminGroup = Get-LocalGroupMember -SID "S-1-5-32-544"
foreach($Admin in $AdminGroup){
$ChangedAdmins = get-localuser -sid $admin.sid | Where-Object {$_.PasswordLastSet -gt $LastDay}
}

But he came to me telling me that recently he had a need to start using this to alert on that a password needed to be updated in his documentation system to complete a process, but he was missing this for Office365 environments. I figured I would give him a hand and made the following script

Monitoring Office365 Global Admin Password changes – All tenants

$LastDay = (Get-Date).addhours(-24)
$credential = Get-Credential
Connect-MsolService -Credential $credential
$customers = Get-msolpartnercontract -All
$ChangedUsers = @()
foreach($customer in $customers){
write-host "getting users for $($Customer.Name)" -ForegroundColorGreen
$adminemails = Get-MsolRoleMember -TenantId $customer.tenantid -RoleObjectId(Get-MsolRole-RoleName"CompanyAdministrator").ObjectId
$Users = $adminemails | get-msoluser-TenantId$customer.TenantId
foreach($User in $Users){
if($User.LastPasswordChangeTimestamp -gt $LastDay){$ChangedUsers += "$($User.UserPrincipalName)has changed his password in the last 24 hours.Please update documentation to reflect.`n"}
}
}

 

Monitoring Office365 Global Admin Password Changes – Single tenant

$TenantName = "YourTenantName.onmicrosoft.com"
$LastDay = (Get-Date).addhours(-24)
$credential = Get-Credential
Connect-MsolService -Credential $credential
$Customer=Get-msolpartnercontract -All | Where-Object{$_.DefaultDomainName -eq $TenantName}
$ChangedUsers=@()
write-host"getting users for $($Customer.Name)" -ForegroundColorGreen
$adminemails = Get-MsolRoleMember -TenantId$customer.tenantid -RoleObjectId (Get-MsolRole -RoleName "CompanyAdministrator").ObjectId
$Users= $adminemails | get-msoluser-TenantId $customer.TenantId
foreach($User in $Users){
if($User.LastPasswordChangeTimestamp -gt $LastDay){$ChangedUsers +="$($User.UserPrincipalName) has changed his password in the last 24 hours.Please update documentation to reflect.`n"}
}

 

This script checks if a password has been changed in the last day, and if so alerts on it, notifying you that a global admin password has been updated and needs to be changed in the documentation. You can also use this as a warning system if you do not have anyone that should be changing these passwords.

Anyway, hope it helps, and as always. Happy PowerShelling!

Documenting with PowerShell: Bulk edit configurations in IT-Glue

I know last week I said I’d take a break from the monitoring blogs, but a MSP recently requested if I knew a way to mass-edit specific configuration items in IT-Glue. In his case, he was going to change the network configuration of devices and wanted a quicker way than to just click on 20 devices. It would be getting annoying fast to do that via the interface.

To make these edits easier for him, I’ve decided to quickly script the following for him:

    #####################################################################
    $APIKEy =  "APIKEYHERE"
    $APIEndpoint = "https://api.eu.itglue.com"
    $orgID = "ORGIDHERE"
    $NewGateway = "192.1.1.254"
    #####################################################################
    If(Get-Module -ListAvailable -Name "ITGlueAPI") {Import-module ITGlueAPI} Else { install-module ITGlueAPI -Force; import-module ITGlueAPI}
    #Settings IT-Glue logon information
    Add-ITGlueBaseURI -base_uri $APIEndpoint
    Add-ITGlueAPIKey $APIKEy
    $ConfigList = (Get-ITGlueConfigurations -page_size 1000 -organization_id $OrgID).data.attributes | Out-GridView -PassThru
    foreach($Config in $ConfigList){
    $ConfigID = ($config.'resource-url' -split "/")[-1]
    $UpdatedConfig = 
    @{
        type = 'Configurations'
        attributes = @{
                    "default-gateway" = $NewGateway
        }
    }
    Set-ITGlueConfigurations -id $ConfigID -data $UpdatedConfig
    }

This grabs all configurations for the specific organisation ID you’ve filled in, it then gives you a grid with all the current configurations. Using this grid you can select the configurations you’d want to make a change and apply the new gateway. Its very easy to modify other fields in bulk too, for this, check the API documentation here.

Anyway, I hope it helps some people struggling with bulk edits, and as always, happy PowerShelling!

Monitoring with PowerShell: External port scanning

So I like knowing exactly what ports are open on my clients network, and have the ability to alert on specific ports that are opened. The problem with most port-scan utilities, and the PowerShell Test-netconnection cmdlet is that they always scan the internal network. In the case that you do enter the external IP whitelisting might allow you to connect anyway and give you some false positives.

To resolve this I’ve created a php page to be used in conjunction with a PowerShell script. The reason I’ve created the page is that I do not like relying on external web based API IP scans. Also I don’t want to be stuck in any subscription model for something as simple as a port scan. With this method you are also completely in control of the source.

So let’s get started! First off you’ll have to upload the following file as “scan.php” to any PHP host. You can browse to the page and it should show you some JSON information regarding the scan it performs on your IP. Scan.php is based on this Github script.

[
<?php
$host = $_SERVER['REMOTE_ADDR'];
$ports = array(21, 25,80,3389,1234,3333,3389,33890,3380);
foreach ($ports as $port)
{
    $connection = @fsockopen($host, $port, $errno, $errstr, 2);
    if (is_resource($connection))
    {
        echo '{' . '"Port":' . $port . ',' . '"status" : "open"' . "},";
        fclose($connection);
    }
    else
    {
        echo '{' . '"Port":' . $port . ', "status" : "closed"},';
    }
}
?>
{ "result": "done" }
]

I’ve converted the original github page to return only JSON. The good thing is that we can use the Invoke-restmethod cmdlet straight away, without having to convert anything, The PowerShell script can be edited to alert only on specific ports that are opened, or on all open ports.

$Results = invoke-restmethod -uri "http://YOURWEBHOST.COM/ip/scan.php"
$OpenPorts = $Results | Where-Object { $_.status -eq "open"}
$ClosedPorts = $Results | Where-Object { $_.status -eq "closed"}

if(!$OpenPorts) {
$PortScanResult = "Healthy"
} else {
$PortScanResult = $OpenPorts
}

And that’s it! as always Happy PowerShelling!