Category Archives: Series: PowerShell Monitoring

Monitoring with PowerShell: Monitoring WVD availability

We’re in the middle of WVD deployment at my MSP. This client is located all over the world and needed an easy way to manage virtual desktops over many regions. This deployment actually got me thinking about how monitoring the WVD environment should be done.

We found that the WVD agent on the VMs at times did not finish the upgrade correctly, or that session hosts became unavailable at inopportune times such as when there is a high logon load. Of course we could’ve setup some Azure Alerts for this but that would be forgoing our RMM system as the single source of alerts, so as always; in comes PowerShell.

As you can’t directly retrieve the information if a session host is available from the VMs themselves I’ve made a script that checks their status via the Azure management plane, it picks up if the updates have been installed correctly, heartbeats aren’t missed, and of course if the session host is actually available. As we’ve also had issues in the past with people forgetting to remove Drain Mode from a WVD pool I’ve also added this.

The Script

The script checks all tenants that are in Lighthouse, if you haven’t configured lighthouse yet, check out my manual here. You can change the alerts to whichever you’d like.

######### Secrets #########S
$ApplicationId = 'AppID'
$ApplicationSecret = 'ApplicationSecret' | ConvertTo-SecureString -Force -AsPlainText
$TenantID = 'YourTenantID'
$RefreshToken = 'yourrefreshtoken'
$UPN = "UPN-Used-To-Generate-Tokens"
######### Secrets #########
If (Get-Module -ListAvailable -Name Az.DesktopVirtualization) { 
    Import-module Az.DesktopVirtualization
}
Else { 
    Install-PackageProvider NuGet -Force
    Set-PSRepository PSGallery -InstallationPolicy Trusted
    Install-Module -Name Az.DesktopVirtualization -force
}
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)
$azureToken = New-PartnerAccessToken -ApplicationId $ApplicationID -Credential $credential -RefreshToken $refreshToken -Scopes 'https://management.azure.com/user_impersonation' -ServicePrincipal -Tenant $TenantId
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationID -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $TenantId
 
Connect-Azaccount -AccessToken $azureToken.AccessToken -GraphAccessToken $graphToken.AccessToken -AccountId $upn -TenantId $tenantID
$Subscriptions = Get-AzSubscription  | Where-Object { $_.State -eq 'Enabled' } | Sort-Object -Unique -Property Id
$SessionHostState = foreach ($Sub in $Subscriptions) {
    $null = $Sub | Set-AzContext

    $WVDHostPools = Get-AzResource -ResourceType 'Microsoft.DesktopVirtualization/hostpools'
    if (!$WVDHostpools) {
        write-host "No hostpool found for tenant $($Sub.name)"
        continue
    }
    write-host "Found hostpools. Checking current availibility status"
    foreach ($HostPool in $WVDHostPools) {
        $AllHPState = get-AzWvdsessionhost -hostpoolname $Hostpool.name -resourcegroupname $hostpool.resourcegroupname 
        Foreach ($State in $AllHPState) {
            [PSCustomObject]@{
                HostName         = $State.Name
                Available        = $State.Status
                Heartbeat        = $State.LastHeartBeat
                UpdateState      = $State.UpdateState
                LastUpdate       = $State.LastUpdateTime
                AllowNewSessions = $State.AllowNewSession
                Subscription     = $Sub.Name
            
            }
        }
    }

}

#Check for WVD Session hosts that have issues updating the agent:

if (($SessionHostState | Where-Object { $_.UpdateState -ne 'Succeeded' })) {
    write-host "Unhealthy - Some session hosts have not updated the Agent correctly."
    $SessionHostState | Where-Object { $_.UpdateState -ne 'Succeeded' }
}
else {
    write-host "Healthy - Session hosts all updated."   
}


#Check for unavailable hosts
if (($SessionHostState | Where-Object { $_.Available -ne 'Available' })) {
    write-host "Unhealthy - Some session hosts are unavailable."
    $SessionHostState | Where-Object { $_.Available -ne 'Available' }
}
else {
    write-host "Healthy - Session hosts all updated."   
}

#Check for hosts in drain mode
if (($SessionHostState | Where-Object { $_.AllowNewSessions -eq $false })) {
    write-host "Unhealthy - Some session hosts are in drain mode"
    $SessionHostState | Where-Object { $_.AllowNewSessions -eq $false }
}
else {
    write-host "Healthy - No sessionhosts in drain mode."   
}

And that’s it. As always, Happy PowerShelling!

Automating with PowerShell: Automatically following all Sharepoint Sites or Teams for all users

So a while back we had a client that uses a lot of sharepoint sites. The client only used Sharepoint online, and found it hard to find all the sites in one place. We pointed them to https://YOURDOMAIN.sharepoint.com/_layouts/15/sharepoint.aspx which gives a nice overview of sites and teams.

They came back to us saying it was a little bit of a hassle to use the overview as it only shared recently used sites, or sites that have been followed manually. Of course we wanted to help them get over this hassle so we scripted this; The following script allows you to grab all sites the user is a member of. It then adds the site to the favorites for that user. You can schedule this so each new user automatically gets added.

The Script

The script uses the Secure Application Model or just a generic Azure Application with permissions to read all sites for your tenants, and all your CSP tenants. The script finds each Team the user has joined and adds them to the favorites for that user.

$TenantID = 'CUSTOMERTENANT.ONMICROSOFT.COM'
$ApplicationId = "YOURPPLICTIONID"
$ApplicationSecret = "YOURAPPLICATIONSECRET"
 
$body = @{
    'resource'      = 'https://graph.microsoft.com'
    'client_id'     = $ApplicationId
    'client_secret' = $ApplicationSecret
    'grant_type'    = "client_credentials"
    'scope'         = "openid"
}

$ClientToken = Invoke-RestMethod -Method post -Uri "https://login.microsoftonline.com/$($tenantid)/oauth2/token" -Body $body -ErrorAction Stop
$headers = @{ "Authorization" = "Bearer $($ClientToken.access_token)" }
$Users = (Invoke-RestMethod -Uri "https://graph.microsoft.com/beta/users" -Headers $Headers -Method Get -ContentType "application/json").value.id
 
foreach ($userid in $users) {
    $AllTeamsURI = "https://graph.microsoft.com/beta/users/$($UserID)/JoinedTeams"
    $Teams = (Invoke-RestMethod -Uri $AllTeamsURI -Headers $Headers -Method Get -ContentType "application/json").value
    foreach ($Team in $teams) {
        $SiteRootReq = Invoke-RestMethod -Uri "https://graph.microsoft.com/beta/groups/$($Team.id)/sites/root" -Headers $Headers -Method Get -ContentType "application/json"
        $AddSitesbody = [PSCustomObject]@{
            value = [array]@{
                "id" = $SiteRootReq.id
            }
        } | convertto-json
        $FavoritedSites = Invoke-RestMethod -Uri "https://graph.microsoft.com/beta/users/$($userid)/followedSites/add" -Headers $Headers -Method POST -body $AddSitesBody -ContentType "application/json"
        write-host "Added $($SiteRootReq.webURL) to favorites for $userid"
    }


}

And that’s it! As always, Happy PowerShelling!

Monitoring with PowerShell: Monitoring potential phishing campaigns

So Microsoft offers a lot of cool tools for Microsoft 365 users, even if you aren’t using the complete suite. One of these is potential phishing detection, by default Microsoft does an analysis of each received e-mail to check if they are potential phishing attempts. You can check these via the interface by going to protection.office.com, Threat Management and clicking on the dashboard.

Of course that’s nice to do a one-off check, but we like getting alerted whenever we see these phishing attempts get above a specific number. For this, we can of course use PowerShell.

By using the PowerShell cmdlet ‘Get-ATPTotalTrafficReport’ (Don’t worry, you don’t need ATP.) we can get the reports from the interface in text format. This allows us to alert whenever a phishing campaign is started and exceeds a threshold we set.

For these scripts you’ll need the Secure Application Model first, to be able to login securely and according to Microsoft’s partner standards. There’s two different versions of the script; one for a single tenant, and another for all tenants under your administration.

Single tenant version

All you have to do for the single tenant version is to enter your credentials, and set what you believe is the maximum phishing attempts for 2 weeks.

######### Secrets #########
$ApplicationId = 'YourAPPID'
$ApplicationSecret = 'Applicationsecret' | ConvertTo-SecureString -AsPlainText -Force
$TenantID = 'YourTenantID'
$RefreshToken = 'RefreshToken'
$ExchangeRefreshToken = 'ExchangeRefreshToken'
$UPN = "UPN-Used-To-Generate-Tokens"
$TenantToCheck = 'testtenant.onmicrosoft.com'
$MaximumPhishingAttempts = 100
######### Secrets #########

$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)

$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID 
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID 
$token = New-PartnerAccessToken -ApplicationId 'a0c73c16-a7e3-4564-9a95-2bdf47383716'-RefreshToken $ExchangeRefreshToken -Scopes 'https://outlook.office365.com/.default' -Tenant $TenantToCheck -erroraction SilentlyContinue

$tokenValue = ConvertTo-SecureString "Bearer $($token.AccessToken)" -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential($upn, $tokenValue)
write-host "Proccessing $TenantToCheck"
$session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://ps.outlook.com/powershell-liveid?DelegatedOrg=$($TenantToCheck)&BasicAuthToOAuthConversion=true" -Credential $credential -Authentication Basic -AllowRedirection -ErrorAction SilentlyContinue
$null = Import-PSSession $session -AllowClobber -CommandName 'Get-ATPTotalTrafficReport' -ErrorAction SilentlyContinue
$Reports = Get-ATPTotalTrafficReport -ErrorAction SilentlyContinue
$TenantReports = [PSCustomObject]@{
    TenantDomain       = $TenantToCheck
    TotalSafeLinkCount = ($Reports | where-object { $_.EventType -eq 'TotalSafeLinkCount' }).Messagecount
    TotalSpamCount     = ($Reports | where-object { $_.EventType -eq 'TotalSpamCount' }).Messagecount
    TotalBulkCount     = ($Reports | where-object { $_.EventType -eq 'TotalBulkCount' }).Messagecount
    TotalPhishCount    = ($Reports | where-object { $_.EventType -eq 'TotalPhishCount' }).Messagecount
    TotalMalwareCount  = ($Reports | where-object { $_.EventType -eq 'TotalMalwareCount' }).Messagecount
    DateOfReports      = "$($Reports.StartDate | Select-Object -Last 1) - $($Reports.EndDate | Select-Object -Last 1)"
}
#end of commands
Remove-PSSession $session -ErrorAction SilentlyContinue

$TenantReports | Where-Object {$_.TotalPhishCount -gt $MaximumPhishingAttempts}

Multiple tenant version

This version does the same as above, but then for all tenants under your administration as Microsoft Partner.

######### Secrets #########
$ApplicationId = 'YourAPPID'
$ApplicationSecret = 'Applicationsecret' | ConvertTo-SecureString -AsPlainText -Force
$TenantID = 'YourTenantID'
$RefreshToken = 'RefreshToken'
$ExchangeRefreshToken = 'ExchangeRefreshToken'
$UPN = "UPN-Used-To-Generate-Tokens"
$MaximumPhishingAttempts = 100
######### Secrets #########


$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)
$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID 
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID 

Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken
$customers = Get-MsolPartnerContract -All
$TenantReports = foreach ($customer in $customers) {
    $token = New-PartnerAccessToken -ApplicationId 'a0c73c16-a7e3-4564-9a95-2bdf47383716'-RefreshToken $ExchangeRefreshToken -Scopes 'https://outlook.office365.com/.default' -Tenant $customer.TenantId -erroraction SilentlyContinue
    $tokenValue = ConvertTo-SecureString "Bearer $($token.AccessToken)" -AsPlainText -Force
    $credential = New-Object System.Management.Automation.PSCredential($upn, $tokenValue)
    $customerId = $customer.DefaultDomainName
    write-host "Proccessing $customerid"
    $session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://ps.outlook.com/powershell-liveid?DelegatedOrg=$($customerId)&BasicAuthToOAuthConversion=true" -Credential $credential -Authentication Basic -AllowRedirection -ErrorAction SilentlyContinue
    $null = Import-PSSession $session -AllowClobber -CommandName 'Get-ATPTotalTrafficReport' -ErrorAction SilentlyContinue
    #From here you can enter your own commands
    $Reports = Get-ATPTotalTrafficReport -ErrorAction SilentlyContinue
    [PSCustomObject]@{
        TenantID           = $customer.TenantId
        TenantName         = $customer.name
        TenantDomain       = $customer.DefaultDomainName
        TotalSafeLinkCount = ($Reports | where-object { $_.EventType -eq 'TotalSafeLinkCount' }).Messagecount
        TotalSpamCount     = ($Reports | where-object { $_.EventType -eq 'TotalSpamCount' }).Messagecount
        TotalBulkCount     = ($Reports | where-object { $_.EventType -eq 'TotalBulkCount' }).Messagecount
        TotalPhishCount    = ($Reports | where-object { $_.EventType -eq 'TotalPhishCount' }).Messagecount
        TotalMalwareCount  = ($Reports | where-object { $_.EventType -eq 'TotalMalwareCount' }).Messagecount
        DateOfReports      = "$($Reports.StartDate | Select-Object -Last 1) - $($Reports.EndDate | Select-Object -Last 1)"
    }
    #end of commands
    Remove-PSSession $session -ErrorAction SilentlyContinue
}

$TenantReports | Where-Object {$_.TotalPhishCount -gt $MaximumPhishingAttempts}

And that’s it! as always, Happy PowerShelling.

Automating with PowerShell: Deploying StorageSense

This is a follow up blog on last weeks blog of monitoring Storage Sense settings; to read about that check out the previous blog here. Monitoring Storage sense can ease your maintenance workload, but you do need a way to deploy StorageSense too.

You can use the script below to deploy storagesense, it even gives you the option to automatically add the OneDrive sites to the StorageSense settings, It’s important to note that StorageSense does not remove cache files that are marked as ‘Always remain offline’ – It only clears the cache for files that are untouched as long as you’ve defined.

Alright, so let’s go to deploying; as before the script uses my RunAsUser module because the StorageSense settings are user based.

The script

Using the script is straightforward; change the deployment settings to your preference, and run the script below.

#Settings
[PSCustomObject]@{
    PrefSched               = '0' #Options are: 0(Low Diskspace),1,7,30
    ClearTemporaryFiles     = $true
    ClearRecycler           = $true
    ClearDownloads          = $true
    AllowClearOneDriveCache = $true
    AddAllOneDrivelocations = $true
    ClearRecyclerDays       = '60' #Options are: 0(never),1,14,30,60
    ClearDownloadsDays      = '60' #Options are: 0(never),1,14,30,60
    ClearOneDriveCacheDays  = '60' #Options are: 0(never),1,14,30,60

    
} | ConvertTo-Json | Out-File "C:\Windows\Temp\WantedStorageSenseSettings.txt"
#

If (Get-Module -ListAvailable -Name "RunAsUser") { 
    Import-module RunAsUser
}
Else { 
    Install-PackageProvider NuGet -Force
    Set-PSRepository PSGallery -InstallationPolicy Trusted
    Install-Module RunAsUser -force -Repository PSGallery
}

$ScriptBlock = {
    $WantedSettings = Get-Content "C:\Windows\Temp\WantedStorageSenseSettings.txt" | ConvertFrom-Json
    $StorageSenseKeys = 'HKCU:\Software\Microsoft\Windows\CurrentVersion\StorageSense\Parameters\StoragePolicy\'
    Set-ItemProperty -Path $StorageSenseKeys -name '01' -value '1' -Type DWord  -Force
    Set-ItemProperty -Path $StorageSenseKeys -name '04' -value $WantedSettings.ClearTemporaryFiles -Type DWord -Force
    Set-ItemProperty -Path $StorageSenseKeys -name '08' -value $WantedSettings.ClearRecycler -Type DWord -Force
    Set-ItemProperty -Path $StorageSenseKeys -name '32' -value $WantedSettings.ClearDownloads -Type DWord -Force
    Set-ItemProperty -Path $StorageSenseKeys -name '256' -value $WantedSettings.ClearRecyclerDays -Type DWord -Force
    Set-ItemProperty -Path $StorageSenseKeys -name '512' -value $WantedSettings.ClearDownloadsDays -Type DWord -Force
    Set-ItemProperty -Path $StorageSenseKeys -name '2048' -value $WantedSettings.PrefSched -Type DWord -Force
    Set-ItemProperty -Path $StorageSenseKeys -name 'CloudfilePolicyConsent' -value $WantedSettings.AllowClearOneDriveCache -Type DWord -Force
    if ($WantedSettings.AddAllOneDrivelocations) {
        $CurrentUserSID = ([System.Security.Principal.WindowsIdentity]::GetCurrent()).User.Value
        $CurrentSites = Get-ItemProperty 'HKCU:\SOFTWARE\Microsoft\OneDrive\Accounts\Business1\ScopeIdToMountPointPathCache' -ErrorAction SilentlyContinue | Select-Object -Property * -ExcludeProperty PSPath, PsParentPath, PSChildname, PSDrive, PsProvider
        foreach ($OneDriveSite in $CurrentSites.psobject.properties.name) {
            New-Item "$($StorageSenseKeys)/OneDrive!$($CurrentUserSID)!Business1|$($OneDriveSite)" -Force
            New-ItemProperty "$($StorageSenseKeys)/OneDrive!$($CurrentUserSID)!Business1|$($OneDriveSite)" -Name '02' -Value '1' -type DWORD -Force
            New-ItemProperty "$($StorageSenseKeys)/OneDrive!$($CurrentUserSID)!Business1|$($OneDriveSite)" -Name '128' -Value $WantedSettings.ClearOneDriveCacheDays -type DWORD -Force
        }
    }

}

$null = Invoke-AsCurrentUser -ScriptBlock $ScriptBlock -UseWindowsPowerShell -NonElevatedSession

And that’s it! as always, Happy PowerShelling!

Monitoring with PowerShell: Monitoring Storage Sense settings

So let’s talk about Storage Sense. Storage Sense is a new-ish feature in Windows 10 which should replace the standard disk cleanup utilities. It has a lot more power than just disk cleanup as it can detect how long files have been in use and react based on age.

Storage Sense is pretty easy to use and can save a lot of disk space for modern clients, especially when they use OneDrive too. Storage Sense has support for OneDrive files on demand. By default it does not clear anything, but it could be setup to clear the local cache if files have not been in use for days, weeks, or even months. This helps with those pesky OneDrive database size limits.

So the first blog about Storage Senses will be monitoring its optimal settings. We’re using the RunAsUser Module for this, because the registry keys will be located in the HKEY_Current_User hive.

Monitoring script

#Settings
$PrefSched = 'Low Diskspace'
$ClearTemporaryFiles = $true
$ClearRecycler = $true
$ClearRecyclerDays = '60'
$ClearDownloads = $true
$ClearDownloadsDays = '60'
$AllowClearOneDriveCache = $true
#

If (Get-Module -ListAvailable -Name "RunAsUser") { 
    Import-module RunAsUser
}
Else { 
    Install-PackageProvider NuGet -Force
    Set-PSRepository PSGallery -InstallationPolicy Trusted
    Install-Module RunAsUser -force -Repository PSGallery
}


$ExpectedSettings = [PSCustomObject]@{
    'Storage Sense Enabled'         = $true
    'Clear Temporary Files'         = $ClearTemporaryFiles
    'Clear Recycler'                = $ClearRecycler
    'Clear Downloads'               = $ClearDownloads
    'Allow Clearing Onedrive Cache' = $AllowClearOneDriveCache
    'Storage Sense schedule'        = $PrefSched
    'Clear Downloads age (Days)'    = $ClearDownloadsDays
    'Clear Recycler age (Days)'     = $ClearRecyclerDays
} 

$ScriptBlock = {
    $StorageSenseKeys = Get-ItemProperty -Path 'HKCU:\Software\Microsoft\Windows\CurrentVersion\StorageSense\Parameters\StoragePolicy\'

    $StorageSenseSched = switch ($StorageSenseKeys.'2048') {
        1 { 'Weekly' }
        7 { 'Every week' }
        30 { 'Every Month' }
        0 { 'Low Diskspace' }
        Default { 'Unknown - Could not retrieve.' }
    }

    [PSCustomObject]@{
        'Storage Sense Enabled'         = [boolean]$StorageSenseKeys.'01'
        'Clear Temporary Files'         = [boolean]$StorageSenseKeys.'04'
        'Clear Recycler'                = [boolean]$StorageSenseKeys.'08'
        'Clear Downloads'               = [boolean]$StorageSenseKeys.'32'
        'Allow Clearing Onedrive Cache' = [boolean]$StorageSenseKeys.CloudfilePolicyConsent
        'Storage Sense schedule'        = $StorageSenseSched
        'Clear Downloads age (Days)'    = $StorageSenseKeys.'512'
        'Clear Recycler age (Days)'     = $StorageSenseKeys.'256'
    } | ConvertTo-Json | Out-File "C:\windows\Temp\CurrentStorageSenseSettings.txt" -Force

 
}

$null = Invoke-AsCurrentUser -ScriptBlock $ScriptBlock -UseWindowsPowerShell -NonElevatedSession
$CurrentSettings = Get-Content  "C:\windows\Temp\CurrentStorageSenseSettings.txt" | ConvertFrom-Json

$ComparedObjects = Compare-Object $CurrentSettings $ExpectedSettings -Property $CurrentSettings.PsObject.Properties.name

if ($ComparedObjects) {
    Write-Host "Unhealthy - The Storage Sense settings are not the same as the expected settings."
}
else {
    write-host "Healthy - Storage Sense Settings are set correctly."
}

And that’s it. Next week I’ll be demonstrating how to setup Storage Sense automatically, including adding OneDrive sites. As always, Happy PowerShelling!

Ending the year with PowerShell: Retrospective

I always like December because everyone gets all in the mood to do these retrospectives; and 2020 was crazy for everyone! for my personally it had a heavy loss, but also a lot of positivity. I enjoy looking at the raw numbers to see how far I’ve gotten.

So, show me the numbers!

  • 1 Microsoft MVP Award (and still amazed at it!)
  • 1 MSP(that I know of) that says I’ve saved their business
  • 2 blogs a week, a total of 100 so far this year. 🙂
  • 4 MSP community and business awards.
  • 5 RMM vendors that are implementing my blogs into their products.
  • 19 webinars I joined/led and had so much fun with, Special thanks to the last ones I did this year with Huntress.
  • 15000+ people that watched these webinars.
  • 1254 e-mails from other MSPs, asking for advice, support, or just thank you messages.
  • 400+ MSPs that installed one or more of my Azure Functions
  • 7000+ upvotes on reddit posts relating to MSP stuff 🙂
  • Almost 300000 downloads of my PowerShell modules on the PowerShell Gallery this year!
  • 1.7+ million unique hits on CyberDrain.com!

So what’s next?

Well, I have some time off planned for 3 weeks so you’ll all have to miss me for a little while. I will still but blogging but not on my normal schedule of 2 blogs a week. In the coming year I’m going to be working to bring even more value to both my own MSP and others by trying to break open automation for everyone. I also have plans to create a cool educational Capture the Flag competition for Managed Service providers and/or general System Administration, next to of course my regular blogging and normal community efforts.

I’m also working with Datto to look into a vendor neutral automation/RMM user group and of course I have bigger projects like AzPAM that need some love.

I’m not done blogging by far, and I’m still super excited for what’s to come and the feedback I get from the MSP community is immense. I try to respond to all of your e-mails and messages but don’t always get the chance. I’d like to take this moment to thank you all and hope to see you again next year!

Special end of year thanks

I did the same last year and think it should become a little bit of a regular thing. For this entire year I’d like to thank my darling wife for putting up with me of course, my partners at Lime Networks and others in no particular order;

My friends at Datto: Thank you for our cooperation. I really love seeing vendors that actually love their products and that is so true on the RMM side. 🙂

My friends at Solarwinds: Mostly for the N-Central/RMM upper-management team; I had too much laughs and funs doing the MSP Masterclasses this year. Hopefully next year we can make just as much fun! I’m also going to show up on Luis’s Café Con Luis soon!

MSPGeek and MSPRU: Thanks for all community members. I love talking to you all 🙂

and a direct, and very special thanks to the following people, whom I’m able to call my friends; Jonathan Crowe, Kyle Hanslovan, Gavin Stone, Stan Lee, Maarten, Nick, Aad and of course everyone else I work closely with. 🙂

And that’s it. As always, Happy PowerShelling!

Monitoring with PowerShell: Typosquat domain checking

One of my team members was following Blackhat today and showed me a pretty cool tool they demonstrated during the conference. The presenters showed a method of checking if your O365 domain was being Typosquated. The tool can be found here. The presenters made a Python tool, and I figured to create an alternative in PowerShell.

I’ve checked their method and found they use two different typosquating detection techniques; they’ve applied homoglyphs and BitSquating. These two techniques are most common in Typosquats, its either replacing characters with similar looking ones, or minor typos in the URL.

In my version, I’ve also introduced pluralization and omission, just to get a bit more domain names, I’m not saying this is a 100% extensive list. If you have any suggested changes feel free to make a GitHub PR here.

The script

To run the script, simply change the domain name at the end of the script and execute it. The script contains two functions; New-TypoSquatDomain which generate a list of typosquated domains and Get-O365TypoSquats which checks if the .onmicrosoft.com, .sharepoint.com and the domain itself are available.

So what can you do with this information? if the .onmicrosoft.com version exists, you can add this to your spamfilter to prevent spam, If the .sharepoint.com version exist people might be phishing you using SharePoint online URLS, and if the domain exists you could add it to the spamfilter or check what’s running there and notify your users.

function New-TypoSquatDomain {
    param (
        $DomainName
    )
    $ReplacementGylph = [pscustomobject]@{
        0  = 'b', 'd' 
        1  = 'b', 'lb' 
        2  = 'c', 'e'
        3  = 'd', 'b'
        4  = 'd', 'cl'
        5  = 'd', 'dl' 
        6  = 'e', 'c' 
        7  = 'g', 'q' 
        8  = 'h', 'lh' 
        9  = 'i', '1'
        10 = 'i', 'l' 
        11 = 'k', 'lk'
        12 = 'k', 'ik'
        13 = 'k', 'lc' 
        14 = 'l', '1'
        15 = 'l', 'i' 
        16 = 'm', 'n'
        17 = 'm', 'nn'
        18 = 'm', 'rn'
        19 = 'm', 'rr' 
        20 = 'n', 'r'
        21 = 'n', 'm'
        22 = 'o', '0'
        23 = 'o', 'q'
        24 = 'q', 'g' 
        25 = 'u', 'v' 
        26 = 'v', 'u'
        27 = 'w', 'vv'
        28 = 'w', 'uu' 
        29 = 'z', 's' 
        30 = 'n', 'r' 
        31 = 'r', 'n'
    }
    $i = 0

    $TLD = $DomainName -split '\.' | Select-Object -last 1
    $DomainName = $DomainName -split '\.' | Select-Object -First 1
    $HomoGlyph = do {
        $NewDomain = $DomainName -replace $ReplacementGylph.$i
        $NewDomain
        $NewDomain + 's'
        $NewDomain + 'a'
        $NewDomain + 't'
        $NewDomain + 'en'
        $i++
    } while ($i -lt 29)

    $i = 0
    $BitSquatAndOmission = do {
        $($DomainName[0..($i)] -join '') + $($DomainName[($i + 2)..$DomainName.Length] -join '')
        $($DomainName[0..$i] -join '') + $DomainName[$i + 2] + $DomainName[$i + 1] + $($DomainName[($i + 3)..$DomainName.Length] -join '')
        $i++
    } while ($i -lt $DomainName.Length)
    $Plurals = $DomainName + 's'; $DomainName + 'a'; $domainname + 'en' ;  ; $DomainName + 't'

    $CombinedDomains = $HomoGlyph + $BitSquatAndOmission + $Plurals | ForEach-Object { "$($_).$($TLD)" }
    return ( $CombinedDomains | Sort-Object -Unique | Where-Object { $_ -ne $DomainName })
}

function Get-O365TypoSquats {
    param (
        $TypoSquatedDomain
    )
    $DomainWithoutTLD = $TypoSquatedDomain -split '.' | Select-Object -First 1
    $DomainTest = Resolve-DnsName -Type A "$($TypoSquatedDomain)" -ErrorAction SilentlyContinue
    $Onmicrosoft = Resolve-DnsName -Type A "$($DomainWithoutTLD).onmicrosoft.com" -ErrorAction SilentlyContinue
    $Sharepoint = Resolve-DnsName -Type A "$($DomainWithoutTLD).sharepoint.com" -ErrorAction SilentlyContinue
    [PSCustomObject]@{
        'Onmicrosoft test' = [boolean]$Onmicrosoft
        'Sharepoint test'  = [boolean]$Sharepoint
        'Domain test'      = [boolean]$DomainTest
        Domain             = $TypoSquatedDomain
    }
}

New-TypoSquatDomain -DomainName 'Google.com' | ForEach-Object { Get-O365TypoSquats -TypoSquatedDomain $_ }

You can load this script into your RMM system and alert whenever results are found.

And that’s it! as always, Happy PowerShelling!

Automating with PowerShell: Adding domains to IT-Glue programmatically.

So ITGlue is a great application and has a lot of API’s available. Unfortunately there is no API to add domains or SSL certificates to IT-Glue. Seeing as I have a couple of sources where domains and SSL certificates come from I’d still like to add them programmatically.

To do this, I’ve created the small function below. It uses the Chrome cookies to login to the ITGlue webpage instead and commit a new domain. It’s quite hacky but works wonders when you need to grab data from a lot of different systems and smush them together in IT-Glue.

You could make some modifications to do this for other non-API available endpoints such as Documents, or SSL certificates.

<#
.SYNOPSIS
    Creates ITG domains using the current ITG cookie
.DESCRIPTION
    A method to create ITGlue domains by abusing the cooking and commiting the normal form workflow. This is due to no Domain API is currently available. The function expects you to be logged into ITGlue using Chrome as your browser.
.EXAMPLE
    PS C:\> New-ITGDomain -OrgID 12345 -DomainName 'example.com' -ITGURL 'Yourcompany.Itglue.com'
    Creates a new ITGlue domain example.com in organisation 12345 for the ITglue YourCompany.ITGlue.com. Please note to *not* use the API URL in this case.
.INPUTS
    OrgID = Organisation ID in IT-Glue
    DomainName = Domain name you wish to add.
    ITGURL = The normal access URL to your IT-Glue instance.
.OUTPUTS
    Fail/Success
.NOTES
   No notes
#>
function New-ITGDomain {
    param (
        [string]$OrgID,
        [string]$DomainName,
        [string]$ITGURL
    )
    $ChromeCookieviewPath = "$($ENV:TEMP)/Chromecookiesview.zip"
    if (!(Test-Path $ChromeCookieviewPath)) {
        write-host "Downloading ChromeCookieView" -ForegroundColor Green
        Invoke-WebRequest 'https://www.nirsoft.net/utils/chromecookiesview.zip' -UseBasicParsing -OutFile $ChromeCookieviewPath
        Expand-Archive -path $ChromeCookieviewPath -DestinationPath  "$($ENV:TEMP)" -Force
    }
    Start-Process -FilePath "$($ENV:TEMP)/Chromecookiesview.exe" -ArgumentList "/scomma $($ENV:TEMP)/chromecookies.csv"
    start-sleep 1
    $Cookies = import-csv "$($ENV:TEMP)/chromecookies.csv"
    write-host "Grabbing Chrome Cookies" -ForegroundColor Green
    $hosts = $cookies | Where-Object { $_.'host name' -like '*ITGlue*' }

    write-host "Found cookies. Trying to create request" -ForegroundColor Green
    write-host "Grabbing ITGlue session" -ForegroundColor Green
    $null = Invoke-RestMethod -uri "https://$($ITGURL)/$($orgid)/domains/new" -Method GET -SessionVariable WebSessionITG
    foreach ($CookieFile in $hosts) {
        $cookie = New-Object System.Net.Cookie     
        $cookie.Name = $cookiefile.Name
        $cookie.Value = $cookiefile.Value
        $cookie.Domain = $cookiefile.'Host Name'
        $WebSessionITG.Cookies.Add($cookie)
    }
    write-host "Grabbing ITGlue unique token" -ForegroundColor Green
    $AuthToken = (Invoke-RestMethod -uri "https://$($ITGURL)/$($orgid)/domains/new" -Method GET -WebSession $WebSessionITG) -match '.*csrf-token"'
    $Token = $matches.0 -split '"' | Select-Object -Index 1
    write-host "Creating domain" -ForegroundColor Green
    $Result = Invoke-RestMethod -Uri "https://$($ITGURL)/$($orgid)/domains?submit=1" -Method "POST" -ContentType "application/x-www-form-urlencoded"   -Body "utf8=%E2%9C%93&authenticity_token=$($Token)&domain%5Bname%5D=$($DomainName)&domain%5Bnotes%5D=&amp;domain%5Baccessible_option%5D=all&domain%5Bresource_access_group_ids%5D%5B%5D=&domain%5Bresource_access_accounts_user_ids%5D%5B%5D=&commit=Save" -WebSession $WebSessionITG
    if ($Result -like "*Domain has been created successfully.*") {
        write-host "Succesfully created domain $DomainName for $OrgID" -ForegroundColor Green
    }
    else {
        write-host "Failed to create $DomainName for $orgid" -ForegroundColor Red
    }
}

Automating with PowerShell: Joining teams meetings with a code

In one of the MSP communities I’m in recently the following question was asked;

Is it possible to join Microsoft Teams meeting in the same way as Webex and Zoom meetings; with just a code?

Question asked in https://msp.zone

I was actually surprised this wasn’t possible; you can join the meetings via phone with an access code but there’s no website to enter a meeting ID and just join it. I’ve decided to make this an option. For this I’ve made TeamsCodeJoiner. TeamsCodeJoiner is a Azure hosted function which allows you to create and join meetings simply by using a code.

So far; the functionality is pretty basic and more akin to a URL shortening tool, but I’m working on more functionality such as creating a code for each meeting created and letting the user known their unique access code. You can even customize the function with a custom domain; users could go to Join.yourcompanyname.com and enter the code, making joining meetings a lot easier from any device instead of having to have the URL.

Deploying the Azure Function is straight-forward, either press the Deploy with Azure button below or check the code yourself on Github and create the function by hand. In regards to costs; this will be a couple of cents a month, or a dollar if used a lot. :).

So how does this look in production? below are some screenshots;

I hope you’ll enjoy this and as always, Happy PowerShelling!

Automating with PowerShell: Backup Teams Chats

I’ve recently had a small discussion with a friend that is using Teams as his primary collaboration platform, just like our MSP does internally. He told me that the only thing that he is really missing is a backup feature of Teams chats. He often deletes entire teams or channels after a project finishes but his backup product only has the ability to backup files and folders inside of the Teams Sharepoint site.

So to help him out I’ve written the script below, the script goes over all the teams in your tenant, and backups the chat per-channel. It makes a HTML file in a reverse read order (So the top most message is the most recent one).

Permissions

As with most of my Office365 scripts; you’ll need the Secure Application Model with some added permissions. For the permissions, perform the following:

  • Go to the Azure Portal.
  • Click on Azure Active Directory, now click on “App Registrations”.
  • Find your Secure App Model application. You can search based on the ApplicationID.
  • Go to “API Permissions” and click Add a permission.
  • Choose “Microsoft Graph” and “Application permission”.
  • Search for “Channel” and click on “Channel.Basic.ReadAll and “ChannelMessage.Read.All”. Click on add permission.
  • Do the same for “Delegate Permissions”.
  • Finally, click on “Grant Admin Consent for Company Name.

The Script

So its important to note that the Graph API is pretty limited when it comes to reading Teams messages; Getting channel messages are limited to 5 requests per second. In my experience this limit is even lower at times. If you’re getting rate limitation errors its best to increase the timeout a bit. 🙂

The script writes HTML files for each backup, it does not use PsWriteHTML like normally because that doesn’t like to have HTML inside of the tables, and most teams messages are HTML.

######### Secrets #########
$ApplicationId = 'YourAppID'
$ApplicationSecret = 'YourAppSecret' | ConvertTo-SecureString -Force -AsPlainText
$TenantIDToBackup = 'TenantToBackup.onmicrosoft.com'
$RefreshToken = 'VeryLongRefreshToken.'
######## Secrets #########
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $TenantIDToBackup
$Header = @{
    Authorization = "Bearer $($graphToken.AccessToken)"
}
$BaseURI = "https://graph.microsoft.com/beta"
$AllMicrosoftTeams = (Invoke-RestMethod -Uri  "$($BaseURI)/groups?`$filter=resourceProvisioningOptions/Any(x:x eq 'Team')" -Headers $Header -Method Get -ContentType "application/json").value


$head = @"
<script>
function myFunction() {
    const filter = document.querySelector('#myInput').value.toUpperCase();
    const trs = document.querySelectorAll('table tr:not(.header)');
    trs.forEach(tr => tr.style.display = [...tr.children].find(td => td.innerHTML.toUpperCase().includes(filter)) ? '' : 'none');
  }</script>
<Title>LNPP - Lime Networks Partner Portal</Title>
<style>
body { background-color:#E5E4E2;
      font-family:Monospace;
      font-size:10pt; }
td, th { border:0px solid black; 
        border-collapse:collapse;
        white-space:pre; }
th { color:white;
    background-color:black; }
table, tr, td, th {
     padding: 2px; 
     margin: 0px;
     white-space:pre; }
tr:nth-child(odd) {background-color: lightgray}
table { width:95%;margin-left:5px; margin-bottom:20px; }
h2 {
font-family:Tahoma;
color:#6D7B8D;
}
.footer 
{ color:green; 
 margin-left:10px; 
 font-family:Tahoma;
 font-size:8pt;
 font-style:italic;
}
#myInput {
  background-image: url('https://www.w3schools.com/css/searchicon.png'); /* Add a search icon to input */
  background-position: 10px 12px; /* Position the search icon */
  background-repeat: no-repeat; /* Do not repeat the icon image */
  width: 50%; /* Full-width */
  font-size: 16px; /* Increase font-size */
  padding: 12px 20px 12px 40px; /* Add some padding */
  border: 1px solid #ddd; /* Add a grey border */
  margin-bottom: 12px; /* Add some space below the input */
}
</style>
"@
   

foreach ($Team in $AllMicrosoftTeams) {
    $TeamsChannels = (Invoke-RestMethod -Uri "$($BaseURI)/teams/$($Team.id)/channels" -Headers $Header -Method Get -ContentType "application/json").value
    foreach ($Channel in $TeamsChannels) {
        $i = 100

        $MessagesRaw = do {
            if (!$MessageURI) { $MessageURI = "$($BaseURI)/teams/$($Team.id)/channels/$($channel.id)/messages?`$top=100" }
            $MessagePage = (Invoke-RestMethod -Uri $MessageURI -Headers $Header -Method Get -ContentType "application/json" -ErrorAction SilentlyContinue)
            $MessageURI = $Messagepage.'@odata.nextlink'
            $Messagepage.value
            write-host "Got $i messages for $($team.displayName) / $($channel.Displayname)"
            $i = $i + 100
            start-sleep 10
        } while ($Messagepage.'@OData.nextlink')
        $MessagesHTML = $Messages | Select-Object  @{label = 'Created on'; expression = { [datetime]$_.CreatedDateTime } },
        @{label = 'From'; expression = { $_.from.user.displayname } },
        @{label = 'Message'; expression = { $_.body.content } },
        @{label = 'Message URL'; expression = { $_.body.weburl } } | ConvertTo-Html -Head $head
        [System.Web.HttpUtility]::HtmlDecode($MessagesHTML) | out-file "C:\temp\$($Team.displayName) - $($Channel.displayName).html"

    } 

     
}

So how does this look? Well; if all goes well the HTML files looks something like this;

And that’s it! as always, Happy PowerShelling!

Monitoring with PowerShell: Monitoring NAS devices

This was actually requested a couple of times but I’ve always seem to hold off on it, mostly because we don’t use that many NAS devices anymore. I did decide to make it a bit more universal than just NAS devices really – It’s about monitoring any device that has SSH access and of which you know the commands that you need. I’ve just used NAS’s as an example in this case.

We’re using SSH because most NAS devices support it – e.g. QNAP/Synology/etc. I like this method more than SNMP monitoring because I strongly believe more in agent based monitoring than SNMP probing.

So lets get started.

The Script

The script itself is very straight-forward; we download the SSH-Posh module by Carlos Perez and then connect to the NAS using SSH, we execute our command defined in $Command. In the case of most NAS devices you can use the command “cat /proc/mdstat”. After executing this command we compare if it contains any words we don’t like, such as fail, rebuild, recovery, etc. 🙂

$Command = 'cat /proc/mdstat'
$NotHealthyStates = "*recovery*", "*failed*", "*failure*", "*offline*", "*rebuilding*"
$DeviceIP = 'YOURRDEVICEIP OR HOSTNAME'
$Creds = Get-Credential #Replace this with your RMMs method to handle credentials.

If (Get-Module -ListAvailable -Name "Posh-SSH") { Import-module "Posh-SSH" } Else { install-module "Posh-SSH" -Force; import-module "Posh-SSH" }

try {
    $null = New-SSHSession -ComputerName $DeviceIP -Credential $Creds -AcceptKey -force
    $SSHOutput = (Invoke-SSHCommand -Index 0 -Command $command).output
    $null = Get-SSHSession | Remove-SSHSession
}
catch {
    write-host "Unhealthy - Could not connect to SSH."
    break
}
if (!$SSHOutput) { write-host "Unhealthy - No SSH output found" ; break }

$HealthState = foreach ($State in $NotHealthyStates) {
    if ($SSHOutput -like $State) {
        "Unhealthy - $State found."
    }
}

if (!$HealthState) { write-host "Healthy - No issues found." } else { write-host $HealthState }

And that’s it! as always, Happy PowerShelling. 🙂

Monitoring with PowerShell: Monitoring print queues

So this one is related to my recent blog about documenting printers; right after the blog I got some comments that I never focused on printer monitoring. I think that’s because printers have a mental block in my mind as I see them as pure evil. 🙂

But without jokes; of course we can monitor printers and print queues. We have some clients that printing is a key part of business so I’ve created the monitoring component below. This monitors both the printer status(e.g. offline/error/no toner, etc) and the Job status. Sometimes jobs get stuck in the queue blocking all others so it’s a good one to cover.

The Script

$Printers = get-printer

$PrintStatus = foreach ($Printer in $Printers) {
    $PrintJobs = get-PrintJob -PrinterObject $printer
    $JobStatus = foreach ($job in $PrintJobs) {
        if ($Job.JobStatus -ne "normal") { "Not Healthy - $($Job)" }
    }
    [PSCustomObject]@{
        PrinterName   = $printer.Name
        PrinterStatus = $Printer.PrinterStatus
        PrinterType   = $Printer.Type
        JobStatus     = $JobStatus
    }
}

$PrintersNotNormal = $PrinterStatus.PrinterStatus | Where-Object { $_.PrinterStatus -ne "normal" }
if (!$PrintersNotNormal) {
    write-host "Healthy - No Printer Errors found" 
}
else {
    Write-Host "Unhealthy - Printers with errors found."
    $PrintersNotNormal
}
$JobsNotNormal = $PrinterStatus.PrinterStatus | Where-Object { $_.JobStatus -ne "normal" -and $_.JobStatus -ne $null }
if (!$JobsNotNormal) {
    write-host "Healthy - No Job Errors found" 
}
else {
    Write-Host "Unhealthy - JJobs with errors found."
    $JobsNotNormal
}

And that’s it! a simple but short one. I’ll be posting some more information about my bigger project AzPam somewhere in December. For next week I have some Azure Function stuff planned – One of them being a method to join a Microsoft Teams meeting, using a code instead of a super long URL. 🙂

As always, Happy PowerShelling!

Monitoring with PowerShell: App hangs

I was talking to a friend the other day and he was using my user experience script in his RMM system for a while. He told me that he loved having the ability to measure the users experience but he had some clients with in-house applications that would write errors to the system log constantly, or he had other clients with crashing services that could not be prevented.

This caused him to disable the User Experience Monitoring script for those clients, which is a shame because it’s what we should be focusing on as MSPs these days. I figured I’d make a lighter version that does not rely on the Windows Reliability Index instead. That way we could avoid some of the crashing services or other issues. So lets get to the script!

The Script

Instead of grabbing the Reliability index – We’re collecting all logs for the last 15 minutes and counting how many AppHangs have been experienced. An AppHang is when the application gives the famous “Not responding” pop-up. We also grab hard application crashes, but filter our those that we don’t want to see such as the LOB application spoken about above.

$IDs = "1002", "1000"
$ExcludedApplications = "*Slack*"
$MaxCount = '1'

$LogFilter = @{
    LogName = 'Application'
    ID      = $IDs
    StartTime = (get-date).AddMinutes(-15)
} 

$Last15Minutes = Get-WinEvent -FilterHashTable $LogFilter -ErrorAction SilentlyContinue | where-object { $_.message -notlike $ExcludedApplications }


if ($Last15Minutes.count -ge $MaxCount) {
    write-host "Unhealthy - The maximum application crash logs are higher than $MaxCount"
}

if (!$Last15Minutes) {
    write-host "Healthy - No app crash logs found"
}

Now this is just an example on how you can achieve this really – If anything I’d suggest to expand on this and get some information to compile your own reliability score. I also don’t specifically love getting Windows Events instead of directly monitoring yourself, but Apphangs aren’t really documented elsewhere.

And that’s it! as always, Happy PowerShelling 🙂

Monitoring with PowerShell: Monitoring Domain Admins logon

So this is one I’ve been researching for a new tool I’m creating. AzPAM, AzPAM will be a Privledged Access Management tool that will be living in your Azure environment, mostly designed for MSPs. If you want to see how AzPam looks or contribute, check out the Github page about it here. I should be pretty close to releasing an alpha version soon! 🙂

To make sure AzPAM can also work with local accounts and domain admin accounts I figured I might try to monitor when the account has logged on. It then dawned on me that this might be something you’ll want to monitor in general. We’ve talked about monitoring new admins and groups before, but never directly if a Domain Admin has logged on.

The Script

So this script checks the lastloggedon time stamp in Active Directory, and checks if this account has logged on in the last 24 hours. You can exclude accounts by adding it to the $ExcludeList variable.

$ExludedAdmins = "JamesDoe", "JohnDoe"

$GroupMembers = Get-ADGroupMember -Identity 'Domain Admins'
$LoggedOntoday = foreach ($member in $GroupMembers) {
    if ($member.name -in $ExludedAdmins) {
        write-host "Skipping $($member.name)" -ForegroundColor Green
        continue
    }
    $ADUser = Get-ADUser -Identity $member.sid -Properties 'LastLogonTimeStamp'
    if ($ADUser.lastlogontimestamp -eq $null) { continue }
    if ([datetime]::FromFileTime($ADUser.LastLogonTimeStamp) -gt (get-date).AddHours(-24)) { 
        "$($member.name) has logged on in the last 24 hours"
    }

}

if (!$LoggedOntoday) { "Healthy. No Domain Admins have logged on today" }

And that’s it! I know it’s a bit of a short one, but with all the work I’m doing on AzPAM I’ll be sure to make it up to you guys soon! As always, Happy PowerShelling.

Monitoring with PowerShell: Monitoring Outlook offline mode and OST Sizes, and active PSTS.

As some of you have noticed I haven’t really been blogging for the past 2 weeks. My father recently died and I had to take some me-time. I’m going to be getting back to blogging regularly again starting now 🙂

Todays blog I’m going to be showing how to monitor if outlook has been set to offline mode by the user, and if the OST size is nearing it’s maximum size, as a bonus I’m also giving you the option of alerting on active PST files. The offline mode is just a handy gizmo to notify users that they might’ve misclicked – It still happens to our users from time to time.

We have a lot of users that work in shared mailboxes. These shared mailboxes get added to the user via automapping. Automapping dumps all the information into a single users OST. The official maximum OST size is 100GB, so if you have 10 shared mailboxes of 10GB, the OST can get full and the user won’t be able to send or receive e-mails.

Monitoring offline mode

So this script uses my RunAsUser Module. This is because Outlook only runs in user mode and as such you need to run these commands as the user itself.

Install-module RunAsUser -Force
$ScriptBlock = { 
    try {
        $outlook = new-object -comobject outlook.application
        $State = if ($outlook.session.offline) {"Outlook has been set to work offline mode." } else { "Healthy - Outlook is in online mode." }
        set-content "C:\programdata\Outlookmonitoring.txt" -value $State -force
    }
    catch {
        set-content "C:\programdata\Outlookmonitoring.txt" -Value  "Could not connect to outlook. " -Force
    }
}
Invoke-AsCurrentUser -UseWindowsPowerShell -NonElevatedSession -scriptblock $ScriptBlock
$Errorstate = get-content "C:\programdata\Outlookmonitoring.txt"

$Errorstate

Monitoring OST Sizes

So like I said before; the maximum size of a OST is 100GB, above that you’ll experience lots of performance loss so we want to keep it nice and small. Let’s say around 60GB. By using this monitoring method you can find exactly which OSTS are in use and how large they are.

Install-module RunAsUser -Force
$ScriptBlock = { 
    try {
        $FileSizeAlert = 60GB
        $outlook = new-object -comobject outlook.application
        $OSTS = ($outlook.session.stores | where-object {$_.filepath -ne ""}).filepath
        $State = foreach ($OST in $OSTS) {
            $File = get-item $OST
            if($File.Length -gt $FileSizeAlert){ "$OST is larger than alert size" }
        }
        if(!$State){ $State = "Healthy - No Large OST found."}
        set-content "C:\programdata\OutlookOSTmonitoring.txt" -value $State -force
    }
    catch {
        set-content "C:\programdata\OutlookOSTmonitoring.txt" -Value  "Could not connect to outlook. " -Force
    }
}
Invoke-AsCurrentUser -UseWindowsPowerShell -NonElevatedSession -scriptblock $ScriptBlock
$Errorstate = get-content "C:\programdata\OutlookOSTmonitoring.txt"

$Errorstate

Finding actively used PST files

Of course we all want to avoid PST files as much as possible, they are prone to dataloss and just a pretty fragile format in general. To find if users have a PST actively mounted in Oulook you can use the following script:

Install-module RunAsUser -Force
$ScriptBlock = { 
    try {
        $outlook = new-object -comobject outlook.application
        $PSTS = ($outlook.session.stores | where-object { $_.filepath -like "*pst" }).filepath
        if (!$PSTS) { $PSTS = "Healthy - No active PST found." }
        set-content "C:\programdata\OutlookPSTmonitoring.txt" -value $PSTS -force
    }
    catch {
        set-content "C:\programdata\OutlookPSTmonitoring.txt" -Value  "Could not connect to outlook. " -Force
    }
}
Invoke-AsCurrentUser -UseWindowsPowerShell -NonElevatedSession -scriptblock $ScriptBlock
$Errorstate = get-content "C:\programdata\OutlookPSTmonitoring.txt"

$Errorstate

And that’s all! As always, Happy PowerShelling