Category Archives: Series: PowerShell Monitoring

Monitoring with PowerShell: Monitoring SMART status using SmartCTL.

Some time ago I wrote a blog about monitoring SMART status with CrystalDiskInfo. After bringing this script over to our production RMM environment everything seemed good. But when I looked a little deeper I found that the script failed on NVME drives. NVME drives handle SMART-Status different from ‘regular’ SATA drives.

This started me on a quest for a solution that also worked on NVME drives. I’ve decided to use SmartMonTools as it has the same benefits as crystaldiskmark – It’s portable, does not require an installation, and is small enough to be downloaded on demand.

The script is fairly straightforward, it downloads the utility from a host, extracts the utility and runs an update for SmartCTL so it can fill in the data correctly. After this for each HDD in the system it will run a compare to the thresholds you’ve setup.

I’ve also had a request for disk monitoring on specifically the available spare count. The script can be edited to monitor this too – that way you can decide your own thresholds over what the manufacturer said is default for the disk.

The script

############ Thresholds #############
$PowerOnTime = 35063 #about 4 years constant runtime.
$PowerCycles = 4000 #4000 times of turning drive on and off
$Temperature = 60 #60 degrees celcius
############ End Thresholds #########
$DownloadURL = "https://cyberdrain.com/wp-content/uploads/2020/02/Smartmontools.zip"
$DownloadLocation = "$($Env:ProgramData)\SmartmonTools"
try {
    $TestDownloadLocation = Test-Path $DownloadLocation
    if (!$TestDownloadLocation) { new-item $DownloadLocation -ItemType Directory -force }
    $TestDownloadLocationZip = Test-Path "$DownloadLocation\Smartmontools.zip"
    if (!$TestDownloadLocationZip) { Invoke-WebRequest -UseBasicParsing -Uri $DownloadURL -OutFile "$($DownloadLocation)\Smartmontools.zip" }
    $TestDownloadLocationExe = Test-Path "$DownloadLocation\smartctl.exe"
    if (!$TestDownloadLocationExe) { Expand-Archive "$($DownloadLocation)\Smartmontools.zip" -DestinationPath $DownloadLocation -Force }
}
catch {
    write-host "The download and extraction of SMARTCTL failed. Error: $($_.Exception.Message)"
    exit 1
}
#update the smartmontools database
start-process -filepath "$DownloadLocation\update-smart-drivedb.exe" -ArgumentList "/S" -Wait
#find all connected HDDs
$HDDs = (& "$DownloadLocation\smartctl.exe" --scan -j | ConvertFrom-Json).devices
$HDDInfo = foreach ($HDD in $HDDs) {
    (& "$DownloadLocation\smartctl.exe" -t short -a -j $HDD.name) | convertfrom-json
}
$DiskHealth = @{}
#Checking SMART status
$SmartFailed = $HDDInfo | Where-Object { $_.Smart_Status.Passed -ne $true }
if ($SmartFailed) { $DiskHealth.add('SmartErrors',"Smart Failed for disks: $($SmartFailed.serial_number)") }
#checking Temp Status
$TempFailed = $HDDInfo | Where-Object { $_.temperature.current -ge $Temperature }
if ($TempFailed) { $DiskHealth.add('TempErrors',"Temperature failed for disks: $($TempFailed.serial_number)") }
#Checking Power Cycle Count status
$PCCFailed = $HDDInfo | Where-Object { $_.Power_Cycle_Count -ge $PowerCycles }
if ($PCCFailed ) { $DiskHealth.add('PCCErrors',"Power Cycle Count Failed for disks: $($PCCFailed.serial_number)") }
#Checking Power on Time Status
$POTFailed = $HDDInfo | Where-Object { $_.Power_on_time.hours -ge $PowerOnTime }
if ($POTFailed) { $DiskHealth.add('POTErrors',"Power on Time for disks failed : $($POTFailed.serial_number)") }

if (!$DiskHealth) { $DiskHealth = "Healthy" }

And that’s it! as always, Happy PowerShelling.

Monitoring with PowerShell: Monitoring internet speeds

It seems like I’m having a week of requests. This one was requested by my friends at Datto. One of their clients wanted to have the ability to run speed-tests and have their RMM system generate alerts whenever the speed drops. I’ve made the following PowerShell script that uses the CLI utility from speedtest.net. This utility gives us some nice feedback to work with;

  • It returns the external IP & the internal IP for the interface used.
  • The current ISP.
  • The download and upload speed.
  • The Jitter,Latency, and packet loss of the connection.
  • and the server it uses, plus the actual speedtest.net URL so you can compare the results.

So, I’ve made the script use two different monitoring methods; one is absolute and based and the values you’ve entered. The other is a percentage based monitor that alerts if the difference between the current speedtest and the previous one is more than 20%.

The script

The script downloads the Speedtest utility from the speedtest website. You can always replace this URL with your own host if you’d like.

######### Absolute monitoring values ########## 
$maxpacketloss = 2 #how much % packetloss until we alert. 
$MinimumDownloadSpeed = 100 #What is the minimum expected download speed in Mbit/ps
$MinimumUploadSpeed = 20 #What is the minimum expected upload speed in Mbit/ps
######### End absolute monitoring values ######

#Replace the Download URL to where you've uploaded the ZIP file yourself. We will only download this file once. 
#Latest version can be found at: https://www.speedtest.net/nl/apps/cli
$DownloadURL = "https://bintray.com/ookla/download/download_file?file_path=ookla-speedtest-1.0.0-win64.zip"
$DownloadLocation = "$($Env:ProgramData)\SpeedtestCLI"
try {
    $TestDownloadLocation = Test-Path $DownloadLocation
    if (!$TestDownloadLocation) {
        new-item $DownloadLocation -ItemType Directory -force
        Invoke-WebRequest -Uri $DownloadURL -OutFile "$($DownloadLocation)\speedtest.zip"
        Expand-Archive "$($DownloadLocation)\speedtest.zip" -DestinationPath $DownloadLocation -Force
    } 
}
catch {  
    write-host "The download and extraction of SpeedtestCLI failed. Error: $($_.Exception.Message)"
    exit 1
}
$PreviousResults = if (test-path "$($DownloadLocation)\LastResults.txt") { get-content "$($DownloadLocation)\LastResults.txt" | ConvertFrom-Json }
$SpeedtestResults = & "$($DownloadLocation)\speedtest.exe" --format=json --accept-license --accept-gdpr
$SpeedtestResults | Out-File "$($DownloadLocation)\LastResults.txt" -Force
$SpeedtestResults = $SpeedtestResults | ConvertFrom-Json

#creating object
[PSCustomObject]$SpeedtestObj = @{
    downloadspeed = [math]::Round($SpeedtestResults.download.bandwidth / 1000000 * 8, 2)
    uploadspeed   = [math]::Round($SpeedtestResults.upload.bandwidth / 1000000 * 8, 2)
    packetloss    = [math]::Round($SpeedtestResults.packetLoss)
    isp           = $SpeedtestResults.isp
    ExternalIP    = $SpeedtestResults.interface.externalIp
    InternalIP    = $SpeedtestResults.interface.internalIp
    UsedServer    = $SpeedtestResults.server.host
    ResultsURL    = $SpeedtestResults.result.url
    Jitter        = [math]::Round($SpeedtestResults.ping.jitter)
    Latency       = [math]::Round($SpeedtestResults.ping.latency)
}
$SpeedtestHealth = @()
#Comparing against previous result. Alerting is download or upload differs more than 20%.
if ($PreviousResults) {
    if ($PreviousResults.download.bandwidth / $SpeedtestResults.download.bandwidth * 100 -le 80) { $SpeedtestHealth += "Download speed difference is more than 20%" }
    if ($PreviousResults.upload.bandwidth / $SpeedtestResults.upload.bandwidth * 100 -le 80) { $SpeedtestHealth += "Upload speed difference is more than 20%" }
}

#Comparing against preset variables.
if ($SpeedtestObj.downloadspeed -lt $MinimumDownloadSpeed) { $SpeedtestHealth += "Download speed is lower than $MinimumDownloadSpeed Mbit/ps" }
if ($SpeedtestObj.uploadspeed -lt $MinimumUploadSpeed) { $SpeedtestHealth += "Upload speed is lower than $MinimumUploadSpeed Mbit/ps" }
if ($SpeedtestObj.packetloss -gt $MaxPacketLoss) { $SpeedtestHealth += "Packetloss is higher than $maxpacketloss%" }

if (!$SpeedtestHealth) {
    $SpeedtestHealth = "Healthy"
}

And that is it! So this monitoring component will be uploaded to the ComStore soon. As always, Happy PowerShelling!

Monitoring with PowerShell: WAN IP changes and Active Directory ages

I’ve been super swamped the last couple of days, as we’re working on our ISO27001 audit in our office. This means most of my time is just being swallowed by auditors. I’ve decided to not break my streak in releasing my blogs on time so this time we’re covering some requests from our readers!

Monitoring WAN IP changes

This was requested by the Reddit user “EqualWorking1”. He wanted the ability to see when a WAN IP changes for one of his servers, as he suspected a ISP kept dropping the link every few minutes. The script needs to run once to create a base-line IP file, and runs the compare based on that.

$previousIP = get-content "$($env:ProgramData)/LastIP.txt" -ErrorAction SilentlyContinue | Select-Object -first 1
if (!$previousIP) { Write-Host "No previous IP found. Compare will fail." }
$Currentip = (Invoke-RestMethod -Uri "https://ipinfo.io/ip") -replace "`n", ""
$Currentip | out-file "$($env:ProgramData)/LastIP.txt" -Force

if ($Currentip -eq $previousIP) {
    write-host "Healthy"
}
else {
    write-host "External WAN address is incorrect. Expected $PreviousIP but received $Currentip"
    write-host @{ 
        CurrentIP = $Currentip
        previousIP = $previousIP
    }
    exit 1
}

Monitoring old computer accounts on Active Directory

This one was requested by Johan, on the N-Central Slack channel. He wants to have the ability to alert when computers get older than a specific age. in his case, 90 days.

 $ENV:ComputerAge = 90
$age = (get-date).AddDays(-$ENV:ComputerAge)
$DomainCheck = Get-CimInstance -ClassName Win32_OperatingSystem
if ($DomainCheck.ProductType -ne "2") { write-host "Not a domain controller. Soft exiting." ; exit 0 }
$OldComputers = Get-ADComputer -Filter * -properties DNSHostName,Enabled,WhenCreated,LastLogonDate | select DNSHostName,Enabled,WhenCreated,LastLogonDate | Where-Object {$_.LastLogonDate -lt $age}


if (!$OldComputers) {
    write-host "Healthy - No computers older than $ENV:ComputerAge found."
}
else {
    write-host"Not Healthy - Computer accounts found older than $ENV:ComputerAge  days"
    write-host @($OldComputers)
}

Monitoring old user accounts on Active Directory

And this one was just added for myself. I like knowing if accounts haven’t been logged onto in some time 🙂

 $ENV:UserAge = 30
$age = (get-date).AddDays(-$ENV:UserAge)
$DomainCheck = Get-CimInstance -ClassName Win32_OperatingSystem
if ($DomainCheck.ProductType -ne "2") { write-host "Not a domain controller. Soft exiting." ; exit 0 }
$OldUsers = Get-ADuser-Filter * -properties UserPrincipalName, Enabled, WhenCreated, LastLogonDate | select UserPrincipalName, Enabled, WhenCreated, LastLogonDate | Where-Object { $_.LastLogonDate -lt $age }


if (!$OldUsers) {
    write-host "Healthy"
}
else {
    write-host "Not Healthy - Users found that havent logged in for $ENV:UserAge days"
    write-host @($OldUsers)
} 

And that’s it this time! short but sweet. I hope you enjoyed and if there is any more requests. Let me know! 🙂 As always, Happy powerShelling.

Monitoring with PowerShell: Monitoring psexec execution

A bunch of bad actors these days uses the great psexec tool by Sysinternals/Microsoft to try to move through the network latterly. PSExec allows you to remotely execute commands on different computers through a very simple command line interface. PSexec also allows you to execute commands or scripts as the SYSTEM account.

We use PSExec professionally to run specific tooling that requires the highest privileges, this means that just flat out blocking PSExec execution on our networks is not possible. We do want to know whenever people do execute this, so we can use it as an early warning system. Please note that this does not capture PSexec clones such as CDEXec and PAexec.

Grabbing PSExec usage actually did not look that hard – you either look for the service it creates, or for the currently running file, so the following simple script would solve it I thought.

The Scripts

#FindPsexec service
$PSExecmon = get-service PSEXESVC
if (!$PSExecmon) {
    $PSExecHealth = "Healthy - no PSExec service found."
}
else {
    $PSExecHealth = "Unhealthy - PSExec service found"
}

 

But after checking some of my older scripts I’ve actually found the -r option for PSExec. The -R parameter allows you to change both the servicename and the executable name, making it a little harder to find now. To solve this, we can look for the specific executable where the description is set to PSExec.

#FindPsexec service
$PSExecmon =  get-process | Where-Object { $_.description -like "*psexec*" }
if (!$PSExecmon) {
    $PSExecHealth = "Healthy - no PSExec service found."
}
else {
    $PSExecHealth = "Unhealthy - PSExec service found"
}

But now you’re saying “But Kelvin, what is someone removes the personal identifying properties?! You won’t find it but it will still run.” And I’d say you’re absolutely right about that. So let’s try a third option;

$Procs = Get-Process | Where-Object { $_.Path -ne $null }
$PSExecmon = foreach ($Proc in $procs) {
    $Sig = Get-AuthenticodeSignature $proc.path
    if ($Sig.SignerCertificate.Thumbprint -eq "3BDA323E552DB1FDE5F4FBEE75D6D5B2B187EEDC") { $proc }
}
if (!$PSExecmon) {
    $PSExecHealth = "Healthy - no PSExec service found."
}
else {
    $PSExecHealth = "Unhealthy - PSExec service found"
}

This last option does have a downside too; Microsoft used the same signing certificate for the PSTools as they did for a .NET installer. This might generate one or two false positives, but it does seem the best way to detect PsExec usage right now. We’ve loaded this up in our RMM and are running this job to make sure we can see whenever PSExec is executed. With this option you could also stop the process automatically.

And that’s it! Hopefully it helps prevent lateral movement in your networks too. As always, Happy PowerShelling!

Monitoring with PowerShell: Monitoring O365 / Azure Breakglass account logon.

Microsoft advises to keep a “breakglass” account for environments in case of a major cell malfunction or other emergency situations. Our worry about these accounts has always been how to check if they have not been compromised or even used in anyway.

I’ve created the following two scripts for this. These scripts are for two cases:

  • The first script is to check the status of break glass accounts in general.
  • The second script is to check the logon status of any admin account.

The second script was designed for just-in-time administration. We have partner access to all of our clients so we don’t need admin accounts, normally speaking. The only problem with this is that some items such as the Security Center cannot be used by partner administrators or delegated access. To get access to these parts we create a temporary admin. We alert on -ANY- logged on admin.

Break-glass monitoring

#We match on the following account. prefix all your breakglass accounts with this. e.g. "Breakglass-1234321"
$BreakGlassUser = "Breakglass*"
############################################################
$ApplicationId         = 'xxxx-xxxx-xxx-xxxx-xxxx'
$ApplicationSecret     = 'TheSecretTheSecrey' | Convertto-SecureString -AsPlainText -Force
$TenantID              = 'YourTenantID'
$RefreshToken          = 'RefreshToken'
$ExchangeRefreshToken  = 'ExchangeRefreshToken'
$upn                   = 'UPN-Used-To-Generate-Tokens'
#############################################################
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)

$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID 
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID 

Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken
$customers = Get-MsolPartnerContract -All
foreach ($customer in $customers) {
  $token = New-PartnerAccessToken -ApplicationId 'a0c73c16-a7e3-4564-9a95-2bdf47383716'-RefreshToken $ExchangeRefreshToken -Scopes 'https://outlook.office365.com/.default' -Tenant $customer.TenantId
  $tokenValue = ConvertTo-SecureString "Bearer $($token.AccessToken)" -AsPlainText -Force
  $credential = New-Object System.Management.Automation.PSCredential($upn, $tokenValue)
  $customerId = $customer.DefaultDomainName
  $session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://ps.outlook.com/powershell-liveid?DelegatedOrg=$($customerId)&BasicAuthToOAuthConversion=true" -Credential $credential -Authentication Basic -AllowRedirection
  Import-PSSession $session -allowclobber -DisableNameChecking
  $startDate = (Get-Date).AddDays(-1)
  $endDate = (Get-Date)
  $Logs = @()
  Write-Host "Retrieving logs for $($customer.name)" -ForegroundColor Blue
  do {
    $logs += Search-unifiedAuditLog -SessionCommand ReturnLargeSet -SessionId $customer.name  -ResultSize 5000 -StartDate $startDate -EndDate $endDate -Operations UserLoggedIn
    Write-Host "Retrieved $($logs.count) logs" -ForegroundColor Yellow
  }while ($Logs.count % 5000 -eq 0 -and $logs.count -ne 0)
  Write-Host "Finished Retrieving logs" -ForegroundColor Green
  $logs | Select-Object UserIds, Operations, CreationDate | Where-Object {$_.UserIds -like $BreakGlassUser}
}

Just-In-Time Admin logging

 ############################################################
$ApplicationId         = 'xxxx-xxxx-xxx-xxxx-xxxx'
$ApplicationSecret     = 'TheSecretTheSecrey' | Convertto-SecureString -AsPlainText -Force
$TenantID              = 'YourTenantID'
$RefreshToken          = 'RefreshToken'
$ExchangeRefreshToken  = 'ExchangeRefreshToken'
$upn                   = 'UPN-Used-To-Generate-Tokens'
#############################################################
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)

$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID 
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID 

Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken
$customers = Get-MsolPartnerContract -All
foreach ($customer in $customers) {
  $adminaccounts  = (Get-MsolRoleMember -TenantId $customer.tenantid -RoleObjectId (Get-MsolRole -RoleName "Company Administrator").ObjectId).EmailAddress
  $token = New-PartnerAccessToken -ApplicationId 'a0c73c16-a7e3-4564-9a95-2bdf47383716'-RefreshToken $ExchangeRefreshToken -Scopes 'https://outlook.office365.com/.default' -Tenant $customer.TenantId
  $tokenValue = ConvertTo-SecureString "Bearer $($token.AccessToken)" -AsPlainText -Force
  $credential = New-Object System.Management.Automation.PSCredential($upn, $tokenValue)
  $customerId = $customer.DefaultDomainName
  $session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://ps.outlook.com/powershell-liveid?DelegatedOrg=$($customerId)&BasicAuthToOAuthConversion=true" -Credential $credential -Authentication Basic -AllowRedirection
  Import-PSSession $session -allowclobber -DisableNameChecking
  $startDate = (Get-Date).AddDays(-1)
  $endDate = (Get-Date)
  $Logs = @()
  Write-Host "Retrieving logs for $($customer.name)" -ForegroundColor Blue
  do {
    $logs += Search-unifiedAuditLog -SessionCommand ReturnLargeSet -SessionId $customer.name  -ResultSize 5000 -StartDate $startDate -EndDate $endDate -Operations UserLoggedIn
    Write-Host "Retrieved $($logs.count) logs" -ForegroundColor Yellow
  }while ($Logs.count % 5000 -eq 0 -and $logs.count -ne 0)
  Write-Host "Finished Retrieving logs" -ForegroundColor Green
  $logs | Select-Object UserIds, Operations, CreationDate | Where-Object {$_.UserIds -in $AdminAccounts}
}

As always my suggestion would be to use your RMM system for monitoring these on a regular basis. We run the scripts every 30 minutes. You’ll have to modify them for your own RMM of course.

And that’s it! as always, Happy PowerShelling.

Monitoring with PowerShell: Monitoring Onedrive Known Folder Move

So my clients are moving to a cloud only based model more and more each day. In most cases this means storing their data in Team/Onedrive. We use the option for Known Folder move. (KFM). The known folders are the Desktop, Pictures, and Documents folder. to setup KFM we use the guide by Microsoft. Quite simply we execute the following PowerShell commands via our RMM system.

$tenantID = "529ceae5-f210-42e8-9645-1234567" #TenantID GUID.
new-item -Path "HKLM:\SOFTWARE\Policies\Microsoft\OneDrive" -Force
New-ItemProperty -Path "HKLM:\SOFTWARE\Policies\Microsoft\OneDrive" -Name "KFMSilentOptIn" -Value $tenantID -Force
New-ItemProperty -Path "HKLM:\SOFTWARE\Policies\Microsoft\OneDrive" -Name "KFMSilentOptInWithNotification" -Value "1" -Force

For some reason or another the data does not always move and get protected. To make sure that we can see what’s going on I’ve decided to create the following monitoring set. This monitors if the KFM is enabled, and if it is enabled that the folders actually have been redirected.

$KFMSet = get-ItemProperty -Path "HKLM:\SOFTWARE\Policies\Microsoft\OneDrive" -Name "KFMSilentOptIn"
if (!$KFMSet) {
    $KFMHealth = "KFM has not been set via registry" 
}
else {
$Desktop = (Get-ItemProperty -path "HKCU:\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders" -name "Desktop").Desktop
$Pictures = (Get-ItemProperty -path "HKCU:\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders" -name "My Pictures").'My pictures'
$Documents = (Get-ItemProperty -path "HKCU:\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders" -name "Personal").Personal

if($Desktop -notlike "$($ENV:OneDrive)*") { $KFMHealth = "Desktop is not set to Onedrive location." }
if($Pictures -notlike "$($ENV:OneDrive)*") { $KFMHealth = "Pictures is not set to Onedrive location." }
if($Documents -notlike "$($ENV:OneDrive)*") { $KFMHealth = "My Documents is not set to Onedrive location." }
}
if($KFMHealth -eq $null) { $KFMHealth = "Healthy" }

Now I can hear you thinking “But Kelvin, this script runs in my RMM and that does not have access to the current user” Well, for these cases where you suffer from this use the following script that impersonates the user for you.

$Source = @"
using System;  
using System.Runtime.InteropServices;

namespace murrayju.ProcessExtensions  
{
    public static class ProcessExtensions
    {
        #region Win32 Constants

        private const int CREATE_UNICODE_ENVIRONMENT = 0x00000400;
        private const int CREATE_NO_WINDOW = 0x08000000;

        private const int CREATE_NEW_CONSOLE = 0x00000010;

        private const uint INVALID_SESSION_ID = 0xFFFFFFFF;
        private static readonly IntPtr WTS_CURRENT_SERVER_HANDLE = IntPtr.Zero;

        #endregion

        #region DllImports

        [DllImport("advapi32.dll", EntryPoint = "CreateProcessAsUser", SetLastError = true, CharSet = CharSet.Ansi, CallingConvention = CallingConvention.StdCall)]
        private static extern bool CreateProcessAsUser(
            IntPtr hToken,
            String lpApplicationName,
            String lpCommandLine,
            IntPtr lpProcessAttributes,
            IntPtr lpThreadAttributes,
            bool bInheritHandle,
            uint dwCreationFlags,
            IntPtr lpEnvironment,
            String lpCurrentDirectory,
            ref STARTUPINFO lpStartupInfo,
            out PROCESS_INFORMATION lpProcessInformation);

        [DllImport("advapi32.dll", EntryPoint = "DuplicateTokenEx")]
        private static extern bool DuplicateTokenEx(
            IntPtr ExistingTokenHandle,
            uint dwDesiredAccess,
            IntPtr lpThreadAttributes,
            int TokenType,
            int ImpersonationLevel,
            ref IntPtr DuplicateTokenHandle);

        [DllImport("userenv.dll", SetLastError = true)]
        private static extern bool CreateEnvironmentBlock(ref IntPtr lpEnvironment, IntPtr hToken, bool bInherit);

        [DllImport("userenv.dll", SetLastError = true)]
        [return: MarshalAs(UnmanagedType.Bool)]
        private static extern bool DestroyEnvironmentBlock(IntPtr lpEnvironment);

        [DllImport("kernel32.dll", SetLastError = true)]
        private static extern bool CloseHandle(IntPtr hSnapshot);

        [DllImport("kernel32.dll")]
        private static extern uint WTSGetActiveConsoleSessionId();

        [DllImport("Wtsapi32.dll")]
        private static extern uint WTSQueryUserToken(uint SessionId, ref IntPtr phToken);

        [DllImport("wtsapi32.dll", SetLastError = true)]
        private static extern int WTSEnumerateSessions(
            IntPtr hServer,
            int Reserved,
            int Version,
            ref IntPtr ppSessionInfo,
            ref int pCount);

        #endregion

        #region Win32 Structs

        private enum SW
        {
            SW_HIDE = 0,
            SW_SHOWNORMAL = 1,
            SW_NORMAL = 1,
            SW_SHOWMINIMIZED = 2,
            SW_SHOWMAXIMIZED = 3,
            SW_MAXIMIZE = 3,
            SW_SHOWNOACTIVATE = 4,
            SW_SHOW = 5,
            SW_MINIMIZE = 6,
            SW_SHOWMINNOACTIVE = 7,
            SW_SHOWNA = 8,
            SW_RESTORE = 9,
            SW_SHOWDEFAULT = 10,
            SW_MAX = 10
        }

        private enum WTS_CONNECTSTATE_CLASS
        {
            WTSActive,
            WTSConnected,
            WTSConnectQuery,
            WTSShadow,
            WTSDisconnected,
            WTSIdle,
            WTSListen,
            WTSReset,
            WTSDown,
            WTSInit
        }

        [StructLayout(LayoutKind.Sequential)]
        private struct PROCESS_INFORMATION
        {
            public IntPtr hProcess;
            public IntPtr hThread;
            public uint dwProcessId;
            public uint dwThreadId;
        }

        private enum SECURITY_IMPERSONATION_LEVEL
        {
            SecurityAnonymous = 0,
            SecurityIdentification = 1,
            SecurityImpersonation = 2,
            SecurityDelegation = 3,
        }

        [StructLayout(LayoutKind.Sequential)]
        private struct STARTUPINFO
        {
            public int cb;
            public String lpReserved;
            public String lpDesktop;
            public String lpTitle;
            public uint dwX;
            public uint dwY;
            public uint dwXSize;
            public uint dwYSize;
            public uint dwXCountChars;
            public uint dwYCountChars;
            public uint dwFillAttribute;
            public uint dwFlags;
            public short wShowWindow;
            public short cbReserved2;
            public IntPtr lpReserved2;
            public IntPtr hStdInput;
            public IntPtr hStdOutput;
            public IntPtr hStdError;
        }

        private enum TOKEN_TYPE
        {
            TokenPrimary = 1,
            TokenImpersonation = 2
        }

        [StructLayout(LayoutKind.Sequential)]
        private struct WTS_SESSION_INFO
        {
            public readonly UInt32 SessionID;

            [MarshalAs(UnmanagedType.LPStr)]
            public readonly String pWinStationName;

            public readonly WTS_CONNECTSTATE_CLASS State;
        }

        #endregion

        // Gets the user token from the currently active session
        private static bool GetSessionUserToken(ref IntPtr phUserToken)
        {
            var bResult = false;
            var hImpersonationToken = IntPtr.Zero;
            var activeSessionId = INVALID_SESSION_ID;
            var pSessionInfo = IntPtr.Zero;
            var sessionCount = 0;

            // Get a handle to the user access token for the current active session.
            if (WTSEnumerateSessions(WTS_CURRENT_SERVER_HANDLE, 0, 1, ref pSessionInfo, ref sessionCount) != 0)
            {
                var arrayElementSize = Marshal.SizeOf(typeof(WTS_SESSION_INFO));
                var current = pSessionInfo;

                for (var i = 0; i < sessionCount; i++)
                {
                    var si = (WTS_SESSION_INFO)Marshal.PtrToStructure((IntPtr)current, typeof(WTS_SESSION_INFO));
                    current += arrayElementSize;

                    if (si.State == WTS_CONNECTSTATE_CLASS.WTSActive)
                    {
                        activeSessionId = si.SessionID;
                    }
                }
            }

            // If enumerating did not work, fall back to the old method
            if (activeSessionId == INVALID_SESSION_ID)
            {
                activeSessionId = WTSGetActiveConsoleSessionId();
            }

            if (WTSQueryUserToken(activeSessionId, ref hImpersonationToken) != 0)
            {
                // Convert the impersonation token to a primary token
                bResult = DuplicateTokenEx(hImpersonationToken, 0, IntPtr.Zero,
                    (int)SECURITY_IMPERSONATION_LEVEL.SecurityImpersonation, (int)TOKEN_TYPE.TokenPrimary,
                    ref phUserToken);

                CloseHandle(hImpersonationToken);
            }

            return bResult;
        }

        public static bool StartProcessAsCurrentUser(string appPath, string cmdLine = null, string workDir = null, bool visible = true)
        {
            var hUserToken = IntPtr.Zero;
            var startInfo = new STARTUPINFO();
            var procInfo = new PROCESS_INFORMATION();
            var pEnv = IntPtr.Zero;
            int iResultOfCreateProcessAsUser;

            startInfo.cb = Marshal.SizeOf(typeof(STARTUPINFO));

            try
            {
                if (!GetSessionUserToken(ref hUserToken))
                {
                    throw new Exception("StartProcessAsCurrentUser: GetSessionUserToken failed.");
                }

                uint dwCreationFlags = CREATE_UNICODE_ENVIRONMENT | (uint)(visible ? CREATE_NEW_CONSOLE : CREATE_NO_WINDOW);
                startInfo.wShowWindow = (short)(visible ? SW.SW_SHOW : SW.SW_HIDE);
                startInfo.lpDesktop = "winsta0\\default";

                if (!CreateEnvironmentBlock(ref pEnv, hUserToken, false))
                {
                    throw new Exception("StartProcessAsCurrentUser: CreateEnvironmentBlock failed.");
                }

                if (!CreateProcessAsUser(hUserToken,
                    appPath, // Application Name
                    cmdLine, // Command Line
                    IntPtr.Zero,
                    IntPtr.Zero,
                    false,
                    dwCreationFlags,
                    pEnv,
                    workDir, // Working directory
                    ref startInfo,
                    out procInfo))
                {
                    throw new Exception("StartProcessAsCurrentUser: CreateProcessAsUser failed.\n");
                }

                iResultOfCreateProcessAsUser = Marshal.GetLastWin32Error();
            }
            finally
            {
                CloseHandle(hUserToken);
                if (pEnv != IntPtr.Zero)
                {
                    DestroyEnvironmentBlock(pEnv);
                }
                CloseHandle(procInfo.hThread);
                CloseHandle(procInfo.hProcess);
            }
            return true;
        }
    }
}


"@

New-Item "C:\programdata\Microsoft OneDrive" -ItemType directory -Force -ErrorAction SilentlyContinue
Invoke-WebRequest -Uri 'https://raw.githubusercontent.com/rodneyviana/ODSyncService/master/Binaries/PowerShell/OneDriveLib.dll' -OutFile 'C:\programdata\Microsoft OneDrive\OneDriveLib.dll'

Add-Type -ReferencedAssemblies 'System', 'System.Runtime.InteropServices' -TypeDefinition $Source -Language CSharp 
$scriptblock = {
    $KFMSet = get-ItemProperty -Path "HKLM:\SOFTWARE\Policies\Microsoft\OneDrive" -Name "KFMSilentOptIn"
    if (!$KFMSet) {
        $KFMHealth = "KFM has not been set via registry" 
    }
    else {
    $Desktop = (Get-ItemProperty -path "HKCU:\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders" -name "Desktop").Desktop
    $Pictures = (Get-ItemProperty -path "HKCU:\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders" -name "My Pictures").'My pictures'
    $Documents = (Get-ItemProperty -path "HKCU:\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders" -name "Personal").Personal
    
    if($Desktop -notlike "$($ENV:OneDrive)*") { $KFMHealth = "Desktop is not set to Onedrive location." }
    if($Pictures -notlike "$($ENV:OneDrive)*") { $KFMHealth = "Pictures is not set to Onedrive location." }
    if($Documents -notlike "$($ENV:OneDrive)*") { $KFMHealth = "My Documents is not set to Onedrive location." }
    }
    if($KFMHealth -eq $null) { $KFMHealth = "Healthy" }
    $KFMHealth | Out-File "C:\programdata\Microsoft OneDrive\OneDriveKFMLogging.txt" -Force
}

$bytes = [System.Text.Encoding]::Unicode.GetBytes($scriptblock)
$encodedCommand = [Convert]::ToBase64String($bytes)

[murrayju.ProcessExtensions.ProcessExtensions]::StartProcessAsCurrentUser("C:\Windows\System32\WindowsPowershell\v1.0\Powershell.exe", " -Encodedcommand $encodedCommand","C:\Windows\System32\WindowsPowershell\v1.0",$false)
start-sleep 2
$KFMHealth = get-content 'C:\programdata\Microsoft OneDrive\OneDriveKFMLogging.txt'
if($KFMHealth -eq $null) { $KFMHealth = "Script failed." }

And that’s it! an easy way to monitor if Known Folder move is working using your RMM system. As always, Happy PowerShelling.

Documenting with PowerShell: Downloading and storing the Office 365 Audit logs (With search!)

As we’re continuing the documenting with PowerShell series I’d like to take a step away from our regular IT-Glue and Documentation scripts and look at something that is related to documentation but also the monitoring side of the house. We’re going to be checking out the Office 365 Unified Audit log.

The unified audit log is the log where all actions that you take in the O365 environment are logged too, which is a great solution for compliance, security, and finding those pesky hackers that are trying to break into our cool Office365 environment. There’s a couple of downsides to the Unified Audit log;

  • Audit logs are only retained for 30 days if you have a business subscription, or 90 days when you have an enterprise subscription. Sometimes you need to go back to what happened more than 3 months ago.
  • Searching the audit log online via the Security and Compliance center is slow and does not show all results.
  • When exporting the results via the webinterface a maximum of 5000 audit log records is exported, meaning you will have to create 10 exports if you have 50,000 items logged.
Introducing the CyberDrain.com Auditlog HTML generator.

These issues are the reason I’ve made the following script – I wanted a way to search the Audit Log easily and have all records included. My script will download all of the audit logs of the previous day and save them as a CSV file. It also creates a completely searchable HTML file for ease of use. I’ve set this up to automatically download all the audit files each day to a secure location. That way whenever I have to start digging into logs I can easily find what I’m looking for. I’ve been requested to put more screenshots of the finished results, so this is how the HTML file will look;

The script

The script uses the secure application model to connect to office365. You can find instructions for the secure application model in this blog.

##########################################
$ApplicationId         = 'xxxx-xxxx-xxx-xxxx-xxxx'
$ApplicationSecret     = 'TheSecretTheSecrey' | Convertto-SecureString -AsPlainText -Force
$TenantID              = 'YourTenantID'
$RefreshToken          = 'RefreshToken'
$ExchangeRefreshToken  = 'ExchangeRefreshToken'
$upn                   = 'UPN-Used-To-Generate-Tokens'
##########################################
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)
$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID 
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID 
Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken

$customers = Get-MsolPartnerContract -All
#Logged in. Moving on to creating folders and getting data.
$folderName = (Get-Date).tostring("dd-MM-yyyy")
$outputfolder = "C:\ScriptOutput"
new-item -Path $outputfolder -ItemType Directory -Name $folderName -Force
foreach ($customer in $customers) {
  $token = New-PartnerAccessToken -ApplicationId 'a0c73c16-a7e3-4564-9a95-2bdf47383716'-RefreshToken $ExchangeRefreshToken -Scopes 'https://outlook.office365.com/.default' -Tenant $customer.TenantId
  $tokenValue = ConvertTo-SecureString "Bearer $($token.AccessToken)" -AsPlainText -Force
  $credential = New-Object System.Management.Automation.PSCredential($upn, $tokenValue)
  $customerId = $customer.DefaultDomainName
  $session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://ps.outlook.com/powershell-liveid?DelegatedOrg=$($customerId)&amp;BasicAuthToOAuthConversion=true" -Credential $credential -Authentication Basic -AllowRedirection
  Import-PSSession $session -allowclobber -DisableNameChecking

  $startDate = (Get-Date).AddDays(-1)
  $endDate = (Get-Date)
  $Logs = @()
  Write-Host "Retrieving logs for $($customer.name)" -ForegroundColor Blue
  do {
    $logs += Search-unifiedAuditLog -SessionCommand ReturnLargeSet -SessionId $customer.name -ResultSize 5000 -StartDate $startDate -EndDate $endDate
    Write-Host "Retrieved $($logs.count) logs" -ForegroundColor Yellow
  }while ($Logs.count % 5000 -eq 0 -and $logs.count -ne 0)
  Write-Host "Finished Retrieving logs" -ForegroundColor Green
  $ObjLogs = foreach ($Log in $Logs) {
    $log.auditdata | convertfrom-json
  }
  $PreContent = @"
&lt;H1> $($Customer.Name) - Audit Log from $StartDate until $EndDate &lt;/H1>&lt;br>

&lt;br> Please note that this log is not complete - It is a representation where fields have been selected that are most commonly filtered on. .&lt;br>
To analyze the complete log for this day, please click here for the complete CSV file log: &lt;a href="$($Customer.Name).csv"/>CSV Logbook&lt;/a>
&lt;br/>
&lt;br/>

&lt;input type="text" id="myInput" onkeyup="myFunction()" placeholder="Search...">
"@ 
  $head = @"
&lt;script>
function myFunction() {
    const filter = document.querySelector('#myInput').value.toUpperCase();
    const trs = document.querySelectorAll('table tr:not(.header)');
    trs.forEach(tr => tr.style.display = [...tr.children].find(td => td.innerHTML.toUpperCase().includes(filter)) ? '' : 'none');
  }&lt;/script>
&lt;Title>Audit Log Report&lt;/Title>
&lt;style>
body { background-color:#E5E4E2;
      font-family:Monospace;
      font-size:10pt; }
td, th { border:0px solid black; 
        border-collapse:collapse;
        white-space:pre; }
th { color:white;
    background-color:black; }
table, tr, td, th {
     padding: 2px; 
     margin: 0px;
     white-space:pre; }
tr:nth-child(odd) {background-color: lightgray}
table { width:95%;margin-left:5px; margin-bottom:20px; }
h2 {
font-family:Tahoma;
color:#6D7B8D;
}
.footer 
{ color:green; 
 margin-left:10px; 
 font-family:Tahoma;
 font-size:8pt;
 font-style:italic;
}
#myInput {
  background-image: url('https://www.w3schools.com/css/searchicon.png'); /* Add a search icon to input */
  background-position: 10px 12px; /* Position the search icon */
  background-repeat: no-repeat; /* Do not repeat the icon image */
  width: 50%; /* Full-width */
  font-size: 16px; /* Increase font-size */
  padding: 12px 20px 12px 40px; /* Add some padding */
  border: 1px solid #ddd; /* Add a grey border */
  margin-bottom: 12px; /* Add some space below the input */
}
&lt;/style>
"@
  #$ObjLogs
  $Logs | export-csv "$($outputfolder)\$($FolderName)\$($Customer.Name).csv" -NoTypeInformation
  $ObjLogs | Select-object CreationTime, UserID, Operation, ResultStatus, ClientIP, Workload, ClientInfoString, * -ErrorAction SilentlyContinue | Convertto-html -head $head -PreContent $PreContent | out-file "$($outputfolder)\$($FolderName)\$($customer.Name).html"
}

Monitoring with PowerShell: Monitoring users that are blocked for login

Hi guys. Today I’ll only have a short blog – I’ve been busy this weekend with non-tech stuff like building a table for dungeons and dragons, which is why I’ve only had time to write a somewhat shorter blog than normally.

This one is based on a blog from last week – Some users on Reddit asked if I could also create a monitoring set for blocked users. We’ve setup policies to make sure users are blocked after multiple failed logins, or when failing the second factor authentication a couple of times. Its best to monitor this to preventively to make sure you can give the users a call and check if everything is functioning as it should.

The following script helps you in this.

##############################
$ApplicationId = 'XXXX-XXXX-XXXX-XXX-XXX'
$ApplicationSecret = 'YourApplicationSecret' | Convertto-SecureString -AsPlainText -Force
$TenantID = 'YourTenantID.Onmicrosoft.com'
$RefreshToken = 'VeryLongRefreshToken'
##############################
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)
$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID 
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID 
Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken
$customers = Get-MsolPartnerContract -All

$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)
$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID 
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID 
Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken
$customers = Get-MsolPartnerContract -All
$BlockedUserlist = foreach ($customer in $customers) {
    write-host "Getting Blocked users for $($Customer.name)" -ForegroundColor Green
    $BlockedUsers = Get-MsolUser -TenantId $($customer.TenantID) | Where-Object {$_.BlockCredential -eq $true}
    foreach($User in $BlockedUsers){ "$($user.UserPrincipalName) is blocked from logon." }
}
if(!$BlockedUserlist) {  $BlockedUserlist = "Healthy" } 

And that’s it! as always, Happy PowerShelling.

Monitoring with PowerShell: Monitoring Office 365 deleted users & License usage

I’ve been getting some requests to talk more about monitoring access and license management for Office 365. Some of you have asked how to be notified when users get deleted, or to get a notification right before a user is deleted permanently. Another question was on how to check if all licenses are assigned and you’re not wasting any resources or money on unused licenses. I’ve decided to blog about both 🙂

Monitoring deleted users

So the first one up is monitoring the deleted users – I understand monitoring this for a multitude of reasons. Imagine you’re distributing licenses and when a user gets deleted you need to update your billing systems, or imagine that you have a specific off-boarding procedure that needs to be kicked off when a user is deleted. I’ve created two cases; One to alert as soon as a deleted user has been found, another to alert when the user is about to be permanently deleted.

Monitoring all deleted users
##############################
$ApplicationId = 'XXXX-XXXX-XXXX-XXX-XXX'
$ApplicationSecret = 'YourApplicationSecret' | Convertto-SecureString -AsPlainText -Force
$TenantID = 'YourTenantID.Onmicrosoft.com'
$RefreshToken = 'VeryLongRefreshToken'
##############################
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)
$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID 
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID 
Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken
$customers = Get-MsolPartnerContract -All
$DeletedUserlist = foreach ($customer in $customers) {
    write-host "Getting Deleted users for $($Customer.name)" -ForegroundColor Green
    $DeletedUsers = Get-MsolUser -ReturnDeletedUsers -TenantId $($customer.TenantID)
    foreach($User in $DeletedUsers){ "$($user.UserPrincipalName) has been deleted on $($User.SoftDeletionTimestamp)" }
}
if(!$DeletedUserlist) {  $DeletedUserlist = "Healthy" }
Monitoring near permanent delete date.
##############################
$Daystomonitor = (Get-Date).AddDays(-28) #This means we will alert when a user has been deleted for 28 days, and is 1 day before permanent deletion.
##############################
$ApplicationId = 'XXXX-XXXX-XXXX-XXX-XXX'
$ApplicationSecret = 'YourApplicationSecret' | Convertto-SecureString -AsPlainText -Force
$TenantID = 'YourTenantID.Onmicrosoft.com'
$RefreshToken = 'VeryLongRefreshToken'
##############################
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)
$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID 
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID 
Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken
$customers = Get-MsolPartnerContract -All
$DeletedUserlist = foreach ($customer in $customers) {
    write-host "Getting Deleted users for $($Customer.name)" -ForegroundColor Green
    $DeletedUsers = Get-MsolUser -ReturnDeletedUsers -TenantId $($customer.TenantID) | Where-Object {$($User.SoftDeletionTimestamp) -lt $Daystomonitor}
    foreach ($User in $DeletedUsers) { "$($user.UserPrincipalName) has been deleted on $($User.SoftDeletionTimestamp)" }
}
if (!$DeletedUserlist) { $DeletedUserlist= "Healthy" }


 

To explain the scripts; The first script connects to each tenant in your Microsoft Partner portal, grabs all deleted users, and gives you a report of all deleted users. The second one does the same, but filters specifically on users that have been deleted for 28 days.

Monitoring unused licenses

So this is one that we are going to use internally after getting some requests from all of you – I think it’s a pretty smart idea to monitor if licenses are in use, and if not to alert on them. It helps to compare the administrative side of licensing to the actual cost of licensing.

##############################
$ApplicationId = 'XXXX-XXXX-XXXX-XXX-XXX'
$ApplicationSecret = 'YourApplicationSecret' | Convertto-SecureString -AsPlainText -Force
$TenantID = 'YourTenantID.Onmicrosoft.com'
$RefreshToken = 'VeryLongRefreshToken'
##############################
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)
$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID 
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID 
Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken
$customers = Get-MsolPartnerContract -All
$UnusedLicensesList = foreach ($customer in $customers) {
    write-host "Getting licenses $($customer.name)" -ForegroundColor Green
    $Licenses = Get-MsolAccountSku -TenantId $($customer.TenantId)
    foreach ($License in $Licenses) { 
        if ($License.ActiveUnits -lt $License.consumedUnits) { "$($customer.name) - $($License.AccountSkuId) has licenses available." }

    }
}
if (!$UnusedLicensesList) { $UnusedLicensesList = "Healthy" } 

And that’s it! a somewhat longer blog than normal, with multiple scripts. But I hope you’ve enjoyed it. As always, Happy Powershelling.

Monitoring with PowerShell: Monitoring SQL server health

Seems like this is the week of SQL server blogs! This time we’re going to cover monitoring the SQL server health. SQL server health monitoring is important to keep all line of business applications in check and to make sure they perform well. We’ll be focussed on monitoring the server, databases, and jobs.

We will use the same trick as we did in the last SQL post. We’re going to be using the SQL Server module called SQLPS, which loads a PSDrive to browse all databases and get the state of each database. So let’s get started!

The script

I’d like to take a moment to point at that this script offers only very basic monitoring. This is often enough for MSPs and non-dba type administrators. If you want more extensive monitoring you should really look into the amazing dbatools module by Chrissy Lemaire and her team of amazing PowerShell admins/dbas. 🙂

The script alerts on databases that are not in a normal state, That have a recovery model other than “Simple” and where a database max size has been set. Also we’re checking if the database is located on C:\. You might want to comment out one of these if you do not care about one of these settings. These are just some of the things that we look out for in our environments. Its fairly straight forward to edit this script to monitor the backup dates instead, or if the database has the correct collation.

import-module SQLPS
$Instances = Get-ChildItem "SQLSERVER:\SQL\$($ENV:COMPUTERNAME)"
foreach ($Instance in $Instances) {
    $databaseList = get-childitem "SQLSERVER:\SQL\$($ENV:COMPUTERNAME)\$($Instance.Displayname)\Databases"
    $SkipDatabases = @("Master","Model","ReportServer","SLDModel.SLDData")
    $Errors =  foreach ($Database in $databaselist | Where-Object {$_.Name -notin $SkipDatabases}) {
        if ($Database.status -ne "normal") {"$($Database.name) has the status: $($Database.status)" }
        if ($Database.RecoveryModel -ne "Simple") {  "$($Database.name) is in logging mode $($Database.RecoveryModel)" }
        if ($database.filegroups.files.MaxSize -ne "-1") { "$($Database.name) has a Max Size set." }
        if ($database.filegroups.files.filename -contains "C:") { "$($Database.name) is located on the C:\ drive." }
    }
}
if (!$errors) { $HealthState = "Healthy" } else { $HealthState = $Errors }  

And that’s it! Of course, modify these scripts to your own environment and requirements. And as always, Happy PowerShelling.