Category Archives: Series: PowerShell Monitoring

Monitoring with PowerShell: Monitoring O365 alerts

So today we’re tackling two blogs in one again, we’re going to be focussing on two different types of alerting policies.

The first getting alerts for M365 or Azure P1/P2 users via the Graph API, these alerts are pretty cool as you can see if users are connecting from tor nodes/VPN nodes, or from stuff like malware linked IPs. Monitoring these alerts actively via your RMM helps you in making sure that users are as safe as they can be.

Unfortunately, I know not everyone has the luxury of a M365, or P1 subscription. To help those users out we’re going get the existing protection alerts for all tenants, and forward them from “TenantsAdmin” to an actual e-mail address managed by you. So, lets get started!

Monitoring Alerts with Graph

This script uses the Secure Application Model, You’ll also need to add some extra permissions to your Secure App. To do that execute the following steps:

  • Go to the Azure Portal.
  • Click on Azure Active Directory, now click on “App Registrations”.
  • Find your Secure App Model application. You can search based on the ApplicationID.
  • Go to “API Permissions” and click Add a permission.
  • Choose “Microsoft Graph” and “Application permission”.
  • Search for “Security” and click on “SecurityEvents.Read.All”. Click on add permission.
  • Search for “Security” and click on “and SecurityEvents.ReadWrite.All”. Click on add permission.
  • Do the same for “Delegate Permissions”.
  • Finally, click on “Grant Admin Consent for Company Name.

After you’ve done this, execute the script below using your RMM, from there you’ll be able to see all open alerts. 🙂

######### Secrets #########
$ApplicationId = 'YourAppID'
$ApplicationSecret = 'YourAppSecret' | ConvertTo-SecureString -Force -AsPlainText
$RefreshToken = 'YourVeryLongRefreshToken'
$SkipList = "Bla.onmicrosoft.com", "Bla2.onmicrosoft.com"
######### Secrets #########


$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)


Try {
    $aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal 
    $graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default'
}
catch {
    write-DRRMAlert "Could not get tokens: $($_.Exception.Message)"
    exit 1
}
Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken

$customers = Get-MsolPartnerContract -All | Where-Object { $_.DefaultDomainName -notin $skiplist }

$OpenAlerts = foreach ($customer in $customers) {
    write-host "Getting started for $($Customer.name)" -foregroundcolor green
    try {
        $graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $customer.TenantID
        $Header = @{
            Authorization = "Bearer $($graphToken.AccessToken)"
        }
    }
    catch {
        "Could not connect to tenant $($customer.name). Error:  $($_.Exception.Message)"
        continue
    }
    (Invoke-RestMethod -Uri "https://graph.microsoft.com/beta/security/Alerts" -Headers $Header -Method Get -ContentType "application/json").value | Select-Object title,category, description, createddatetime,userstates,status, @{label="Username";expression={$_.userstates.userprincipalname}}

}

if(!$OpenAlerts){
    $OpenAlerts = "Healthy - No open alerts found"
}

$OpenAlerts 

Now lets move on to the e-mail alerts.

Setting Protection Alerts

So the biggest problem with the default protection alerts is that they can only be sent to “Tenant Admins” a group of administrators in the tenant. For Microsoft Partners this group often does not have licensed user.

To tackle this issue we’re going to grab all rules that reference the TenantsAdmins group, we’re going to copy that rule, set the e-mail address to one of our choosing. We’re leaving the original rule intact because those might be used by local administrators or in co-managed environments.

######### Secrets #########
$ApplicationId         = 'ApplicationID'
$ApplicationSecret     = 'AppSecret'
$TenantID              = 'YourTenantID'
$RefreshToken          = 'RefreshToken'
$ExchangeRefreshToken = 'ExchangeRefreshToken'
$UPN = "YourUPN"
$AlertingEmail = "O365Alerts@YourDomain.com"
######### Secrets #########

$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)
 
$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID
 
Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken
$customers = Get-MsolPartnerContract -All
foreach ($customer in $customers) {
    $customerId = $customer.DefaultDomainName
    write-host "Processing $customerId"
    try {
        $token = New-PartnerAccessToken -ApplicationId 'a0c73c16-a7e3-4564-9a95-2bdf47383716'-RefreshToken $ExchangeRefreshToken -Scopes 'https://outlook.office365.com/.default'
        $tokenValue = ConvertTo-SecureString "Bearer $($token.AccessToken)" -AsPlainText -Force
        $credential = New-Object System.Management.Automation.PSCredential($upn, $tokenValue)
        $SccSession = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://ps.compliance.protection.outlook.com/powershell-liveid?BasicAuthToOAuthConversion=true&DelegatedOrg=$($customerId)" -Credential $credential -AllowRedirection -Authentication Basic -erroraction stop -WarningAction SilentlyContinue
        $null = import-pssession $SccSession -disablenamechecking -allowclobber -CommandName "Get-ActivityAlert", "New-ActivityAlert", "Get-ProtectionAlert", "New-protectionalert", "Set-activityAlert", "Set-protectionalert"
    }
    catch {
        Write-Host "Could not get tokens or log in to session for $($customer.DefaultDomainName): $($_.Exception.Message)"
        continue
    }
    
    $ProtectionAlerts = Get-ProtectionAlert | Where-Object { $_.NotifyUser -eq "TenantAdmins" -and $_.disabled -eq $false }
    
    ForEach ($ProtectionAlert in $ProtectionAlerts) {
        $NewName = if ($ProtectionAlert.name.Length -gt 30) { "$($ProtectionAlert.name.substring(0,30)) - $($AlertingEmail)" } else { "$($ProtectionAlert.name) - $($AlertingEmail)" }
        $ExistingRule = Get-ProtectionAlert -id $NewName -ErrorAction "SilentlyContinue"
        if (!$ExistingRule) {
            $splat = @{
                name                = $NewName
                NotifyUser          = $AlertingEmail
                Operation           = $ProtectionAlert.Operation
                NotificationEnabled = $true
                Severity            = $ProtectionAlert.Severity
                Category            = $ProtectionAlert.Category
                Comment             = $ProtectionAlert.Comment
                threattype          = $ProtectionAlert.threattype
                AggregationType     = $ProtectionAlert.AggregationType
                Disabled            = $ProtectionAlert.Disabled
            }
            try {
              $null = New-ProtectionAlert @splat -ErrorAction Stop
            }
            catch {
                write-host "Could not create rule. Most likely no subscription available. Error: $($_.Exception.Message)"
            }
        }
        else {
            write-host "Rule exists, Moving on."
        }
    }

    Remove-PSSession $SccSession
 
}

And that’s it! as always, Happy PowerShelling! 🙂

Monitoring with PowerShell: Monitoring DNS forwarders

One of my friends recently told me that DNS forwarders might be a nice subject to talk about,He had an issue with his forwarders not responding and solved that with PowerShell.

DNS forwarders are often used to decrease the load on a network and not rely on just root hints. Most of the times I suggest to use either the providers DNS servers as the forwarding addresses. There are cases where you’d like to use another server; for example when using DNS filtering.

In any case, you’ll always have to check if the DNS servers that you are forwarding towards are actually resolving. If they won’t you’ll get name resolution issues for your clients and that’s never a good time.

To monitor the DNS forwarders we’ll use two PowerShell commands; One to retrieve the DNS server forwarders and one to test if the DNS servers are actually available and resolving.

The script

So let’s break this script down, because it’s actually quite multi-functional.

$ForwarderAddresses = (get-dnsserverforwarder).ipaddress.ipaddresstostring
$DNSServerTest = foreach ($forwarder in $ForwarderAddresses) {
    [PSCustomObject]@{
        Forwarder = $forwarder
        ForwarderTest = (Test-DNSServer -context forwarder -IPAddress $Forwarder).result
        TCPTest = (Test-NetConnection -ComputerName $forwarder -Port 53).TcpTestSucceeded
        Resolutiontest = if($null -ne (Resolve-DnsName -server $forwarder -name "Google.com" -DnsOnly).ipaddress){ $true } else { $false }
    }
}

By running this script we’re creating an object that tells us which forwarders are currently configured, if the internal Windows DNS server test is able to connect to it, If we are able to connect over TCP, and if DNS resolution worked. We can then use this object to apply different types of monitoring. So let’s combine all of this information into one cool monitoring set.

$ExpectedForwarders = "1.1.1.1", "8.8.8.8", "8.8.4.4","9.9.9.9"

$ForwarderAddresses = (get-dnsserverforwarder).ipaddress.ipaddresstostring
$DNSServerTest = foreach ($forwarder in $ForwarderAddresses) {
    [PSCustomObject]@{
        Forwarder      = $forwarder
        ForwarderTest  = (Test-DNSServer -context forwarder -IPAddress $Forwarder).result
        TCPTest        = (Test-NetConnection -ComputerName $forwarder -Port 53).TcpTestSucceeded
        Resolutiontest = if ($null -ne (Resolve-DnsName -server $forwarder -name "Google.com" -DnsOnly).ipaddress) { $true } else { $false }
    }

}

foreach ($TestedForwarder in $DNSServerTest) {
    if ($TestedForwarder.forwarder -notin $ExpectedForwarders) { write-host "The expected forwarders don't match the configured forwarders: $($TestedForwarder.Forwarder)" }
    if ($TestedForwarder.ForwarderTest -ne "Success") { write-host "$($TestedForwarder.forwarder) has not passed the Forwarder Test" }
    if ($TestedForwarder.TCPTest -ne $true) { write-host "$($TestedForwarder.forwarder) has not passed the TCP Port Test" }
    if ($TestedForwarder.Resolutiontest -ne $true) { write-host "$($TestedForwarder.forwarder) has not passed the Forwarder Test" }
}

And that’s it! as always, Happy PowerShelling!

Automating with PowerShell: Increasing the O365 Secure Score

At the start of this week I’ve blogged about reading the secure score and documenting it. This is of course just one part of the new beta Secure Score module. The next one is actually the more fun part; applying the correct security settings to a tenant.

So first things first; The module is still rough around the edges and in beta. It’s mostly used to demonstrate how you can attack reaching a higher score, and actually helping your clients reach a higher level of security.

The SecureScore is just a baseline of added value items you can apply to each tenant. A description of each item can be found here. Don’t rely on just the secure score for your security needs.

So let’s get to increasing the secure score.

Examples

So imagine you want to apply all ‘low impact’ items, to all tenants. This is stuff like having a breakglass account, not allowing OAuth approvals by users, not having password expire policies and setting up a DLP policy that prevents sending out credit card information. You’ll run:


######### Secrets #########
$ApplicationId         = 'ApplicationID'
$ApplicationSecret     = 'AppSecret'
$TenantID              = 'YourTenantID'
$RefreshToken          = 'RefreshToken'
$ExchangeRefreshToken = 'ExchangeRefreshToken'
$UPN = "YourUPN"
######### Secrets #########
Install-module SecureScore
set-securescore -AllTenants -ControlName LowImpact -upn $upn -ApplicationSecret $ApplicationSecret -ApplicationId $ApplicationId -RefreshToken $RefreshToken -ExchangeToken $ExchangeRefreshToken -Confirmed

This loops through all tenants, sets up all LowImpact items without confirmation, and poof. 🙂 But now imagine you are using external tooling for something like MFA enrollment; Microsoft doesn’t know and hasn’t given you the points for it, so lets correct that:


######### Secrets #########
$ApplicationId         = 'ApplicationID'
$ApplicationSecret     = 'AppSecret'
$TenantID              = 'YourTenantID'
$RefreshToken          = 'RefreshToken'
$ExchangeRefreshToken = 'ExchangeRefreshToken'
$UPN = "YourUPN"
######### Secrets #########
Install-Module SecureScore
set-securescore -AllTenants -ControlName MFARegistrationV2 -upn $upn -ApplicationSecret $ApplicationSecret -ApplicationId $ApplicationId -RefreshToken $RefreshToken -ExchangeToken $ExchangeRefreshToken -Confirmed -ExternallyResolved

Using the parameter -ExternallyResolved you won’t apply the actual fix, and instead just tell Microsoft “Hey, we’ve solved this using another product. Please give us the points”. Pretty cool when you are using ADFS with a own MFA product, or just DUO or the likes.

But imagine you’re not sure what an item does, and what effect it has on users. You can run the following command on a single tenant to get a little explanation:

set-securescore -TenantID "Sometenant.onmicrosoft.com" -ControlName AdminMFAV2 -upn $upn -ApplicationSecret $ApplicationSecret -ApplicationId $ApplicationId -RefreshToken $RefreshToken -ExchangeToken $ExchangeRefreshToken

Result:
WARNING: This will enable multi-factor authentication for all admin users, and prompt them at first logon to configure MFA. Do you want to continue?

So, there’s a lot of stuff to play with in this module and I’ll be adding a lot of functionality in the future for other payloads. I hope you guys enjoy it and if you have any issues, let me know! 🙂

And that’s it! as always, Happy PowerShelling!

Monitoring with PowerShell: Monitoring B-Series VM credits

A lot of MSPs use the B-Series VMs for tasks, and why woulnd’t you? It are cheap VMs that allow you to use azure as a cost effective solution for clients. The B-Series VM are “burstable” VMs, meaning they don’t get the full CPU performance constantly.

The description of Microsoft explains it best:

The B-series burstable VMs are ideal for workloads that do not need the full performance of the CPU continuously, like web servers, small databases and development and test environments. These workloads typically have burstable performance requirements. The B-Series provides these customers the ability to purchase a VM size with a price conscience baseline performance that allows the VM instance to build up credits when the VM is utilizing less than its base performance. When the VM has accumulated credit, the VM can burst above the VM’s baseline using up to 100% of the CPU when your application requires the higher CPU performance.

Microsoft – https://azure.microsoft.com/en-us/blog/introducing-b-series-our-new-burstable-vm-size/

This is of course great for smaller servers such as domain controllers, small RemoteApp machines or generic low performance VMs, but you do need to pay attention that you don’t run out of “credits” when performance is required.

So let’s start alerting on B-Series VMs that are running out of steam. Full disclosure and credit where it’s due: A part of this script was shared with me by Andrew Cullen of Lanter Technologies, thanks for that Andrew!

The script

This script checks all the subscriptions for each VM that is in the B-series, from there on we check the current credits remaining and alert on it if those get under 90.

To fix this, you could temporarily upscale the VM to a larger series which gives you new credits, or move tasks to different VM’s if it happens a lot. This will most likely also help in the “why is my VM suddenly so slow” scenarios 😉

We’re using the secure application model and Azure Lighthouse for these tasks, as such you can load these scripts into any RMM system that is able to handle credentials securely.

######### Secrets #########
$ApplicationId = 'ApplicationID'
$ApplicationSecret = 'ApplicationSecret' | ConvertTo-SecureString -Force -AsPlainText
$TenantID = 'YourTenantID'
$RefreshToken = 'YourRefreshToken'
$UPN = "UPN-Used-To-Generate-Tokens"
######### Secrets #########

$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)
$azureToken = New-PartnerAccessToken -ApplicationId $ApplicationID -Credential $credential -RefreshToken $refreshToken -Scopes 'https://management.azure.com/user_impersonation' -ServicePrincipal -Tenant $TenantId
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationID -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $TenantId
 
Connect-Azaccount -AccessToken $azureToken.AccessToken -GraphAccessToken $graphToken.AccessToken -AccountId $upn -TenantId $tenantID
$Subscriptions = Get-AzSubscription  | Where-Object { $_.State -eq 'Enabled' } | Sort-Object -Unique -Property Id
$VMCredits = foreach ($Sub in $Subscriptions) {
    write-host "Processing client $($sub.name)"
    $null = $Sub | Set-AzContext
    get-azvm -status | Where-Object { $_.HardwareProfile.VmSize -like "Standard_B*" -and $_.PowerState -like "*Running*" } | ForEach-Object {
        $Credits = (Get-AzMetric -ResourceId $_.Id -MetricName "CPU Credits Remaining").data.average | Where-Object { $_ -ne "" } | select-object -last 1
        [PSCustomObject]@{
            VMName           = $_.Name
            CreditsRemaining = $Credits
            Subscription     = $sub.name
        }
    }
}

foreach ($VMCredit in $VMCredits | where-object { $_.CreditsRemaining -lt "90" }) {
    write-host "$($VMCredit.VMname) has $($VMCredit.CreditsRemaining) credits remaining"
}

And that’s it! I hope this helps in tackling those pesky performance issues when using the cheaper VMs in Azure. As always, Happy PowerShelling

Monitoring with PowerShell: O365 location alerts

A while back someone asked me to convert a script they had to the Secure Application Model. This specific script checked the Office 365 audit log IPs against a online database of locations. I declined at first and suggested it to look into Microsoft 365, a P1 or P2 subscription which allows you to do this native.

The reader came back to me recently asking once more, and giving a explanation on why P1/P2 or M365 was not possible in her case. I understand that sometimes you might have to make due with what you have. I also figured others might be in the same boat.

So this script is made to monitor the O365 unified audit log and to compare the IP addresses to a database online. The database she used previously was very rate-limited and as such not really suitable. Not a lot of people know that there actually is a great 100% free online lookup service for IPs at https://ip2c.org/. The script uses that database as there are no API limitations. 🙂

As always, I’d like to note that this should just be one layer in your entire security model and you should not put all your faith in this. Enable MFA, and keep good security hygiene.

The Script

You’ll need the Secure Application Model for this script. The script currently checks all your tenants except the ones you state in “$Skiplist”.

######### Secrets #########
$ApplicationId = 'ApplicationID'
$ApplicationSecret = 'ApplicationSecret' | ConvertTo-SecureString -Force -AsPlainText
$TenantID = 'TenantID'
$RefreshToken = 'VeryLongRefreshToken'
$ExchangeRefreshToken = 'LongExchangeToken'
$UPN = "UPN-User-To-Generate-IDs"
######### Secrets #########

$AllowedCountries = @('Belgium', 'Netherlands', 'Germany', 'United Kingdom')
$Skiplist = @("bla1.onmicrosoft.com", "bla2.onmicrosoft.com", "bla2.onmicrosoft.com")

$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)
$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID 
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID 
Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken

$customers = Get-MsolPartnerContract -All | Where-Object { $_.DefaultDomainName -notin $SkipList }

$StrangeLocations = foreach ($customer in $customers) {
    Write-Host "Getting logon location details for $($customer.Name)" -ForegroundColor Green
    $token = New-PartnerAccessToken -ApplicationId 'a0c73c16-a7e3-4564-9a95-2bdf47383716'-RefreshToken $ExchangeRefreshToken -Scopes 'https://outlook.office365.com/.default' -Tenant $customer.TenantId
    $tokenValue = ConvertTo-SecureString "Bearer $($token.AccessToken)" -AsPlainText -Force
    $credential = New-Object System.Management.Automation.PSCredential($upn, $tokenValue)
    $customerId = $customer.DefaultDomainName
    $session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://ps.outlook.com/powershell-liveid?DelegatedOrg=$($customerId)&BasicAuthToOAuthConversion=true" -Credential $credential -Authentication Basic -AllowRedirection
    $null = Import-PSSession $session -CommandName Search-UnifiedAuditLog -AllowClobber
 

    $startDate = (Get-Date).AddDays(-1)
    $endDate = (Get-Date)
    Write-Host "Retrieving logs for $($customer.name)" -ForegroundColor Blue
    $logs = do {
        $log = Search-unifiedAuditLog -SessionCommand ReturnLargeSet -SessionId $customer.name -ResultSize 5000 -StartDate $startDate -EndDate $endDate -Operations UserLoggedIn
        Write-Host "Retrieved $($log.count) logs" -ForegroundColor Yellow
        $log
    } while ($Log.count % 5000 -eq 0 -and $log.count -ne 0)
    Write-Host "Finished Retrieving logs" -ForegroundColor Green
 
    $userIds = $logs.userIds | Sort-Object -Unique

    $LocationMonitoring = foreach ($userId in $userIds) {
 
        $searchResult = ($logs | Where-Object { $_.userIds -contains $userId }).auditdata | ConvertFrom-Json -ErrorAction SilentlyContinue
        $ips = $searchResult.clientip | Sort-Object -Unique
        foreach ($ip in $ips) {
            $IsIp = ($ip -as [ipaddress]) -as [bool]
            if ($IsIp) { $ipresult = (Invoke-restmethod -method get -uri "https://ip2c.org/$($ip)") -split ';' }
            [PSCustomObject]@{
                user              = $userId
                IP                = $ip
                Country           = ($ipresult | Select-Object -index 3)
                CountryCode       = ($ipresult | Select-Object -Index 1)
                Company           = $customer.Name
                TenantID          = $customer.tenantID
                DefaultDomainName = $customer.DefaultDomainName
            }
           
        }

    }
    foreach ($Location in $LocationMonitoring) {
        if ($Location.country -notin $AllowedCountries) { $Location }
    }
}
if (!$StrangeLocations) {
    $StrangeLocations = 'Healthy'
}

$StrangeLocations

And that’s it! Now you’re monitoring the locations from where users are logging on. As always, Happy PowerShelling!

Monitoring with PowerShell: Host isolation

So 2 weeks ago we talked about PowerShell canaries and using them as an early warning system. I got a couple of questions about the follow up, especially the host isolation. There’s a lot of tricks that you can use to isolate a host from others.

The version below allows you to isolate, and de-isolate, or re-integrate the OS into normal operation. To isolate the machine, we use the Windows Firewall, we block all outbound traffic, and we stop both the ‘workstation’ and ‘server’ service. This means the machine can no longer use shares or access network resources.

We also disable DNS by forwarding it to a fake server and clear the DNS cache. To make sure we can still remotely access the machine via our RMM or other tools, we add those to the host file and create an allow rule in the firewall.

All of these tasks can be reverted by changing the $isolation variable to $false. Don’t forget to add your own tooling to the list of allowed hosts.

The Script

$AllowedHosts = "google.com", "YourSuperDuperTurboRMM.com", '1.2.3.4'
$isolation = $false


write-host "Checking all IPs for hosts"
$ConvertedHosts = foreach ($Remotehost in $AllowedHosts) {
    $IsIp = ($RemoteHost -as [ipaddress]) -as [bool]
    if ($IsIp) {
        $ipList = $Remotehost
    }
    else {
        $IPList = (Resolve-DnsName $Remotehost).ip4address
    }
    Foreach ($IP in $IPList) {
        [PSCustomObject]@{
            Hostname = $Remotehost
            IP       = $IP
        }
    }
}


if ($isolation) {
    write-host "Checking if Windows firewall is enabled" -ForegroundColor Green
    $WindowsFirewall = Get-NetFirewallProfile | Where-Object { $_.Enabled -ne $false }
    if (!$WindowsFirewall) { 
        write-host "Windows firewall is enabled. Moving onto next task" -ForegroundColor Green
    }
    else {
        Write-Host "Windows Firewall is not enabled. Enabling for extra isolation" -ForegroundColor Yellow
        $WindowsFirewall | Set-NetFirewallProfile -Enabled:True
    }
    write-host "Preparing Windows Firewall isolation rule" -ForegroundColor Green

    $ExistingRule = Get-NetFirewallRule -DisplayName "ISOLATION: Allowed Hosts" -ErrorAction SilentlyContinue
    if ($ExistingRule) {
        write-host "Setting existing Windows Firewall isolation rule" -ForegroundColor Green
        Get-NetFirewallRule -Direction Outbound | Set-NetFirewallRule -Enabled:False
        set-NetFirewallRule -Direction Outbound -Enabled:True -Action Allow -RemoteAddress $ConvertedHosts.IP -DisplayName "ISOLATION: Allowed Hosts"
        get-netfirewallprofile | Set-NetFirewallProfile -DefaultOutboundAction Block
    }
    else {
        write-host "Creating Firewall isolation rule" -ForegroundColor Green
        Get-NetFirewallRule -Direction Outbound | Set-NetFirewallRule -Enabled:False
        New-NetFirewallRule -Direction Outbound -Enabled:True -Action Allow -RemoteAddress $ConvertedHosts.IP -DisplayName "ISOLATION: Allowed Hosts"
        get-netfirewallprofile | Set-NetFirewallProfile -DefaultOutboundAction Block
    }
    write-host "Adding list of hostnames to host file" -ForegroundColor Green
    foreach ($HostEntry in $ConvertedHosts) {
        Add-Content -Path "$($ENV:windir)/system32/drivers/etc/hosts" -Value "`n$($HostEntry.IP)`t`t$($HostEntry.Hostname)"
        start-sleep -Milliseconds 200
    }
    write-host 'Setting DNS to a static server that does not exist' -ForegroundColor Green
    Get-dnsclientserveraddress | Set-DnsClientServerAddress -ServerAddresses 127.0.0.127
    write-host "Clearing DNS cache" -ForegroundColor Green
    Clear-DnsClientCache
    write-host "Stopping 'client' and 'server' service. and setting to disabled" -ForegroundColor Green

    stop-service -name 'Workstation' -Force
    get-service -name 'Workstation' | Set-Service -StartupType Disabled
    stop-service -name 'Server' -Force 
    get-service -name 'server' | Set-Service -StartupType Disabled

    write-host 'Isolation performed. To undo these actions, please run the script with $Isolation set to false' -ForegroundColor Green
}
else {
    write-host "Undoing isolation process." -ForegroundColor Green
    write-host "Setting existing Windows Firewall isolation rule to allow traffic" -ForegroundColor Green
    Get-NetFirewallRule -Direction Outbound | Set-NetFirewallRule -Enabled:True
    Remove-NetFirewallRule -DisplayName "ISOLATION: Allowed Hosts" -ErrorAction SilentlyContinue
    get-netfirewallprofile | Set-NetFirewallProfile -DefaultOutboundAction Allow

    write-host "Removing list of hostnames from host file" -ForegroundColor Green
    foreach ($HostEntry in $ConvertedHosts) {
        $HostFile = Get-Content "$($ENV:windir)/system32/drivers/etc/hosts"
        $NewHostFile = $HostFile -replace "`n$($HostEntry.IP)`t`t$($HostEntry.Hostname)", ''
        Set-Content -Path "$($ENV:windir)/system32/drivers/etc/hosts" -Value $NewHostFile
        start-sleep -Milliseconds 200
    }
    write-host "Clearing DNS cache" -ForegroundColor Green
    Clear-DnsClientCache
    write-host "Setting DNS back to DHCP" -ForegroundColor Green
    Get-dnsclientserveraddress | Set-DnsClientServerAddress -ResetServerAddresses
    write-host "Starting 'Workstation' and 'server' service. and setting to disabled" -ForegroundColor Green
    get-service -name 'Workstation' | Set-Service -StartupType Automatic
    start-service 'Workstation'
    get-service -name 'server' | Set-Service -StartupType Automatic
    start-service 'Server'
    write-host 'Undo Isolation performed. To re-isolate, run the script with the $Isolation parameter set to true.' -ForegroundColor Green
}

And that’s it! This is of course just an example on how to isolate a host from others in case you want to, you should modify it to fir your environment.

As always, happy PowerShelling!

Monitoring with PowerShell: Notifying users of Windows Updates

With my recently released RunAsUser module there’s been an influx of questions on what it could be used for. I’ve tried to describe as much as possible on the github page and the previous blog about it. But one I wanted to talk about real quick is the ability to create Toast notifications.

Toast notifications are those little OS native notifications you side in the bottom right of your screen when receiving an e-mail. Our RMM system has the ability to create a notification using an application, but to be honest that notification looks like it came straight out of 1990.

To have a bit better user experience, and to also get the ability to do specific things with user-input I’ve decided to use Burnt Toast. Burnt Toast is a module that give you the ability to generate pretty toast messages with just a couple lines of code. Brilliant really!

Combining my RunAsUser module, and Burnt Toast we’re able to send a script to the currently logged on user’s session and get full functionality in there. One example is to reboot the computer after updates. So lets get going!

The script

The following script can be used to create a toast for reboots. It creates a ‘protocol handler’. It then toasts with a nice Gif of my logo to get the users attention. The script assumes you have trusted the PSGallery before hand.

#Checking if ToastReboot:// protocol handler is present
New-PSDrive -Name HKCR -PSProvider Registry -Root HKEY_CLASSES_ROOT -erroraction silentlycontinue | out-null
$ProtocolHandler = get-item 'HKCR:\ToastReboot' -erroraction 'silentlycontinue'
if (!$ProtocolHandler) {
    #create handler for reboot
    New-item 'HKCR:\ToastReboot' -force
    set-itemproperty 'HKCR:\ToastReboot' -name '(DEFAULT)' -value 'url:ToastReboot' -force
    set-itemproperty 'HKCR:\ToastReboot' -name 'URL Protocol' -value '' -force
    new-itemproperty -path 'HKCR:\ToastReboot' -propertytype dword -name 'EditFlags' -value 2162688
    New-item 'HKCR:\ToastReboot\Shell\Open\command' -force
    set-itemproperty 'HKCR:\ToastReboot\Shell\Open\command' -name '(DEFAULT)' -value 'C:\Windows\System32\shutdown.exe -r -t 00' -force
}

Install-Module -Name BurntToast
Install-module -Name RunAsUser
invoke-ascurrentuser -scriptblock {

    $heroimage = New-BTImage -Source 'https://media.giphy.com/media/eiwIMNkeJ2cu5MI2XC/giphy.gif' -HeroImage
    $Text1 = New-BTText -Content  "Message from IT"
    $Text2 = New-BTText -Content "Your IT provider has installed updates on your computer at $(get-date). Please select if you'd like to reboot now, or snooze this message."
    $Button = New-BTButton -Content "Snooze" -snooze -id 'SnoozeTime'
    $Button2 = New-BTButton -Content "Reboot now" -Arguments "ToastReboot:" -ActivationType Protocol
    $5Min = New-BTSelectionBoxItem -Id 5 -Content '5 minutes'
    $10Min = New-BTSelectionBoxItem -Id 10 -Content '10 minutes'
    $1Hour = New-BTSelectionBoxItem -Id 60 -Content '1 hour'
    $4Hour = New-BTSelectionBoxItem -Id 240 -Content '4 hours'
    $1Day = New-BTSelectionBoxItem -Id 1440 -Content '1 day'
    $Items = $5Min, $10Min, $1Hour, $4Hour, $1Day
    $SelectionBox = New-BTInput -Id 'SnoozeTime' -DefaultSelectionBoxItemId 10 -Items $Items
    $action = New-BTAction -Buttons $Button, $Button2 -inputs $SelectionBox
    $Binding = New-BTBinding -Children $text1, $text2 -HeroImage $heroimage
    $Visual = New-BTVisual -BindingGeneric $Binding
    $Content = New-BTContent -Visual $Visual -Actions $action
    Submit-BTNotification -Content $Content
}

And that’s it! you must be wondering how it looks, so lets show you that too!

And that’s it! as always, Happy PowerShelling!

Automating with PowerShell: Impersonating users while running as SYSTEM

I’ve demonstrated in a couple of blogs like the OneDrive Sync Monitoring and the OneDrive File Monitoring that it’s possible to impersonate the current user when a script is actually started by the NT AUTHORITY\SYSTEM account.

My friends asked me if it would not be possible for other scripts to use the same approach. In the previous blogs I’ve shown that by loading the component by MurrayJu we got the ability to impersonate. I converted this into a module which you can find on https://github.com/KelvinTegelaar/RunAsUser.

This module allows you to run any script that is initiated by SYSTEM and execute it as the currently logged on user. This gives us a lot of freedom. Most RMM systems(and intune!) don’t allow monitoring under the currently logged on user. This often means that you have to work around accessing resources directly in their profile.

Some examples would be accessing installers that run in the users AppData folder, or registry items created under HKCU. Another could be scripts that require accessing shared drives or printers that are only mapped in user-space.

This is also super useful for intune scripts, because you just need to present things to the user or install things using their credentials directly.

Using the module

So, using the module is very straight forward. To install the module execute the following command:

install-module RunAsUser

After you’ve installed the module you can jump straight into scripting. There are some things to account for; The script requires SYSTEM credentials or the SeDelegateSessionUserImpersonatePrivilege privilege.

The second thing is that the output can’t be directly captured. If you want to get output from the script you’ll have to write it to a file and pick that up again in the SYSTEM session. This might sound a little confusing so I have an example below.

$scriptblock = {
$IniFiles = Get-ChildItem "$ENV:LOCALAPPDATA\Microsoft\OneDrive\settings\Business1" -Filter 'ClientPolicy*' -ErrorAction SilentlyContinue

if (!$IniFiles) {
    write-host 'No Onedrive configuration files found. Stopping script.'
    exit 1
}
 
$SyncedLibraries = foreach ($inifile in $IniFiles) {
    $IniContent = get-content $inifile.fullname -Encoding Unicode
    [PSCustomObject]@{
        'Item Count' = ($IniContent | Where-Object { $_ -like 'ItemCount*' }) -split '= ' | Select-Object -last 1
        'Site Name'  = ($IniContent | Where-Object { $_ -like 'SiteTitle*' }) -split '= ' | Select-Object -last 1
        'Site URL'   = ($IniContent | Where-Object { $_ -like 'DavUrlNamespace*' }) -split '= ' | Select-Object -last 1
    }
}
$SyncedLibraries | ConvertTo-Json | Out-File 'C:\programdata\Microsoft OneDrive\OneDriveLibraries.txt'
}
try{
Invoke-AsCurrentUser -scriptblock $scriptblock
} catch{
write-error "Something went wrong"
}
start-sleep 2 #Sleeping 2 seconds to allow script to write to disk.
$SyncedLibraries = (get-content "C:\programdata\Microsoft OneDrive\OneDriveLibraries.txt" | convertfrom-json)
if (($SyncedLibraries.'Item count' | Measure-Object -Sum).sum -gt '280000') { 
write-host "Unhealthy - Currently syncing more than 280k files. Please investigate."
$SyncedLibraries
}
else {
write-host "Healthy - Syncing less than 280k files."
}

In the script, we’re executing the Script Block using Invoke-AsCurrentUser command. This runs that entire block of code as the currently logged on user. We then sleep for 2 seconds allowing the script block to finish writing to disk. After this finishes, we pick up the file again under the system account and process the results.

So in short; using this module opens up a lot of user-based monitoring for systems that normally only allow executing under the SYSTEM account. Hopefully this helps people solve some challenges.

As a closing remark I’d like to thank Ben Reader (@Powers_hell) for his help on the module. He assisted in cleaning up the code right after release, making it all look and feel a lot smoother and he assisted in better error handling. Thanks Buddy! 🙂

As always, Happy PowerShelling.

Monitoring with PowerShell: Monitoring the Onedrive client limitations

I wrote about monitoring the Onedrive sync status some time ago. That blog gained a lot of popularity. Implementing it allows you to monitor Onedrive, which runs in user mode while using your RMM that often runs under system.

So after a while we’ve noticed that we bumped into issues that weren’t getting captured by monitoring just the sync state. Onedrive tends to get performance issues when syncing more than 300k files. This also counts for files on demand. There are some other limitations but the best write-up about this is here. Large libraries just tend to make things a little screwy.

To tackle this you can use the following script. This gets all the synced sites, lists the Item count, checks if the sum is higher than 280k and alerts if that is true. This helped us tackle performance problems and nip them in the bud pretty early on 🙂

$IniFiles = Get-ChildItem "$ENV:LOCALAPPDATA\Microsoft\OneDrive\settings\Business1" -Filter 'ClientPolicy*' -ErrorAction SilentlyContinue

if (!$IniFiles) {
    write-host "No Onedrive configuration files found. Stopping script."
    exit 1
}

$SyncedLibraries = foreach ($inifile in $IniFiles) {
    $IniContent = get-content $inifile.fullname -Encoding Unicode
    [PSCustomObject]@{
        'Item Count' = ($IniContent | Where-Object { $_ -like 'ItemCount*' }) -split '= ' | Select-Object -last 1
        'Site Name'  = ($IniContent | Where-Object { $_ -like 'SiteTitle*' }) -split '= ' | Select-Object -last 1
        'Site URL'   = ($IniContent | Where-Object { $_ -like 'DavUrlNamespace*' }) -split '= ' | Select-Object -last 1
    }
}

if (($SyncedLibraries.'Item count' | Measure-Object -Sum).sum -gt '280000') { 
    write-host "Unhealthy - Currently syncing more than 280k files. Please investigate." 
    $SyncedLibraries
}
else {
    write-host "Healthy - Syncing less than 280k files."
}

Running this script directly often doesn’t work, because we’ll bump into the same issue we have with the OneDrive monitoring script. To fix that, use the version below. This impersonates the current user. Remember that this should run as NT AUTHORITY\SYSTEM.

$Source = @"
using System;  
using System.Runtime.InteropServices;

namespace murrayju.ProcessExtensions  
{
    public static class ProcessExtensions
    {
        #region Win32 Constants

        private const int CREATE_UNICODE_ENVIRONMENT = 0x00000400;
        private const int CREATE_NO_WINDOW = 0x08000000;

        private const int CREATE_NEW_CONSOLE = 0x00000010;

        private const uint INVALID_SESSION_ID = 0xFFFFFFFF;
        private static readonly IntPtr WTS_CURRENT_SERVER_HANDLE = IntPtr.Zero;

        #endregion

        #region DllImports

        [DllImport("advapi32.dll", EntryPoint = "CreateProcessAsUser", SetLastError = true, CharSet = CharSet.Ansi, CallingConvention = CallingConvention.StdCall)]
        private static extern bool CreateProcessAsUser(
            IntPtr hToken,
            String lpApplicationName,
            String lpCommandLine,
            IntPtr lpProcessAttributes,
            IntPtr lpThreadAttributes,
            bool bInheritHandle,
            uint dwCreationFlags,
            IntPtr lpEnvironment,
            String lpCurrentDirectory,
            ref STARTUPINFO lpStartupInfo,
            out PROCESS_INFORMATION lpProcessInformation);

        [DllImport("advapi32.dll", EntryPoint = "DuplicateTokenEx")]
        private static extern bool DuplicateTokenEx(
            IntPtr ExistingTokenHandle,
            uint dwDesiredAccess,
            IntPtr lpThreadAttributes,
            int TokenType,
            int ImpersonationLevel,
            ref IntPtr DuplicateTokenHandle);

        [DllImport("userenv.dll", SetLastError = true)]
        private static extern bool CreateEnvironmentBlock(ref IntPtr lpEnvironment, IntPtr hToken, bool bInherit);

        [DllImport("userenv.dll", SetLastError = true)]
        [return: MarshalAs(UnmanagedType.Bool)]
        private static extern bool DestroyEnvironmentBlock(IntPtr lpEnvironment);

        [DllImport("kernel32.dll", SetLastError = true)]
        private static extern bool CloseHandle(IntPtr hSnapshot);

        [DllImport("kernel32.dll")]
        private static extern uint WTSGetActiveConsoleSessionId();

        [DllImport("Wtsapi32.dll")]
        private static extern uint WTSQueryUserToken(uint SessionId, ref IntPtr phToken);

        [DllImport("wtsapi32.dll", SetLastError = true)]
        private static extern int WTSEnumerateSessions(
            IntPtr hServer,
            int Reserved,
            int Version,
            ref IntPtr ppSessionInfo,
            ref int pCount);

        #endregion

        #region Win32 Structs

        private enum SW
        {
            SW_HIDE = 0,
            SW_SHOWNORMAL = 1,
            SW_NORMAL = 1,
            SW_SHOWMINIMIZED = 2,
            SW_SHOWMAXIMIZED = 3,
            SW_MAXIMIZE = 3,
            SW_SHOWNOACTIVATE = 4,
            SW_SHOW = 5,
            SW_MINIMIZE = 6,
            SW_SHOWMINNOACTIVE = 7,
            SW_SHOWNA = 8,
            SW_RESTORE = 9,
            SW_SHOWDEFAULT = 10,
            SW_MAX = 10
        }

        private enum WTS_CONNECTSTATE_CLASS
        {
            WTSActive,
            WTSConnected,
            WTSConnectQuery,
            WTSShadow,
            WTSDisconnected,
            WTSIdle,
            WTSListen,
            WTSReset,
            WTSDown,
            WTSInit
        }

        [StructLayout(LayoutKind.Sequential)]
        private struct PROCESS_INFORMATION
        {
            public IntPtr hProcess;
            public IntPtr hThread;
            public uint dwProcessId;
            public uint dwThreadId;
        }

        private enum SECURITY_IMPERSONATION_LEVEL
        {
            SecurityAnonymous = 0,
            SecurityIdentification = 1,
            SecurityImpersonation = 2,
            SecurityDelegation = 3,
        }

        [StructLayout(LayoutKind.Sequential)]
        private struct STARTUPINFO
        {
            public int cb;
            public String lpReserved;
            public String lpDesktop;
            public String lpTitle;
            public uint dwX;
            public uint dwY;
            public uint dwXSize;
            public uint dwYSize;
            public uint dwXCountChars;
            public uint dwYCountChars;
            public uint dwFillAttribute;
            public uint dwFlags;
            public short wShowWindow;
            public short cbReserved2;
            public IntPtr lpReserved2;
            public IntPtr hStdInput;
            public IntPtr hStdOutput;
            public IntPtr hStdError;
        }

        private enum TOKEN_TYPE
        {
            TokenPrimary = 1,
            TokenImpersonation = 2
        }

        [StructLayout(LayoutKind.Sequential)]
        private struct WTS_SESSION_INFO
        {
            public readonly UInt32 SessionID;

            [MarshalAs(UnmanagedType.LPStr)]
            public readonly String pWinStationName;

            public readonly WTS_CONNECTSTATE_CLASS State;
        }

        #endregion

        // Gets the user token from the currently active session
        private static bool GetSessionUserToken(ref IntPtr phUserToken)
        {
            var bResult = false;
            var hImpersonationToken = IntPtr.Zero;
            var activeSessionId = INVALID_SESSION_ID;
            var pSessionInfo = IntPtr.Zero;
            var sessionCount = 0;

            // Get a handle to the user access token for the current active session.
            if (WTSEnumerateSessions(WTS_CURRENT_SERVER_HANDLE, 0, 1, ref pSessionInfo, ref sessionCount) != 0)
            {
                var arrayElementSize = Marshal.SizeOf(typeof(WTS_SESSION_INFO));
                var current = pSessionInfo;

                for (var i = 0; i < sessionCount; i++)
                {
                    var si = (WTS_SESSION_INFO)Marshal.PtrToStructure((IntPtr)current, typeof(WTS_SESSION_INFO));
                    current += arrayElementSize;

                    if (si.State == WTS_CONNECTSTATE_CLASS.WTSActive)
                    {
                        activeSessionId = si.SessionID;
                    }
                }
            }

            // If enumerating did not work, fall back to the old method
            if (activeSessionId == INVALID_SESSION_ID)
            {
                activeSessionId = WTSGetActiveConsoleSessionId();
            }

            if (WTSQueryUserToken(activeSessionId, ref hImpersonationToken) != 0)
            {
                // Convert the impersonation token to a primary token
                bResult = DuplicateTokenEx(hImpersonationToken, 0, IntPtr.Zero,
                    (int)SECURITY_IMPERSONATION_LEVEL.SecurityImpersonation, (int)TOKEN_TYPE.TokenPrimary,
                    ref phUserToken);

                CloseHandle(hImpersonationToken);
            }

            return bResult;
        }

        public static bool StartProcessAsCurrentUser(string appPath, string cmdLine = null, string workDir = null, bool visible = true)
        {
            var hUserToken = IntPtr.Zero;
            var startInfo = new STARTUPINFO();
            var procInfo = new PROCESS_INFORMATION();
            var pEnv = IntPtr.Zero;
            int iResultOfCreateProcessAsUser;

            startInfo.cb = Marshal.SizeOf(typeof(STARTUPINFO));

            try
            {
                if (!GetSessionUserToken(ref hUserToken))
                {
                    throw new Exception("StartProcessAsCurrentUser: GetSessionUserToken failed.");
                }

                uint dwCreationFlags = CREATE_UNICODE_ENVIRONMENT | (uint)(visible ? CREATE_NEW_CONSOLE : CREATE_NO_WINDOW);
                startInfo.wShowWindow = (short)(visible ? SW.SW_SHOW : SW.SW_HIDE);
                startInfo.lpDesktop = "winsta0\\default";

                if (!CreateEnvironmentBlock(ref pEnv, hUserToken, false))
                {
                    throw new Exception("StartProcessAsCurrentUser: CreateEnvironmentBlock failed.");
                }

                if (!CreateProcessAsUser(hUserToken,
                    appPath, // Application Name
                    cmdLine, // Command Line
                    IntPtr.Zero,
                    IntPtr.Zero,
                    false,
                    dwCreationFlags,
                    pEnv,
                    workDir, // Working directory
                    ref startInfo,
                    out procInfo))
                {
                    throw new Exception("StartProcessAsCurrentUser: CreateProcessAsUser failed.\n");
                }

                iResultOfCreateProcessAsUser = Marshal.GetLastWin32Error();
            }
            finally
            {
                CloseHandle(hUserToken);
                if (pEnv != IntPtr.Zero)
                {
                    DestroyEnvironmentBlock(pEnv);
                }
                CloseHandle(procInfo.hThread);
                CloseHandle(procInfo.hProcess);
            }
            return true;
        }
    }
}


"@

Add-Type -ReferencedAssemblies 'System', 'System.Runtime.InteropServices' -TypeDefinition $Source -Language CSharp 

$scriptblock = {
    $IniFiles = Get-ChildItem "$ENV:LOCALAPPDATA\Microsoft\OneDrive\settings\Business1" -Filter 'ClientPolicy*' -ErrorAction SilentlyContinue

    if (!$IniFiles) {
        write-host 'No Onedrive configuration files found. Stopping script.'
        exit 1
    }
    
    $SyncedLibraries = foreach ($inifile in $IniFiles) {
        $IniContent = get-content $inifile.fullname -Encoding Unicode
        [PSCustomObject]@{
            'Item Count' = ($IniContent | Where-Object { $_ -like 'ItemCount*' }) -split '= ' | Select-Object -last 1
            'Site Name'  = ($IniContent | Where-Object { $_ -like 'SiteTitle*' }) -split '= ' | Select-Object -last 1
            'Site URL'   = ($IniContent | Where-Object { $_ -like 'DavUrlNamespace*' }) -split '= ' | Select-Object -last 1
        }
    }
    $SyncedLibraries | ConvertTo-Json | Out-File 'C:\programdata\Microsoft OneDrive\OneDriveLibraries.txt'
}


[murrayju.ProcessExtensions.ProcessExtensions]::StartProcessAsCurrentUser("C:\Windows\System32\WindowsPowershell\v1.0\Powershell.exe", "-command $($scriptblock)", "C:\Windows\System32\WindowsPowershell\v1.0", $false)
start-sleep 5

$SyncedLibraries = (get-content "C:\programdata\Microsoft OneDrive\OneDriveLibraries.txt" | convertfrom-json)
if (($SyncedLibraries.'Item count' | Measure-Object -Sum).sum -gt '280000') { 
    write-host "Unhealthy - Currently syncing more than 280k files. Please investigate." 
    $SyncedLibraries
}
else {
    write-host "Healthy - Syncing less than 280k files."
}

And that’s it! as always PowerShelling 🙂

Automating with PowerShell: Teams Automapping

Something like 3 years ago I wrote a blog about using PowerShell to configure Onedrive sites using the odopen protocol. This was pretty much the only method to configure Onedrive to automatically map sites and have a zero-touch configuration.

Of course over the years the management side has improved and OneDrive usage has exploded. Unfortunately the onedrive automatic mapping structure isn’t where it should be yet. For example the GPO/intune method for automatic mapping configuration can take up to 8 hours to apply on any client.

During migrations and new deployment this is pretty much unacceptable. To make sure that mapping would be instant I’ve decided to create two scripts; One Azure Function which I’ve called AzMapper, and another client based script.

AzMapper requires you to create an Azure Function. To do that follow this manual and replace the script with the one below. This script is compatible with the Secure Application Model, and as such it can check all of your partner tenants too. Meaning you’ll only need to host a single version.

Replace $ApplicationID and $ApplicationSecret with your own from the Secure App Model.

You’ll also need to give your Secure Application Model a little more permissions, specifically to read the groups:

  • Go to the Azure Portal.
  • Click on Azure Active Directory, now click on “App Registrations”.
  • Find your Secure App Model application. You can search based on the ApplicationID.
  • Go to “API Permissions” and click Add a permission.
  • Choose “Microsoft Graph” and “Application permission”.
  • Search for “Sites” and click on “Sites.Read.All”. Click on add permission.
  • Search for “Team” and click on “TeamMember.Read.All”. Click on add permission.
  • Search for “Team” and click on “Team.ReadBasic.Alll”. Click on add permission.
  • Do the same for “Delegate Permissions”.
  • Finally, click on “Grant Admin Consent for Company Name.

AzMapper

using namespace System.Net
param($Request, $TriggerMetadata)
#
$ApplicationId = 'ApplicationID'
$ApplicationSecret = 'ApplicationSecret' 
#
$TenantID = $Request.Query.TenantID
$user = $Request.Query.Username

$body = @{
    'resource'      = 'https://graph.microsoft.com'
    'client_id'     = $ApplicationId
    'client_secret' = $ApplicationSecret
    'grant_type'    = "client_credentials"
    'scope'         = "openid"
}
$ClientToken = Invoke-RestMethod -Method post -Uri "https://login.microsoftonline.com/$($tenantid)/oauth2/token" -Body $body -ErrorAction Stop
$headers = @{ "Authorization" = "Bearer $($ClientToken.access_token)" }
$UserID = (Invoke-RestMethod -Uri "https://graph.microsoft.com/beta/users/$($user)" -Headers $Headers -Method Get -ContentType "application/json").id

$AllTeamsURI = "https://graph.microsoft.com/beta/users/$($UserID)/JoinedTeams"
$Teams = (Invoke-RestMethod -Uri $AllTeamsURI -Headers $Headers -Method Get -ContentType "application/json").value
$MemberOf = foreach ($Team in $Teams) {
    $SiteRootUri = "https://graph.microsoft.com/beta/groups/$($Team.id)/sites/root"
    $SiteRootReq = Invoke-RestMethod -Uri $SiteRootUri  -Headers $Headers -Method Get -ContentType "application/json"
    $SiteDrivesUri = "https://graph.microsoft.com/beta/groups/$($Team.id)/sites/root/Lists"
    $SitesDrivesReq = (Invoke-RestMethod -Uri $SiteDrivesUri -Headers $Headers -Method Get -ContentType "application/json").value | where-object { $_.Name -eq "Shared Documents" }
    $DriveInfo = $SitesDrivesReq.ParentReference.siteid -split ','
    if($SiteRootReq.description -like "*no-auto-map*"){ continue }
    if ($null -eq [System.Web.HttpUtility]::UrlEncode($SitesDrivesReq.id)) { continue }
    [pscustomobject] @{
        SiteID    = [System.Web.HttpUtility]::UrlEncode("{$($DriveInfo[1])}")
        WebID     = [System.Web.HttpUtility]::UrlEncode("{$($DriveInfo[2])}")
        ListID    = [System.Web.HttpUtility]::UrlEncode($SitesDrivesReq.id)
        WebURL    = [System.Web.HttpUtility]::UrlEncode($SiteRootReq.webUrl)
        Webtitle  = [System.Web.HttpUtility]::UrlEncode($($Team.displayName)).Replace("+", "%20")
        listtitle = [System.Web.HttpUtility]::UrlEncode($SitesDrivesReq.name)
    }

}

# Associate values to output bindings by calling 'Push-OutputBinding'.
Push-OutputBinding -Name Response -Value ([HttpResponseContext]@{
    StatusCode = [HttpStatusCode]::OK
    Body = $MemberOf
})

If you want sites to be skipped, you can add “no-auto-map” in the description in Teams. this will cause the script to skip that site.

Now we can browse to the AzMapper URL to check if our function is working. To test the AzMapper click on “Get Function URL” in the Azure Portal and copy the URL required. You’ll end up with something like:

https://azMapper.azurewebsites.net/api/AzOneMap?code=verylongapicodehere==

an example of a test url would be:

https://azMapper.azurewebsites.net/api/AzOneMap?code=verylongapicodehere==&Tenantid=TENANTIDHERE&amp;Username=USERNAMEHERE

This should return all sites for that specific user, it will contain exactly the information you need to create a odopen:// URL.

Client script

So you can schedule the client script using whatever method you prefer – as a startup script, using your RMM, or just a one-off during migrations. When a site is already configured to be synced it will skip this site. We do assume that OneDrive is already configured and just waiting to sync sites. 😉

#########################
$AutoMapURL = "https://azmapper.azurewebsites.net/api/AzOneMap"
$AutomapAPIKey = "TheAPIKeyFromTheAppAlsoKnownAsTheCode"
#########################

write-host "Grabbing OneDrive info from registry" -ForegroundColor Green
$TenantID = (get-itemproperty "HKCU:\SOFTWARE\Microsoft\OneDrive\Accounts\Business1").ConfiguredTenantId
$TenantDisplayName = (get-itemproperty "HKCU:\SOFTWARE\Microsoft\OneDrive\Accounts\Business1").Displayname
$username = (get-itemproperty "HKCU:\SOFTWARE\Microsoft\OneDrive\Accounts\Business1").userEmail
$CurrentlySynced = (get-itemproperty "HKCU:\SOFTWARE\Microsoft\OneDrive\Accounts\Business1\Tenants\$($tenantdisplayname)" -ErrorAction SilentlyContinue)
Write-Host "Retrieving all possible teams."  -ForegroundColor Green
$ListOfTeams = Invoke-RestMethod -Method get -uri "$($AutoMapURL)?Tenantid=$($TenantID)&Username=$($Username)&amp;code=$($AutomapAPIKey)" -UseBasicParsing
$Upn = [System.Web.HttpUtility]::Urldecode($username)
foreach ($Team in $ListOfTeams) {
    write-host "Checking if site is not already synced" -ForegroundColor Green
    $sitename = [System.Web.HttpUtility]::Urldecode($Team.Webtitle)
    if ($CurrentlySynced.psobject.Properties.name -like "*$($sitename) -*") {
        write-host "Site $Sitename is already being synced. Skipping." -ForegroundColor Green 
        continue
    }
    else {
        write-host "Mapping Team: $Sitename" -ForegroundColor Green
        Start-Process "odopen://sync/?siteId=$($team.SiteID)&webId=$($team.webid)&amp;amp;listId=$($team.ListID)&userEmail=$upn&amp;webUrl=$($team.Weburl)&webtitle=$($team.Webtitle)"
        start-sleep 5
    }
}

And that’s it! running this script will map all the sites this specific user has access to, it won’t give weird pop-ups for users that do not have access and this should help you ease all Teams deployments by a lot. As always, Happy PowerShelling!

Monitoring with PowerShell: AD KRBTGT & making your own canaries

I decided this time I’m gonna be combining two small blogs, because they’re both pretty small and easy. Both are somewhat security oriented. The first part of the blog we will tackle monitoring the KRBTGT password. This needs to be reset on a regular schedule to ensure bad actors can’t abuse it.

The second part we’ll focus on creating our own ‘Canary’ files. These files can be used for a lot of things but the most common is to detect if ransomware has touched them in someway or the other. So, lets get started!

Monitoring KRBTGT Password age

So it’s actually straight forward to monitor the KRBTGT account, as it’s just a AD account. We’ll monitor this by grabbing the PasswordLastSet Attributes from the Active Directory. If you want to automatically resolve this, I’d strongly suggest to look at the script in this Github.

$Days = (Get-Date).AddDays(-31)
$Account = Get-AdUser krbtgt -property passwordlastset
$Setdate = if($Account.PasswordLastSet -gt $Days){ "Healthy - Password set date $($Account.Passwordlastset)" } else {" Unhealthy - Password set date $($Account.Passwordlastset)" }

You can change the amount of days to what you are comfortable with. I believe the documentation doesn’t have a strong suggestion in how much you should, but as this is a completely automated solution we perform this on a monthly basis.

Creating and monitoring file canaries

So, canaries are files that you place on strategic locations on a machine to check if the files aren’t being touched, corrupted, or encrypted in any way. Primarily they are used to prevent a full encryption of a computer and minimize data loss and lateral movement.

So with this script, we create canaries in a couple of locations;

  • The My Documents folder of each user
  • The Desktop Folder of each user
  • The root of each drive on the machine

You’ll also be able to create them in locations you want by adding to the $CreateLocations variable. We create the files as hidden, so users should not see the file, the file name will be canaryfile.pdf, even though it’s just a simple text file.

So the script creates a file in each location, and immediately starts alerting on two properties; If the file has been edited in the past hour, and if the file contains the correct string. I’d advice to apply the monitoring to the device, wait an hour, and then actually start alerting on it or reacting.

$CreateLocations = @('AllDesktops', 'AllDocuments', 'AllDrives', 'C:\temp')
$FileContent = "This file is a special file created by your managed services provider. For more information contact the IT Support desk."

foreach ($Locations in $CreateLocations) {
    $AllLocations = switch ($Locations) {
        "AllDesktops" { (Get-ChildItem "C:\Users" -Recurse -Force -filter 'Desktop' -Depth 3).FullName }
        "AllDocuments" { (Get-ChildItem "C:\Users" -Recurse -Force -Filter 'Documents' -Depth 3).fullname }
        "AllDrives" { ([System.IO.DriveInfo]::getdrives() | Where-Object { $_.DriveType -eq 'Fixed' }).Name }
        default { $Locations }
    }
  $CanaryStatus = foreach ($Location in $AllLocations) {
        if ((test-path "$Location\CanaryFile.pdf") -eq $false) {
            $File = New-Item $Location -Name "CanaryFile.pdf" -Value $FileContent
            $file.Attributes = 'hidden'
        }
        else {
            $ExistingFile = get-item "$Location\CanaryFile.pdf" -Force
            if ($ExistingFile.LastWriteTime -gt (get-date).AddHours(-1)) { "$Location\CanaryFile.pdf is unhealthy. The LastWriteTime was $($ExistingFile.LastWriteTime)" }
            $ExistingFileContents = get-content $ExistingFile -Force
            if ($ExistingFileContents -ne $FileContent) { "$Location\CanaryFile.pdf is unhealthy. The contents do not match. This is a sign the file has most likely been encrypted" }
        }
    }
}
if(!$CanaryStatus){
    $CanaryStatus = "Healthy"
}

$CanaryStatus

If you feel confident enough about this, you could set up some self-healing like disabling network access, or shutting the device down before the machine is completely encrypted. And that’s it. As always, Happy PowerShelling!

Monitoring with PowerShell: Monitoring Shodan results (in-depth)

Sometime ago I made a blog about monitoring your environments by using PowerShell and the Shodan API. This blog was well received but I felt like it could use a lot of improvements. The data returned wasn’t all that useful for some, and sometimes you want to exclude specific ports in case of an actual webserver for example.

So I’ve made an updated version that is able to return more info, This version allows you to add both exclusions, and get more information like who the ISP is, and if known vulnerabilities have been found. It also keeps a history of the previous result and runs a compare against this, to check if something has been changed.

As always these scripts are designed to run with your RMM, on environments where you’d expect no open ports or a very limited subset. Monitoring this on all your devices probably aint the best plan 🙂

$PortExclusions = @('80', '443')
$CurrentIP = (Invoke-WebRequest -uri "http://ifconfig.me/ip" -UseBasicParsing ).Content
$ListIPs = @($CurrentIP)
$Shodan = foreach ($ip in $ListIPs) {
    try {
        $ReqFull = Invoke-RestMethod -uri "https://api.shodan.io/shodan/host/$($ip)?key=$APIKEY"
    }
    catch {
        write-host "Could not get information for host $IP. Error was:  $($_.Exception.Message)"
        continue
    }
    foreach ($req in $ReqFull.data | Where-Object { $_.port -notin $PortExclusions }) {
        [PSCustomObject]@{
            'IP'                    = $req.ip_str
            'Detected OS'           = $Req.data.OS
            'Detected Port'         = $req.port
            'Detected ISP'          = $req.isp
            'Detected Data'         = $req.data
            'Found vulnerabilities' = $req.opts.vulns
        }
    }
}

$previousResult = get-content "$($Env:Programdata)\ShodanScan\LastScan.txt" -ErrorAction SilentlyContinue | ConvertFrom-Json
if($previousResult) {$CompareObject = Compare-Object $previousresult $Shodan}
if ($CompareObject) { Write-Host "There is a different between the previous result and the current result. Please investigate" }
new-item "$($Env:Programdata)\ShodanScan" -ItemType Directory -Force
ConvertTo-Json $Shodan  | Out-File "$($Env:Programdata)\ShodanScan\LastScan.txt"


if (!$Shodan) { write-host "Healthy - Hosts are not found in Shodan." } else { write-host "Hosts are found in Shodan. Information: $($Shodan | out-string)" } 

And that’s it! as always, Happy PowerShelling.

Monitoring with PowerShell: Monitoring Azure AD Devices and users age.

So we’re managing more and more cloud only clients. This is fantastic because you don’t have to worry about all the old worries like keeping a server online and updated. Another cool thing is that it becomes a lot easier to manage devices and endpoints.

The thing is, even with Azure AD you still have maintenance tasks that never seem to disappear. This time, we’re picking up the age old issue of keeping your Active Directory cleaned up. In this case; The Azure Active Directory.

With the following script we detect a couple of things; any user that has not logged in for 90 days, but also any device that has not logged into the Azure AD for 90 days. Finding these older devices gives you the ability to see if your off-boarding procedures are running well and you’re not having a total mess.

A good real life example came to me recently; one of our employees had a device stolen and I logged into the intune portal to start a remote wipe. The problem was that this user had around 10 devices in the portal and I could not be sure which was the current one. If I had maintained the portal and ran this script more often, finding the device would’ve been much easier.

Lets get to the script! As always I’ll publish two versions

Single tenant script

########################## Secure App Model Settings ############################
$ApplicationId = 'YourApplicationID'
$ApplicationSecret = 'YourApplicationSecret' | Convertto-SecureString -AsPlainText -Force
$TenantID = 'YourTenantID'
$RefreshToken = 'YourRefreshToken'
$UPN = "YourUPN"
$CustomerTenant = "Customer.onmicrosoft.com"
########################## Script Settings  ############################
$Date = (get-date).AddDays(-90)
$Baseuri = "https://graph.microsoft.com/beta"
write-host "Generating token to log into Azure AD. Grabbing all tenants" -ForegroundColor Green
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)
$CustGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes "https://graph.microsoft.com/.default" -ServicePrincipal -Tenant $CustomerTenant
write-host "$($Tenant.Displayname): Starting process." -ForegroundColor Green
$Header = @{
    Authorization = "Bearer $($CustGraphToken.AccessToken)"
}
write-host " $($Tenant.Displayname): Grabbing all Users that have not logged in for 90 days." -ForegroundColor Green
$UserList = (Invoke-RestMethod -Uri "$baseuri/users/?`$select=displayName,UserPrincipalName,signInActivity" -Headers $Header -Method get -ContentType "application/json").value | select-object DisplayName, UserPrincipalName, @{Name = 'LastLogon'; Expression = { [datetime]::Parse($_.SignInActivity.lastSignInDateTime) } } | Where-Object { $_.LastLogon -lt $Date }
$devicesList = (Invoke-RestMethod -Uri "$baseuri/devices" -Headers $Header -Method get -ContentType "application/json").value | select-object Displayname, @{Name = 'LastLogon'; Expression = { [datetime]::Parse($_.approximateLastSignInDateTime) } }

   
$OldObjects = [PSCustomObject]@{
    Users   = $UserList | where-object { $_.LastLogon -ne $null }
    Devices = $devicesList | Where-Object { $_.LastLogon -lt $Date }
}

if (!$OldObjects) { write-host "No old objects found in any tenant" } else { write-host "Old objects found."; $Oldobjects }

All tenants script

########################## Secure App Model Settings ############################
$ApplicationId = 'YourApplicationID'
$ApplicationSecret = 'YourApplicationSecret' | Convertto-SecureString -AsPlainText -Force
$TenantID = 'YourTenantID'
$RefreshToken = 'YourRefreshToken'
$UPN = "YourUPN"
########################## Script Settings  ############################
$Date = (get-date).AddDays((-90))
$Baseuri = "https://graph.microsoft.com/beta"
write-host "Generating token to log into Azure AD. Grabbing all tenants" -ForegroundColor Green
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)
$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID 
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID 
Connect-AzureAD -AadAccessToken $aadGraphToken.AccessToken -AccountId $upn -MsAccessToken $graphToken.AccessToken -TenantId $tenantID | Out-Null
$tenants = Get-AzureAdContract -All:$true
Disconnect-AzureAD
$OldObjects = foreach ($Tenant in $Tenants) {

    $CustGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes "https://graph.microsoft.com/.default" -ServicePrincipal -Tenant $tenant.CustomerContextId
    write-host "$($Tenant.Displayname): Starting process." -ForegroundColor Green
    $Header = @{
        Authorization = "Bearer $($CustGraphToken.AccessToken)"
    }
    write-host " $($Tenant.Displayname): Grabbing all Users that have not logged in for 90 days." -ForegroundColor Green
    $UserList = (Invoke-RestMethod -Uri "$baseuri/users/?`$select=displayName,UserPrincipalName,signInActivity" -Headers $Header -Method get -ContentType "application/json").value | select-object DisplayName,UserPrincipalName,@{Name='LastLogon';Expression={[datetime]::Parse($_.SignInActivity.lastSignInDateTime)}} | Where-Object { $_.LastLogon -lt $Date }
    $devicesList = (Invoke-RestMethod -Uri "$baseuri/devices" -Headers $Header -Method get -ContentType "application/json").value | select-object Displayname,@{Name='LastLogon';Expression={[datetime]::Parse($_.approximateLastSignInDateTime)}}

   
    [PSCustomObject]@{
        Users = $UserList | where-object {$_.LastLogon -ne $null}
        Devices = $devicesList | Where-Object {$_.LastLogon -lt $Date}
    }
}

if(!$OldObjects) { write-host "No old objects found in any tenant"} else { write-host "Old objects found."; $Oldobjects}

And that’s it! as always, Happy PowerShelling!

Monitoring with PowerShell: Monitoring Active Directory Health

Some time ago I wrote a blog about monitoring Active Directory replication. A couple of days ago a friend in Slack asked me if I have anything for monitoring the entire general health of a domain controller, and not just replication.

So I researched some options, I found this blog by Adam, which was 90% of what I needed. I just wanted a script that was slightly more complete – Not just the health is important but also specific settings are.

For example; we want the replication time to be under 30 minutes, we want the recycle bin to always be active, and password complexity should be enabled everywhere. To monitor all of this, I’ve built the script below.

The Script

$DiagInfo = dcdiag
$DCDiagResult = $Diaginfo | select-string -pattern '\. (.*) \b(passed|failed)\b test (.*)' | foreach {
    $obj = @{
        TestName   = $_.Matches.Groups[3].Value
        TestResult = $_.Matches.Groups[2].Value
        Entity     = $_.Matches.Groups[1].Value
    }
    [pscustomobject]$obj
}

$DCDiagStatus = foreach ($FailedResult in $DCDiagResult | Where-Object { $_.Testresult -ne "passed" }) {
    "DC diag test not succesfull on entity $($FailedResult.entity) - $($FailedResult.testname)"
}
if(!$DCDiagStatus){ $DCDiagStatus = "Healthy. No DCDiag Tests failed" }

$ReplicationSchedule = Get-ADReplicationSiteLink -filter *
$ReplicationSchedStatus = foreach ($Replication in $ReplicationSchedule) {
    if ($replication.ReplicationFrequencyInMinutes -gt 30) { "Potentional replication schedulde issue. $($Replication.name) has a replication schedulde of $($replication.ReplicationFrequencyInMinutes)" }
}
if (!$ReplicationSchedStatus) { $ReplicationSchedStatus = "Healthy - Replication Schedulde for all sites is lower than 30 minutes" }

$PasswordPolcy = Get-ADDefaultDomainPasswordPolicy
if ($PasswordPolcy.complexityEnabled -ne $true) { $PasswordComplexityHealthy = "Unhealthy - Password Complexity is disabled" }else {
    $PasswordComplexityHealthy = "Healthy - Password Complxity is enabled" 
}

$ADRecycler = Get-ADOptionalFeature -Filter 'name -like "Recycle Bin Feature"'
if (!$ADRecycler.enabledscopes) { $ADRecyclerHealth = "Unhealthy. AD Recycle Bin Feature is disabled" } else { $ADRecyclerHealth = "Healthy - Recycle bin is enabled." }

$DCDiagStatus 
$ReplicationSchedStatus
$PasswordComplexityHealthy
$ADRecyclerHealth

So load this up in your RMM, check the values of the bottom 4 variables and done. 🙂 Active Directory Monitoring made easy.

As always, Happy PowerShelling!

Automating with PowerShell: Using the new Autotask REST API

So I’m a bit later than normal with blogging, that’s mostly because I was working on this project a little longer than usual. Autotask recently released update 2020.2 and this update includes a new REST API.

This is super cool, because the old API was a SOAP api and terribly inconvenient to actively use. To help people with using the new Autotask API I’ve created a module. The module is still in alpha/beta but you can download it from the PSGallery now.

The project page is here. Feel free to report any issues or do a pull request if you want to help develop the module! Just be prepared for breakage in the first couple of weeks, I’m still working on finding the best methods 🙂

 Installation instructions

The module is published to the PSGallery, download it using:

   install-module AutotaskAPI

Usage

To get items using the Autotask API you’ll first have to add the authentication headers using the` Add-AutotaskAPIAuth` function.

$Creds = get-credential    Add-AutotaskAPIAuth -ApiIntegrationcode 'ABCDEFGH00100244MMEEE333 -credentials $Creds

When the command runs, You will be asked for credentials. Using these we will try to decide the correct webservices URL for your zone based on the email address. If this fails you must manually set the webservices URL.

Add-AutotaskBaseURI -BaseURI https://webservices1.autotask.net/atservicesrest

The Base URI value has tab completion to help you find the correct one easily.

To find resources using the API, execute the Get-autotaskAPIResource function. For the Get-AutotaskAPIResource function you will need either the ID of the resource you want to retrieve, or the JSON SearchQuery you want to execute. 

Examples


To find the company with ID 12345

Get-AutotaskAPIResource -Resource Companies -ID 12345

 To get all companies that are Active:

Get-AutotaskAPIResource -Resource Companies -SearchQuery "{filter='active -eq True'}"


To create a new company, we can either make the entire JSON body ourselves, or use the New-AutotaskBody function.

$Body = New-AutotaskBody -Definitions CompanyModel 


 This creates a body for the model Company. Definitions can be tab-completed. The body will contain all expected values. If you want an empty body instead use:

  $Body = New-AutotaskBody -Definitions CompanyModel -NoContent

After setting the values for the body you want, execute: New-AutotaskAPIResource -Resource Companies -Body $body

Contributions

Feel free to send pull requests or fill out issues when you encounter any.