Category Archives: Automation

Monitoring with PowerShell Chapter 3: Monitoring SSL certificates on IIS

For some clients that still have on-site servers running IIS we have to monitor the SSL status to make sure that things such as the Remote Desktop Gateway, or SSTP VPN are always available. To make sure that we are able to replace SSL certifcates on time we need to know when specific certificates expire.

To do this, we will use the Get-IISSite cmdlet. This cmdlet exposes all of the IIS configuration in an easy to use format. First, we’ll get all sites with bindings like this;

(Get-IISSite).bindings

Now if you run this, you won’t see that much information. To make sure we get a list of all possible items we have a couple of choices, but the simplest is using the select statement:

(Get-IISSite).bindings | select *

Here we’re telling PowerShell we want to select every possible object returned by our command. With this you’ll see we have a lot more information about the bindings. Including that we can filter on the protocol used. Because we only want information about the pages secured with a certificated we can run the same command again, but this time filtering on the protocol “HTTPS”

(Get-IISSite).bindings | where-object {$_.Protocol -eq "https"}

Now that we have the information that we want! our next step is easy; we will find the bound certificate in the local certificate store and check the expiry date. We like to be warned 14 days before expiry, but you can change the days to anything you’d like 🙂

$Days = (Get-Date).AddDays("14")
$CertsBound = (Get-IISSite).bindings | where-object {$_.Protocol -eq "https"}
foreach($Cert in $CertsBound){
$CertFile = Get-ChildItem -path "CERT:LocalMachine\$($Cert.CertificateStoreName)" | Where-Object -Property ThumbPrint -eq $cert.RawAttributes.certificateHash
if($certfile.NotAfter -lt $Days) { $CertState += "$($certfile.FriendlyName) will expire on $($certfile.NotAfter)" }
}

if(!$certState){$CertState = "Healthy"}

And thats it! if the certificate will expire in 14 days, this set will start alerting. Happy PowerShelling!

Autotask Quick Time Entry

Autotask’s app for iOS and Android isn’t the greatest app ever made, because of this our on-site engineers do not always enter time-entries because it’s too much of a hassle.

So to make sure that time entries are added on time by them I’ve created a quick web application to enter the time entries easily from a phone. Please note that I am not a PHP developer in any way, so if you feel the code is messy or not up to standards you are free to fix it yourself. This entire blog post comes without guarantees. 😉

preparations

You have to have a AT API user. Create this user and document the username and password. the user should have the role “API User” and have the following extra settings enabled:

  • Note > Contract > ImpersonateForCreate
  • Note > Contract > ImpersonateForModify
  • CrmNote > ImpersonateForCreate
  • CrmNote > ImpersonateForModify
  • Note > Project > ImpersonateForCreate
  • Note > Project > ImpersonateForModify
  • Note > ProjectTask > ImpersonateForCreate
  • Note > ProjectTask > ImpersonateForModify
  • Note > Ticket > ImpersonateForCreate
  • Note > Ticket > ImpersonateForModify
  • TimeEntry > ImpersonateForCreate
  • TimeEntry > ImpersonateForModify

Download the files to install here.

Installation:
  • Copy all files to your webhost.
  • edit the file bootstrap.php in _libAT. Enter your AutoTask API username and password.
  • Enter the correct WSDL location. You can find this in the API documentation here
  • browse to https://yourwebhost.com/frontend/GetRoles.php. Copy the ID you want to use as the default role.
  • Open SubmitTime.php and paste the DefaultRoleID in the correct location here.
  • Open EnterTime.php and find “worktype”. Here you’ll see a list of worktypes. edit this list to match your Autotask configuration. Keep this file open
  • Find all entries called “YOURWEBHOST” and replace with your actual webhost URL.

When all of this is done, you can browse to http://YOURWEBHOST.com/frontend/EnterTime.php and you’ll be be able to enter time entries in a couple of quick steps. Feel free to edit the files, change how it looks or even use it in other projects as long as you credit me as the source.

Monitoring with PowerShell Chapter 3: Monitoring user creation

We all know that bad actors often create accounts for repeat access when they gain access to your network. To make sure that we are aware of these situations we user PowerShell monitoring.

For security measures we monitor 3 types of user creation:

  • All created domain users,
  • All users added privileged groups(e.g. Domain Admins)
  • All created local users.
  • All (Temporary) users created that contain specific keywords.
Temporary users Monitoring:

The temporary users monitoring set is only used on active directory users, not local users. as we’re already monitoring all local user creation this would be redudant.

$ArrayOfNames = @("test", "tmp","skykick","mig", "migwiz","temp","-admin","supervisor")

foreach($name in $ArrayOfNames){
$filter =  'Name -like "*'+ $($name) + '*"'
$Users = Get-ADUser -Filter $filter
if($users -ne $null){
foreach($user in $users){
$TemporaryUser += "$($user.name) has been found, created at $($user.whencreated)`n"
}
}
}
if(!$TemporaryUser){$Userlist = "No Temporary Accounts Found" }
Created Domain Users

For domain users, we alert on all users created in the last 24 hours. after 24 hours this monitoring set goes back to a healthy state.

$When = ((Get-Date).AddDays(-1)).Date
$GetUsers = Get-ADUser -Filter {whenCreated -ge $When} -Properties whenCreated

foreach($user in $GetUsers){
$userchanges += "$($user.name) has been created at $($user.whencreated) `n"
}

if($UserChanges -eq $Null) { $UserChanges = "No Changes Detected"}
All created local users

Local users do not have a creation date, that makes it a little more difficult to monitor. To solve this we create a file with the results of get-localuser, We update that file every 24 hours. Then from there on, we compare the file to the most recent results of the output of get-localuser. We are alerted if the compare finds newer content, Meaning a user has been created or removed in the local user database.

$Outputfile = "C:\Windows\temp\LocalUsers.txt"
$Localusers = Get-LocalUser | Select-Object *
Test-Path $Outputfile | %{if($_ -eq $false){new-item $Outputfile} }
$OutPutFileContents = Get-Content $Outputfile | ConvertFrom-Json

If((get-item $Outputfile).LastWriteTime -lt (Get-Date).AddDays((-1))){    $Localusers | ConvertTo-Json |out-file $Outputfile  }

$Compare = Compare-Object -DifferenceObject $OutPutFileContents.name -ReferenceObject $Localusers.name

if(!$Compare) { $Compare = "Healthy"}
Privileged Group Changes

We monitor the privileged groups by using Ashley McGlone’s script to get the changes. You can find that full script here.

Function Get-PrivilegedGroupChanges {            
    Param(            
        $Server = "localhost",            
        $Hour = 24            
    )            
                
        $ProtectedGroups = Get-ADGroup -Filter 'AdminCount -eq 1' -Server $Server            
        $Members = @()            
                
        ForEach ($Group in $ProtectedGroups) {            
            $Members += Get-ADReplicationAttributeMetadata -Server $Server `
                -Object $Group.DistinguishedName -ShowAllLinkedValues |            
             Where-Object {$_.IsLinkValue} |            
             Select-Object @{name='GroupDN';expression={$Group.DistinguishedName}}, `
                @{name='GroupName';expression={$Group.Name}}, *            
        }            
                
        $Members |            
            Where-Object {$_.LastOriginatingChangeTime -gt (Get-Date).AddHours(-1 * $Hour)}            
                
    }     
    
    $ListOfChanges = Get-PrivilegedGroupChanges
    
    if($ListOfChanges -eq $Null) 
    {
    $GroupChanges = "No Changes Detected"
    }
    else {
    foreach($item in $ListOfChanges){
    $GroupChanges += "Group $($ListOfChanges.GroupName) has been changed - $($listofchanges.AttributeValue) has been added or removed"
    }
    }
    
    if(!$groupChanges) { $GroupChanges = "No Changes Detected"}

Monitoring with PowerShell Chapter 3: Monitoring and remediating Windows Feature Update status

With the advent of Windows 10 all MSP’s are faced with a new challenge: How do we manage the different Windows 10 Feature versions and how do we make sure we can automatically upgrade our clients to the latest version of the Windows 10 OS? Microsoft has not made feature updating very straightforward, and sometimes the automatic updates error out.

A lot of RMM systems claim they do version upgrades perfectly, unfortunately I have not seen any RMM that gracefully upgrades machines without too many issues and without user intervention. as an MSP its key to automate as much as possible and prevent engineers from having to bother users to perform machine upgrades, so here is PowerShell to the rescue:

Checking the Windows version and comparing it to your standards.

We’ve decided we want all of our users on the same feature level, because of this we’ve created a monitoring set that alerts us when Windows 10 computers have a lower version then one we’ve centrally set. Quite simply our monitoring set only returns the current release ID we pull from the registry:

$ReleaseID = (Get-ItemProperty -Path "HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion" -Name releaseid).releaseid

Now we alert on this monitoring set, when it is below a expected value. In our case anything lower than 1903 generates an alert. Our system then sees that this alert has been generated and performs a remediation script during the next maintenance cycle we’ve agreed with the client.

Remediation

Remediation for having the wrong feature update is easy. Run the update and done. The problem is that most RMM systems don’t handle this very cleanly. To resolve this we’ve created a PowerShell script that grabs the ISO from a networkshare, copies it to a temporary location, and runs the upgrade from there.

The script also checks if there is enough space on the GPT boot partition, and if not deletes the unnecessary font files.

If the network share is not available, It will download the ISO from a web server you specify, which is great for a mobile workforce. All you have to change in this script is $ISOPath and $Webserver.

$ISOPath = "\\Servername\Netlogon\Windows10.iso"
$WebServer = "http://YOURWEBSERVER/Windows10.iso"
$OSDiskIndex = gwmi -query "Select * from Win32_DiskPartition WHERE Bootable = True" | Select-Object -ExpandProperty DiskIndex
$PartTypeFull = gwmi -query "Select * from Win32_DiskPartition WHERE Index = 0" | Select-Object -ExpandProperty Type
$PartTypeMid = $PartTypeFull.Substring(0,3)
$PartType = Out-String -InputObject $PartTypeMid
if ($PartType -like "*GPT*")
{
write-output ("System has a GPT partition, clearing EFI fonts....");
cmd.exe /c "mountvol b: /s"
Remove-Item b:\efi\Microsoft\Boot\Fonts\*.* -force
cmd.exe /c "mountvol b: /d"
}
if (Test-Path $ISOPath)
{
write-output ("Mounting ISO from: "+$ISOPath);
}
else
{
write-output ("Warn: ISO not found at: "+$ISOPath);
write-output ("Downloading ISO from webserver....");
mkdir c:\temp
$ISOPath = "c:\temp\Windows_10_upgrade.iso";
invoke-webrequest $Webserver -OutFile $ISOPath
}
Mount-DiskImage -ImagePath $ISOPath
$ISODrive = Get-DiskImage -ImagePath $ISOPath | Get-Volume | Select-Object -ExpandProperty DriveLetter
write-output ("Mounted ISO on drive: "+$ISODrive)
$Exe = ":\setup.exe"
$Arguments = "/auto upgrade /quiet /noreboot"
$ExePath = $ISODrive + $Exe
write-output ("Running setup from ISO: " + $ExePath)
Start-Process $ExePath $Arguments

And tada! thats it. The upgrade will run, but the machine will not reboot until the user performs the reboot. We schedule more tasks during our maintenance cycle, so we’d rather have the RMM system handle the reboot.

Happy PowerShelling!

Monitoring with PowerShell Chapter 3: Hyper-v state

We managed a whole lot of hyper-v servers for our clients, including large clusters but also smaller single server solutions. This makes it difficult to make sure that everyone creates VM’s as they should, and sometimes mistakes are made by engineers or backup software that cause a checkpoint to be left on a production server.

To make sure we don’t get problems along the way we use the following monitoring sets.

Monitoring checkpoints, Snapshots, AVHD’s

We monitor each VM for running checkpoints and snapshots by running the following script. This checks if a snapshot is older than 24 hours and creates an alert based on this. If no snapshots are found it reports that the snapshot state is healthy

$snapshots = Get-VM | Get-VMSnapshot | Where-Object {$_.CreationTime -lt (Get-Date).AddDays(-1) }
foreach($Snapshot in $snapshots){
$SnapshotState += "A snapshot has been found for VM $($snapshot.vmname). The snapshot has been created at  $($snapshot.CreationTime)  `n"
}
if(!$SnapshotState) { $snapshotstate = "Healthy"}

we used this monitoring set for a while, but then found that we had some servers that got restored from a backup that did not have a snapshot available, but did run on an AVHX. That can cause issues as the AVHDX can grow without you noticing as it doesn’t have a complete snapshot available. To also monitor AVHDX’s we’re using the following set

$VHDs = Get-VM | Get-VMHardDiskDrive
foreach($VHD in $VHDs){
if($vhd.path -match "avhd"){ $AVHD += "$($VHD.VMName) is running on AVHD: $($VHD.path) `n"}
}
if(!$AVHD){ $AVHD = "Healthy" }

Version of Integration services

Monitoring Integration services on older version of hyper-v, or migrated versions is quite important as the hyper-v integration services also provider driver interfaces to the client VM’s. To solve this we use the following monitoring script:

$VMMS = gwmi -namespace root\virtualization\v2 Msvm_VirtualSystemManagementService
 
# 1 == VM friendly name. 123 == Integration State
$RequestedSummaryInformationArray = 1,123
$vmSummaryInformationArray = $VMMS.GetSummaryInformation($null, $RequestedSummaryInformationArray).SummaryInformation
 

$outputArray = @()
 

foreach ($vmSummaryInformation in [array] $vmSummaryInformationArray)
   {  
 
   switch ($vmSummaryInformation.IntegrationServicesVersionState)
      {
       1       {$vmIntegrationServicesVersionState = "Up-to-date"}
       2       {$vmIntegrationServicesVersionState = "Version Mismatch"}
       default {$vmIntegrationServicesVersionState = "Unknown"}
      }

   $vmIntegrationServicesVersion = (get-vm $vmSummaryInformation.ElementName).IntegrationServicesVersion
   if ($vmIntegrationServicesVersion -eq $null) {$vmIntegrationServicesVersion = "Unknown"}
 
   $output = new-object psobject
   $output | add-member noteproperty "VM Name" $vmSummaryInformation.ElementName
   $output | add-member noteproperty "Integration Services Version" $vmIntegrationServicesVersion
   $output | add-member noteproperty "Integration Services State" $vmIntegrationServicesVersionState
 
   # Add the PSObject to the output Array
   $outputArray += $output
 
   }

foreach ($VM in $outputArray){
if ($VM.'Integration Services State' -contains "Version Mismatch"){
$ISState += "$($VM.'VM Name') Integration Services state is $($VM.'Integration Services State')`n"
}}
if(!$IIState){ $IIState = "Healthy" }

NUMA spanning

The next script is made to monitor the NUMA span of virtual machine. You might notice a decrease in performance when your NUMA spanning incorrect, not just in assigned memory but a general performance degradation of up to 80%. For more information, you can check this link and this link.

$VMs = Get-VM
foreach ($VM in $VMs){
$GetvCPUCount = Get-VM -Name $VM.Name | select Name,NumaAligned,ProcessorCount,NumaNodesCount,NumaSocketCount
$CPU = Get-WmiObject Win32_Processor
$totalCPU = $CPU.numberoflogicalprocessors[0]*$CPU.count
if ($GetvCPUCount.NumaAligned -eq $False){
$vCPUoutput += "NUMA not aligned for; $($VM.Name). vCPU assigned: $($GetvCPUCount.ProcessorCount) of $totalCPU available`n"
}}
if(!$vCPUOutput){ $vCPUOutput = "Healthy" }

Monitoring with PowerShell Chapter 3: Monitoring MFA-Server and Office365 MFA status

We use both Azure MFA Server to secure our on-site resources, and Office365 MFA for our clients. To make sure we don’t have aggressors changing the MFA settings, or simply administrators forgetting to set-up MFA for clients we make sure that we alert on both.

The issue with monitoring the MFA server is that its a product Microsoft bought later on its in life. As such does not have a PowerShell module included, and it does not conform to the current Common Engineering Criteria.

To solve this, we add a DLL that exposes the functionality that we need, when they get a list of all users(starting with A, until Z).

Add-Type -Path "C:\Program Files\Multi-Factor Authentication Server\pfsvcclientclr.DLL"
$problem = [PfSvcClientClr.ConstructResult]::miscError;
$main = [PfSvcClientClr.PfSvcClient]::construct([PfSvcClientClr.ConstructTarget]::local, [ref] $problem);
$DLLModule = $main.GetType().GetMethod("getInterface").MakeGenericMethod([PfSvcClientClr.IPfMasterComposite]).Invoke($main, $null);
$users = $DLLModule.find_users_startsWith("a")
foreach($username in $users){
if($DLLModule.get_user_disabledBehavior("$username") -eq "succeed") { $UserOutput += "$($username) is set to succeed authentication without MFA `n" }
}
if(!$UserOutput) { $UserOutput = "Healthy" }

Next we will monitor the multi-factor authentication on the Office365 side. For this we will use the MSOL module and your partner credentials you’ve generated using this blog. The script can be used both to get information for all clients, or a single client. To demonstrate I’ve added both

All Clients script:

$ApplicationId       = "xxxxxx-xxxx-xxxx-xxxx-xxxxxxxx"
$ApplicationSecret   = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx="
$RefreshToken        = "VeryLongRefreshToken"
$Tenantname          = "YourTenant.onmicrosoft.com"

$PasswordToSecureString = $ApplicationSecret  | ConvertTo-SecureString -asPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId,$PasswordToSecureString)
$aadGraphToken = New-PartnerAccessToken -RefreshToken $refreshToken -Resource https://graph.windows.net -Credential $credential -TenantId $Tenantname
$graphToken =  New-PartnerAccessToken -RefreshToken $refreshToken -Resource https://graph.microsoft.com -Credential $credential -TenantId $Tenantname

Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken
$AllUSers = Get-MsolPartnerContract |  Get-MsolUser -all | select DisplayName,UserPrincipalName,@{N="MFA Status"; E={ 
if( $_.StrongAuthenticationRequirements.State -ne $null) {$_.StrongAuthenticationRequirements.State} else { "Disabled"}}}
foreach($User in $allusers | Where-Object{ $_.'MFA Status' -eq "Disabled"}){
$DisabledUsers += "$($User.UserPrincipalName) Has MFA disabled `n"
}
if(!$DisabledUsers){ $DisabledUsers = "Healthy"}

Single client script

$ApplicationId       = "xxxx-xxxxx-xxxxx-xxxxx-xxxxxx"
$ApplicationSecret   = "xxxxxxxxxxxxxxxxxxxxxxxxxxxx="
$RefreshToken        = "VeryLongStringHere"
$Tenantname          = "YourTenant.onmicrosoft.com"
$ClientTenantName    = "TheClientsTenant.onmicrosoft.com"
$PasswordToSecureString = $ApplicationSecret  | ConvertTo-SecureString -asPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId,$PasswordToSecureString)
$aadGraphToken = New-PartnerAccessToken -RefreshToken $refreshToken -Resource https://graph.windows.net -Credential $credential -TenantId $Tenantname
$graphToken =  New-PartnerAccessToken -RefreshToken $refreshToken -Resource https://graph.microsoft.com -Credential $credential -TenantId $Tenantname

Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken
$AllUSers = Get-MsolPartnerContract -DomainName $ClientTenantName |  Get-MsolUser -all | select DisplayName,UserPrincipalName,@{N="MFA Status"; E={ 
if( $_.StrongAuthenticationRequirements.State -ne $null) {$_.StrongAuthenticationRequirements.State} else { "Disabled"}}}
foreach($User in $allusers | Where-Object{ $_.'MFA Status' -eq "Disabled"}){
$DisabledUsers += "$($User.UserPrincipalName) Has MFA disabled"
}
if(!$DisabledUsers){ $DisabledUsers = "Healthy"}

Monitoring with PowerShell Chapter 3: Monitoring creation of scheduled tasks

Hi All,

After a blog post from Malwarebytes (here) about specific adware and cryptolockers using scheduled tasks to make sure they can remain undetected, or even regain control of the system by running a specific task every once in a while, We’ve decided with to start monitoring the creation of scheduled tasks. Users generally don’t really setup these tasks in normal situations.

We’ve decided to start alerting on any task created in the last 24 hours. Our N-Central monitoring system creates a ticket for this so we can investigate it. The great trick about this is that the set automatically resets itself after 24 hours, unless we create another task.

The monitoring set

Ok, so lets get started! the ingredients we will need today are:

  • Windows 8.1 or higher
  • PowerShell
  • A monitoring system, or way to run the scripts.

First, We’ll start by trying to get all tasks on our system. We’ll use the get-scheduledtask cmdlet.

Get-ScheduledTask

Now we have a cool list of tasks, but hey, there is no date information that is returned by this cmdlet?! That’s no fun. It seems like Microsoft had a little oversight in this case. Still, its not really a problem for us; Microsoft always stores the XML files for the scheduled tasks on the same location, namely %WINDIR%\System32\Tasks.

Instead of messing with extra modules, or even trying to get date information out of the default cmdlets we’ll move on to using get-childitem, with a filter of 24 hours. Lets start by grabbing our Task Path and setting it to a variable for ease of use, and then get the list of tasks created today.

$Taskpath = "$ENV:windir\system32\Tasks"
$TasksToday = Get-ChildItem -Path $TaskPath -Recurse| Where-Object {$_.CreationTime -gt (Get-Date).AddDays(-1)}

Ok, Great! now we already have a list of all files created today. We could stop right now, set this up in our monitoring system and be done with it. The only issue I have with this is that I’d like to know the actual tasks settings. So lets try getting those too.

We start looping through all the files, loading them as XML objects, and getting the information we care about. in our case we only want to know the name, author, and what command it would execute. We put that information in $TaskState. If $Taskstate remains blank during the entire script. We simply output “Healthy”.

$Taskpath = "$ENV:windir\system32\Tasks"
$TasksToday = Get-ChildItem -Path $TaskPath -Recurse| Where-Object {$_.CreationTime -gt (Get-Date).AddDays(-1)}

Foreach($task in $TasksToday){
$TaskXML = get-content $Task.FullName
$TaskName = $($TaskXML.task.RegistrationInfo.uri)
$TaskAuthor = $TaskXML.task.Principals.Principal.userid
$TaskExec = $TaskXML.task.actions.exec.command + $TaskXML.task.actions.exec.Arguments
$TaskState += "$TaskName has been created by SID: $TaskAuthor and executes $TaskExec`n"
}

if(!$TaskState){ $TaskState = "Healthy"}

Based on this information, we load this into our monitoring system and alert if the TaskState is anything but “Healthy”. And that’s it, we’re now monitoring the scheduled tasks with PowerShell. Happy PowerShelling!

Update: it was reported by a reader that he has some jobs that he did not have permissions on, and thus the task failed. My RMM system runs every PowerShell command as SYSTEM, so this wasn’t noticed during testing. Anyway, to solve this issue you can run the script below instead. This script forces ownership to be taken over for the current account.

$Taskpath = "$ENV:windir\system32\Tasks"
takeown /a /r /d Y /f "$TaskPath"
$TasksToday = Get-ChildItem -Path $TaskPath -Recurse| Where-Object {$_.CreationTime -gt (Get-Date).AddDays(-1)}

Foreach($task in $TasksToday){
$TaskXML = get-content $Task.FullName
$TaskName = $($TaskXML.task.RegistrationInfo.uri)
$TaskAuthor = $TaskXML.task.Principals.Principal.userid
$TaskExec = $TaskXML.task.actions.exec.command + $TaskXML.task.actions.exec.Arguments
$TaskState += "$TaskName has been created by SID: $TaskAuthor and executes $TaskExec`n"
}

if(!$TaskState){ $TaskState = "Healthy"}

Using the Secure App Model to connect to microsoft partner resources

This is a quick and dirty blog, as I am quite busy editing our PowerShell Functions to use the secure app model before the deadline of august first.

Microsoft has announced that any Microsoft partner will need to start enforcing MFA on all accounts. This means you can no longer have an exclusion for accounts that run PowerShell scripts, or even use whitelisting. To learn more about this, I’d suggest to check out the Microsoft announcement here.

Because I know a lot of MSP’s and other service providers that will see this a major issue, I’ve hacked together a couple of quick scripts that can help you start using the new authentication model as easy as you are currently using credentials.

By running the script below. You will get 3 parts of the new credential object: the clientID, the ClientSecret, and the refresh token. you can use these to log into any of the Secure App Model endpoints. Save the script below and run it as follows:

Create-AzureADApplication.ps1 -ConfigurePreconsent -DisplayName "Partner Center Web App"

This script is originally from the Microsoft support center for partners which you can find here. I just combined the scripts to make sure you only have to enter your credentials twice: once to log into Azure and create the app, and the second time to log in with a user that will be used as the API service account.

<#
    .SYNOPSIS
        This script will create the require Azure AD application.
    .EXAMPLE
        .\Create-AzureADApplication.ps1 -ConfigurePreconsent -DisplayName "Partner Center Web App"

        .\Create-AzureADApplication.ps1 -ConfigurePreconsent -DisplayName "Partner Center Web App" -TenantId eb210c1e-b697-4c06-b4e3-8b104c226b9a

        .\Create-AzureADApplication.ps1 -ConfigurePreconsent -DisplayName "Partner Center Web App" -TenantId tenant01.onmicrosoft.com
    .PARAMETER ConfigurePreconsent
        Flag indicating whether or not the Azure AD application should be configured for preconsent.
    .PARAMETER DisplayName
        Display name for the Azure AD application that will be created.
    .PARAMETER TenantId
        [OPTIONAL] The domain or tenant identifier for the Azure AD tenant that should be utilized to create the various resources.
#>

Param
(
    [Parameter(Mandatory = $true)]
    [switch]$ConfigurePreconsent,
    [Parameter(Mandatory = $true)]
    [string]$DisplayName,
    [Parameter(Mandatory = $false)]
    [string]$TenantId
)

$ErrorActionPreference = "Stop"

# Check if the Azure AD PowerShell module has already been loaded.
if ( ! ( Get-Module AzureAD ) ) {
    # Check if the Azure AD PowerShell module is installed.
    if ( Get-Module -ListAvailable -Name AzureAD ) {
        # The Azure AD PowerShell module is not load and it is installed. This module
        # must be loaded for other operations performed by this script.
        Write-Host -ForegroundColor Green "Loading the Azure AD PowerShell module..."
        Import-Module AzureAD
    } else {
        Install-Module AzureAD
    }
}

try {
    Write-Host -ForegroundColor Green "When prompted please enter the appropriate credentials..."

    if([string]::IsNullOrEmpty($TenantId)) {
        Connect-AzureAD | Out-Null

        $TenantId = $(Get-AzureADTenantDetail).ObjectId
    } else {
        Connect-AzureAD -TenantId $TenantId | Out-Null
    }
} catch [Microsoft.Azure.Common.Authentication.AadAuthenticationCanceledException] {
    # The authentication attempt was canceled by the end-user. Execution of the script should be halted.
    Write-Host -ForegroundColor Yellow "The authentication attempt was canceled. Execution of the script will be halted..."
    Exit
} catch {
    # An unexpected error has occurred. The end-user should be notified so that the appropriate action can be taken.
    Write-Error "An unexpected error has occurred. Please review the following error message and try again." `
        "$($Error[0].Exception)"
}

$adAppAccess = [Microsoft.Open.AzureAD.Model.RequiredResourceAccess]@{
    ResourceAppId = "00000002-0000-0000-c000-000000000000";
    ResourceAccess =
    [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
        Id = "5778995a-e1bf-45b8-affa-663a9f3f4d04";
        Type = "Role"},
    [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
        Id = "a42657d6-7f20-40e3-b6f0-cee03008a62a";
        Type = "Scope"},
    [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
        Id = "311a71cc-e848-46a1-bdf8-97ff7156d8e6";
        Type = "Scope"}
}

$graphAppAccess = [Microsoft.Open.AzureAD.Model.RequiredResourceAccess]@{
    ResourceAppId = "00000003-0000-0000-c000-000000000000";
    ResourceAccess =
        [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
            Id = "bf394140-e372-4bf9-a898-299cfc7564e5";
            Type = "Role"},
        [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
            Id = "7ab1d382-f21e-4acd-a863-ba3e13f7da61";
            Type = "Role"}
}

$partnerCenterAppAccess = [Microsoft.Open.AzureAD.Model.RequiredResourceAccess]@{
    ResourceAppId = "fa3d9a0c-3fb0-42cc-9193-47c7ecd2edbd";
    ResourceAccess =
        [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
            Id = "1cebfa2a-fb4d-419e-b5f9-839b4383e05a";
            Type = "Scope"}
}

$SessionInfo = Get-AzureADCurrentSessionInfo

Write-Host -ForegroundColor Green "Creating the Azure AD application and related resources..."

$app = New-AzureADApplication -AvailableToOtherTenants $true -DisplayName $DisplayName -IdentifierUris "https://$($SessionInfo.TenantDomain)/$((New-Guid).ToString())" -RequiredResourceAccess $adAppAccess, $graphAppAccess, $partnerCenterAppAccess -ReplyUrls @("urn:ietf:wg:oauth:2.0:oob")
$password = New-AzureADApplicationPasswordCredential -ObjectId $app.ObjectId
$spn = New-AzureADServicePrincipal -AppId $app.AppId -DisplayName $DisplayName

if($ConfigurePreconsent) {
    $adminAgentsGroup = Get-AzureADGroup -Filter "DisplayName eq 'AdminAgents'"
    Add-AzureADGroupMember -ObjectId $adminAgentsGroup.ObjectId -RefObjectId $spn.ObjectId
}
write-host "Installing PartnerCenter Module." -ForegroundColor Green
install-module PartnerCenter -Force

write-host "Please approve consent form." -ForegroundColor Green
$PasswordToSecureString = $password.value | ConvertTo-SecureString -asPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential($($app.AppId),$PasswordToSecureString)
$TenantNameOrID = read-host "Please enter your tenantID or tenantname(e.g. Tenant.onmicrosoft.com)"
$token = New-PartnerAccessToken -Consent -Credential $credential -Resource https://api.partnercenter.microsoft.com -TenantId $TenantNameOrID

Write-Host "================ Secrets ================"
Write-Host "ApplicationId       = $($app.AppId)"
Write-Host "ApplicationSecret   = $($password.Value)"
write-host "RefreshToken        = $($token.refreshtoken)"
Write-Host "================ Secrets ================"
Write-Host "    SAVE THESE IN A SECURE LOCATION     "

To use the tokens in your scripts, use the Application ID as the username, the Application secret as the password, and the RefreshToken as the refreshtoken value requested by the modules.

$credential = Get-Credential
$refreshToken = 'Your-Refresh-Token-Value'

$aadGraphToken = New-PartnerAccessToken -RefreshToken $refreshToken -Resource https://graph.windows.net -Credential $credential
$graphToken =  New-PartnerAccessToken -RefreshToken $refreshToken -Resource https://graph.microsoft.com -Credential $credential

Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken

Update

There has been a small update: Microsoft has advised the following workaround for the Microsoft Exchange Module. This workaround creates a user that is excluded from all baselines policies as long as no interactive logon with the account is performed. For more information check this url

Import-Module AzureAD
Connect-AzureAD
$PasswordProfile = New-Object -TypeName Microsoft.Open.AzureAD.Model.PasswordProfile
$PasswordProfile.Password = "Password"
$PasswordProfile.ForceChangePasswordNextLogin = $false
$user = New-AzureADUser -DisplayName "New User" -PasswordProfile $PasswordProfile -UserPrincipalName "NewUser@contoso.com" -AccountEnabled $true
$adminAgentsGroup = Get-AzureADGroup -Filter "DisplayName eq 'AdminAgents'"
Add-AzureADGroupMember -ObjectId $adminAgentsGroup.ObjectId -RefObjectId $user.ObjectId

Please note that this is a workaround that should only be used for Exchange Access.

Getting all alarms of all Unifi sites with PowerShell

For reporting purposes I sometimes like collecting our all alert logs for my Ubiquity Unifi sites. Of course I want this in a nice readable format so I create a HTML file with all output from the unifi API.

First we start with connecting to the Unifi controller and creating a websession. This websession is used to authenticate us to the Unifi Controller.

param(
[string]$URL = 'yourcontroller.controller.tld',
[string]$port = '8443',
[string]$User = 'APIUSER',
[string]$Pass = 'SomeReallyLongPassword'
)
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
[string]$controller = "https://$($URL):$($port)"
[string]$credential = "`{`"username`":`"$User`",`"password`":`"$Pass`"`}"
try {
$null = Invoke-Restmethod -Uri "$controller/api/login" -method post -body $credential -ContentType "application/json; charset=utf-8"  -SessionVariable myWebSession
}catch{
$APIerror = "Api Connection Error: $($_.Exception.Message)"
}

Now that we’re connecting, we can start making API calls, our first call is to collect all sites so we will be able to loop trough the sites, when doing this we will be using the data from our previous websession in $myWebSession

try {
    $SiteList = Invoke-Restmethod -Uri "$controller/api/s" -WebSession $myWebSession
    }catch{
    $APIerror = "Query Failed: $($_.Exception.Message)"
    }

$SiteList.data is now filled with the list of all sites on the controller. The next step will be to process this and gather all the alerts from these sites.

foreach($Site in $SiteList.data) {
$SiteID = $($Site.Name)
$AlertsForSite = Invoke-Restmethod -Uri "$controller/api/s/$($siteID)/stat/alarm" -WebSession $myWebSession 
$AlarmsArray += $AlertsForSite.data | convertto-html -Title "Alarm Report Unifi" -PreContent "<h1>$($site.desc)</h1>" -PostContent "<h5><i>$(get-date)</i></h5>" 
}

Now all we have to do is export the $Alarmsarray to a html file which gives us the report in a decent-looking/editable HTML format:

$AlarmsArray | Out-File "C:\temp\AlarmOutput.html"

The complete script is included for reference below, or just for easy copy and pasting 😉

<code>param(
[string]$URL = 'yourcontroller.controller.tld',
[string]$port = '8443',
[string]$User = 'APIUSER',
[string]$Pass = 'APIUSERPASS'
)
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
[string]$controller = "https://$($URL):$($port)"
[string]$credential = "`{`"username`":`"$User`",`"password`":`"$Pass`"`}"
try {
$null = Invoke-Restmethod -Uri "$controller/api/login" -method post -body $credential -ContentType "application/json; charset=utf-8"  -SessionVariable myWebSession
}catch{
$APIerror = "Api Connection Error: $($_.Exception.Message)"
}

try {
    $SiteList = Invoke-Restmethod -Uri "$controller/api/self/sites" -WebSession $myWebSession
    }catch{
    $APIerror = "Query Failed: $($_.Exception.Message)"
    }

foreach($Site in $SiteList.data) {
$SiteID = $($Site.Name)
$AlertsForSite = Invoke-Restmethod -Uri "$controller/api/s/$($siteID)/stat/alarm" -WebSession $myWebSession 
$AlarmsArray += $AlertsForSite.data | convertto-html -Title "Alarm Report Unifi" -PreContent "&lt;H1>$($site.desc)&lt;/H1>" -PostContent "&lt;H5>&lt;i>$(get-date)&lt;/i>&lt;/H5>" 
}

$AlarmsArray | Out-File "C:\temp\AlarmOutput.html"</code>

and tada! thats it. As always, Be carefull and Happy PowerShelling!

Deploy MFA to all Administrator accounts in all (Partner) tenants

As a MSP we tend to take over a lot of Microsoft tenants which therefore do not have their state of security in order. To make sure that we always use MFA for administration purposes we have an Azure Function running that deploys MFA for our administrator accounts. We do this by using our central phone number. That way MFA is always configured and we notice when an admin is trying to log in. We tend to use our delegated access as the normal administration purpose but some tasks require a local admin account in the customer’s portal.

We currently have the following running as an Azure Function. I’ve edited the script to allow anyone to run this directly. If you need help converting this to an Azure Function please see my other blog here.

The great thing about this script is that we completely negate the need for our helpdesk to configure MFA for the admin accounts. The script makes sure that MFA is completely prepared for them all.

Script breakdown

In the first part of the script we set our phone numbers, connect to the MSOLService and get all the Microsoft Partner Contracts we have permissions on

$MobileNumber = "+31611111111"
$AltNumber = "+31010101010"
Connect-MsolService
$ClientList = get-msolpartnerContract -All

Now that $ClientList contains all our clients we loop trough these to get all of our admin accounts.

foreach($Client in $ClientList){
$adminemails = Get-MsolRoleMember -TenantId $Client.tenantid -RoleObjectId (Get-MsolRole -RoleName "Company Administrator").ObjectId
$admins = $adminemails | get-msoluser -TenantId $client.tenantid

Now that we have our admin accounts in $Admins the next step is to create two objects – One for the Strong Authentication Requirements and another for the Strong Authentication Method.

$AuthReq = New-Object -TypeName Microsoft.Online.Administration.StrongAuthenticationRequirement
$AuthReq.RelyingParty = "*"
$AuthReq.State = "Enabled"
$AuthReq.RememberDevicesNotIssuedBefore = (Get-Date)
$AuthMethod = New-Object -TypeName Microsoft.Online.Administration.StrongAuthenticationMethod
$AuthMethod.MethodType = "TwoWayVoiceMobile"
$AuthMethod.IsDefault = $true
$AuthMethodObj = @($AuthMethod)

This makes sure that the user is completely pre-provisioned and does not need to go through the entire process of activating MFA first, which saves us valuable time. To finalize we run a for-each loop which enables all found Admin accounts with Multi Factor Authentication. It also sets the phone numbers to the ones we’ve entered as variables above.

foreach ($admin in $admins | Where-Object {$_.StrongAuthenticationMethods -eq $Null}) {
        Write-Host "Enabling MFA for $($admin.userprincipalname)" -ForegroundColor Green
        Set-MsolUser -UserPrincipalName $admin.userprincipalname -StrongAuthenticationRequirements $AuthReq -StrongAuthenticationMethods $AuthMethodObj -TenantId $client.tenantid  -MobilePhone $MobileNumber -AlternateMobilePhones $AltNumber
    }
}

And thats it! as a result the clients Secure Score will increase by about 50 points. Awesome! The complete script will look like this

Final Script

$MobileNumber = "+31611111111"
$AltNumber = "+31010101010"
Connect-MsolService
$ClientList = get-msolpartnerContract -All

foreach($Client in $ClientList){
$adminemails = Get-MsolRoleMember -TenantId $Client.tenantid -RoleObjectId (Get-MsolRole -RoleName "Company Administrator").ObjectId
$admins = $adminemails | get-msoluser -TenantId $client.tenantid
$AuthReq = New-Object -TypeName Microsoft.Online.Administration.StrongAuthenticationRequirement
$AuthReq.RelyingParty = "*"
$AuthReq.State = "Enabled"
$AuthReq.RememberDevicesNotIssuedBefore = (Get-Date)
$AuthMethod = New-Object -TypeName Microsoft.Online.Administration.StrongAuthenticationMethod
$AuthMethod.MethodType = "TwoWayVoiceMobile"
$AuthMethod.IsDefault = $true
$AuthMethodObj = @($AuthMethod)
foreach ($admin in $admins | Where-Object {$_.StrongAuthenticationMethods -eq $Null}) {
        Write-Host "Enabling MFA for $($admin.userprincipalname)" -ForegroundColor Green
        Set-MsolUser -UserPrincipalName $admin.userprincipalname -StrongAuthenticationRequirements $AuthReq -StrongAuthenticationMethods $AuthMethodObj -TenantId $client.tenantid  -MobilePhone $MobileNumber -AlternateMobilePhones $AltNumber
    }
}

Tada! as always, be careful, and happy PowerShelling!