Featured image of post Automating with PowerShell: Storing Office 365 audit logs longer than 90 days

Automating with PowerShell: Storing Office 365 audit logs longer than 90 days

A friend of mine recently bumped into an issue; his client wanted to know when a specific user logged on for the last time. The problem was that he did not have the unified audit log enabled, but even if he did the time-span was too long. We got lucky at using just the mailbox audit log over the past days so he could help his clients. He still worried about the Unified Audit logs and only having 90 days of logging.

I’ve told him about a couple of SIEM products, including Azure Sentinel which is able to ingest logs from Office 365. The problem is that for SIEM tooling you require a lot of a setup and a team to keep everything running smoothly, next to that most SIEMs aren’t really focused on the MSP-workflow. It was a bit overkill for what he wanted too.

So first I helped him setup the unified audit log for all his clients, and afterwards shared a script with him to store the audit logs in a readable format, with both a .CSV file and a HTML representation.

You can run this script in an Azure Runbook, Function, or just scheduled on a secure workstation.

The script downloads the information over the previous 1.2 days. We do this to cover the gap just in case the script runs a bit longer than expected.

The Script

As always, we’re using the Secure Application model. The script creates the folder structure for you.

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
# Write to the Azure Functions log stream.
Write-Host "Start at $(get-date)"
##########################################
$ApplicationId = 'ApplicationID'
$ApplicationSecret = 'YoursecretySecret' | Convertto-SecureString -AsPlainText -Force
$TenantID = 'YourTenantID'
$RefreshToken = 'verylongtoken'
$ExchangeRefreshToken = 'anotherverylongtoken'
$UPN = "any-valid-partner-upn"
$outputfolder = "D:\home\site\wwwroot\reports"
$ModulePath = "D:\home\site\wwwroot\AuditLogRetrieval\Modules"
##########################################
$credential = New-Object System.Management.Automation.PSCredential($ApplicationId, $ApplicationSecret)
$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID
Connect-AzureAD -AadAccessToken $aadGraphToken.AccessToken -AccountId $upn -MsAccessToken $graphToken.AccessToken
#Logged in. Moving on to creating folders and getting data.
$folderName = (Get-Date).tostring("yyyy-MM-dd")
New-item -Path $outputfolder -ItemType Directory -Name $folderName -Force
$customers = Get-AzureADContract -All:$true
foreach ($customer in $customers) {
    New-item "$($outputfolder)" -name $Customer.DisplayName -ItemType Directory -Force
    $token = New-PartnerAccessToken -ApplicationId 'a0c73c16-a7e3-4564-9a95-2bdf47383716'-RefreshToken $ExchangeRefreshToken -Scopes 'https://outlook.office365.com/.default' -Tenant $customer.CustomerContextId
    $tokenValue = ConvertTo-SecureString "Bearer $($token.AccessToken)" -AsPlainText -Force
    $credential = New-Object System.Management.Automation.PSCredential($upn, $tokenValue)
    $customerId = $customer.DefaultDomainName
    $session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://ps.outlook.com/powershell-liveid?DelegatedOrg=$($customerId)&BasicAuthToOAuthConversion=true" -Credential $credential -Authentication Basic -AllowRedirection
    Import-PSSession $session -allowclobber -DisableNameChecking -CommandName "Search-unifiedAuditLog"

    $startDate = (Get-Date).AddDays( - '1.2')
    $endDate = (Get-Date)
    $Logs = @()
    Write-Host "Retrieving logs for $($customer.displayname)" -ForegroundColor Blue
    do {
        $logs += Search-unifiedAuditLog -SessionCommand ReturnLargeSet -SessionId $customer.displayname -ResultSize 5000 -StartDate $startDate -EndDate $endDate
        Write-Host "Retrieved $($logs.count) logs" -ForegroundColor Yellow
    }while ($Logs.count % 5000 -eq 0 -and $logs.count -ne 0)


    Write-Host "Finished Retrieving logs" -ForegroundColor Green
    $ObjLogs = foreach ($Log in $Logs) {
        $log.auditdata | convertfrom-json
    }


    $PreContent = @"

<H1> $($Customer.DisplayName) - Audit Log from $StartDate until $EndDate </H1><br>

<br> Please note that this log is not complete - It is a representation where fields have been selected that are most commonly filtered on. .<br>
To analyze the complete log for this day, please click here for the complete CSV file log: <a href="$($folderName).csv"/>CSV Logbook</a>
<br/>
<br/>

<input type="text" id="myInput" onkeyup="myFunction()" placeholder="Search...">
"@
    $head = @"
<script>
function myFunction() {
    const filter = document.querySelector('#myInput').value.toUpperCase();
    const trs = document.querySelectorAll('table tr:not(.header)');
    trs.forEach(tr => tr.style.display = [...tr.children].find(td => td.innerHTML.toUpperCase().includes(filter)) ? '' : 'none');
  }</script>
<Title>Audit Log Report</Title>
<style>
body { background-color:#E5E4E2;
      font-family:Monospace;
      font-size:10pt; }
td, th { border:0px solid black;
        border-collapse:collapse;
        white-space:pre; }
th { color:white;
    background-color:black; }
table, tr, td, th {
     padding: 2px;
     margin: 0px;
     white-space:pre; }
tr:nth-child(odd) {background-color: lightgray}
table { width:95%;margin-left:5px; margin-bottom:20px; }
h2 {
font-family:Tahoma;
color:#6D7B8D;
}
.footer
{ color:green;
 margin-left:10px;
 font-family:Tahoma;
 font-size:8pt;
 font-style:italic;
}
#myInput {
  background-image: url('https://www.w3schools.com/css/searchicon.png'); /* Add a search icon to input */
  background-position: 10px 12px; /* Position the search icon */
  background-repeat: no-repeat; /* Do not repeat the icon image */
  width: 50%; /* Full-width */
  font-size: 16px; /* Increase font-size */
  padding: 12px 20px 12px 40px; /* Add some padding */
  border: 1px solid #ddd; /* Add a grey border */
  margin-bottom: 12px; /* Add some space below the input */
}
</style>
"@
    #$ObjLogs
    $Logs | export-csv "$($outputfolder)\$($Customer.DisplayName)\$($FolderName).csv" -NoTypeInformation
    $ObjLogs | Select-object CreationTime, UserID, Operation, ResultStatus, ClientIP, Workload, ClientInfoString | Convertto-html -head $head -PreContent $PreContent | out-file "$($outputfolder)\$($Customer.DisplayName)\$($FolderName).html"

    write-host "Generating root HTML file with all dates" -ForegroundColor Green
    $Items = get-childitem "$($outputfolder)\$($Customer.DisplayName)" -Filter "*.html"
    $tableheader = "<table><tr><th>Date</th><th>HTML Report</th><th>CSV Report</th></tr>"
    $URLS = foreach ($item in $Items) {
        @"
        <tr><td>$($item.basename)</td><td><a href=`"$($item.basename).html`"/>HTML Report</a></td><td><a href=`"$($item.basename).CSV`"/>CSV Report</a></td></tr>
"@
    }

    $Head, $tableheader, $URLS | out-file "$($outputfolder)\$($Customer.DisplayName)\index.html"

}

write-host "Generating index file for all clients." -ForegroundColor Green
$tableheader = "<table><tr><th>Customer Name</th><th>Onmicrosoft domain</th><th>Reports</th></tr>"
$customerhtml = foreach ($customer in $customers) {
    @"
    <tr><td>$($customer.DisplayName)</td><td>$($customer.DefaultDomainName)</td><td><a href=`"$($customer.DisplayName)\index.html`"/>Reports</a></td></tr>
"@

}

$Head, $tableheader, $customerhtml | out-file "$($outputfolder)\index.html"
write-host "Ended at $(Get-date)"

And that’s it! an easy way to store the log files for longer than 90 days, and not have the investment of a SIEM. I’d still suggest looking into a SIEM for the long run though. As always, Happy PowerShelling!

All blogs are posted under AGPL3.0 unless stated otherwise
comments powered by Disqus
Built with Hugo
Theme Stack designed by Jimmy