Complete script can be found here: get-exolegacysignins.ps1 but if you want to know what's happening within, check detailed walkthrough below.
Graph API authentication stuff
I'll be using ADAL.NET / Active Directory Authentication Library (ADAL) for authentication. Dll's can be found in AzureRM or AzureAD PowerShell modules.
# to make code more readable and rows shorter,
# ADAL namespace is introduced (must be the first line of the script)
using namespace Microsoft.IdentityModel.Clients.ActiveDirectory
# Binding AAD dlls, AzureADPreview works as well
$AADModule = Get-Module -Name "AzureAD" -ListAvailable
[System.Reflection.Assembly]::LoadFrom($(Join-Path $AADModule.ModuleBase "Microsoft.IdentityModel.Clients.ActiveDirectory.dll")) | out-null
[System.Reflection.Assembly]::LoadFrom($(Join-Path $AADModule.ModuleBase "Microsoft.IdentityModel.Clients.ActiveDirectory.Platform.dll")) | out-null
Then we need Azure application registation for Graph API access. Luckily there's a global one that can be used with delegated permissions. Global multi-tenant "Azure AD PowerShell ClientID" 1b730954-1685-4b74-9bfd-dac224a7b894, includes delegated permissions for- AuditLog.Read.All
- Directory.AccessAsUser.All
- Directory.ReadWrite.All
- Group.ReadWrite.All
$ClientID = "1b730954-1685-4b74-9bfd-dac224a7b894"
#Parameters for Graph API
$resourceURI = "https://graph.microsoft.com"
$authority = "https://login.microsoftonline.com/common"
The Azure app we're using uses delegated permissions. So, we'll have to pass credentials to get the token for Graph API access. To read AAD signin logs admin permissions are required, eg. Security Reader or Global Reader role for example.
$userid = read-host "Enter user name (upn)"
$securepwd = read-host "Enter pa$$w0rd! for $userid" -AsSecureString
$uc = new-object UserPasswordCredential -ArgumentList $userid, $securepwd
Then we can get the token using PowerShell ClientID and previously entered user credentials. Let's authenticate!
$authContext = New-Object AuthenticationContext -ArgumentList $authority
$authResponse = [AuthenticationContextIntegratedAuthExtensions]::AcquireTokenAsync($authContext, $resourceURI, $ClientID, $uc)
$authResult = $authResponse.result
Write-Debug "Auth status: $($authResponse.Status)"
Write-Debug "Auth access token type: $($authResult.AccessTokenType)"
After getting the token we can build header for Graph API requests.
$headers = @{}
$headers.Add('Authorization','Bearer ' + $authResult.AccessToken)
$headers.Add('Content-Type', "application/json")
API Request filters
Then we only want sign-in events using basic authentication. Here we have a list all "clientApps" we are interested in, e.g. Exchange Online basic auth client apps.
$legacyClients = @(
"Other clients",
"Exchange Web Services",
"MAPI Over HTTP",
"POP3",
"Outlook Anywhere (RPC over HTTP)",
"IMAP4",
"AutoDiscover",
"Offline Address Book",
"Authenticated SMTP",
"Exchange Online PowerShell",
"Exchange ActiveSync"
)
There's probably thousands of sign-in events and those should be fetched in smaller "chunks". In this script one request will pull 8 hours of events per query URL. If querying very large tenant, you might want to reduce time frame to even smaller time window. In this script I'm getting all possible events e.g. fetching all legacy sign-ins from Azure AD logs during the last 30 days. NOTE: query filtering is very sensitive to datetime format, dates must be like: "2020-02-05T14:01:02Z".
$reportingStartDate = (Get-Date).ToUniversalTime().Date.AddDays(-30)
$reportingEndDate = (Get-Date).ToUniversalTime().Date
$timespanMinutes = 480 #8h
We're getting closer of looping all together. Here we initialize the start time for filtering events by createdDateTime property.
# set initial "nextstarttime" for web request
$nextstarttime = $reportingStartDate
Do { ...
Finally, we're inside the main loop that keeps goinging on until all blocks are finished. 8 hours at the time. First thing in the outer do-while loop we'll have to initialize datetime query filters. Also note that export CSV filepath is defined in this section. This will generate one CSV for each day. Of course, by changing csv naming you can export all to single file. Events are later on appended to this file.
# get log entries for a specified time span
$fromtime = $nextstarttime # set "from" as the ending datetime from the previous time window
$totime = $fromtime.AddMinutes($timespanMinutes)
# ensure totime is in spesified timeframe, if passes it, set end date as report end datetime
if ($totime -gt $reportingEndDate) { $totime = $reportingEndDate }
# NOTE: filtering is very sensitive to datetime format, date must be like: 2020-02-05T14:01:02Z
# convert to "sortable" format
$from = $($fromtime.ToString("s")) + "Z"
$to = $($totime.ToString("s")) + "Z"
Write-Debug "Request from $from"
Write-Debug "Request to $to"
#set start for the next round in do-while
$nextstarttime = $totime
# generate only one log file for each day, append results to AAD_LegacySignInReport_yyyyMMdd.csv
$now = "{0:yyyyMMdd}" -f $fromtime
$outputFile = ".\AAD_LegacySignInReport_$now.csv"
Next step, generating query URLs. We're genarating separate query URLs for each client app in "legacyClients" in given 8 hour time frame. So, lots of queries happening soon.
# generate request URLs for each legacy client type, using given time range
$urls = @()
$legacyClients|%{
$urls += "https://graph.microsoft.com/beta/auditLogs/signIns?"+`
"`$filter="+`
"createdDateTime%20ge%20" + $from + "%20"+`
"and%20createdDateTime%20le%20" + $to + "%20"+`
"and%20clientAppUsed%20eq%20'" + $_ +`
"'&`$top=1000"
}
foreach ($url in $urls) { ...
Pulling sign-in events from Graph API
Finally we're getting something out of logs.... Another do-while loop to get all rows related to single query filtered by- "createdDateTime" - 8 hour at the time
- "clientAppUsed" - value within defined client apps
From converted PSObject collection, formatted results are exported to CSV file.
If query returns more than top 1000 rows specified in URL, new query url is fetched from "@odata.nextLink" property. Queries are done in do-while loop until nextLink is empty.
Do {
Write-Output "Requesting sign-ins: $url"
Try {
# get data and convert json payload
$myReport = (Invoke-WebRequest -UseBasicParsing -Headers $headers -Uri $url)
$results = ($myReport.Content | ConvertFrom-Json).value
# export sign-in data to CSV
$results | `
select `
createdDateTime,
userDisplayName,userPrincipalName,userId,`
appId,appDisplayName,`
ipAddress,`
clientAppUsed,`
userAgent,`
conditionalAccessStatus,`
isInteractive,`
resourceDisplayName,resourceId,`
mfaDetail,`
@{Name='status.errorCode'; Expression={$_.status.errorCode}},`
@{Name='status.failureReason'; Expression={$_.status.failureReason}},`
@{Name='status.additionalDetails'; Expression={$_.status.additionalDetails}},`
@{Name='location.countryOrRegion'; Expression={$_.location.countryOrRegion}},`
@{Name='location.city'; Expression={$_.location.city}},`
@{Name='device.deviceId'; Expression={$_.deviceDetail.deviceId}},`
@{Name='device.displayName'; Expression={$_.DeviceDetail.displayName}},`
@{Name='device.operatingSystem'; Expression={$_.DeviceDetail.operatingSystem}},`
@{Name='device.browser'; Expression={$_.DeviceDetail.browser}},`
@{Name='device.isCompliant'; Expression={$_.DeviceDetail.isCompliant}},`
@{Name='device.isManaged'; Expression={$_.DeviceDetail.isManaged}},`
@{Name='device.trustType'; Expression={$_.DeviceDetail.trustType}}`
| Export-Csv $outputFile -Append -NoTypeInformation -Encoding UTF8
# if request returns over 1000 events, next query url is returned
$url = ($myReport.Content | ConvertFrom-Json).'@odata.nextLink'
$count = $count+$results.Count
Write-Output "$count events returned"
} Catch { # retry logic related stuff here ... }
} while($url -ne $null)
And that's it!
Just import CSVs to Excel or to Power BI and start analysing :)
No comments:
Post a Comment