반응형

Previous Post: 2025.08.10 - [Microsoft 365] - Microsoft 365 Log Management (1): Getting Started with Sentinel

 

Microsoft 365 Log Management (1): Getting Started with Sentinel

▶ Watch on YouTube: Microsoft 365 Log Management (1): Getting Started with Sentinel Why Log Management Matters in Microsoft 365One of the biggest challenges I faced while managing Microsoft 365 was log management.Initially, message trace and audit logs w

pepuri.limcm.kr

 

 

In the previous post, we explored how to enable Microsoft Sentinel and start collecting Microsoft 365 logs.
This time, we’ll focus on integrating Microsoft Defender for Identity (MDI) logs into Sentinel and preparing them for Power BI visualization.


Youtube: Microsoft 365 Log Management (2): Connecting MDI Logs to Sentinel and Power BI

 

 

Step 1. Verify MDI Activation

Navigation Path: System → Settings → Identities

 

Check Sensor Activation:
With the latest MDI v3, activation is much simpler—if your Domain Controller is already onboarded to Microsoft

 

Defender for Endpoint (MDE), MDI can be enabled without additional steps.

(A separate post will cover the new version once it’s officially released.)

 

Verify Signals:
Go to Advanced Hunting and confirm that IdentityLogonEvents are being recorded.

→ If signals appear here, you can confirm that Sentinel is also receiving MDI logs.

 

Connector Setup:
Navigate to Microsoft Defender XDR → Open connector page.

 

 

→ Enable Microsoft Defender for Identity and save.

 

After a short delay, you should be able to query MDI logs in Sentinel.

 


Step 2. Register an Enterprise App for Sentinel Log Export

Currently, Advanced Hunting and Sentinel have limitations when running large queries.
Our ultimate goal is to visualize data in Power BI, so we’ll first store logs as CSV files in SharePoint.

 

To achieve this, we’ll use the Log Analytics API, which requires Enterprise App registration.

Registration Steps

1. Go to Entra Admin Center → App registrations → New registration

 

2. Name the app → Register

 

3. Navigate to API permissions → Add a permission

 

4. Select APIs my organization uses → Log Analytics API

 

5. Check Data.ReadAdd permissions

 

6. Click Grant admin consent

 

7. Go to Certificates & secrets → New client secret → Add

 

8. Copy the generated Value and store it securely

 

9. In Log Analytics Workspaces → Access control (IAM), click Add role assignment

 

10. Assign Log Analytics Reader role

 

11. Grant the role to the newly created app

 


Step 3. Export Logs to CSV

Tenant ID & Client ID

 

Workspace ID

 

Client Secret

Once these values are ready, you can use a PowerShell script to call the Log Analytics API and export logs in chunks.

 

I created the following script to call the Log Analytics API using AI.

You’ll need the following details for the script:

# === Authentication (Service Principal) ===
$TenantId = "<TENANT_ID>"
$ClientId = "<CLIENT_ID>"
$Secret   = "<CLIENT_SECRET>"  

# === Workspace ===
$WorkspaceId = "<WORKSPACE_ID>"

# === Extraction target / Period / Output ===
$Table        = "IdentityLogonEvents"
$StartUtc     = [datetime]"2025-08-12T00:00:00Z"
$EndUtc       = [datetime]::UtcNow
$ChunkHours   = 6  
$OutDir       = "F:\sentinel\IdentityLogonEvents"
$FilePrefix   = "IdentityLogonEvents"
$SkipExisting = $true

# === Interval / Retry / Timeout ===
$MinIntervalSeconds = 30 
$HttpTimeoutSeconds = 300
$MaxRetries         = 5
$BaseDelaySeconds   = 5

<# ======================= Utilities ======================= #>

# Create folder
New-Item -ItemType Directory -Force -Path $OutDir | Out-Null

# Token cache
$Script:TokenInfo = $null

function Get-LogAnalyticsToken {
    if ($Script:TokenInfo -and $Script:TokenInfo.ExpiresOn -gt (Get-Date).ToUniversalTime().AddMinutes(5)) {
        return $Script:TokenInfo.AccessToken
    }

    $body = @{
        client_id     = $ClientId
        client_secret = $Secret
        grant_type    = "client_credentials"
        scope         = "https://api.loganalytics.io/.default"
    }

    $tokenResponse = Invoke-RestMethod -Method Post `
        -Uri "https://login.microsoftonline.com/$TenantId/oauth2/v2.0/token" `
        -Body $body `
        -TimeoutSec $HttpTimeoutSeconds

    $Script:TokenInfo = [pscustomobject]@{
        AccessToken = $tokenResponse.access_token
        ExpiresOn   = (Get-Date).ToUniversalTime().AddSeconds([int]$tokenResponse.expires_in)
    }
    return $Script:TokenInfo.AccessToken
}

function Invoke-LAQuery {
    param(
        [Parameter(Mandatory=$true)] [string] $Kql,
        [Parameter(Mandatory=$true)] [string] $WorkspaceId
    )

    $attempt = 0
    while ($true) {
        $attempt++
        $token = Get-LogAnalyticsToken
        $headers = @{ Authorization = "Bearer $token" }
        $body    = @{ query = $Kql } | ConvertTo-Json

        try {
            return Invoke-RestMethod -Method Post `
                -Uri "https://api.loganalytics.azure.com/v1/workspaces/$WorkspaceId/query" `
                -Headers $headers -ContentType "application/json" `
                -Body $body -TimeoutSec $HttpTimeoutSeconds
        }
        catch {
            $status = $_.Exception.Response.StatusCode.value__
            $resp   = $null
            try { $resp = [System.IO.StreamReader]::new($_.Exception.Response.GetResponseStream()).ReadToEnd() } catch {}

            # 401: Refresh token
            if ($status -eq 401 -and $attempt -le $MaxRetries) {
                $Script:TokenInfo = $null
                Start-Sleep -Seconds ($BaseDelaySeconds * [math]::Pow(2, $attempt - 1))
                continue
            }

            # 429 or 5xx
            if (($status -eq 429 -or $status -ge 500) -and $attempt -le $MaxRetries) {
                $retryAfter = 0
                try { $retryAfter = [int]$_.Exception.Response.Headers["Retry-After"] } catch {}
                if ($retryAfter -le 0) {
                    $retryAfter = [int]($BaseDelaySeconds * [math]::Pow(2, $attempt - 1))
                }
                Write-Warning "Query throttled/failed (status $status). Retry in $retryAfter sec. Attempt $attempt/$MaxRetries"
                Start-Sleep -Seconds $retryAfter
                continue
            }

            throw "Log Analytics query failed (status $status): $resp"
        }
    }
}

function Convert-RowsToObjects {
    param(
        [Parameter(Mandatory=$true)] $ResultTable
    )
    $cols = $ResultTable.columns.name
    $rows = $ResultTable.rows | ForEach-Object {
        $o = [ordered]@{}
        for ($i=0; $i -lt $cols.Count; $i++) { $o[$cols[$i]] = $_[$i] }
        [pscustomobject]$o
    }

    foreach ($row in $rows) {
        foreach ($p in $row.PSObject.Properties) {
            $v = $p.Value
            if ($v -is [System.Collections.IDictionary] -or
                $v -is [System.Array] -or
                $v -is [PSCustomObject]) {
                $row.($p.Name) = ($v | ConvertTo-Json -Compress -Depth 50)
            }
        }
    }
    return $rows
}

function Wait-ForRateLimit($startedAt, [int]$minSeconds) {
    $elapsed = [int]((Get-Date).ToUniversalTime() - $startedAt).TotalSeconds
    $remain  = $minSeconds - $elapsed
    if ($remain -gt 0) { Start-Sleep -Seconds $remain }
}

<# ======================= Query Loop ======================= #>

$cursor = $StartUtc
while ($cursor -lt $EndUtc) {
    $iterStart = [datetime]::UtcNow

    $chunkStart = $cursor
    $chunkEnd   = $cursor.AddHours($ChunkHours)
    $cursor     = $chunkEnd

    $stamp   = $chunkStart.ToString("yyyyMMddHHmm")
    $outFile = Join-Path $OutDir ("{0}{1}.csv" -f $FilePrefix, $stamp)
    if ($SkipExisting -and (Test-Path $outFile)) {
        Write-Host "Skip: $outFile"
        Wait-ForRateLimit $iterStart $MinIntervalSeconds
        continue
    }

    $startIso = $chunkStart.ToString("yyyy-MM-ddTHH:mm:ssZ")
    $endIso   = $chunkEnd.ToString("yyyy-MM-ddTHH:mm:ssZ")

    $kql = @"
$Table
| where TimeGenerated >= datetime('$startIso')
| where TimeGenerated <  datetime('$endIso')
| order by TimeGenerated asc
"@

    Write-Host ("Query {0}Z ~ {1}Z" -f $chunkStart.ToString("s"), $chunkEnd.ToString("s"))

    try {
        $r = Invoke-LAQuery -Kql $kql -WorkspaceId $WorkspaceId

        if (-not $r.tables -or $r.tables.Count -eq 0 -or -not $r.tables[0]) {
            Write-Host "  -> No result table."
        } else {
            $rows = Convert-RowsToObjects -ResultTable $r.tables[0]
            if ($rows -and $rows.Count -gt 0) {
                $rows | Export-Csv -Path $outFile -NoTypeInformation -Encoding UTF8
                Write-Host ("  -> {0} rows -> {1}" -f $rows.Count, $outFile)

                if ($rows.Count -ge 450000) {
                    Add-Content -Path (Join-Path $OutDir "_oversized.txt") -Value "$startIso~$endIso,$($rows.Count)"
                    Write-Warning "Result very large ($($rows.Count) rows). Consider reducing chunk size for this period."
                }
            } else {
                Write-Host "  -> No rows."
            }
        }
    }
    catch {
        Write-Warning "Error range: $startIso ~ $endIso"
        Write-Warning "Error: $($_.Exception.Message)"
        Add-Content -Path (Join-Path $OutDir "_failed.txt") -Value "$startIso~$endIso"
    }

    Wait-ForRateLimit $iterStart $MinIntervalSeconds
}

Write-Host "Done. Output dir: $OutDir"

 

 

 

 

Tip: Adjust ChunkHours and MinIntervalSeconds to avoid hitting API throttling limits.

When everything is configured correctly, the export process will look like this:


Step 4. Connect Power BI (Load CSV from SharePoint)

From my perspective, the ideal approach would be for Sentinel to natively support BI integration.
Although it provides queries that allow you to connect Power BI as shown below, due to API call limitations, a separate storage layer is required for effective use in BI.

 

The Sentinel Data Lake feature is currently available in preview, but it appears that Power BI integration is not yet supported.
For now, we’ll store the data in SharePoint Online, which is a cost-effective option, and then aggregate it in Power BI.

 

 

Upload CSV to SharePoint

 

Power BI Desktop  Get Data  Blank query

 

Advanced Editior

 

Paste the query below. (This was created with the help of AI.)

let
    // ========== ① User Settings ==========
    SiteUrl         = "https://clim823.sharepoint.com/sites/Sentinel",
    LibraryName     = "Shared Documents",      
    TargetFolder    = "IdentityLogonEvents",     
    FileNamePrefix  = "IdentityLogonEvents",     
    KeepLastNMonths = 6,

    // ========== ② File → Table Conversion Function ==========
    ParseCsv = (fileContent as binary) as table =>
        let
            csv = Csv.Document(
                    fileContent,
                    [Delimiter = ",", Columns = null, Encoding = 65001, QuoteStyle = QuoteStyle.Csv]
                  ),
            promoted = Table.PromoteHeaders(csv, [PromoteAllScalars = true])
        in
            promoted,

    // ========== ③ Navigate to Target Folder ==========
    Source      = SharePoint.Contents(SiteUrl, [ApiVersion = 15]),
    Library     = Source{[Name=LibraryName]}[Content],
    Folder      = Library{[Name=TargetFolder]}[Content],   // DeviceLogonEvents 

    // ========== ④ Filter Files ==========
    FilteredByName = Table.SelectRows(Folder, each Text.StartsWith([Name], FileNamePrefix)),
    FilteredByExt  = Table.SelectRows(FilteredByName, each Text.Lower([Extension]) = ".csv"),

    // ========== ⑤ Load Files → Convert to Tables ==========
    AddedData   = Table.AddColumn(FilteredByExt, "Data", each ParseCsv([Content]), type table),
    TablesList  = List.RemoveNulls(List.Transform(AddedData[Data], each try _ otherwise null)),

    // ========== ⑥ Align Schema & Merge ==========
    AllCols        = if List.Count(TablesList) = 0 
                     then {} 
                     else List.Distinct(List.Combine(List.Transform(TablesList, each Table.ColumnNames(_)))),
    AlignedTables  = List.Transform(TablesList, each Table.ReorderColumns(_, AllCols, MissingField.UseNull)),
    Appended       = if List.Count(AlignedTables) = 0 
                     then #table(AllCols, {}) 
                     else Table.Combine(AlignedTables),

    // ========== ⑦ Filter by Last N Months ==========
    WithTimestampTyped = if List.Contains(Table.ColumnNames(Appended), "Timestamp")
                         then Table.TransformColumnTypes(Appended, {{"Timestamp", type datetime}})
                         else Appended,

    FilteredByDate =
        if List.Contains(Table.ColumnNames(WithTimestampTyped), "Timestamp")
        then Table.SelectRows(WithTimestampTyped, each [Timestamp] >= Date.AddMonths(DateTime.LocalNow(), -KeepLastNMonths))
        else WithTimestampTyped
in
    FilteredByDate

 

Close & Apply

 

Using this data, you can build dashboards that provide valuable insights into identity-related activities, as shown below.


Why This Matters

By connecting MDI logs to Sentinel and then visualizing them in Power BI, you can:

  • Detect suspicious identity activities faster
  • Correlate identity signals with other security data
  • Build interactive dashboards for security insights

 

반응형

+ Recent posts