반응형

Previously, I covered how to export a Power BI M Query from Microsoft Sentinel and connect it to Power BI Desktop.

2025.08.24 - [Microsoft 365] - Microsoft 365 Log Management (2): Connecting MDI Logs to Sentinel and Power BI

 

While doing a self-study to compare Endpoint DLP logs against Microsoft Defender for Endpoint (MDE) logs, I ran into a practical issue: in Power BI, reorganizing column order can be surprisingly annoying when you just want to quickly compare a few fields side by side.

 

After digging in, I found a very handy trick:

✅ You can take the M Query exported from Sentinel/Log Analytics and paste it directly into Excel Power Query—and it works.

 

If you do analysis primarily in Excel (filters, quick comparisons, pivot tables), this approach is super practical.

So here’s the clean workflow:

“M Query export → Excel connection → analysis”

 

Youtube: https://youtu.be/iuyK1sINfzw

 


TL;DR

  • In Sentinel / Log Analytics, export your query using Export to Power BI (as an M query).
  • In Excel, open Power Query (Blank Query) and paste the M Query into the Advanced Editor.
  • Authenticate using Organizational account, then Close & Load to load it into a worksheet table.
  • From then on, just hit Refresh to update logs—no more re-running the same query in the portal.

Step 1) Export the M Query from Sentinel / Log Analytics

In the Azure Portal, navigate to either:

  • Microsoft Sentinel > Logs

  • Log Analytics Workspace > Logs

 

 

Write or select the query for the table > Setting Time range > Share > Export to Power BI (as an M query)


Step 2) Connect to Log Analytics Using M Query in Excel

2-1) Create a Blank Query

In Excel:

  • Data > Get Data > From Other Sources > Blank Query


2-2) Paste the M Query into Advanced Editor

In the Power Query Editor:

Open Advanced Editor

 

 

Paste the entire M Query you downloaded in Step 1 as-is

A typical exported M Query includes things like:

  • The target table
  • The query time range

 

✅ Pro tip: If you need to connect multiple tables, just duplicate the query and update only the table name and time span section. It’s the fastest way to scale your workbook.


2-3) Configure Credentials (Authentication)

On first connection, you may see Edit Credentials.

 

 

Organizational account → sign in → Connect


2-4) Load to Excel and Refresh Anytime

Before loading:

  • Rename the query to something meaningful
  • Then choose Close & Load to load into an Excel worksheet table

  • Use filters, sorting, pivots, conditional formatting, side-by-side comparisons… all the Excel stuff that’s great for fast investigation.

 

And the best part:

Refresh updates the dataset without re-running the whole process in the portal.


Step 3) Bonus: Analyze Logs with Copilot (Excel + OneDrive/SharePoint)

After loading logs into Excel:

  1. Save the workbook to OneDrive or SharePoint
  2. Ask Copilot to analyze the data

If Copilot recognizes your tables (for example, MDE-related tables), it can quickly do things like:

  • Summaries
  • Trend analysis
  • Outlier/anomaly detection
  • Quick insights and narrative explanations

Wrap-up

Using M Query Export from Sentinel/Log Analytics isn’t just for Power BI—you can connect it directly to Excel and build a refreshable log analysis workbook.

If your workflow is centered on:

  • Fast comparison
  • Column reordering
  • Filtering
  • Pivot-based analysis

…then Excel can be the more efficient tool. And once the dataset is in OneDrive/SharePoint, Copilot becomes an extra boost for rapid investigation.

반응형
반응형

In the previous post, I covered the flow of managing logs from MDI → Sentinel → Log Analytics API → PowerShell → CSV → BI.

 

Previous Post:

2025.08.24 - [Microsoft 365] - Microsoft 365 Log Management (2): Connecting MDI Logs to Sentinel and Power BI

 

While exporting logs using PowerShell, I started to wonder:
As we move toward a more serverless cloud environment, managing logs via scheduled PowerShell scripts means I still need to operate a VM, which increases management overhead.

If you’re only considering cost, scheduling PowerShell scripts on a VM and exporting to SharePoint or OneDrive can be cheaper.
However, from a long-term perspective, I believe it’s time to move away from running scheduled PowerShell scripts on VMs and adopt a serverless approach.

Also, visualizing and managing logs with BI tools can provide valuable insights.
With this in mind, I anticipate that connecting to Microsoft Fabric or similar platforms will eventually become necessary.

In this post, I’ll cover how to export logs to Azure Data Lake Storage (ADLS) Gen2 and connect them to BI.

 

Youtube : Microsoft 365 Log Management (3): How to connect Sentinel logs to Azure Data Lake Storage Gen 2

 


Step 1. Create an ADLS Gen2 Storage Account

1. Go to Azure Portal → Search for Storage Accounts

 

2. Create a Storage Account
In Preferred storage type, select Azure Blob Storage or Azure Data Lake Storage Gen2.

 

 

3. Hierarchical Namespace - Check Enable hierarchical namespace.

Data Lake Storage Gen2 is suitable for big data analytics and other data analysis scenarios.

 

4. Complete the creation and verify the storage account


Step 2. Create an Export Rule

1. Go to Log Analytics Workspace → Settings → Data Export → Create export rule

 

2. Name your rule

 

3. Select the tables to export

 

4. Set the destination to the storage account you created

 

5. Go to Data storage → Containers to check the exported tables

 

6. Navigate through subfolders to see that exports occur every 5 minutes

Step 3. Connect to Power BI

1. In Power BI Desktop, go to Get data → More

 

2. Select Azure → Azure Data Lake Storage Gen2

 

3. You’ll be prompted to enter a URL

 

4. Find the DFS URL using Azure Storage Explorer

Go to Storage Account → Storage browser → Download and install Azure Storage Explorer

 

Connect, navigate to the folder path, and open Properties

 

Copy the DFS URL

 

5. Paste the URL into Power BI

 

6. Enter your credentials (Account Key)

 

You can find the Account Key under Security + networking → Access keys

 

7. Connect and then Combine & Transform Data

 

Unlike saving to SharePoint, where you need to create queries manually, the native connector support makes this process much simpler.


Conclusion

By following these steps, you can export Microsoft 365 logs to Azure Data Lake Storage Gen2 and easily visualize them in Power BI.
If you’re considering a serverless environment and BI integration, this approach offers a more efficient and scalable way to manage your logs in the long run.

반응형
반응형

Previous Post: 2025.08.10 - [Microsoft 365] - Microsoft 365 Log Management (1): Getting Started with Sentinel

 

Microsoft 365 Log Management (1): Getting Started with Sentinel

▶ Watch on YouTube: Microsoft 365 Log Management (1): Getting Started with Sentinel Why Log Management Matters in Microsoft 365One of the biggest challenges I faced while managing Microsoft 365 was log management.Initially, message trace and audit logs w

pepuri.limcm.kr

 

 

In the previous post, we explored how to enable Microsoft Sentinel and start collecting Microsoft 365 logs.
This time, we’ll focus on integrating Microsoft Defender for Identity (MDI) logs into Sentinel and preparing them for Power BI visualization.


Youtube: Microsoft 365 Log Management (2): Connecting MDI Logs to Sentinel and Power BI

 

 

Step 1. Verify MDI Activation

Navigation Path: System → Settings → Identities

 

Check Sensor Activation:
With the latest MDI v3, activation is much simpler—if your Domain Controller is already onboarded to Microsoft

 

Defender for Endpoint (MDE), MDI can be enabled without additional steps.

(A separate post will cover the new version once it’s officially released.)

 

Verify Signals:
Go to Advanced Hunting and confirm that IdentityLogonEvents are being recorded.

→ If signals appear here, you can confirm that Sentinel is also receiving MDI logs.

 

Connector Setup:
Navigate to Microsoft Defender XDR → Open connector page.

 

 

→ Enable Microsoft Defender for Identity and save.

 

After a short delay, you should be able to query MDI logs in Sentinel.

 


Step 2. Register an Enterprise App for Sentinel Log Export

Currently, Advanced Hunting and Sentinel have limitations when running large queries.
Our ultimate goal is to visualize data in Power BI, so we’ll first store logs as CSV files in SharePoint.

 

To achieve this, we’ll use the Log Analytics API, which requires Enterprise App registration.

Registration Steps

1. Go to Entra Admin Center → App registrations → New registration

 

2. Name the app → Register

 

3. Navigate to API permissions → Add a permission

 

4. Select APIs my organization uses → Log Analytics API

 

5. Check Data.ReadAdd permissions

 

6. Click Grant admin consent

 

7. Go to Certificates & secrets → New client secret → Add

 

8. Copy the generated Value and store it securely

 

9. In Log Analytics Workspaces → Access control (IAM), click Add role assignment

 

10. Assign Log Analytics Reader role

 

11. Grant the role to the newly created app

 


Step 3. Export Logs to CSV

Tenant ID & Client ID

 

Workspace ID

 

Client Secret

Once these values are ready, you can use a PowerShell script to call the Log Analytics API and export logs in chunks.

 

I created the following script to call the Log Analytics API using AI.

You’ll need the following details for the script:

# === Authentication (Service Principal) ===
$TenantId = "<TENANT_ID>"
$ClientId = "<CLIENT_ID>"
$Secret   = "<CLIENT_SECRET>"  

# === Workspace ===
$WorkspaceId = "<WORKSPACE_ID>"

# === Extraction target / Period / Output ===
$Table        = "IdentityLogonEvents"
$StartUtc     = [datetime]"2025-08-12T00:00:00Z"
$EndUtc       = [datetime]::UtcNow
$ChunkHours   = 6  
$OutDir       = "F:\sentinel\IdentityLogonEvents"
$FilePrefix   = "IdentityLogonEvents"
$SkipExisting = $true

# === Interval / Retry / Timeout ===
$MinIntervalSeconds = 30 
$HttpTimeoutSeconds = 300
$MaxRetries         = 5
$BaseDelaySeconds   = 5

<# ======================= Utilities ======================= #>

# Create folder
New-Item -ItemType Directory -Force -Path $OutDir | Out-Null

# Token cache
$Script:TokenInfo = $null

function Get-LogAnalyticsToken {
    if ($Script:TokenInfo -and $Script:TokenInfo.ExpiresOn -gt (Get-Date).ToUniversalTime().AddMinutes(5)) {
        return $Script:TokenInfo.AccessToken
    }

    $body = @{
        client_id     = $ClientId
        client_secret = $Secret
        grant_type    = "client_credentials"
        scope         = "https://api.loganalytics.io/.default"
    }

    $tokenResponse = Invoke-RestMethod -Method Post `
        -Uri "https://login.microsoftonline.com/$TenantId/oauth2/v2.0/token" `
        -Body $body `
        -TimeoutSec $HttpTimeoutSeconds

    $Script:TokenInfo = [pscustomobject]@{
        AccessToken = $tokenResponse.access_token
        ExpiresOn   = (Get-Date).ToUniversalTime().AddSeconds([int]$tokenResponse.expires_in)
    }
    return $Script:TokenInfo.AccessToken
}

function Invoke-LAQuery {
    param(
        [Parameter(Mandatory=$true)] [string] $Kql,
        [Parameter(Mandatory=$true)] [string] $WorkspaceId
    )

    $attempt = 0
    while ($true) {
        $attempt++
        $token = Get-LogAnalyticsToken
        $headers = @{ Authorization = "Bearer $token" }
        $body    = @{ query = $Kql } | ConvertTo-Json

        try {
            return Invoke-RestMethod -Method Post `
                -Uri "https://api.loganalytics.azure.com/v1/workspaces/$WorkspaceId/query" `
                -Headers $headers -ContentType "application/json" `
                -Body $body -TimeoutSec $HttpTimeoutSeconds
        }
        catch {
            $status = $_.Exception.Response.StatusCode.value__
            $resp   = $null
            try { $resp = [System.IO.StreamReader]::new($_.Exception.Response.GetResponseStream()).ReadToEnd() } catch {}

            # 401: Refresh token
            if ($status -eq 401 -and $attempt -le $MaxRetries) {
                $Script:TokenInfo = $null
                Start-Sleep -Seconds ($BaseDelaySeconds * [math]::Pow(2, $attempt - 1))
                continue
            }

            # 429 or 5xx
            if (($status -eq 429 -or $status -ge 500) -and $attempt -le $MaxRetries) {
                $retryAfter = 0
                try { $retryAfter = [int]$_.Exception.Response.Headers["Retry-After"] } catch {}
                if ($retryAfter -le 0) {
                    $retryAfter = [int]($BaseDelaySeconds * [math]::Pow(2, $attempt - 1))
                }
                Write-Warning "Query throttled/failed (status $status). Retry in $retryAfter sec. Attempt $attempt/$MaxRetries"
                Start-Sleep -Seconds $retryAfter
                continue
            }

            throw "Log Analytics query failed (status $status): $resp"
        }
    }
}

function Convert-RowsToObjects {
    param(
        [Parameter(Mandatory=$true)] $ResultTable
    )
    $cols = $ResultTable.columns.name
    $rows = $ResultTable.rows | ForEach-Object {
        $o = [ordered]@{}
        for ($i=0; $i -lt $cols.Count; $i++) { $o[$cols[$i]] = $_[$i] }
        [pscustomobject]$o
    }

    foreach ($row in $rows) {
        foreach ($p in $row.PSObject.Properties) {
            $v = $p.Value
            if ($v -is [System.Collections.IDictionary] -or
                $v -is [System.Array] -or
                $v -is [PSCustomObject]) {
                $row.($p.Name) = ($v | ConvertTo-Json -Compress -Depth 50)
            }
        }
    }
    return $rows
}

function Wait-ForRateLimit($startedAt, [int]$minSeconds) {
    $elapsed = [int]((Get-Date).ToUniversalTime() - $startedAt).TotalSeconds
    $remain  = $minSeconds - $elapsed
    if ($remain -gt 0) { Start-Sleep -Seconds $remain }
}

<# ======================= Query Loop ======================= #>

$cursor = $StartUtc
while ($cursor -lt $EndUtc) {
    $iterStart = [datetime]::UtcNow

    $chunkStart = $cursor
    $chunkEnd   = $cursor.AddHours($ChunkHours)
    $cursor     = $chunkEnd

    $stamp   = $chunkStart.ToString("yyyyMMddHHmm")
    $outFile = Join-Path $OutDir ("{0}{1}.csv" -f $FilePrefix, $stamp)
    if ($SkipExisting -and (Test-Path $outFile)) {
        Write-Host "Skip: $outFile"
        Wait-ForRateLimit $iterStart $MinIntervalSeconds
        continue
    }

    $startIso = $chunkStart.ToString("yyyy-MM-ddTHH:mm:ssZ")
    $endIso   = $chunkEnd.ToString("yyyy-MM-ddTHH:mm:ssZ")

    $kql = @"
$Table
| where TimeGenerated >= datetime('$startIso')
| where TimeGenerated <  datetime('$endIso')
| order by TimeGenerated asc
"@

    Write-Host ("Query {0}Z ~ {1}Z" -f $chunkStart.ToString("s"), $chunkEnd.ToString("s"))

    try {
        $r = Invoke-LAQuery -Kql $kql -WorkspaceId $WorkspaceId

        if (-not $r.tables -or $r.tables.Count -eq 0 -or -not $r.tables[0]) {
            Write-Host "  -> No result table."
        } else {
            $rows = Convert-RowsToObjects -ResultTable $r.tables[0]
            if ($rows -and $rows.Count -gt 0) {
                $rows | Export-Csv -Path $outFile -NoTypeInformation -Encoding UTF8
                Write-Host ("  -> {0} rows -> {1}" -f $rows.Count, $outFile)

                if ($rows.Count -ge 450000) {
                    Add-Content -Path (Join-Path $OutDir "_oversized.txt") -Value "$startIso~$endIso,$($rows.Count)"
                    Write-Warning "Result very large ($($rows.Count) rows). Consider reducing chunk size for this period."
                }
            } else {
                Write-Host "  -> No rows."
            }
        }
    }
    catch {
        Write-Warning "Error range: $startIso ~ $endIso"
        Write-Warning "Error: $($_.Exception.Message)"
        Add-Content -Path (Join-Path $OutDir "_failed.txt") -Value "$startIso~$endIso"
    }

    Wait-ForRateLimit $iterStart $MinIntervalSeconds
}

Write-Host "Done. Output dir: $OutDir"

 

 

 

 

Tip: Adjust ChunkHours and MinIntervalSeconds to avoid hitting API throttling limits.

When everything is configured correctly, the export process will look like this:


Step 4. Connect Power BI (Load CSV from SharePoint)

From my perspective, the ideal approach would be for Sentinel to natively support BI integration.
Although it provides queries that allow you to connect Power BI as shown below, due to API call limitations, a separate storage layer is required for effective use in BI.

 

The Sentinel Data Lake feature is currently available in preview, but it appears that Power BI integration is not yet supported.
For now, we’ll store the data in SharePoint Online, which is a cost-effective option, and then aggregate it in Power BI.

 

 

Upload CSV to SharePoint

 

Power BI Desktop  Get Data  Blank query

 

Advanced Editior

 

Paste the query below. (This was created with the help of AI.)

let
    // ========== ① User Settings ==========
    SiteUrl         = "https://clim823.sharepoint.com/sites/Sentinel",
    LibraryName     = "Shared Documents",      
    TargetFolder    = "IdentityLogonEvents",     
    FileNamePrefix  = "IdentityLogonEvents",     
    KeepLastNMonths = 6,

    // ========== ② File → Table Conversion Function ==========
    ParseCsv = (fileContent as binary) as table =>
        let
            csv = Csv.Document(
                    fileContent,
                    [Delimiter = ",", Columns = null, Encoding = 65001, QuoteStyle = QuoteStyle.Csv]
                  ),
            promoted = Table.PromoteHeaders(csv, [PromoteAllScalars = true])
        in
            promoted,

    // ========== ③ Navigate to Target Folder ==========
    Source      = SharePoint.Contents(SiteUrl, [ApiVersion = 15]),
    Library     = Source{[Name=LibraryName]}[Content],
    Folder      = Library{[Name=TargetFolder]}[Content],   // DeviceLogonEvents 

    // ========== ④ Filter Files ==========
    FilteredByName = Table.SelectRows(Folder, each Text.StartsWith([Name], FileNamePrefix)),
    FilteredByExt  = Table.SelectRows(FilteredByName, each Text.Lower([Extension]) = ".csv"),

    // ========== ⑤ Load Files → Convert to Tables ==========
    AddedData   = Table.AddColumn(FilteredByExt, "Data", each ParseCsv([Content]), type table),
    TablesList  = List.RemoveNulls(List.Transform(AddedData[Data], each try _ otherwise null)),

    // ========== ⑥ Align Schema & Merge ==========
    AllCols        = if List.Count(TablesList) = 0 
                     then {} 
                     else List.Distinct(List.Combine(List.Transform(TablesList, each Table.ColumnNames(_)))),
    AlignedTables  = List.Transform(TablesList, each Table.ReorderColumns(_, AllCols, MissingField.UseNull)),
    Appended       = if List.Count(AlignedTables) = 0 
                     then #table(AllCols, {}) 
                     else Table.Combine(AlignedTables),

    // ========== ⑦ Filter by Last N Months ==========
    WithTimestampTyped = if List.Contains(Table.ColumnNames(Appended), "Timestamp")
                         then Table.TransformColumnTypes(Appended, {{"Timestamp", type datetime}})
                         else Appended,

    FilteredByDate =
        if List.Contains(Table.ColumnNames(WithTimestampTyped), "Timestamp")
        then Table.SelectRows(WithTimestampTyped, each [Timestamp] >= Date.AddMonths(DateTime.LocalNow(), -KeepLastNMonths))
        else WithTimestampTyped
in
    FilteredByDate

 

Close & Apply

 

Using this data, you can build dashboards that provide valuable insights into identity-related activities, as shown below.


Why This Matters

By connecting MDI logs to Sentinel and then visualizing them in Power BI, you can:

  • Detect suspicious identity activities faster
  • Correlate identity signals with other security data
  • Build interactive dashboards for security insights

 

반응형
반응형

▶ Watch on YouTube: Microsoft 365 Log Management (1): Getting Started with Sentinel

 


Why Log Management Matters in Microsoft 365

One of the biggest challenges I faced while managing Microsoft 365 was log management.
Initially, message trace and audit logs were enough. But as I started incorporating security solutions like Microsoft Defender, the amount of data skyrocketed.


How We Used to Do It

Previously, I relied on PowerShell scripts to extract logs, store them in a separate repository, and later manage them via SQL Server for analysis.
While this worked, it had several drawbacks:

  • Required a dedicated VM for log collection
  • Credential management was cumbersome and posed security risks
  • Didn’t align well with the SaaS-first approach
  • Frequent schema changes and new log types increased maintenance overhead

In short, the process became increasingly labor-intensive.


Why I Chose Microsoft Sentinel

To solve these issues, I turned to Microsoft Sentinel.
Although Sentinel is primarily a SIEM solution, my initial goal is centralized log management. Here’s why Sentinel stood out:

  • Native integration with Microsoft 365
  • Automated log collection and schema updates
  • Easy integration with Defender, Entra, Intune, and more

The Role of AI

Thanks to AI, the barrier to entry for these technologies has dropped significantly.
With Copilot, I can leverage the data stored in Sentinel more intelligently.
Once logs are ingested into Sentinel, it’s like having a database ready for advanced analytics—and AI can answer questions based on that data.

This marks the beginning of a shift from manual log management to a more automated and intelligent approach.


What is Microsoft Sentinel?

Microsoft Sentinel is a cloud-native SIEM (Security Information and Event Management) solution that collects and analyzes security logs and events from multiple sources.
It supports threat detection, automated response, and security operations efficiency.

Learn more: What is Microsoft Sentinel? | Microsoft Learn


Microsoft 365 Log Collection Architecture

Here’s the architecture I’m planning for Microsoft 365 → Sentinel:

Microsoft 365 Log Collection Architecture

  • Signals from various Microsoft 365 services are sent to Sentinel via built-in connectors
  • However, not all logs are supported by default
  • Unsupported logs require API calls or custom connectors

Note: In this post, we’ll focus on enabling Sentinel. Detailed configurations for each service will be covered in future posts.


Steps to Enable Microsoft Sentinel

1. Access Azure Portal
https://portal.azure.com → Search for Sentinel


2. Create a Sentinel Resource


- Create a new resource group


- Create a Log Analytics Workspace

 

It is just Log Analytics workspace.

 

Move to Sentinel → Create


- Add Microsoft Sentinel to the workspace

 

Adding Microsoft Sentinel

 


3. Add Microsoft 365 Data Connectors
- Go to Content Hub

 


Currently, Sentinel is being integrated with the Defender page.
If you go to Defender (Security.microsoft.com) and click on Microsoft Sentinel, you can confirm that it is being provisioned.

 

If you refresh in the Content hub within Sentinel on Azure, you will see the available Content that can be added as shown below.

 

For a simple connection test, search for Microsoft Entra ID and proceed with the installation.

 


Data Connectors Microsoft Entra ID Open connector page

 

Select the logs to import and apply changes.



4. Verify Log Collection
- Wait for logs to populate


- Use KQL mode to query and validate data ingestion


What’s Next?

In the next post, I’ll cover enabling specific Microsoft 365 logs and, if needed, the E5 onboarding process.


Tip: If you’re planning to integrate Sentinel with Microsoft 365, start small—enable core connectors first, then expand gradually.

반응형

+ Recent posts