반응형

Many IT engineers and managers are unaware that if your Hyper-V host server is running Windows Server Datacenter Edition, you can use AVMA (Automatic Virtual Machine Activation) keys to automatically activate guest VMs. Leveraging this feature simplifies the activation process and makes management much easier.

In this post, I’ll walk you through how AVMA works, how to use it, and some practical tips for automating Windows Server VM activation on Hyper-V.

 

Youtube: https://youtu.be/deyWNdW6S-U

 


What is AVMA?

AVMA (Automatic Virtual Machine Activation) allows you to activate Windows Server virtual machines running on a Datacenter edition Hyper-V host without needing to enter a product key for each VM. This is especially useful for environments where you frequently deploy or redeploy VMs.

Reference:
Automatic Virtual Machine Activation in Windows Server | Microsoft Learn

  • Guest VM: The version of Windows Server you can activate depends on the host OS version.

Supported Host and Guest Combinations

For example, if your host is Windows Server 2025, you can activate guest VMs from 2012 R2 up to 2025 using AVMA keys.


AVMA Keys for Each Windows Server Version

You can find the official AVMA keys in Microsoft’s documentation. Here are some examples:


How to Use AVMA Keys During Installation

When installing Windows Server as a VM on your Hyper-V Datacenter host, you can enter the AVMA key during setup:

Choose a licensing method:
Select “Use a product key” and enter the AVMA key for your OS version.

Select the image:
The installer will recognize the OS version that matches the AVMA key.


Post-Installation Activation

After installation, you might notice that Windows is not yet activated. Here’s how to proceed:

Check Activation Status:
Go to Start > Settings > System > Activation. If not activated, you may see an error (e.g., 0xC004F012).

 

Activate via Command Line:
Open PowerShell or Command Prompt as Administrator and run:

 

This will trigger activation using the AVMA key.

 

Verify Activation:
The activation state should now show as “Active”.


Activating an Already Installed VM

If you’ve already installed the OS without entering a key, you can still activate:

 

Go to System Settings:
Start > System > About > Product key and activation.

 

 

Change Product Key:
Enter the appropriate AVMA key and proceed with activation.

 

Next

 

Activate


Pro Tip: Using Sysprep

After completing activation, running Sysprep is highly recommended for managing test environments efficiently. This avoids repetitive product key entry and ensures your template VMs are ready for rapid deployment.


Conclusion

AVMA is a powerful feature for anyone managing Windows Server VMs on Hyper-V Datacenter hosts. It streamlines activation, reduces manual work, and helps maintain compliance. Make sure to use the correct AVMA key for your guest OS version, and enjoy hassle-free VM deployments!


 

반응형
반응형

Previously, I covered how to export a Power BI M Query from Microsoft Sentinel and connect it to Power BI Desktop.

2025.08.24 - [Microsoft 365] - Microsoft 365 Log Management (2): Connecting MDI Logs to Sentinel and Power BI

 

While doing a self-study to compare Endpoint DLP logs against Microsoft Defender for Endpoint (MDE) logs, I ran into a practical issue: in Power BI, reorganizing column order can be surprisingly annoying when you just want to quickly compare a few fields side by side.

 

After digging in, I found a very handy trick:

✅ You can take the M Query exported from Sentinel/Log Analytics and paste it directly into Excel Power Query—and it works.

 

If you do analysis primarily in Excel (filters, quick comparisons, pivot tables), this approach is super practical.

So here’s the clean workflow:

“M Query export → Excel connection → analysis”

 

Youtube: https://youtu.be/iuyK1sINfzw

 


TL;DR

  • In Sentinel / Log Analytics, export your query using Export to Power BI (as an M query).
  • In Excel, open Power Query (Blank Query) and paste the M Query into the Advanced Editor.
  • Authenticate using Organizational account, then Close & Load to load it into a worksheet table.
  • From then on, just hit Refresh to update logs—no more re-running the same query in the portal.

Step 1) Export the M Query from Sentinel / Log Analytics

In the Azure Portal, navigate to either:

  • Microsoft Sentinel > Logs

  • Log Analytics Workspace > Logs

 

 

Write or select the query for the table > Setting Time range > Share > Export to Power BI (as an M query)


Step 2) Connect to Log Analytics Using M Query in Excel

2-1) Create a Blank Query

In Excel:

  • Data > Get Data > From Other Sources > Blank Query


2-2) Paste the M Query into Advanced Editor

In the Power Query Editor:

Open Advanced Editor

 

 

Paste the entire M Query you downloaded in Step 1 as-is

A typical exported M Query includes things like:

  • The target table
  • The query time range

 

✅ Pro tip: If you need to connect multiple tables, just duplicate the query and update only the table name and time span section. It’s the fastest way to scale your workbook.


2-3) Configure Credentials (Authentication)

On first connection, you may see Edit Credentials.

 

 

Organizational account → sign in → Connect


2-4) Load to Excel and Refresh Anytime

Before loading:

  • Rename the query to something meaningful
  • Then choose Close & Load to load into an Excel worksheet table

  • Use filters, sorting, pivots, conditional formatting, side-by-side comparisons… all the Excel stuff that’s great for fast investigation.

 

And the best part:

Refresh updates the dataset without re-running the whole process in the portal.


Step 3) Bonus: Analyze Logs with Copilot (Excel + OneDrive/SharePoint)

After loading logs into Excel:

  1. Save the workbook to OneDrive or SharePoint
  2. Ask Copilot to analyze the data

If Copilot recognizes your tables (for example, MDE-related tables), it can quickly do things like:

  • Summaries
  • Trend analysis
  • Outlier/anomaly detection
  • Quick insights and narrative explanations

Wrap-up

Using M Query Export from Sentinel/Log Analytics isn’t just for Power BI—you can connect it directly to Excel and build a refreshable log analysis workbook.

If your workflow is centered on:

  • Fast comparison
  • Column reordering
  • Filtering
  • Pivot-based analysis

…then Excel can be the more efficient tool. And once the dataset is in OneDrive/SharePoint, Copilot becomes an extra boost for rapid investigation.

반응형
반응형

M365 Log Management (4): Building a Windows Update Dashboard from Update History (Intune + Log Analytics + Power BI)

Recently, I’ve been getting more and more interested in visualizing operational logs and device records in a Power BI dashboard. In the Microsoft ecosystem, one of the biggest advantages is that the reporting and data pipelines are designed by the same vendor that built the platform, which often makes the integration more efficient than many third‑party approaches.

At first, I considered pulling everything with PowerShell, but I found that Intune policies + Log Analytics can load the relevant Windows Update signals with far less friction—and then you can build a dashboard on top of them quickly.

This post walks through how to create a Windows Update dashboard using Windows Update for Business reports, Azure Log Analytics, and a Power BI template.

 

Youtube: https://youtu.be/ToqAFJpoh_g

 


What You’ll Need (Requirements)

To build the dashboard described here, you’ll need:

  • An Azure subscription
  • A Log Analytics workspace
  • Devices enrolled and managed with Microsoft Intune
  • Power BI Desktop (to open the template and customize the report)

Reference Materials (Official/Community)

These were the key resources used while implementing the solution:


High-Level Flow (How the Data Gets to Your Dashboard)

At a high level, the process looks like this:

  1. Intune policy enables required diagnostic/telemetry settings on devices
  2. Windows Update for Business reports is enabled and connected to your Log Analytics workspace
  3. Devices upload update status signals → stored in Log Analytics tables (e.g., tables prefixed with UC*)
  4. A Power BI template queries the Log Analytics workspace and visualizes update health

Step 1) Configure Intune Devices for Windows Update for Business Reports

This step ensures that devices can send the required diagnostic data (including device name, if needed for reporting clarity). I followed the Microsoft Learn guidance and created a configuration policy using the Settings catalog. 1.%20Windows%20Update%20%EA%B8%B0%EB%A1%9D%EC%9D%84%20%ED%86%B5%ED%95%9C%20%EB%8C%80%EC%8B%9C%EB%B3%B4%EB%93%9C%20%EB%A7%8C%EB%93%A4%EA%B8%B0.loop)

1. Create a Configuration Profile

In Intune admin center:

DevicesWindows

 

 

ConfigurationPoliciesNew policy


Platform: Windows 10 and later | Profile type: Settings catalog

 

 

Create the profile and give it a name (example used: AllowDeviceNameInDiagnosticData)

 

2. Add Required Settings

In the Settings catalog, search and add the following:

  • Allow Telemetry
    • Category: System
    • Value: Basic
  • Configure Telemetry Opt In Settings UX
    • Value: Disabled
  • Configure Telemetry Opt In Change Notification
    • Value: Disabled
  • Allow device name to be sent in Windows diagnostic data
    • Value: Allowed

 

3. Assign and Monitor the Policy

  • Assign the profile to the target users/devices

  • Complete Review + create

  • Monitor the deployment status in Intune to confirm devices are checking in successfully 


 

Step 2) Enable Windows Update for Business Reports and Connect Log Analytics

Once devices are ready, you need to enable Windows Update for Business reports and link it to your Azure subscription and Log Analytics workspace

1. Open the Built-In Workbook in Azure

In Azure Portal:

  • Go to Monitor

  • Select Workbooks > Choose Windows Update for Business reports

  • Click Get started 

2. Configure Enrollment (Subscription + Workspace)

  • Select your Azure subscription & Log Analytics workspace > Save settings

 

 

During this flow, you can see that configuration is handled through Microsoft Graph (the UI surfaces the Graph endpoint being called). 

 

3. Wait for Data to Populate

The UI mentions it may take up to 24 hours, but in my case it took 48+ hours before data appeared.

4. Confirm Data in Log Analytics

In Log Analytics, the data lands in tables that start with UC (for example, multiple UC* tables will appear once ingestion begins). 

5. Understand Collection / Upload Frequency

Microsoft documentation also lists data types and upload frequency/latency. Practically speaking, you should expect some tables/events to arrive on different cadences (some daily, some per update event, and with latency that can span hours to a day or more). 


Step 3) Tailor the Reports with Power BI

Once data is available in Log Analytics, the easiest path to a polished dashboard is to use the official Power BI template published for Windows Update for Business reports. 

 

1. Download the Power BI Template

From the Tech Community / Windows IT Pro blog post, download the Power BI template referenced in the guide.

Tailor Windows Update for Business reports with Power BI | Windows IT Pro Blog

 

2. Copy the Workspace ID

In Azure Portal:

  • Open Log Analytics workspaces

  • Copy the Workspace ID

3. Open the Template and Load Data

  • Open the Power BI template file
  • When prompted, paste the Workspace ID

  • Click Load 

4. Authenticate

When Power BI prompts for access to the Log Analytics endpoint:

  • Choose Organizational account

  • Click Connect 

5. View Your Windows Update Dashboard

After authentication completes and data is loaded, the dashboard visuals populate and you can begin customizing pages, KPIs, filters, and device group views. 


 

Wrap-Up

With just Intune, Log Analytics, and the Power BI template, you can build a practical Windows Update dashboard without writing custom scripts or maintaining a separate data pipeline. The key is getting device diagnostics configured correctly, enabling WUfB reports, and allowing enough time for ingestion to stabilize. 

반응형
반응형

While organizing Intune policies, I discovered the existence of the Intune Data Warehouse and realized that it’s possible to build BI dashboards using Power BI.

 

Searching on YouTube, I found that connection methods have been available for quite some time.

 

My goal is to visualize every area of M365, so I decided to take on the challenge right away.

 

Youtube:  M365. Creating an Intune Dashboard

 

1. Import Data

There are two main ways to connect Intune Data Warehouse to Power BI.

Method 1. OData Feed

In Power BI, select Get data > OData feed

 

Feed URL Input

 

Enter your organizational account and click Connect


All available tables will be listed – check all and click Load


Data Loading

 

Import complete

Method 2. Connector

In Power BI, select Get Data > More

 

Online Services > Intune Data Warehouse


Specify Period


Select tables and click Load (the following steps are the same)

 

The Connector brings in more tables, but the meaningful data is similar
OData Feed allows for custom queries via Advanced Query
The Connector allows you to specify the period

This post will proceed using the Connector method.


2. Download Power BI Template

Most Intune dashboard resources are based on the following template:

PowerBiDashboards/Intune Dashboard.pbix at main · JayRHa/PowerBiDashboards · GitHub

 

Dashboard Example

 

Transform data > Data source settings to check the Connector-based connection.

 

Refresh

 

you may encounter an error like below:

 

The template creator’s blog suggested checking the technical documentation below and changing the locale, but even after changing it, the issue was not resolved. Therefore, I proceeded by copying the template instead.

 

Supported languages and countries/regions for Power BI

https://learn.microsoft.com/en-us/power-bi/fundamentals/supported-languages-countries-regions

 

In your BI file connected to your data, add pages with the same names as the template at the bottom.

 

Copy and paste the three pages as shown below.

 


3. Add Objects and Set Relationships

Since the structure may not match, you might encounter some errors.

 

Adjust the structure to match.

 

This error occurs because the Text Filter object is missing.

 

Go to More visuals > From AppSource.

 

Search for and add the Text Filter.

 

After refreshing or switching pages, you’ll see the issue is resolved.

 

Errors on the Devices page occur because table relationships do not match the template.

 

Model View menu to check the differences in Relationships count.

 

First import data, BI automatically sets relationships.

Since each environment is different, table relationships may vary. Use the following approach as a reference, and match the relationships to the template as needed.

 

Go to Manage relationships.

 

Some relationships in the template are missing in your BI.

 

Match Structure

 

After do it. Save

 

Sometimes, relationships are not automatically created because there’s no data on one side.

 

 

Inactive/Active reversed, fix them as well.

 

Errors on the Devices page will be resolved.

 

There are no errors on the ConfigProfiles page as well.

 

4. Conclusion

By leveraging Power BI, you can intuitively manage Intune devices.

반응형
반응형

In the previous post, I covered the flow of managing logs from MDI → Sentinel → Log Analytics API → PowerShell → CSV → BI.

 

Previous Post:

2025.08.24 - [Microsoft 365] - Microsoft 365 Log Management (2): Connecting MDI Logs to Sentinel and Power BI

 

While exporting logs using PowerShell, I started to wonder:
As we move toward a more serverless cloud environment, managing logs via scheduled PowerShell scripts means I still need to operate a VM, which increases management overhead.

If you’re only considering cost, scheduling PowerShell scripts on a VM and exporting to SharePoint or OneDrive can be cheaper.
However, from a long-term perspective, I believe it’s time to move away from running scheduled PowerShell scripts on VMs and adopt a serverless approach.

Also, visualizing and managing logs with BI tools can provide valuable insights.
With this in mind, I anticipate that connecting to Microsoft Fabric or similar platforms will eventually become necessary.

In this post, I’ll cover how to export logs to Azure Data Lake Storage (ADLS) Gen2 and connect them to BI.

 

Youtube : Microsoft 365 Log Management (3): How to connect Sentinel logs to Azure Data Lake Storage Gen 2

 


Step 1. Create an ADLS Gen2 Storage Account

1. Go to Azure Portal → Search for Storage Accounts

 

2. Create a Storage Account
In Preferred storage type, select Azure Blob Storage or Azure Data Lake Storage Gen2.

 

 

3. Hierarchical Namespace - Check Enable hierarchical namespace.

Data Lake Storage Gen2 is suitable for big data analytics and other data analysis scenarios.

 

4. Complete the creation and verify the storage account


Step 2. Create an Export Rule

1. Go to Log Analytics Workspace → Settings → Data Export → Create export rule

 

2. Name your rule

 

3. Select the tables to export

 

4. Set the destination to the storage account you created

 

5. Go to Data storage → Containers to check the exported tables

 

6. Navigate through subfolders to see that exports occur every 5 minutes

Step 3. Connect to Power BI

1. In Power BI Desktop, go to Get data → More

 

2. Select Azure → Azure Data Lake Storage Gen2

 

3. You’ll be prompted to enter a URL

 

4. Find the DFS URL using Azure Storage Explorer

Go to Storage Account → Storage browser → Download and install Azure Storage Explorer

 

Connect, navigate to the folder path, and open Properties

 

Copy the DFS URL

 

5. Paste the URL into Power BI

 

6. Enter your credentials (Account Key)

 

You can find the Account Key under Security + networking → Access keys

 

7. Connect and then Combine & Transform Data

 

Unlike saving to SharePoint, where you need to create queries manually, the native connector support makes this process much simpler.


Conclusion

By following these steps, you can export Microsoft 365 logs to Azure Data Lake Storage Gen2 and easily visualize them in Power BI.
If you’re considering a serverless environment and BI integration, this approach offers a more efficient and scalable way to manage your logs in the long run.

반응형
반응형

Previous Post: 2025.08.10 - [Microsoft 365] - Microsoft 365 Log Management (1): Getting Started with Sentinel

 

Microsoft 365 Log Management (1): Getting Started with Sentinel

▶ Watch on YouTube: Microsoft 365 Log Management (1): Getting Started with Sentinel Why Log Management Matters in Microsoft 365One of the biggest challenges I faced while managing Microsoft 365 was log management.Initially, message trace and audit logs w

pepuri.limcm.kr

 

 

In the previous post, we explored how to enable Microsoft Sentinel and start collecting Microsoft 365 logs.
This time, we’ll focus on integrating Microsoft Defender for Identity (MDI) logs into Sentinel and preparing them for Power BI visualization.


Youtube: Microsoft 365 Log Management (2): Connecting MDI Logs to Sentinel and Power BI

 

 

Step 1. Verify MDI Activation

Navigation Path: System → Settings → Identities

 

Check Sensor Activation:
With the latest MDI v3, activation is much simpler—if your Domain Controller is already onboarded to Microsoft

 

Defender for Endpoint (MDE), MDI can be enabled without additional steps.

(A separate post will cover the new version once it’s officially released.)

 

Verify Signals:
Go to Advanced Hunting and confirm that IdentityLogonEvents are being recorded.

→ If signals appear here, you can confirm that Sentinel is also receiving MDI logs.

 

Connector Setup:
Navigate to Microsoft Defender XDR → Open connector page.

 

 

→ Enable Microsoft Defender for Identity and save.

 

After a short delay, you should be able to query MDI logs in Sentinel.

 


Step 2. Register an Enterprise App for Sentinel Log Export

Currently, Advanced Hunting and Sentinel have limitations when running large queries.
Our ultimate goal is to visualize data in Power BI, so we’ll first store logs as CSV files in SharePoint.

 

To achieve this, we’ll use the Log Analytics API, which requires Enterprise App registration.

Registration Steps

1. Go to Entra Admin Center → App registrations → New registration

 

2. Name the app → Register

 

3. Navigate to API permissions → Add a permission

 

4. Select APIs my organization uses → Log Analytics API

 

5. Check Data.ReadAdd permissions

 

6. Click Grant admin consent

 

7. Go to Certificates & secrets → New client secret → Add

 

8. Copy the generated Value and store it securely

 

9. In Log Analytics Workspaces → Access control (IAM), click Add role assignment

 

10. Assign Log Analytics Reader role

 

11. Grant the role to the newly created app

 


Step 3. Export Logs to CSV

Tenant ID & Client ID

 

Workspace ID

 

Client Secret

Once these values are ready, you can use a PowerShell script to call the Log Analytics API and export logs in chunks.

 

I created the following script to call the Log Analytics API using AI.

You’ll need the following details for the script:

# === Authentication (Service Principal) ===
$TenantId = "<TENANT_ID>"
$ClientId = "<CLIENT_ID>"
$Secret   = "<CLIENT_SECRET>"  

# === Workspace ===
$WorkspaceId = "<WORKSPACE_ID>"

# === Extraction target / Period / Output ===
$Table        = "IdentityLogonEvents"
$StartUtc     = [datetime]"2025-08-12T00:00:00Z"
$EndUtc       = [datetime]::UtcNow
$ChunkHours   = 6  
$OutDir       = "F:\sentinel\IdentityLogonEvents"
$FilePrefix   = "IdentityLogonEvents"
$SkipExisting = $true

# === Interval / Retry / Timeout ===
$MinIntervalSeconds = 30 
$HttpTimeoutSeconds = 300
$MaxRetries         = 5
$BaseDelaySeconds   = 5

<# ======================= Utilities ======================= #>

# Create folder
New-Item -ItemType Directory -Force -Path $OutDir | Out-Null

# Token cache
$Script:TokenInfo = $null

function Get-LogAnalyticsToken {
    if ($Script:TokenInfo -and $Script:TokenInfo.ExpiresOn -gt (Get-Date).ToUniversalTime().AddMinutes(5)) {
        return $Script:TokenInfo.AccessToken
    }

    $body = @{
        client_id     = $ClientId
        client_secret = $Secret
        grant_type    = "client_credentials"
        scope         = "https://api.loganalytics.io/.default"
    }

    $tokenResponse = Invoke-RestMethod -Method Post `
        -Uri "https://login.microsoftonline.com/$TenantId/oauth2/v2.0/token" `
        -Body $body `
        -TimeoutSec $HttpTimeoutSeconds

    $Script:TokenInfo = [pscustomobject]@{
        AccessToken = $tokenResponse.access_token
        ExpiresOn   = (Get-Date).ToUniversalTime().AddSeconds([int]$tokenResponse.expires_in)
    }
    return $Script:TokenInfo.AccessToken
}

function Invoke-LAQuery {
    param(
        [Parameter(Mandatory=$true)] [string] $Kql,
        [Parameter(Mandatory=$true)] [string] $WorkspaceId
    )

    $attempt = 0
    while ($true) {
        $attempt++
        $token = Get-LogAnalyticsToken
        $headers = @{ Authorization = "Bearer $token" }
        $body    = @{ query = $Kql } | ConvertTo-Json

        try {
            return Invoke-RestMethod -Method Post `
                -Uri "https://api.loganalytics.azure.com/v1/workspaces/$WorkspaceId/query" `
                -Headers $headers -ContentType "application/json" `
                -Body $body -TimeoutSec $HttpTimeoutSeconds
        }
        catch {
            $status = $_.Exception.Response.StatusCode.value__
            $resp   = $null
            try { $resp = [System.IO.StreamReader]::new($_.Exception.Response.GetResponseStream()).ReadToEnd() } catch {}

            # 401: Refresh token
            if ($status -eq 401 -and $attempt -le $MaxRetries) {
                $Script:TokenInfo = $null
                Start-Sleep -Seconds ($BaseDelaySeconds * [math]::Pow(2, $attempt - 1))
                continue
            }

            # 429 or 5xx
            if (($status -eq 429 -or $status -ge 500) -and $attempt -le $MaxRetries) {
                $retryAfter = 0
                try { $retryAfter = [int]$_.Exception.Response.Headers["Retry-After"] } catch {}
                if ($retryAfter -le 0) {
                    $retryAfter = [int]($BaseDelaySeconds * [math]::Pow(2, $attempt - 1))
                }
                Write-Warning "Query throttled/failed (status $status). Retry in $retryAfter sec. Attempt $attempt/$MaxRetries"
                Start-Sleep -Seconds $retryAfter
                continue
            }

            throw "Log Analytics query failed (status $status): $resp"
        }
    }
}

function Convert-RowsToObjects {
    param(
        [Parameter(Mandatory=$true)] $ResultTable
    )
    $cols = $ResultTable.columns.name
    $rows = $ResultTable.rows | ForEach-Object {
        $o = [ordered]@{}
        for ($i=0; $i -lt $cols.Count; $i++) { $o[$cols[$i]] = $_[$i] }
        [pscustomobject]$o
    }

    foreach ($row in $rows) {
        foreach ($p in $row.PSObject.Properties) {
            $v = $p.Value
            if ($v -is [System.Collections.IDictionary] -or
                $v -is [System.Array] -or
                $v -is [PSCustomObject]) {
                $row.($p.Name) = ($v | ConvertTo-Json -Compress -Depth 50)
            }
        }
    }
    return $rows
}

function Wait-ForRateLimit($startedAt, [int]$minSeconds) {
    $elapsed = [int]((Get-Date).ToUniversalTime() - $startedAt).TotalSeconds
    $remain  = $minSeconds - $elapsed
    if ($remain -gt 0) { Start-Sleep -Seconds $remain }
}

<# ======================= Query Loop ======================= #>

$cursor = $StartUtc
while ($cursor -lt $EndUtc) {
    $iterStart = [datetime]::UtcNow

    $chunkStart = $cursor
    $chunkEnd   = $cursor.AddHours($ChunkHours)
    $cursor     = $chunkEnd

    $stamp   = $chunkStart.ToString("yyyyMMddHHmm")
    $outFile = Join-Path $OutDir ("{0}{1}.csv" -f $FilePrefix, $stamp)
    if ($SkipExisting -and (Test-Path $outFile)) {
        Write-Host "Skip: $outFile"
        Wait-ForRateLimit $iterStart $MinIntervalSeconds
        continue
    }

    $startIso = $chunkStart.ToString("yyyy-MM-ddTHH:mm:ssZ")
    $endIso   = $chunkEnd.ToString("yyyy-MM-ddTHH:mm:ssZ")

    $kql = @"
$Table
| where TimeGenerated >= datetime('$startIso')
| where TimeGenerated <  datetime('$endIso')
| order by TimeGenerated asc
"@

    Write-Host ("Query {0}Z ~ {1}Z" -f $chunkStart.ToString("s"), $chunkEnd.ToString("s"))

    try {
        $r = Invoke-LAQuery -Kql $kql -WorkspaceId $WorkspaceId

        if (-not $r.tables -or $r.tables.Count -eq 0 -or -not $r.tables[0]) {
            Write-Host "  -> No result table."
        } else {
            $rows = Convert-RowsToObjects -ResultTable $r.tables[0]
            if ($rows -and $rows.Count -gt 0) {
                $rows | Export-Csv -Path $outFile -NoTypeInformation -Encoding UTF8
                Write-Host ("  -> {0} rows -> {1}" -f $rows.Count, $outFile)

                if ($rows.Count -ge 450000) {
                    Add-Content -Path (Join-Path $OutDir "_oversized.txt") -Value "$startIso~$endIso,$($rows.Count)"
                    Write-Warning "Result very large ($($rows.Count) rows). Consider reducing chunk size for this period."
                }
            } else {
                Write-Host "  -> No rows."
            }
        }
    }
    catch {
        Write-Warning "Error range: $startIso ~ $endIso"
        Write-Warning "Error: $($_.Exception.Message)"
        Add-Content -Path (Join-Path $OutDir "_failed.txt") -Value "$startIso~$endIso"
    }

    Wait-ForRateLimit $iterStart $MinIntervalSeconds
}

Write-Host "Done. Output dir: $OutDir"

 

 

 

 

Tip: Adjust ChunkHours and MinIntervalSeconds to avoid hitting API throttling limits.

When everything is configured correctly, the export process will look like this:


Step 4. Connect Power BI (Load CSV from SharePoint)

From my perspective, the ideal approach would be for Sentinel to natively support BI integration.
Although it provides queries that allow you to connect Power BI as shown below, due to API call limitations, a separate storage layer is required for effective use in BI.

 

The Sentinel Data Lake feature is currently available in preview, but it appears that Power BI integration is not yet supported.
For now, we’ll store the data in SharePoint Online, which is a cost-effective option, and then aggregate it in Power BI.

 

 

Upload CSV to SharePoint

 

Power BI Desktop  Get Data  Blank query

 

Advanced Editior

 

Paste the query below. (This was created with the help of AI.)

let
    // ========== ① User Settings ==========
    SiteUrl         = "https://clim823.sharepoint.com/sites/Sentinel",
    LibraryName     = "Shared Documents",      
    TargetFolder    = "IdentityLogonEvents",     
    FileNamePrefix  = "IdentityLogonEvents",     
    KeepLastNMonths = 6,

    // ========== ② File → Table Conversion Function ==========
    ParseCsv = (fileContent as binary) as table =>
        let
            csv = Csv.Document(
                    fileContent,
                    [Delimiter = ",", Columns = null, Encoding = 65001, QuoteStyle = QuoteStyle.Csv]
                  ),
            promoted = Table.PromoteHeaders(csv, [PromoteAllScalars = true])
        in
            promoted,

    // ========== ③ Navigate to Target Folder ==========
    Source      = SharePoint.Contents(SiteUrl, [ApiVersion = 15]),
    Library     = Source{[Name=LibraryName]}[Content],
    Folder      = Library{[Name=TargetFolder]}[Content],   // DeviceLogonEvents 

    // ========== ④ Filter Files ==========
    FilteredByName = Table.SelectRows(Folder, each Text.StartsWith([Name], FileNamePrefix)),
    FilteredByExt  = Table.SelectRows(FilteredByName, each Text.Lower([Extension]) = ".csv"),

    // ========== ⑤ Load Files → Convert to Tables ==========
    AddedData   = Table.AddColumn(FilteredByExt, "Data", each ParseCsv([Content]), type table),
    TablesList  = List.RemoveNulls(List.Transform(AddedData[Data], each try _ otherwise null)),

    // ========== ⑥ Align Schema & Merge ==========
    AllCols        = if List.Count(TablesList) = 0 
                     then {} 
                     else List.Distinct(List.Combine(List.Transform(TablesList, each Table.ColumnNames(_)))),
    AlignedTables  = List.Transform(TablesList, each Table.ReorderColumns(_, AllCols, MissingField.UseNull)),
    Appended       = if List.Count(AlignedTables) = 0 
                     then #table(AllCols, {}) 
                     else Table.Combine(AlignedTables),

    // ========== ⑦ Filter by Last N Months ==========
    WithTimestampTyped = if List.Contains(Table.ColumnNames(Appended), "Timestamp")
                         then Table.TransformColumnTypes(Appended, {{"Timestamp", type datetime}})
                         else Appended,

    FilteredByDate =
        if List.Contains(Table.ColumnNames(WithTimestampTyped), "Timestamp")
        then Table.SelectRows(WithTimestampTyped, each [Timestamp] >= Date.AddMonths(DateTime.LocalNow(), -KeepLastNMonths))
        else WithTimestampTyped
in
    FilteredByDate

 

Close & Apply

 

Using this data, you can build dashboards that provide valuable insights into identity-related activities, as shown below.


Why This Matters

By connecting MDI logs to Sentinel and then visualizing them in Power BI, you can:

  • Detect suspicious identity activities faster
  • Correlate identity signals with other security data
  • Build interactive dashboards for security insights

 

반응형
반응형

▶ Watch on YouTube: Microsoft 365 Log Management (1): Getting Started with Sentinel

 


Why Log Management Matters in Microsoft 365

One of the biggest challenges I faced while managing Microsoft 365 was log management.
Initially, message trace and audit logs were enough. But as I started incorporating security solutions like Microsoft Defender, the amount of data skyrocketed.


How We Used to Do It

Previously, I relied on PowerShell scripts to extract logs, store them in a separate repository, and later manage them via SQL Server for analysis.
While this worked, it had several drawbacks:

  • Required a dedicated VM for log collection
  • Credential management was cumbersome and posed security risks
  • Didn’t align well with the SaaS-first approach
  • Frequent schema changes and new log types increased maintenance overhead

In short, the process became increasingly labor-intensive.


Why I Chose Microsoft Sentinel

To solve these issues, I turned to Microsoft Sentinel.
Although Sentinel is primarily a SIEM solution, my initial goal is centralized log management. Here’s why Sentinel stood out:

  • Native integration with Microsoft 365
  • Automated log collection and schema updates
  • Easy integration with Defender, Entra, Intune, and more

The Role of AI

Thanks to AI, the barrier to entry for these technologies has dropped significantly.
With Copilot, I can leverage the data stored in Sentinel more intelligently.
Once logs are ingested into Sentinel, it’s like having a database ready for advanced analytics—and AI can answer questions based on that data.

This marks the beginning of a shift from manual log management to a more automated and intelligent approach.


What is Microsoft Sentinel?

Microsoft Sentinel is a cloud-native SIEM (Security Information and Event Management) solution that collects and analyzes security logs and events from multiple sources.
It supports threat detection, automated response, and security operations efficiency.

Learn more: What is Microsoft Sentinel? | Microsoft Learn


Microsoft 365 Log Collection Architecture

Here’s the architecture I’m planning for Microsoft 365 → Sentinel:

Microsoft 365 Log Collection Architecture

  • Signals from various Microsoft 365 services are sent to Sentinel via built-in connectors
  • However, not all logs are supported by default
  • Unsupported logs require API calls or custom connectors

Note: In this post, we’ll focus on enabling Sentinel. Detailed configurations for each service will be covered in future posts.


Steps to Enable Microsoft Sentinel

1. Access Azure Portal
https://portal.azure.com → Search for Sentinel


2. Create a Sentinel Resource


- Create a new resource group


- Create a Log Analytics Workspace

 

It is just Log Analytics workspace.

 

Move to Sentinel → Create


- Add Microsoft Sentinel to the workspace

 

Adding Microsoft Sentinel

 


3. Add Microsoft 365 Data Connectors
- Go to Content Hub

 


Currently, Sentinel is being integrated with the Defender page.
If you go to Defender (Security.microsoft.com) and click on Microsoft Sentinel, you can confirm that it is being provisioned.

 

If you refresh in the Content hub within Sentinel on Azure, you will see the available Content that can be added as shown below.

 

For a simple connection test, search for Microsoft Entra ID and proceed with the installation.

 


Data Connectors Microsoft Entra ID Open connector page

 

Select the logs to import and apply changes.



4. Verify Log Collection
- Wait for logs to populate


- Use KQL mode to query and validate data ingestion


What’s Next?

In the next post, I’ll cover enabling specific Microsoft 365 logs and, if needed, the E5 onboarding process.


Tip: If you’re planning to integrate Sentinel with Microsoft 365, start small—enable core connectors first, then expand gradually.

반응형
반응형

This time, we will cover the topic of ADFS & WAP Upgrade & Migration.

As indicated in the title, the upgrade and migration will be performed from Windows Server 2022 to 2025.

For reference, the ADFS configured on Windows Server 2022 will be referred to as ADFS2022, and the WAP configured on Windows Server 2025 will be called WAP2025.

 

Youtube: https://youtu.be/BYR4fl7o29o

 

 

Step 1. Installing ADFS 2025

 

First, join the server where you will install ADFS to the Active Directory.

 

 

Go to Server Manager -> Add Roles and Features.

 

 

Proceed with installing the Active Directory Federation Services role.

 

 

Click Install.

 

 

Next, select Configure the federation service on this server.

 

 

Choose Add a federation server to a federation server farm.

 

 

Click Change and enter the credentials of a Domain Admin account.

 

 

Enter the information of the existing ADFS server.

 

 

Specify the certificate (ensure the certificate installation has been completed beforehand).

 

 

Provide the ADFS service account details.

 

 

Proceed with the installation process.

 

 

Close

 

 

Once the installation is complete, launch AD FS Management.

 

 

You will see that the current server is set as Secondary. A switch between Primary and Secondary needs to be performed.

 

 

On the newly installed 2025 server, run the following command to switch it to Primary:

Set-AdfsSyncProperties -Role PrimaryComputer

 

 

To change the existing ADFS 2022 server to Secondary, run this command on the 2022 server:

Set-AdfsSyncProperties -Role SecondaryComputer -PrimaryComputerName <2025서버>

 

 

When you open the management console on ADFS 2022, you will see it is now set as Secondary.

 

 

On ADFS 2025, confirm that it has switched to Primary.

 

 

Finally, update the internal DNS to point the ADFS address to the new server’s IP.

 

 

Step 2. Remove the Existing ADFS 2022

 

 

From the Roles installation menu, start the Remove Roles and Features Wizard.

 

 

Uncheck the Active Directory Federation Services role and proceed with the removal.

 

 

Close

 

 

Once the removal is complete, change the server’s membership from the domain to a Workgroup.

 

 

Step 3. Install WAP2025

 

 

Open the hosts file on the existing WAP2022 server with Notepad, copy its contents, and save it to the WAP2025 server.

 

 

Note that while published configurations are migrated, certificates are not included, so make sure to back up and import each certificate separately.

 

 

On WAP2025, proceed to install the Remote Access Role.

 

 

Check Web Application Proxy and continue with the installation.

 

 

Open the Web Application Proxy Wizard

 

 

Enter the ADFS service URL and credentials.

 

 

Select the pre-installed certificate.

 

 

Configure

 

 

Close

 

 

The interface will display as if a cluster is configured.

 

 

You can verify the current connected servers with the command:

 

 

Similar to 2019 and 2022 versions, the Configuration Version remains as Windows Server 2016.

 

Step 4. Remove WAP2022

 

 

On WAP2022, start the Remove Roles and Features Wizard.

 

 

Uncheck the Remote Access – Web Application Proxy role and proceed with removal.

 

 

Update the currently connected server information using the following command on WAP2025:

Set-WebApplicationProxyConfiguration -ConnectedServersName <WAP2025>

 

 

Confirm that the connection information has been updated correctly.

 

 

Successful login was also confirmed via Office.com, indicating that no additional action is required in Entra ID Connect and no major issues are expected.

반응형
반응형

Youtube: https://youtu.be/VEyKbmwxoaU

 

 

Exchange Server Subscription Edition (SE) Has Finally Been Released

Exchange Server Subscription Edition (SE) is now available | Microsoft Community Hub

Copilot AI Summary

This page announces the general availability of Exchange Server Subscription Edition (SE). The main points are as follows:

  • Background of the Release: Exchange SE continues Microsoft’s tradition of providing enterprise-grade email services across cloud, on-premises, and hybrid environments.
  • Service and Licensing Changes: Exchange SE follows the Modern Lifecycle Policy, meaning there is no predefined end-of-support date.
  • Upgrade Details: In-place upgrades from Exchange Server 2019 CU14 or CU15 to Exchange SE are recommended.
  • Differences: While Exchange SE RTM is functionally the same as Exchange 2019 CU15, the name and version number have been updated.
  • Future Plans: After October 2025, Exchange SE will be the only supported on-premises version. New features and installation requirements will be added in the future.

The page also mentions the release of Skype for Business Server Subscription Edition.


It’s really convenient to have Copilot summarize the page like this.

AI makes it easy to understand and concisely presents the key points.

As of now, Subscription Edition is more of a version rename than a functional update.

So if you're upgrading from 2019, there's no need to rebuild your environment — an in-place upgrade is enough.

That’s why it feels more like an update rather than a full upgrade.

 

You can download the installation file from the link below:

Exchange Server build numbers and release dates | Microsoft Learn

Over time, the term RTM may be phased out.

 

Let’s walk through what happens when you upgrade from CU15, for comparison.

 

Mount the ISO file and run the Setup file.

''

You’ll notice the label SUBSCRIPTION EDITION at the top of the installer screen.

 

The installation proceeds the same way as in previous versions.

 

 

After the installation completes, you’ll see the version number has been updated.

 

DAG is also maintained without any issues.

반응형
반응형

Previous Post

2024.12.31 - [Exchange] - Exchange Server 2019. Deployment (2): Configuration (CU14, Nov24SUv2 / Windows Server 2022)

 

While creating a YouTube video, I also decided to write this blog post. I revisited DAG configuration after a long time, thinking it would be useful when setting up a test environment for the upcoming Subscription Edition upgrade.

In Korea, DAG is often referred to as "redundancy." It is a feature in Exchange Server that provides automatic failover in case of database issues. A more detailed explanation involves multiple scenarios, but for now, I will keep it simple and focus on the basic setup.

 

https://youtu.be/oJbLbREw1zA

 

 

 

The environment and specifications remain the same as in the previous post, with three Exchange Servers making up the DAG. The final architecture is as follows:

 

IPLess DAG Configuration

This time, I am using the IPLess configuration approach.

Database availability groups | Microsoft Learn

The IPLess configuration has the following characteristics:

  • No IP address is assigned to the cluster/DAG, so there is no IP resource in the cluster core resource group.
  • No network name is assigned to the cluster, meaning there is no network name resource in the cluster core resource group.
  • The cluster/DAG name is not registered in DNS and cannot be resolved on the network.
  • A Cluster Name Object (CNO) is not created in Active Directory.
  • The cluster cannot be managed using Failover Cluster Manager but must be managed using Windows PowerShell, with cmdlets executed on individual cluster members.

I asked GPT to compare the traditional DAG approach with the IPLess approach, and the results are summarized in the table below:

Active Directory Dependency Requires CNO and AD objects No AD objects required
IP Address Requires static IP No IP required
DNS Registration Required Not required
Failover Speed Relatively slower Relatively faster
Management Complexity Requires AD and network management Reduced management burden
Security Concerns Requires AD object management and permissions No AD objects needed

If there are no compatibility issues with third-party solutions, IPLess DAG is recommended.

 

Prerequisites

When setting up a DAG, the disk structure must be identical across all servers. If the DB disk is set as drive D: on one server, all other servers must also configure their DB disks as drive D:

 
 
Step 1. Creating the Witness Directory

Before proceeding, let's understand what a Witness is.

1. What is a Witness Server?

A Witness Server is a server that provides a quorum vote to maintain the cluster quorum within a Database Availability Group (DAG). A DAG requires an odd number of votes (Quorum) to function properly, and the Witness Server helps achieve this.

DAGs operate as Windows Failover Clusters consisting of multiple Mailbox Servers, maintaining a quorum for high availability. If the number of Mailbox Servers in the DAG is even (e.g., 2, 4, 6...), an additional vote is needed, which is provided by the Witness Server.

You might wonder why a Witness is necessary when there are already three servers in the DAG. GPT provided the following explanation:

Server Count Total Votes (Including Witness) Operation Status Quorum Status
All 3 servers operational 4 (3 servers + 1 Witness) ✅ Running normally OK (4/2 = 2 or more required)
1 server fails (2 remaining) 3 (2 servers + 1 Witness) ✅ Running normally OK (3/2 = 1.5 → Rounded to 2)
2 servers fail (1 remaining) 2 (1 server + 1 Witness) ✅ Running normally OK (2/2 = 1 or more required)
All servers fail (0 remaining) 1 (Witness only) ❌ DAG stops Failed (1/2 = 0.5 → Less than 1 required)

To ensure stable operation, a Witness is essential.

 

2. What is a Witness Directory?

A Witness Directory is a shared folder on the Witness Server used for DAG operations. It stores files that record the cluster state and helps determine quorum status during a failover.

Default Witness Folder Settings:

  • A shared folder must be created on the Witness Server.
  • Typically located at C:\DAGWitness.
  • The Witness Server must be able to communicate with all Mailbox Servers in the DAG.
  • The Exchange Trusted Subsystem group must have Read/Write permissions on the folder.

The Witness Server must be a separate system, and a Witness folder must be created on it. In my setup, I am using the Azure AD Connector server as the Witness Server (recently renamed to Entra ID Connect).

 

 

Creating the Witness Folder on the Witness Server

 

 

Right-click the folder -> Properties

 

 

Navigate to Sharing -> Share

 

 

Click Find People

 

 

Enter Exchange Trusted Subsystem -> Check Names -> OK

 

 

Set Permission Level: Read/Write -> Share

 

Click Done

 

 

Right-click Start Button -> Computer Management

 

 

Go to Local Users and Groups -> Groups -> Administrators

 

 

Click Add

 

 

Enter Exchange Trusted Subsystem -> Check Names -> OK

 

 

The Witness folder is now created and configured with the necessary permissions.

 
Step 2. Configuring the DAG

Next, let's configure the Exchange Servers into a DAG.

 

 

Open Exchange Admin Center (ECP) -> Servers -> Database Availability Groups -> Add

 

 

Specify the DAG name -> Enter Witness Server details -> Click Save

 

 

The DAG is created as shown below.

 

 

Click Manage DAG Membership

 

 

Add one Exchange Server first -> Click Save

 

 

The configuration process starts.

 

 

Add the remaining Exchange Servers using the same steps.

 

Step 3. Database Replication

After setting up the DAG, replicate the databases as follows:

 

 

Navigate to Databases -> Select a DB -> Click Add Database Copy

 

 

Add the Exchange Server -> Click Save

 

 

If circular logging is enabled, an error will occur. Disable circular logging before proceeding, then re-enable it later.

 

 

If an error occurs initially,

 

 

 wait a moment and click Update to force replication.

 

 

Once complete, verify that the replication status is Healthy.

 

 

Check the other servers to confirm that replication is functioning correctly.

 

With this setup, your Exchange Server DAG is now fully configured using the IPLess approach, providing high availability and redundancy.

반응형

+ Recent posts