While organizing Intune policies, I discovered the existence of the Intune Data Warehouse and realized that it’s possible to build BI dashboards using Power BI.
Searching on YouTube, I found that connection methods have been available for quite some time.
My goal is to visualize every area of M365, so I decided to take on the challenge right away.
There are two main ways to connect Intune Data Warehouse to Power BI.
Method 1. OData Feed
In Power BI, select Get data > OData feed
Feed URL Input
Enter your organizational account and click Connect
All available tables will be listed – check all and click Load
Data Loading
Import complete
Method 2. Connector
In Power BI, select Get Data > More
Online Services > Intune Data Warehouse
Specify Period
Select tables and click Load (the following steps are the same)
The Connector brings in more tables, but the meaningful data is similar OData Feed allows for custom queries via Advanced Query The Connector allows you to specify the period
This post will proceed using the Connector method.
2. Download Power BI Template
Most Intune dashboard resources are based on the following template:
Transform data > Data source settings to check the Connector-based connection.
Refresh
you may encounter an error like below:
The template creator’s blog suggested checking the technical documentation below and changing the locale, but even after changing it, the issue was not resolved. Therefore, I proceeded by copying the template instead.
Supported languages and countries/regions for Power BI
In your BI file connected to your data, add pages with the same names as the template at the bottom.
Copy and paste the three pages as shown below.
3. Add Objects and Set Relationships
Since the structure may not match, you might encounter some errors.
Adjust the structure to match.
This error occurs because the Text Filter object is missing.
Go to More visuals > From AppSource.
Search for and add the Text Filter.
After refreshing or switching pages, you’ll see the issue is resolved.
Errors on the Devices page occur because table relationships do not match the template.
Model View menu to check the differences in Relationships count.
First import data, BI automatically sets relationships.
Since each environment is different, table relationships may vary. Use the following approach as a reference, and match the relationships to the template as needed.
Go to Manage relationships.
Some relationships in the template are missing in your BI.
Match Structure
After do it. Save
Sometimes, relationships are not automatically created because there’s no data on one side.
Inactive/Active reversed, fix them as well.
Errors on the Devices page will be resolved.
There are no errors on the ConfigProfiles page as well.
4. Conclusion
By leveraging Power BI, you can intuitively manage Intune devices.
While exporting logs using PowerShell, I started to wonder: As we move toward a more serverless cloud environment, managing logs via scheduled PowerShell scripts means I still need to operate a VM, which increases management overhead.
If you’re only considering cost, scheduling PowerShell scripts on a VM and exporting to SharePoint or OneDrive can be cheaper. However, from a long-term perspective, I believe it’s time to move away from running scheduled PowerShell scripts on VMs and adopt a serverless approach.
Also, visualizing and managing logs with BI tools can provide valuable insights. With this in mind, I anticipate that connecting to Microsoft Fabric or similar platforms will eventually become necessary.
In this post, I’ll cover how to export logs to Azure Data Lake Storage (ADLS) Gen2 and connect them to BI.
Data Lake Storage Gen2 is suitable for big data analytics and other data analysis scenarios.
4. Complete the creation and verify the storage account
Step 2. Create an Export Rule
1. Go to Log Analytics Workspace → Settings → Data Export → Create export rule
2. Name your rule
3. Select the tables to export
4. Set the destination to the storage account you created
5. Go to Data storage → Containers to check the exported tables
6. Navigate through subfolders to see that exports occur every 5 minutes
Step 3. Connect to Power BI
1. In Power BI Desktop, go to Get data → More
2. Select Azure → Azure Data Lake Storage Gen2
3. You’ll be prompted to enter a URL
4. Find the DFS URL using Azure Storage Explorer
Go to Storage Account → Storage browser → Download and install Azure Storage Explorer
Connect, navigate to the folder path, and open Properties
Copy the DFS URL
5. Paste the URL into Power BI
6. Enter your credentials (Account Key)
You can find the Account Key under Security + networking → Access keys
7. Connect and then Combine & Transform Data
Unlike saving to SharePoint, where you need to create queries manually, the native connector support makes this process much simpler.
Conclusion
By following these steps, you can export Microsoft 365 logs to Azure Data Lake Storage Gen2 and easily visualize them in Power BI. If you’re considering a serverless environment and BI integration, this approach offers a more efficient and scalable way to manage your logs in the long run.
In the previous post, we explored how to enable Microsoft Sentinel and start collecting Microsoft 365 logs. This time, we’ll focus on integrating Microsoft Defender for Identity (MDI) logs into Sentinel and preparing them for Power BI visualization.
Check Sensor Activation: With the latest MDI v3, activation is much simpler—if your Domain Controller is already onboarded to Microsoft
Defender for Endpoint (MDE), MDI can be enabled without additional steps.
(A separate post will cover the new version once it’s officially released.)
Verify Signals: Go to Advanced Hunting and confirm that IdentityLogonEvents are being recorded.
→ If signals appear here, you can confirm that Sentinel is also receiving MDI logs.
Connector Setup: Navigate to Microsoft Defender XDR → Open connector page.
→ Enable Microsoft Defender for Identity and save.
After a short delay, you should be able to query MDI logs in Sentinel.
Step 2. Register an Enterprise App for Sentinel Log Export
Currently, Advanced Hunting and Sentinel have limitations when running large queries. Our ultimate goal is to visualize data in Power BI, so we’ll first store logs as CSV files in SharePoint.
To achieve this, we’ll use the Log Analytics API, which requires Enterprise App registration.
Registration Steps
1. Go to Entra Admin Center → App registrations → New registration
2. Name the app → Register
3. Navigate to API permissions → Add a permission
4. Select APIs my organization uses → Log Analytics API
5. Check Data.Read → Add permissions
6. Click Grant admin consent
7. Go to Certificates & secrets → New client secret → Add
8. Copy the generated Value and store it securely
9. In Log Analytics Workspaces → Access control (IAM), click Add role assignment
10. Assign Log Analytics Reader role
11. Grant the role to the newly created app
Step 3. Export Logs to CSV
Tenant ID & Client ID
Workspace ID
Client Secret
Once these values are ready, you can use a PowerShell script to call the Log Analytics API and export logs in chunks.
I created the following script to call the Log Analytics API using AI.
Tip: Adjust ChunkHours and MinIntervalSeconds to avoid hitting API throttling limits.
When everything is configured correctly, the export process will look like this:
Step 4. Connect Power BI (Load CSV from SharePoint)
From my perspective, the ideal approach would be for Sentinel to natively support BI integration.
Although it provides queries that allow you to connect Power BI as shown below, due to API call limitations, a separate storage layer is required for effective use in BI.
The Sentinel Data Lake feature is currently available in preview, but it appears that Power BI integration is not yet supported. For now, we’ll store the data in SharePoint Online, which is a cost-effective option, and then aggregate it in Power BI.
Upload CSV to SharePoint
Power BI Desktop Get Data Blank query
Advanced Editior
Paste the query below. (This was created with the help of AI.)
let
// ========== ① User Settings ==========
SiteUrl = "https://clim823.sharepoint.com/sites/Sentinel",
LibraryName = "Shared Documents",
TargetFolder = "IdentityLogonEvents",
FileNamePrefix = "IdentityLogonEvents",
KeepLastNMonths = 6,
// ========== ② File → Table Conversion Function ==========
ParseCsv = (fileContent as binary) as table =>
let
csv = Csv.Document(
fileContent,
[Delimiter = ",", Columns = null, Encoding = 65001, QuoteStyle = QuoteStyle.Csv]
),
promoted = Table.PromoteHeaders(csv, [PromoteAllScalars = true])
in
promoted,
// ========== ③ Navigate to Target Folder ==========
Source = SharePoint.Contents(SiteUrl, [ApiVersion = 15]),
Library = Source{[Name=LibraryName]}[Content],
Folder = Library{[Name=TargetFolder]}[Content], // DeviceLogonEvents
// ========== ④ Filter Files ==========
FilteredByName = Table.SelectRows(Folder, each Text.StartsWith([Name], FileNamePrefix)),
FilteredByExt = Table.SelectRows(FilteredByName, each Text.Lower([Extension]) = ".csv"),
// ========== ⑤ Load Files → Convert to Tables ==========
AddedData = Table.AddColumn(FilteredByExt, "Data", each ParseCsv([Content]), type table),
TablesList = List.RemoveNulls(List.Transform(AddedData[Data], each try _ otherwise null)),
// ========== ⑥ Align Schema & Merge ==========
AllCols = if List.Count(TablesList) = 0
then {}
else List.Distinct(List.Combine(List.Transform(TablesList, each Table.ColumnNames(_)))),
AlignedTables = List.Transform(TablesList, each Table.ReorderColumns(_, AllCols, MissingField.UseNull)),
Appended = if List.Count(AlignedTables) = 0
then #table(AllCols, {})
else Table.Combine(AlignedTables),
// ========== ⑦ Filter by Last N Months ==========
WithTimestampTyped = if List.Contains(Table.ColumnNames(Appended), "Timestamp")
then Table.TransformColumnTypes(Appended, {{"Timestamp", type datetime}})
else Appended,
FilteredByDate =
if List.Contains(Table.ColumnNames(WithTimestampTyped), "Timestamp")
then Table.SelectRows(WithTimestampTyped, each [Timestamp] >= Date.AddMonths(DateTime.LocalNow(), -KeepLastNMonths))
else WithTimestampTyped
in
FilteredByDate
Close & Apply
Using this data, you can build dashboards that provide valuable insights into identity-related activities, as shown below.
Why This Matters
By connecting MDI logs to Sentinel and then visualizing them in Power BI, you can:
Detect suspicious identity activities faster
Correlate identity signals with other security data
Build interactive dashboards for security insights