AzLogDcrIngestPS
AzLogDcrIngestPS - Unleashing the power of Log Ingestion API with Azure LogAnalytics custom table v2, Azure Data Collection Rules and Azure Data Ingestion Pipeline
Install / Use
/learn @KnudsenMorten/AzLogDcrIngestPSREADME
Introduction
If you are sending data using HTTP Data Collector API (REST) today, you should continue reading, as this API will be deprecated, as part of the transition to Log ingestion API using Azure Data Collection Rules, Azure Pipeline, Azure LogAnalytics custom tables (v2).

As you can see from the illustrations above more components (DCR, DCR, Pipeline, Schema) are added, which also increases the complexity and challenges to take into account:
- Creation of DCR + tables before sending data
- Dependencies to DCE, which must exist
- Timing / delays when doing automations
- Schema for data must be defined in both DCR and custom table (v2)
- Naming conventions & limitations / Prohibited names
- Deal with new properties – support both merge and overwrite
- Upload changes (32 mb -> 1 mb) per JSON (batches, calculations)
- Data manipulations of source data (filtering, remove)
Your new helper, AzLogDcrIngestPS
I am really happy to announce my new Powershell module, AzLogDcrIngestPS. This module can ease the steps, if you want to send any data to Azure LogAnalytics custom logs (v2) - using the new features of Azure Log Ingestion Pipeline, Azure Data Colection Rules & Log Ingestion API.
Think of this module as an "AnyConnector", which can be used to send data from any 3rd party source using PS script (intermediate) or use as part of a script collecting data from endpoint. It can be used in Powershell scripts, Azure Function, Azure Automation, etc.


The 25 functions will help you with:
- data manipulation before sending data in
- table / dcr / schema / transformation management
- data upload using Azure Log Ingestion Pipeline / Log Ingestion API
- support/security


Cool features of AzLogDcrIngestPS are:
- create/update the DCRs and tables automatically - based on the source object schema
- validate the schema for naming convention issues. If exist found, it will mitigate the issues
- update schema of DCRs and tables, if the structure of the source object changes
- auto-fix if something goes wrong with a DCR or table
- can remove data from the source object, if there are colums of data you don't want to send
- can convert source objects based on CIM or PS objects into PSCustomObjects/array
- can add relevant information to each record like UserLoggedOn, Computer, CollectionTime
See the Powershell module AzLogDcrIngestPS in action
I have provided 4 demos for you to try. I have also provided videos for you to check out.
Alternatively, I have built a cool showcase - ClientInspector (v2), free for the community, where you can see how you can use the funtions from the AzLogDcrIngestPS module.

ClientInspector can bring back data from your clients using Azure Log Ingestion Pipeline, Azure Data Collection Rules, Azure LogAnalytics; view them with Azure Monitor & Azure Dashboards - and get "drift-alerts" using Microsoft Sentinel. it includes tons of great information and dashboards to see if you are in control with your clients - or something is drifting from desired state.
Videos
I have provided 4 demos for you to try, but if you want to see it first using video, check out these videos:
Video 3m 19s - Running ClientInspector using commandline (normal mode)
Video 1m 40s - Automatic creation of 2 tables & DCRs (verbose mode)
Video 1m 37s - Automatic creation of 2 tables & DCRs (normal mode)
Video 1m 34s - See schema of DCR and table)
Video 2m 19s - Data manipulation
Video 1m 58s - Kusto queries against data
Video 3m 01s - Dashboards
Video 0m 48s - Sample usage of data - lookup against Lenovo warranty db
Video 7m 25s - Deployment via ClientInspector DeploymentKit
Migration videos
If you are looking for videos on how to migrate your LogAnalytics tables from Classic (v1) to Data Collector Rule-based (v2) video’s – to replace HTTP Data Collector API (v1) with Log Ingestion API (v2), you can find them here:
- Side-by-Side Migration (new table, new naming convention)
- Re-use existing table – Migrate to DCR based format
Download latest version
You can download latest version of AzLogDcrIngestPS here - or install from Powershell Gallery:
Install AzLogDcringestPS from Powershell Gallery
install-module AzLogDcrIngestPS
Download AzLogDcringestPS module from this Github repositry
Quick links for more information
How to get started in your own environment (demo)
Background for building this Powershell module
Deep-dive about Azure Data Collection Rules (DCRs)
Deep-dive about Log Ingestion API
Architecture, Schema & Networking
Security
Source data - what data can I use ?
Example of how to use the functions
How can I modify the schema of LogAnalytics table & Data Collection Rule, when the source object schema changes ?
Migration of LogAnalytics v1 table to v2-format (keep existing)
How to enable verbose-mode & get more help ?
Integration of AzLogDcrIngest in your scripts
Function synopsis
Detailed - Data Manipulation
Detailed - Table/DCR/Schema/Transformation management
Detailed - Data Out (upload to Azure LogAnalytics)
Detailed - Support functions (security)
Contact me
Credits & Thank You
Lastly, I would like to give big credits to a few people, who I have worked together with on building AzLogDcrIngestPS Powershell module and my daily work with the Azure log & viewing capabilities:
|Name|Role| |:---|:---| |Ivan Varnitski|Program Manager - Azure Pipeline| |Evgeny Ternovsky|Program Manager - Azure Pipeline| |Nick Kiest|Program Manager - Azure Data Collection Rules| |Oren Salzberg|Program Manager - Azure LogAnalytics| |Guy Wild|Technical Writer - Azure LogAnalytics| |John Gardner|Program Manager - Azure Workbooks| |Shikha Jain|Program Manager - Azure Workbooks| |Shayoni Seth|Program Manager - Azure Monitor Agent| |Jeff Wolford|Program Manager - Azure Monitor Agent| |Xema Pathak|Program Manager - Azure VMInsight (integration to Azure Monitor Agent)|
How to get started ?
The 3 steps to get started with sending logs through Azure Pipeline using Log Ingestion API, Data Collection Rules and AzLogDcrIngestPS are:
Step 1 - Get demo environment up and running.
Download the Powershell script Step1-Deployment-DemoEnvironment
Modify the SubscriptionId and TenantId in the header before running the deployment
The deployment-script will setup the following tasks:
- create Azure Resource Group for Azure LogAnalytics Workspace
- create Azure LogAnalytics Workspace
- create Azure App registration used for upload of data by demo-upload script
- create Azure service principal on Azure App
- create needed secret on Azure app
- create the Azure Resource Group for Azure Data Collection Endpoint (DCE) in same region as Azure LogAnalytics Workspace
- create the Azure Resource Group for Azure Data Collection Rules (DCR) in same region as Azure LogAnalytics Workspace
- create Azure Data Collection Endpoint (DCE) in same region as Azure LogAnalytics Workspace
- delegate permissions for Azure App on LogAnalytics workspace
- delegate permissions for Azure App on Azure Resource Group for Azure Data Collection Rules (DCR)
- delegate permissions for Azure App on Azure Resource Group for Azure Data Collection Endpoints (DCE)
$UseRandomNumber = $true
If ($UseRandomNumber)
{
$Number = [string](Get-Random -Minimum 1000 -Maximum 10000)
}
Else
{
$Number = "1"
}
# Azure App
$AzureAppName = "Demo" + $Number + " - Automation - Log-Ingestion"
$AzAppSecretName = "Secret used for Log-Ingestion"
# Azure Active Directory (AAD)
$TenantId = "<xxxxxx>" # "<put in your Azure AD TenantId>"
# Azure LogAnalytics
$LogAnalyticsSubscription = "<xxxxxx>" # "<put in the SubId of where to place environment>"
$LogAnalyticsResourceGroup = "rg-l
