Azure.datafactory.tools
Tools for deploying Data Factory (v2) in Microsoft Azure
Install / Use
/learn @Azure-Player/Azure.datafactory.toolsREADME
azure.datafactory.tools
<img style="float: right;" src="./images/logo512.png" width="200px">PowerShell module to help simplify Azure Data Factory CI/CD processes. This module was created to meet the demand for a quick and trouble-free deployment of an Azure Data Factory instance to another environment.
The main advantage of the module is the ability to publish all the Azure Data Factory service code from JSON files by calling one method. The module supports now:
- Creation of Azure Data Factory, if it doesn't exist
- Deployment of all type of objects:
- Pipelines,
- DataSets,
- Linked Services,
- (mapping & wrangling) Data Flows,
- Triggers,
- Integration Runtimes
- Managed Virtual Network
- Managed Private Endpoints
- Credentials
- Finding the right order for deploying objects (no more worrying about object names)
- Built-in mechanism to replace, remove or add the properties with the indicated values (CSV and JSON file formats supported)
- Stopping/starting triggers
- Dropping objects when not exist in the source (code)
- Optionally can skip deletion of excluded objects
- Filtering (include or exclude) objects to be deployed by name and/or type and/or folder
- Filtering supports wildcards
- Publish options allow you to control:
- Whether stop and restarting triggers
- Whether delete or not objects not in the source
- Whether create or not a new instance of ADF if it not exist
- Tokenisation in config file allows replace any value by Environment Variable or Variable from DevOps Pipeline
- Allows to define multiple file (objects) by wildcarding
- Global Parameters
- Support for Managed VNET and Managed Private Endpoint
- ⭐️ Incremental deployment (NEW!)
- Build function to support validation of files, dependencies and config
- Test connections (Linked Services)
- Generates mermaid dependencies diagram to be used in MarkDown type of documents
Table of Content
<details close> <summary>Expand/Collapse</summary>- azure.datafactory.tools
- Table of Content
- Known issues
- Overview
- How to start
- Publish Azure Data Factory
- Examples
- How it works
- Selective deployment, triggers and logic
- Build/Test Azure Data Factory code
- Test connection of Linked Service (preview)
- Generate dependencies diagram
- Export ADF code to ArmTemplate
- Publish ADF using ArmTemplate file(s) (preview)
- Publish from Azure DevOps
- Release Notes
- Misc
📚 New Structured Documentation Available!
We've reorganized the documentation into focused guides.
Start here: 📖 Structured Documentation
Or jump directly to: Getting Started | Publishing Guide | Cmdlet Reference
Known issues
- Native CDC objects are not yet supported.
Overview
This module works for Azure Data Factory V2 only and uses Az.DataFactory PowerShell module from Microsoft for the management of objects in ADF service.
Support
The module is compatible and works with Windows PowerShell 5.1, PowerShell Core 6.0 and above. This means you can use Linux-based agents in your Azure DevOps pipelines.
How to start
Install-Module
To install the module, open PowerShell command line window and run the following lines:
Install-Module -Name azure.datafactory.tools -Scope CurrentUser
Import-Module -Name azure.datafactory.tools
If you want to upgrade module from a previous version:
Update-Module -Name azure.datafactory.tools
Check your currently available version of module:
Get-Module -Name azure.datafactory.tools
Source: https://www.powershellgallery.com/packages/azure.datafactory.tools
Publish Azure Data Factory
This module publishes all objects from JSON files stored by ADF in a code repository (collaboration branch). Bear in mind we are talking about master branch, NOT adf_publish branch.
If you want to deploy from adf_publish branch - read this article: Deployment of Azure Data Factory with Azure DevOps.
Where is my code?
If you have never seen code of your Azure Data Factory instance - then you need to configure the code repository for your ADF. This article helps you to do that: Setting up Code Repository for Azure Data Factory v2.
Once you have set up the code repository, clone the repo and pull (download) onto local machine. The folder structure should look like this:
SQLPlayerDemo
dataflow
dataset
integrationRuntime
linkedService
pipeline
trigger
Some of these folders might not exist when ADF has none of that kind of objects.
Examples
Publish (entire) ADF code into ADF service in Azure:
Publish-AdfV2FromJson
-RootFolder <String>
-ResourceGroupName <String>
-DataFactoryName <String>
-Location <String>
[-Stage] <String>
[-Option] <AdfPublishOption>
[-Method] <String>
[-DryRun] <Switch>
Assuming your ADF is named SQLPlayerDemo and the code is located in c:\GitHub\AdfName\, replace the values for SubscriptionName, ResourceGroupName, DataFactoryName and run the following command using PowerShell CLI:
$SubscriptionName = 'Subscription'
Set-AzContext -Subscription $SubscriptionName
$ResourceGroupName = 'rg-devops-factory'
$DataFactoryName = "SQLPlayerDemo"
$Location = "NorthEurope"
$RootFolder = "c:\GitHub\AdfName\"
Publish-AdfV2FromJson -RootFolder "$RootFolder" -ResourceGroupName "$ResourceGroupName" -DataFactoryName "$DataFactoryName" -Location "$Location"
Other environments (stage)
Use optional [-Stage] parameter to prepare json files of ADF with appropriate values for properties and deploy to another environment correctly. See section: How it works / Step: Replacing all properties environment-related for more details.
Publi
Related Skills
node-connect
342.0kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
84.7kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
342.0kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
commit-push-pr
84.7kCommit, push, and open a PR
