Manage Azure Data Factory pipelines in Microsoft Fabric

By:   |   Updated: 2024-12-05   |   Comments   |   Related: > Microsoft Fabric


Problem

We recently started using Microsoft Fabric for our cloud data platform. However, we already have quite an estate of Azure data services running in our company, including a huge number of Azure Data Factory (ADF) pipelines. It seems cumbersome to migrate all those pipelines to Microsoft Fabric, especially because some features are not supported yet and ADF is the mature choice at the moment. We like the concept of Microsoft Fabric's centralization, where everything is managed in one platform. Is there an option to manage ADF in Fabric?

Solution

Microsoft Fabric is a SaaS cloud data platform that aims to centralize all data analytics experiences under one roof: data engineering, data science, machine learning, data visualization, real-time analytics—it's all possible in Fabric. However, it's still a young product. While it is continuously being updated and expanded each month, some features that are present in other Azure data services are still missing in Fabric.

For example, in ADF, you have the concept of linked services and data sets, while pipelines in Fabric do not share the same concepts. Rather, connections are created on a per-user basis:

connections in Fabric

If someone creates a connection in a pipeline and they want other people to use that connection as well, they need to give explicit permission to do so:

give permission to use a connection

If permissions are not shared and the person who created the connection leaves the company, pipelines can stop working. These issues are avoided in ADF due to the concepts of linked services and datasets. Currently, ADF also has better/easier key vault integration.

These reasons may be why people are reluctant to move their existing ADF pipelines to Fabric. The downside is managing your data platform in multiple places. Recently, a new feature was introduced in Fabric that allows you to "mount" an existing ADF environment into the Fabric Data Factory environment. This permits the ability to manage your current ETL infrastructure within Fabric itself, overcoming some growing pains when moving to Fabric.

Note: At the time of writing, this feature is in public preview. The behavior or user interface may change by the time it's released as generally available.

Mount an Azure Data Factory in Fabric

When logged into Fabric, go to your Fabric-enabled workspace (either a trial workspace or a workspace backed by a Fabric capacity). In the workspace, open the dropdown list next to the New button, and select More options at the bottom.

new dropdown list, select more options

In the Data Factory section, choose Azure Data Factory (preview)

choose azure data factory

You will be presented with a list of ADF instances that can be found in your tenant.

list of possible ADF instances to mount

At the moment – and keep in mind this feature is in preview – it is not possible to mount an ADF instance that is integrated into a git repository.

git integrated ADF is not supported yet

In practice, this might mean that your development environment of ADF cannot be mounted yet (as the dev environment is typically git integrated). However, your test, UAT, or production environment can, as those are typically not git integrated but rather updated through CI/CD pipelines after a pull request in the dev environment.

Select the instance you want to mount and click OK to mount it.

select instance to mount

This might take a minute…

mounting ADF in Fabric...

Once it's done, you'll get a notification.

successfully mounted

Click on View Azure Data Factory to go to your mounted instance. You can also find it in the explorer of your workspace:

find mounted adf in workspace

When you open the instance, it will feel quite familiar to your original ADF instance:

mounted ADF instance, with a pipeline open

The difference here is that the author and manage screens are in the same pane. You can edit items, such as triggers for example:

edit trigger in mounted ADF

It's also possible to add an activity to a pipeline (or update/remove an existing activity).

add activity in mounted ADF

However, keep in mind that you can only add activities that are supported by Azure Data Factory. Activities only available in Fabric Pipelines (such as the Office 365 Outlook activity) cannot be added.

It's also possible to create brand new items:

create new items in mounted ADF

In the Manage section, you can view the Linked Services:

linked services in mounted ADF

Or the Integration Runtimes:

integration runtimes in mounted ADF

You can edit existing items and create new items by hovering over the menu item. This will show an ellipsis. By clicking on it, you can add a new item:

add new linked service in mounted ADF

If there are no items on the list, you'll get a message and a button to create a new item:

button to create new item in mounted ADF

There are a couple of things missing though. This is the manage pane of the "original" ADF instance:

manage menu in original ADF

It would be helpful if at least Factory Settings and Global Parameters were also available in the mounted version (maybe this will be added later).

You can start a pipeline in debug mode from the Author section.

debug pipeline in mounted ADF

However, the monitoring section is also missing. If you want to view the past execution statuses of your pipelines, you'll need to go to the original ADF instance.

go to monitoring section in original ADF

There you can see the results of all the executions, even the ones started within the mount in Fabric:

monitoring section in original ADF, with debug result from pipeline started in mounted ADF

Conclusion

With the ability to mount an existing ADF instance into a Microsoft Fabric workspace, we're one step closer to a single integrated and centralized data platform in the Microsoft cloud. Most essential features are there, such as the ability to add or modify pipelines, datasets, or data flows, and to view and manage linked services, triggers, and integration runtimes.

However, some key features are missing, notably the monitoring pane, or the ability to manage global parameters. Currently, ADF instances linked to a git repository are not supported. This means that for some actions, you'll still need to go to the actual ADF instance. We need to wait and see if these features make it into the final product.

Next Steps


sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author Koen Verbeeck Koen Verbeeck is a seasoned business intelligence consultant at AE. He has over a decade of experience with the Microsoft Data Platform in numerous industries. He holds several certifications and is a prolific writer contributing content about SSIS, ADF, SSAS, SSRS, MDS, Power BI, Snowflake and Azure services. He has spoken at PASS, SQLBits, dataMinds Connect and delivers webinars on MSSQLTips.com. Koen has been awarded the Microsoft MVP data platform award for many years.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips


Article Last Updated: 2024-12-05

Comments For This Article

















get free sql tips
agree to terms