Refresh Power BI Dataset using Azure Logic Apps – Part 2

By:   |   Updated: 2019-05-02   |   Comments (3)   |   Related: > Azure Data Factory


Problem

When you have a created a model in Power BI Desktop using Import Data, you have to refresh the model to refresh with new data. If you also have a daily ETL batch process, it is a good idea to refresh your model after your data warehouse has been loaded with new data. In this tip, we'll take a look on how we can achieve this in Azure Data Factory and Logic Apps.

Solution

In part 1, we created an Azure Logic Apps custom connector. Having this custom connector is a prerequisite for Azure Logic Apps to connect to the Power BI API. If you haven't read part 1 yet, I advise to do so. In the remainder of this tip, we'll discuss how we can create the Azure Logic App – which is going to be started from an Azure Data Factory (ADF) pipeline – that will refresh the Power BI dataset.

Refreshing a Power BI Dataset from Azure Data Factory

Creating a Sample Power BI Desktop Model

First, we need a model to refresh of course. Important is that we have a model that doesn't use Direct Query or a live connection (there's no refresh needed), but the Import Data method. Startup Power BI Desktop and select Get Data. In the menu, choose Blank Query.

blank query

In the Query Editor, choose Enter Data.

enter data

We're going to enter 1 cell of dummy data. In the model, we'll add a calculated column with a timestamp. That way we'll know at what time the model was refreshed.

dummy data

Hit Close & Apply to load this data to the model.

close editor

In the model, hide the dummy column:

hide dummy column

Next, add a calculated column to the table.

add calculated column

The formula of this column uses the NOW() DAX function to get the current timestamp.

refresh date

In the report canvas, use a card visualization to visualize this date:

visualize date

The last step is to publish to a workgroup in the Power BI service.

publish model

Creating the Azure Logic App

In the Azure Portal, create a new Logic App by selecting Create a resource and then searching for Azure Logic Apps.

create new logic app

Make sure to create the Logic App in the same resource group and location as your custom connector. Once the resource is deployed, go to the Logic Apps designer. As a trigger, choose the HTTP request. This request will be sent by a Web Activity in ADF.

http request as trigger

You don't need to specify a request body. Click on New Step to add an action.

add new step

In the newly added action, search for Power BI. You'll get multiple results. The connector with the Power BI name is the official one, but as mentioned in part 1 this connector has only one action (adding rows to a dataset). Choose the connector with the name of your custom connector.

choose custom connector

There are a lot of actions, so search for refresh and pick the refresh action that refresh a dataset in a specified workspace.

choose correct refresh action

The next step is to sign into your Power BI account. Make sure to use a browser that doesn't block pop-ups or temporarily disable them.

sign in

Even though we specified the correct permissions for the app when we registered it on the Power BI website, you as an author might still need some admin permissions to log in and use the app to perform tasks on the Power BI service. This depends on the policy setting of your organization. You might get an error, even if you're admin of the workspace where the dataset is located. At the very minimum, the user that logs in needs to be able to access the workgroup and be able to refresh the dataset.

After you've logged in, you need to specify the group ID (group equals workspace) and the dataset key. You can find both IDs by browsing to the dataset in the Power BI service. Click on the report icon by the dataset to create a new report.

create new report

When you're in the blank report canvas, the URL will contain both the ID of the workspace and the dataset:

dataset power bi

Copy paste both IDs and fill them out in the Logic App action:

configure action

The Logic App is now finished and can be saved.

Creating Azure Data Factory Pipeline

To finalize our setup, we need to create a pipeline in Azure Data Factory.

create new pipeline

Add a Web Activity onto the pipeline canvas. In the settings tab, we need the URL from the HTTP request trigger of the Logic App.

copy URL

Paste the URL. As method, choose the POST method.

settings

A dummy body can be specified. Everything else can be left as-is. Although we created a new pipeline for this tip, in practice you'll probably add this web activity at the end of your current ETL/ELT pipeline, right after your data warehouse has been refreshed.

Now we can test the whole setup. Debug the ADF pipeline to trigger the Logic App, which in its turn will refresh the Power BI dataset.

debug pipeline

Sending an HTTP request is asynchronous, so it will finish very fast. Let's take a look at the Logic App to see if it ran correctly:

logic app history

In the Power BI service, go to the dataset and click on the ellipsis to go to its settings.

dataset settings

There, we can check the refresh history of the dataset:

refresh history

Indeed, when we take a look at the report, we can see the date has changed:

refreshed date

Apparently, the server hosting the dataset in the Power BI service has another time zone though.

Conclusion

In this two-part tip, we defined a workflow to refresh a Power BI dataset from Azure Data Factory. First, we created a custom connector that enabled an Azure Logic App to talk to the Power BI API. Then, we created an ADF pipeline to trigger said Logic App.

Next Steps


sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author Koen Verbeeck Koen Verbeeck is a seasoned business intelligence consultant at AE. He has over a decade of experience with the Microsoft Data Platform in numerous industries. He holds several certifications and is a prolific writer contributing content about SSIS, ADF, SSAS, SSRS, MDS, Power BI, Snowflake and Azure services. He has spoken at PASS, SQLBits, dataMinds Connect and delivers webinars on MSSQLTips.com. Koen has been awarded the Microsoft MVP data platform award for many years.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips


Article Last Updated: 2019-05-02

Comments For This Article




Thursday, May 30, 2019 - 7:45:50 AM - Andy Back To Top (81259)

Hi Koen,

Wow, this has helped me to solve a problem I was struggling with since December! Good work, I really appreciate this instruction!

Regards,

Andy


Wednesday, May 8, 2019 - 9:00:32 AM - Koen Verbeeck Back To Top (79990)

Hi Bill,

excellent question :) I asked the exact same question to someone of the BI team. For the time being, the only "API" that probably supports this is the brand new XMLA endpoint. Which is Power BI Premium only. You can take a look at our discussion here: https://twitter.com/Ko_Ver/status/1108648552237019136

Regards,
Koen


Tuesday, May 7, 2019 - 11:42:12 PM - Bill Coleman Back To Top (79980)

Hi,

Firstly, this works great - thanks so much for putting it all together, it's made my life a little easier :)

I've got a follow up question around error handling - the refresh returns a success as soon as it has been successfully triggered. However, if an error occurs mid way through the refresh process, this is not captured. Is there a way to configure the API or change how it is executed to ensure that the pipeline doesn't complete until the refresh is 100% finished?

Thanks

Bill















get free sql tips
agree to terms