Using Power Automate to validate Power BI data refreshes

Power BI dataset can be a powerful tool to validate data refreshes if used correctly. They are free from some of the inherent shortcomings of a manual validation approach. At the same time, the out-of-the-box support for Power BI makes Power Automate a preferable choice for automating Power BI-related activities.
In this article, we will build an automated data validation solution that uses the Power BI dataset to validate data refresh and automate the validation process with Power Automate. We will discuss the requirements, steps, benefits, and limitations with an example.
In a previous article, we discussed the importance of the refresh assessment to validate your data before refreshing your Power BI reports. With the help of an example, we discussed a manual two-step refresh assessment approach and its shortcomings.
If you’re new to this article, refer to our previous article to get started.
Before we move forward, here is a quick summary of our “Unified Growth Monitor” reporting solution:
- 33 KPIs to track progress across the organization.
- 18 diverse data sources: Automated (on-prem SQL DB, Synapse) and Manual files (CSV and Excel). One fact table from each source; hence, we have 18 fact tables in our dataset.
- New data is available at different intervals: Monthly on various workdays (for example, on W1, W7, W14, etc.) and Quarterly for different KPIs.

Let’s look at how we automate the refresh assessment and validation to free up our team to focus on other essential tasks and retain users’ trust in our reports.
How to Automate Data Refresh Assessment
Our automation strategy can be summarized as follows: we can obtain a “now” state of data and KPIs from our Power BI dataset. Then, we can replicate the dataset and refresh the replica to obtain the “will-be” state of data. By comparing our dataset’s “now” and “will-be” states, we can conclude whether the dataset can be safely refreshed or delayed until the issues are investigated and addressed.
The best part of our approach is that it doesn’t require any additional licenses than you already have from your organization. Standard Power Automate connectors are sufficient to build a simple validation solution. We will use the following components and do not need a Premium either for Power BI or Power Automate.

Automating data refresh assessment can be divided into four main steps, each containing multiple sub-steps and activities.

1. Prepare the dataset for validation
Our dataset contains all the relationships and calculations we need to validate our data. However, our dataset most likely only contains the calculations to fit our reporting needs, for example, a DAX measure to calculate a KPI value. The first step is to create additional supporting tables and DAX measures in the dataset to get the results we can use for comparison. Use the following points as a starting step:
- Identify your validation requirements: Start with what your team validates with each data refresh. Is it the presence of data in tables, the KPI value, or the availability of data for different dimensions, etc?
- Decide the granularity for the validation: Do you want to validate historical data or only the latest month? Do you want to validate data for only a particular dimension, e.g., Geography or all the dimensions?
- Focus on patterns: Do you have some particular data sources that are frequently troubled, e.g., a manual file that needs extra attention? Do you have a fact table that frequently has incorrect data for a particular dimension, e.g., EBIT numbers are sometimes incorrect, but everything else is correct?
- Identify expected changes after a data refresh: Can you identify any expected changes in your data after each refresh that can act as a threshold? Maybe some of your fact tables will have a fixed increase in the number of rows, or your KPIs will only increase or decrease within a limit, etc. We don’t need this information at this stage; we will use it to highlight the differences while visualizing them. For example, we expect a 5% increase in the number of rows, a 5% change in the KPI values, and the latest data availability date to increase by one month.
A simple validation for our “Unified Growth Monitor” dataset considers the following for each KPI: the latest date of data availability, the number of rows, and the latest KPI value. For now, we only want to focus on the newest month without considering additional dimensions. We made the following changes to our model:
- Create supporting tables: Add any new tables to support your validation requirements. We added a KPI Reference table that contains the KPI IDs, sorting order, type of KPI value (as we have both absolute and percentage KPIs), and the latest data availability date (calculated from the fact tables). You can use DAX to add a table or source it from an Excel or CSV file.

- Create DAX measures: Create additional new measures to support your validation. We created one measure per data source to count the number of rows in the fact tables. Then, we created a parent measure with KPI Reference as a context that calls the other row counting measures. We also created a measure with KPI Reference as a context that calls the other measures to calculate the latest growth for each KPI.

- Visualize your validation results and get the DAX query: Use the dimensions, newly added tables, and DAX measures to visualize your validation results. Then, utilize the Performance Optimizer to get the DAX query for the visual. Keep this DAX query handy; we will use this query to get the “now” and “will be” state of the dataset.


2. Create a replica of the dataset
We have to get the “will-be” state of the dataset after the refresh. We will create a replica of our dataset and publish it to a workspace to achieve this. Later, we will refresh this replica with Power Automate. We appended “Replica” to the dataset name to identify the dataset.

We don’t need a Premium or Premium-Per-User (PPU) workspace for the replica dataset. However, the choice of workspace to publish the replica should align with your organization’s directives on using environments. For example, if you cannot use production data connections in dev or test Power BI workspaces, you should publish the replica in the production workspace.
At this stage, we need to decide on a SharePoint location that will store our assessment results (“now” and “will-be” states) as JSON files. We will source these files from SharePoint to Power BI and visualize the validation results.

3. Automate validation actions with Power Automate
It’s essential to decide how we want to trigger the flow. We have multiple options to trigger the flow, e.g., manually from the Power Automate portal, automatically based on an email, etc. We prefer to trigger the flow from the Power BI report we’re using to visualize our validation results. So, our entire solution remains in one file, and we don’t have to go to the Power Automate portal to trigger it separately.
Start with a new Power BI report in Power BI Desktop, and let’s save this file as “Refresh Assessment Results.” Add a Power Automate visual on the first page and select More options (…) > Edit to configure the button. In the edit mode, create a new flow to be applied to the button, and select New > Instant cloud flow.


If you’re new to Power Automate, refer to the Microsoft Learn article for a quick starter guide.
Do you know Copilot in Power Automate can simplify your workload and greatly enhance your experience while implementing this flow? Check out our Copilot in Power Automate guidebook for Power BI developers.
Our goal is to implement the following architecture in Power Automate.

To make things easier, we can divide the implementation steps into three main categories: target, source, and assessment.
Target steps
Steps to refresh the replica and get the “will-be” state of the dataset.

We will use the following Connector — Action as steps:
- Power BI – Refresh a dataset: To refresh the replica dataset. Select the workspace and replica dataset in the parameters.
- Scheduler – Delay: We can’t determine whether the data refresh has finished with standard connectors. So, add an estimated wait time before the next step to allow refresh to finish. The estimation should be based on your observations of how long your dataset takes to refresh, plus a few extra minutes.
- Power BI – Run a query against a dataset: To execute the DAX query we obtained earlier while preparing our dataset for validation (step 1) against the refreshed replica dataset. Select the workspace and dataset, paste the DAX query, and include the nulls.
- SharePoint – Create file: To store the DAX query results as a JSON file for later use. Select the SharePoint storage location, add a file name with a timestamp to make it unique with JSON as an extension, and the body of the previous action as file content. You can use the following expression to generate unique file names that start with Target_ and contain the West Europe timestamp.
concat('Target_',convertFromUtc(utcNow(),'W. Europe Standard Time','yyyyMMdd_HHmmss'),'.json')
Source steps
Steps to get the “now” state of the dataset.

We will use the following Connector — Action as steps:
- Power BI – Run a query against a dataset: To execute the DAX query against the dataset.
- SharePoint – Create file: To store the DAX query results as a JSON file. You can use the following expression to generate unique file names that start with Source_ and contain the West Europe timestamp.
concat('Source_',convertFromUtc(utcNow(),'W. Europe Standard Time','yyyyMMdd_HHmmss'),'.json')
Give your flow a name, and select Save and apply. Next, we need to run the flow successfully at least once to ensure that we have Source_ and Target_ JSON files available in the SharePoint storage. Follow the prompts, fix any errors, and re-run until your flow runs successfully. Verify that your SharePoint has Source_ and Target_ JSON files.

Next, we will load the latest Source_ and Target_ files as two separate tables to the Power BI file. Use PowerQuery to load the latest files, then transform the JSON data to get clean and usable tabular data. When the data has loaded, create an appropriate relationship between the tables and publish the Power BI file to a workspace.

How do you load the latest file? Add a filter on the Name column to begin with Source_ or Target_, sort the Date Modified column in descending order, and use Keep Top Rows = 1.
Assessment steps
Steps to prepare the assessment results and notify the team.

We will use the following Connector — Action as steps:
- Power BI – Refresh a dataset: To refresh the “Refresh Assessment Results” dataset. Refreshing the dataset will load the latest Source_ and Target_ files to the dataset. Select the workspace and “Refresh Assessment Results” dataset in the parameters.
- Scheduler – Delay: Add an estimated wait time to allow the dataset to be refreshed.
- Mail – Send an email notification (V3): To send an email notification to the PBI Dev or team when results are ready. Select the User email or specify the recipients, and add a subject and a body to communicate the readiness of the assessment results.

That’s all the steps for the flow. Save and apply your changes and try to run the flow. Follow the prompts, fix any errors, and re-run until your flow runs successfully and you receive an email congratulating you on your success.

4. Visualize the validation results in Power BI
The last step gives you the flexibility and complete feature set of Power BI to visualize your validation results. At the minimum, your visualization should assist you in spotting the differences at a glance. Additionally, you can build rules and a summary to describe the differences and potential issues verbally. When done, remember to publish your changes. It’s a good habit to test your solution with different test cases a few times.

For our “Unified Growth Monitor,” we built a table with the “now” (Source_) and “will-be” (Target_) state of the dataset. We used conditional formatting to highlight the differences. We built a handful of DAX measures to compare and analyze the differences and briefly summarize potential data issues.
Before every refresh, our PBI Dev triggers the assessment and waits for the completion email. Then, he reviews the differences and decides if we need a further investigation or if it’s safe to refresh the dataset.
Benefits and Limitations
This article will only be complete with mentioning the pros and cons of our automated solution. However, we will skip restating the benefits of an automated approach over a manual approach and focus on benefits and limitations unique to our scenario.
- Latest data model and calculation changes are always included in the validation: Using the Power BI dataset for validation ensures that we always have the latest data model and KPI calculations considered during validation. Unlike manual approaches, there is no risk of forgetting to update the validation queries.
- Independent of underlying data source and restrictions: The underlying data source and associated limits do not impact the validation process and efforts. The process works for you as long as you have access to the Power BI dataset. Similarly, you don’t have to worry about SQL or Pivot to validate your data.
- No Premium licenses required: Our automation process doesn’t require any Premium licenses. It works whether you have Power BI Premium or Pro. We only used Power Automate standard connectors to build this automated solution.
- Maintenance efforts to keep the replica in sync: The dataset can’t be replicated automatically, so developers must replicate the model manually. Afterward, the PBI Dev has to keep the replica in sync with the dataset.
That’s all for now. Trigger your refresh assessment from Power BI Service and pat yourself on the back for making this work.
Conclusion
Using the Power BI dataset during validation can overcome several shortcomings of a manual validation approach. With DAX, we can build rules and verbal summaries to help us identify the problem data areas at a glance. Furthermore, packaging the Power Automate flow and assessment results in a single file allows us to complete the assessment efficiently without having to navigate to multiple portals.
Next Steps
Adding additional features can improve the automated refresh assessment solution we built in this post. We’re working on another article to cover the details.


Leave a comment