If you are using the DAX Query View Testing Pattern you can also look at automating the deployment and testing using Azure DevOps. The following instructions show you how to setup an Azure DevOps pipeline to automate deployment of Power BI reports/semantic models and automate testing.
Figure 1 -- High-level diagram of automated deployment of PBIP and automated testing with the DAX Query View Testing Pattern
In the pattern depicted in Figure 1, your team saves their Power BI work in the PBIP extension format and commits those changes to Azure DevOps.
Then an Azure Pipeline is triggered to validate the content of your Power BI semantic models and reports by performing the following:
-
The semantic model changes are identified using the "git diff" command. Semantic models that are changed are published to a premium-backed workspace using Rui Romano's Fabric-PBIP script. The question now is, which workspace do you deploy it to? I typically promote to a Build workspace first, which provides an area to validate the content of the semantic model before promoting to a development workspace that is shared by others on the team. This reduces the chances that a team member introduces an error in the Development workspace that could hinder the work being done by others in that workspace.
-
With the semantic models published to a workspace, the report changes are identified using the "git diff" command. Report changes are evaluated for their "definition.pbir" configuration. If the byConnection property is null (meaning the report is not a thin report), the script identifies the local semantic model (example in Figure 2). If the byConnection is not null, we assume the report is a thin report and configured appropriately. Each report that has been updated is then published in the same workspace.
-
For the semantic models published in step 1, the script then validates the functionality of the semantic model through a synchronous refresh using Invoke-SemanticModelRefresh. Using the native v1.0 API would be problematic because it is asynchronous, meaning if you issue a refresh you only know that the semantic model refresh has kicked off, but not if it was successful. To make it synchronous, I've written a module that will issue an enhanced refresh request to get a request identifier (a GUID). This request identifier can then be passed as parameter to the Get Refresh Execution Details endpoint to check on that specific request's status and find out whether or not the refresh has completed successfully.
If the refresh is successful, we move to step 4. Note: The first time a new semantic is placed in the workspace, the refresh will fail. You have to "prime" the pipeline and set the data source credentials manually. As of April 2024, this is not fully automatable and the Fabric team at Microsoft has written about. -
For each semantic model, Invoke-DQVTesting is called to run the DAX Queries that follow the DAX Query View Testing Pattern. Results are then logged to the Azure DevOps pipeline (Figure 3). Any failed test will fail the pipeline.
Figure 3 - Example of test results logged by Invoke-DQVTesting
-
You have an Azure DevOps project and have at least Project or Build Administrator rights for that project.
-
You have connected a premium-back capacity workspace to your repository in your Azure DevOps project. Instructions are provided at this link.
-
Your Power BI tenant has XMLA Read/Write Enabled.
-
You have a service principal or account (username and password) with a Premium Per User license. If you are using a service principal you will need to make sure the Power BI tenant allows service principals to use the Fabric APIs. The service principal or account will need at least the Member role to the workspace.
- In your project, navigate to the Pipelines->Library section.
- Select the "Add Variable Group" button.
- Create a variable group called "TestingCredentials" and create the following variables:
- USERNAME_OR_CLIENTID - The service principal's application/client id or universal provider name for the account.
- PASSWORD_OR_CLIENTSECRET - The client secret or password for the service principal or account respectively.
- TENANT_ID - The Tenant GUID. You can locate it by following the instructions at this link.
- Save the variable group.
- Navigate to the pipeline interface.
- Select the "New Pipeline" button.
- Select the Azure Repos Git option.
- Select the repository you have connected the workspace via Git Integration.
- Copy the contents of the template YAML file located at this link into the code editor.
- Update the default workspace name for located on line 5 with the workspace you will typically use to conduct testing.
- Select the 'Save and Run' button.
- You will be prompted to commit to the main branch. Select the 'Save and Run' button.
-
You will be redirected to the first pipeline run, and you will be asked to authorize the pipeline to access the variable group created previously. Select the 'View' button.
-
A pop-up window will appear. Select the 'Permit' button.
- You will be asked to confirm. Select the 'Permit' button.
- This will kick off the automated deployment and testing as described above.
- Select the "Automated Deployment and Testing Job".
- You will see a log of DAX Queries that end in .Tests or .Test running against their respective semantic models in your workspace.
- For any failed tests, this will be logged to the job, and the pipeline will also fail.
It's essential to monitor the Azure DevOps pipeline for any failures. I've also written about some best practices for setting that up in this article.
The pipeline leverages two PowerShell modules called Invoke-DQVTesting and Invoke-SemanticModelRefresh. For more information, please see Invoke-DQVTesting and Invoke-SemanticModelRefresh respectively.
Git Logo provided by Git - Logo Downloads (git-scm.com)