Skip to content

Latest commit

 

History

History
145 lines (81 loc) · 9.98 KB

pbip-deployment-and-dqv-testing-pattern.md

File metadata and controls

145 lines (81 loc) · 9.98 KB

PBIP Deployment & DAX Query View Testing (DQV) Pattern

If you are using the DAX Query View Testing Pattern you can also look at automating the deployment and testing using Azure DevOps. The following instructions show you how to setup an Azure DevOps pipeline to automate deployment of Power BI reports/semantic models and automate testing.

Table of Contents

High-Level Process

Figure 1 Figure 1 -- High-level diagram of automated deployment of PBIP and automated testing with the DAX Query View Testing Pattern

In the pattern depicted in Figure 1, your team saves their Power BI work in the PBIP extension format and commits those changes to Azure DevOps.

Then an Azure Pipeline is triggered to validate the content of your Power BI semantic models and reports by performing the following:

  1. The semantic model changes are identified using the "git diff" command. Semantic models that are changed are published to a premium-backed workspace using Rui Romano's Fabric-PBIP script. The question now is, which workspace do you deploy it to? I typically promote to a Build workspace first, which provides an area to validate the content of the semantic model before promoting to a development workspace that is shared by others on the team. This reduces the chances that a team member introduces an error in the Development workspace that could hinder the work being done by others in that workspace.

  2. With the semantic models published to a workspace, the report changes are identified using the "git diff" command. Report changes are evaluated for their "definition.pbir" configuration. If the byConnection property is null (meaning the report is not a thin report), the script identifies the local semantic model (example in Figure 2). If the byConnection is not null, we assume the report is a thin report and configured appropriately. Each report that has been updated is then published in the same workspace.

    Figure 2 Figure 2 - Example of. pbir definition file

  3. For the semantic models published in step 1, the script then validates the functionality of the semantic model through a synchronous refresh using Invoke-SemanticModelRefresh. Using the native v1.0 API would be problematic because it is asynchronous, meaning if you issue a refresh you only know that the semantic model refresh has kicked off, but not if it was successful. To make it synchronous, I've written a module that will issue an enhanced refresh request to get a request identifier (a GUID). This request identifier can then be passed as parameter to the Get Refresh Execution Details endpoint to check on that specific request's status and find out whether or not the refresh has completed successfully.

    If the refresh is successful, we move to step 4. Note: The first time a new semantic is placed in the workspace, the refresh will fail. You have to "prime" the pipeline and set the data source credentials manually. As of April 2024, this is not fully automatable and the Fabric team at Microsoft has written about.

  4. For each semantic model, Invoke-DQVTesting is called to run the DAX Queries that follow the DAX Query View Testing Pattern. Results are then logged to the Azure DevOps pipeline (Figure 3). Any failed test will fail the pipeline.

Figure 3 Figure 3 - Example of test results logged by Invoke-DQVTesting

Prerequisites

  1. You have an Azure DevOps project and have at least Project or Build Administrator rights for that project.

  2. You have connected a premium-back capacity workspace to your repository in your Azure DevOps project. Instructions are provided at this link.

  3. Your Power BI tenant has XMLA Read/Write Enabled.

  4. You have a service principal or account (username and password) with a Premium Per User license. If you are using a service principal you will need to make sure the Power BI tenant allows service principals to use the Fabric APIs. The service principal or account will need at least the Member role to the workspace.

Instructions

Create the Variable Group

  1. In your project, navigate to the Pipelines->Library section.

Variable Groups

  1. Select the "Add Variable Group" button.

Add Variable Group

  1. Create a variable group called "TestingCredentials" and create the following variables:
  • USERNAME_OR_CLIENTID - The service principal's application/client id or universal provider name for the account.
  • PASSWORD_OR_CLIENTSECRET - The client secret or password for the service principal or account respectively.
  • TENANT_ID - The Tenant GUID. You can locate it by following the instructions at this link.

Create Variable Group

  1. Save the variable group.

Save Variable Group

Create the Pipeline

  1. Navigate to the pipeline interface.

Navigate to Pipeline

  1. Select the "New Pipeline" button.

New Pipeline

  1. Select the Azure Repos Git option.

ADO Option

  1. Select the repository you have connected the workspace via Git Integration.

Select Repo

  1. Copy the contents of the template YAML file located at this link into the code editor.

Copy YAML

  1. Update the default workspace name for located on line 5 with the workspace you will typically use to conduct testing.

Update workspace parameter

  1. Select the 'Save and Run' button.

Save and Run

  1. You will be prompted to commit to the main branch. Select the 'Save and Run' button.

Save and Run again

  1. You will be redirected to the first pipeline run, and you will be asked to authorize the pipeline to access the variable group created previously. Select the 'View' button.

  2. A pop-up window will appear. Select the 'Permit' button.

Permit

  1. You will be asked to confirm. Select the 'Permit' button.

Permit Again

  1. This will kick off the automated deployment and testing as described above.

Automated Job

  1. Select the "Automated Deployment and Testing Job".

Select Job

  1. You will see a log of DAX Queries that end in .Tests or .Test running against their respective semantic models in your workspace.

Log

  1. For any failed tests, this will be logged to the job, and the pipeline will also fail.

Failed Tests

Monitoring

It's essential to monitor the Azure DevOps pipeline for any failures. I've also written about some best practices for setting that up in this article.

Powershell Modules

The pipeline leverages two PowerShell modules called Invoke-DQVTesting and Invoke-SemanticModelRefresh. For more information, please see Invoke-DQVTesting and Invoke-SemanticModelRefresh respectively.

Git Logo provided by Git - Logo Downloads (git-scm.com)