Skip to main content

Announcing new deployment pipelines capabilities

Headshot of article author Nimrod Shalit

Deployment pipelines helps enterprise BI teams build an efficient and reusable release process by maintaining development, test, and production environments.

BI teams adopting deployment pipelines, will enjoy:

  • Improved productivity
  • Faster content updates delivery
  • Reduced manual work and errors

Deployment pipelines is a Premium feature, now available to all PPU-licensed users, a license that recently became generally available.

We have just released new features, that open plenty of ways for BI teams to leverage deployment pipelines.


Paginated reports management

We are excited to announce that paginated reports can now be managed inside deployment pipelines. Like other items managed in deployment pipelines, paginated reports will have the following capabilities:

  • Detect updates in a paginated report and highlight it through the ‘compare’ button.
  • Deploy a paginated report across the different stages of the pipeline, so you can create new reports in the production stage or update existing ones.
  • Set rules for a paginated report and connect it to a specific data source in the test or production stages. Once a rule for a data source is set, future updates to that report will not change the data source connection.

Read more about paginated report rules in deployment pipelines.


Protect your data in deployment pipelines

Deployment pipelines can now manage content that includes sensitivity labels, enabling users to leverage all the world-class security capabilities that Power BI offers.

Due to their nature, labels will be managed differently than other item properties during deployments. For most properties, we either:

  • Always copy and always override – for example, report visuals, dashboard tiles or model schema updates.
  • Never copy and never override – for example, data, URLs or permissions

Sensitivity labels will have a different behavior, so that users will be able to set and retain different labels for different stages, while ensuring items will always be labeled properly.  You can read more about sensitivity labels in deployment pipelines here.


BI teams can deploy and update datasets

One of the most asked features from BI teams using deployment pipelines, is to be able to have multiple team members update the same dataset. Unlike publishing and updating a PBIX from Power BI Desktop, to deploy and update a dataset in deployment pipelines, a user had to be the dataset owner. We are happy to announce that we have now removed this limitation and aligned the permission model to work the in the same way as Power BI Desktop – workspace members can update datasets through deployment pipelines.

This will build a consistent experience for BI teams who manage content together in Power BI Desktop, and deployment pipelines, and streamline updates of datasets to production. Read more about deployment pipelines permissions.


Quick access management (available in the coming weeks)

The permission management in deployment pipelines is separated between the pipeline itself, and each workspace in the pipeline. While this model has advantages and flexibility, it requires more manual work to grant all the permissions for a user added to a pipeline, and is more likely to cause mistakes.

To make the sharing of pipelines easier and faster, we added the ability to grant access to all the relevant workspaces directly from the pipeline access pane, instead of going through each workspace access pane.

Using ‘Quick access’ to manage permissions is easy, fast and much harder to get wrong.


Deploying major changes in a dataset

When a dataset schema is updated, the pipeline retains data as much as possible. This is in order to provide constant availability of content to end users, and make sure there are no down-times. However, some changes are too big to keep the data and update the schema.

BI creators that manage the model have some cases where a big-breaking change is required in the model. When this happens, they can continue the deployment of the dataset, while dropping the existing data in the updated dataset. The updated dataset will be empty after the deployment, and a full refresh will be required to make data available to end users again.

Due to the sensitivity of this operation, the pipeline stops the deployment each time such a change is detected, and the deploying user is required to agree to continue the deployment, resulting in an empty dataset. Learn more.


What’s coming up next?

We have exciting features that will be available in the coming semester:

  • Deployment automation– we’ve just launched a limited private preview of deployment automation. This will be our first step in providing the ability to automate deployments and integrate with external DevOps tools. In this release, developers will be able to automate the deployments of content in existing pipelines, and update the Power BI App content after the deployment was completed.
  • Dataflows management in deployment pipelines– after we added support to manage paginated reports in pipelines, we will also add the ability to manage dataflows. Users will be able to detect changes to a dataflow, deploy the changes to the test and production stages alongside other items, and set rules to connect to specific data in specific stages.
  • Assign workspaces to all the pipeline stages– once this feature will be released, users will have the flexibility to assign existing workspaces to each of the stages in a pipeline. This should allow more flexibility in planning what content will be added to a pipeline, and make it much easier to shift existing workspaces to be managed within a pipeline.

There is more to come, please make sure to follow our release notes to track the latest roadmap updates.

Still missing important features? Please post ideas or vote for them so that we can know what is missing for your team.