Optimising Power BI Dashboards
Lee Hawthorn June 01, 2019 #PowerBIPower BI is such a usable tool that I've found soon after deployment the data and calculations can grow. This isn't a bad thing as the flexibility allows users to experiment with data sources to understand value etc.
The downside, if data isn't well managed it can lead to slow and unreliable models.
There are best practices you can follow but before diving in it's worth setting up a simple framework that includes:
Assessment -> Measurement -> Change -> Measurement
Assessment
It's important to know about the source of data, is it a SQL Server DB? SSAS Direct Connection? Web Service, Data Warehouse?
Loading data into the Power BI model is different to making a direct query. In my experience using direct query against a relational SQL source requires extensive tuning of the SQL DB. Doing this alone will require a project as SQL DB uses different technology than Power BI.
Also, what is the destination? Power BI Pro? Power BI Premium?
When are the Power BI models being used and how are they used?
If you have a Finance App that is being hit by the Finance department at day -3, -2, -1, 1 then it's important to know this.
The Board App could be used by the business leaders for their monthly management meeting. If the Finance leader is telling a story with Power BI you don't want delays as they are navigating through the model.
Data size and growth is important too - the model may work fine with 1 million rows but with 100 million things will be different
Measurement
Things you can measure:
- Size of the data
- Size of the compressed model
- Speed of queries
- Memory consumption
- CPU Utilisation
Change
Focusing on the Power BI model you can focus on two areas that will impact performance in different ways.
-
Data Model
The Data Model is important as it determines the level of compression you get from Vertipaq as well as the model size.
The better the compression the less data has to be read by Vertipaq, hence, it's a big contributor to model size and speed of queries.
- Remove any columns that aren't needed
- Reduce the amount of unique values in a column
- Remove attribute columns and place into a dimension
- Push back calculated columns, tables to the source where possible
- Keep a rolling data retention window if possible - I had an example once where a client was storing 5 years of data when most of their queries were on the last 2 years of data.
- Ensure you have the correct grain of data in the model. For instance, do you need to see individual transactions or can they be grouped up to Day?
-
Calculated tables, columns, measures
For DAX calculations it's a case of identifying the most expensive calculations in use.
Rewriting these calculations to improve performance (too deep to go into here)
This is going to need advanced DAX skills as well as some understanding of the Vertipaq engine.
Tools
You can use a variety of tools to measure performance.
Optimising Power BI is not an easy task - you can only really learn by doing. I hope at least you find a good path using the info on this post.