top of page
  • Writer's pictureAntriksh Sharma

Performance penalty of Measures in Filters Pane in Power BI

Updated: Oct 5, 2022

In this blog let's take a look at the performance of a report when you use a Measure in the Filter Pane.


Let's say you have a data model like this one:


Sales - 12K rows

Dates - 2.5K rows

Products - 2.1K rows

Customer - 18K rows

And the you try to create a report that computes running total:


Code -

Running Total = CALCULATE ( [Total Sales], DATESYTD ( Dates[Date] ) )

But that's not it, over the time you will develop the report and use slicers and filter pane to apply filters at visual and page level.


For example I have used Color column from Product table and Gender column from Customer table to apply permanent filters to the visual using the Filter pane:

Now let's investigate the code that is generated by the visual as well the timings, I have used Performance Analyzer to capture the query, I have cleaned the query and renamed the variables to make the code easier to read.

When I run this code, it runs in about 115 milliseconds and executes 14 Storage Engine queries whereas 3 of them are Cached and reused later. Most of the work is done by the formula engine and storage engine is quick in order to retrieve the data and feed it to Formula Engine.

So far the code is how we want it, it is fast and it works, but let's say you want to filter those rows from the matrix where the Sales Amount is blank and instead of writing this logic in a measure you use the Filter pane for this

Let's see the query and performance impact of this:

When I run this code it works in the similar duration

So we might think that the code is fast, however the thing to remember here is that the Fact table has only 12K rows and Analysis Services is a fast engine so thousand rows don't even matter, what if we increase that to 12.5 Million? The difference will be clearly visible.


The one without the slicer measure works slightly slower:

The one with the measure is almost same:

But this code wasn't acutally complex because we are computing running total with an optimized time intelligence function, so if instead of DATESYTD we use vanilla DAX let's see what happens then.


This time the running total will not reset for each year instead it will start on 2007-01-01 and will go on till 2009-12-31


Code -

Running Total = CALCULATE ( [Total Sales], -- DATESYTD ( Dates[Date] ) Dates[Date] <= MAX ( Dates[Date] ) )


This time the report without the Measure filter runs in 1.4 seconds but this is not completely correct because in this scenario we are not hiding rows where Sales Amount is BLANK, but we will fix that in a moment.

The report with the Measure filter runs in 2.1 seconds, so we already have a difference of 700 milliseconds

Now let's fix the first report to not show running total where Sales Amount is blank.

And now you can see that we have reduced the timings from 2.1 seconds to 800 ms.


So what is the learning here? The goal of the blog is not to tell you whether you should use a Measure in the Filter pane of not, because it is a feature that you might use sometimes, the goal is to help you understand and find out the performance penalty if you use a complex measure in the Filter Pane.


I have seen scenarios where the users had multiple nested measure calls with context transition and ALLSELECTED, and their visuals were taking infinite time before getting out of resource error, the reason being the whole engine/server isn't able to solve the Filter Pane part so it can't even proceed ahead with the final result.


When the DAX engine has to compute a query it has to scan the compressed columns, materialize and join them to read them, if the size of the table generated by the Filter Pane is huge and in millions, the query will already be slow because then that table will be further used to apply filter to the next code in the query. So your machine needs to have enough RAM for compressed data, materialization and refreshing the data, if not the user experience will be poor.


We should also consider that this test was conducted on a personal system with data in import mode, once it is in Power BI service there are several factors that will degrade the performance even further. And if you have DirectQuery then there is a high chance your code won't even work due to DAX limitation of DirectQuery as well as maximum rowset limitation of 1Million, which you can change in Power BI service if you have Premium Capacity but not if you are on PPU or Pro - Max Intermediate Rowset Count Property



1,030 views2 comments
bottom of page