Introduction
calendar-based Time Intelligence, the need for custom Time Intelligence logic has decreased dramatically.
Now, we can create custom calendars to meet our Time Intelligence calculation needs.
You might have read my article about advanced Time Intelligence:
https://towardsdatascience.com/advanced-time-intelligence-in-dax-with-performance-in-mind/
Most of the custom logic is no longer needed.
But we still have scenarios where we must have custom calculations, like running average.
Some time ago, SQLBI wrote an article about calculating the running average.
This piece uses the same principles described there in a slightly different approach.
Let’s see how we can calculate the running average over three months by using the new Calendars.
Using classic Time Intelligence
First, we use the standard Gregorian calendar with the classic Time Intelligence date table.
I use a similar approach as described in the SQLBI article linked in the References section below.
Running Average by Month =
// 1. Get the first and last Date for the current Filter Context
VAR MaxDate = MAX( 'Date'[Date] )
// 2. Generate the Date range needed for the Moving average (three months)
VAR DateRange =
DATESINPERIOD( 'Date'[Date]
,MaxDate
,-3
,MONTH
)
// 3. Generate a table filtered by the Date Range generated at step 2
// This table contains only three rows
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Date'[MonthKey]
, "#Sales", [Sum Online Sales]
)
,DateRange
)
RETURN
// 4. Calculate the Average over the three values in the table generate in step 3
AVERAGEX(SalesByMonth, [#Sales])
When executing this measure in DAX Studio, I get the expected results:

So far, so good.
Using a standard calendar
Next, I created a Calendar named “Gregorian Calendar” and changed the code to use this calendar.
To make this easier to understand, I copied the date table to a new table named “Gregorian Date Table”.
The change is when calling the DATESINPERIOD() function.
Instead of using the date column, I use the newly created calendar:
Running Average by Month =
// 1. Get the first and last Date for the current Filter Context
VAR MaxDate = MAX( 'Gregorian Date Table'[Date] )
// 2. Generate the Date range needed for the Moving average (three months)
VAR DateRange =
DATESINPERIOD( 'Gregorian Calendar'
,MaxDate
,-3
,MONTH
)
// 3. Generate a table filtered by the Date Range generated at step 2
// This table contains only three rows
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Gregorian Date Table'[MonthKey]
, "#Sales", [Sum Online Sales]
)
,DateRange
)
RETURN
// 4. Calculate the Average over the three values in the table generate in step 3
AVERAGEX(SalesByMonth, [#Sales])
As expected, the results are identical:

The performance is excellent, as this query completes in 150 milliseconds.
Using a custom calendar
But what happens when using a custom calendar?
For example, a calendar with 15 months per year and 31 days for each month?
I created such a calendar for my article, which describes use cases for calendar-based Time Intelligence (See the Link at the Top and in the References section).
When you look at the code for the measure, you will notice that it’s different:
Running Average by Month (Custom) =
VAR LastSelDate = MAX('Financial Calendar'[CalendarEndOfMonthDate])
VAR MaxDateID = CALCULATE(MAX('Financial Calendar'[ID_Date])
,REMOVEFILTERS('Financial Calendar')
,'Financial Calendar'[CalendarEndOfMonthDate] = LastSelDate
)
VAR MinDateID = CALCULATE(MIN('Financial Calendar'[ID_Date])
,REMOVEFILTERS('Financial Calendar')
,'Financial Calendar'[CalendarEndOfMonthDate] = EOMONTH(LastSelDate, -2)
)
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Financial Calendar'[CalendarYearMonth]
, "#Sales", [Sum Online Sales]
)
,'Financial Calendar'[ID_Date] >= MinDateID
&& 'Financial Calendar'[ID_Date] <= MaxDateID
)
RETURN
AVERAGEX(SalesByMonth, [#Sales])
The reason for the changes is that this table lacks a date column usable with the DATESINPERIOD() function. For this reason, I must use custom code to calculate the value range for ID_Date.
These are the results:

As you can check, the results are correct.
Optimizing by using a day index
But when I analyze the performance, it’s not that great.
It takes almost half a second to calculate the results.
We can improve performance by removing the need to retrieve the minimum and maximum ID_Date and performing a more efficient calculation.
I know that each month has 31 days.
To go back three months, I know that I must go back by 93 days.
I can use this to create a faster version of the measure:
Running Average by Month (Financial) =
// Step 1: Get the last Month (ID)
VAR SelMonth = MAX('Financial Calendar'[ID_Month])
// Step 2: Generate the Date Range from the last 93 days
VAR DateRange =
TOPN(93
,CALCULATETABLE(
SUMMARIZECOLUMNS('Financial Calendar'[ID_Date])
,REMOVEFILTERS('Financial Calendar')
,'Financial Calendar'[ID_Month] <= SelMonth
)
,'Financial Calendar'[ID_Date], DESC
)
// 3. Generate a table filtered by the Date Range generated at step 2
// This table contains only three rows
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Financial Calendar'[ID_Month]
, "#Sales", [Sum Online Sales]
)
,DateRange
)
RETURN
// 4. Calculate the Average over the three values in the table generate in step 3
AVERAGEX(SalesByMonth, [#Sales])
This time, I used the TOPN() function to retrieve the 93 previous rows from the Financial Calendar table and used this list as a filter.
The results are identical to the previous version:

This version needs only 118 ms to complete.
But can we go even further with the optimization?
Next, I added a new column to the Fiscal Calendar to assign ranks to the rows. Now, each date has a unique number which is in direct correlation to the order of them:

The measure using this column is the following:
Running Average by Month (Financial) =
// Step 1: Get the last Month (ID)
VAR MaxDateRank = MAX('Financial Calendar'[ID_Date_RowRank])
// Step 2: Generate the Date Range from the last 93 days
VAR DateRange =
CALCULATETABLE(
SUMMARIZECOLUMNS('Financial Calendar'[ID_Date])
,REMOVEFILTERS('Financial Calendar')
,'Financial Calendar'[ID_Date_RowRank] <= MaxDateRank
&& 'Financial Calendar'[ID_Date_RowRank] >= MaxDateRank - 92
)
--ORDER BY 'Financial Calendar'[ID_Date] DESC
// 3. Generate a table filtered by the Date Range generated at step 2
// This table contains only three rows
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Financial Calendar'[ID_Month]
, "#Sales", [Sum Online Sales]
)
,DateRange
)
RETURN
// 4. Calculate the Average over the three values in the table generate in step 3
AVERAGEX(SalesByMonth, [#Sales])
The result is the same, I don’t show it again.
But here is the comparison from the execution statistics:

As you can see, the Version using TOPN() is slightly slower than the one using the RowRank column.
But the differences are marginal.
More importantly, the version using the RowRank column requires more data to complete the calculations. See the Rows column for details.
This means more RAM usage.
But with this small number of rows, the differences are still marginal.
It’s your choice which version you prefer.
Using a weekly calendar
Lastly, let’s look at a week-based calculation.
This time, I want to calculate the rolling average over the last three weeks.
As the calendar-based Time Intelligence allows for the creation of a week-based calendar, the measure is very similar to the second one:
Running Average by Week =
// 1. Get the first and last Date for the current Filter Context
VAR MaxDate = MAX( 'Gregorian Date Table'[Date] )
// 2. Generate the Date range needed for the Moving average (three months)
VAR DateRange =
DATESINPERIOD( 'Week Calendar'
,MaxDate
,-3
,WEEK
)
// 3. Generate a table filtered by the Date Range generated at step 2
// This table contains only three rows
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Gregorian Date Table'[WeekKey]
, "#Sales", [Sum Online Sales]
)
,DateRange
)
RETURN
// 4. Calculate the Average over the three values in the table generate in step 3
AVERAGEX(SalesByMonth, [#Sales])
The key part is that I use the “WEEK” parameter in the DATESINPERIOD() call.
That’s all.
This is the result of the query:

The performance is excellent, with execution times below 100 ms.
Be aware that weekly calculations are only possible with the calendar-based Time Intelligence.
Conclusion
As you have seen, the calendar-based Time Intelligence makes life easier with custom logic: we only need to pass the calendar instead of a date column to the functions. And we can calculate weekly intervals.
But the current feature set doesn’t include a semester interval. When we must calculate semester-based results, we must either use classic Time Intelligence or write custom code.
But we still need custom logic, especially when we don’t have a date column in our calendar table. In such cases, we can’t use the standard time intelligence functions, as they still work with date columns.
Remember: The most important task when working with calendar-based Time Intelligence is building a consistent and complete calendar table. From my experience, this is the most complex task.
As a sidenote, I found some interesting functions on daxlib.org about a running average.
I added a link to the functions in the References section below.
These functions follow a completely different pattern, but I wanted to include them to create a complete picture of this topic.
References
The mentioned SQLBI.com article on calculating the running Average:
https://www.sqlbi.com/articles/rolling-12-months-average-in-dax
Time Series functions on daxlib.org with a different approach:
https://daxlib.org/package/TimeSeries.MovingAverage
Here is my last article, where I explain Calendar-based Time-Intelligence:
https://towardsdatascience.com/use-cases-for-the-new-calendar-based-time-intelligence/
Like in my previous articles, I use the Contoso sample dataset. You can download the ContosoRetailDW Dataset for free from Microsoft here.
The Contoso Data can be used freely under the MIT License, as described in this document. I changed the dataset to shift the data to contemporary dates.
Deep Insight Think Deeper. See Clearer