Introduction
calendar-based Time Intelligence, the necessity for custom Time Intelligence logic has decreased dramatically.
Now, we will create custom calendars to fulfill our Time Intelligence calculation needs.
You may have read my article about advanced Time Intelligence:
A lot of the custom logic is not any longer needed.
But we still have scenarios where we will need to have custom calculations, like running average.
A while ago, SQLBI wrote an article about calculating the running average.
This piece uses the identical principles described there in a rather different approach.
Let’s see how we will calculate the running average over three months by utilizing the brand new Calendars.
Using classic Time Intelligence
First, we use the usual Gregorian calendar with the classic Time Intelligence date table.
I exploit the same approach as described within the SQLBI article linked within the References section below.
Running Average by Month =
// 1. Get the primary and last Date for the present Filter Context
VAR MaxDate = MAX( 'Date'[Date] )
// 2. Generate the Date range needed for the Moving average (three months)
VAR DateRange =
DATESINPERIOD( 'Date'[Date]
,MaxDate
,-3
,MONTH
)
// 3. Generate a table filtered by the Date Range generated at step 2
// This table incorporates only three rows
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Date'[MonthKey]
, "#Sales", [Sum Online Sales]
)
,DateRange
)
RETURN
// 4. Calculate the Average over the three values within the table generate in step 3
AVERAGEX(SalesByMonth, [#Sales])
When executing this measure in DAX Studio, I get the expected results:
To this point, so good.
Using a typical calendar
Next, I created a Calendar named “Gregorian Calendar” and altered the code to make use of this calendar.
To make this easier to grasp, I copied the date table to a brand new table named “Gregorian Date Table”.
The change is when calling the DATESINPERIOD() function.
As a substitute of using the date column, I exploit the newly created calendar:
Running Average by Month =
// 1. Get the primary and last Date for the present Filter Context
VAR MaxDate = MAX( 'Gregorian Date Table'[Date] )
// 2. Generate the Date range needed for the Moving average (three months)
VAR DateRange =
DATESINPERIOD( 'Gregorian Calendar'
,MaxDate
,-3
,MONTH
)
// 3. Generate a table filtered by the Date Range generated at step 2
// This table incorporates only three rows
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Gregorian Date Table'[MonthKey]
, "#Sales", [Sum Online Sales]
)
,DateRange
)
RETURN
// 4. Calculate the Average over the three values within the table generate in step 3
AVERAGEX(SalesByMonth, [#Sales])
As expected, the outcomes are equivalent:

The performance is superb, as this question completes in 150 milliseconds.
Using a custom calendar
But what happens when using a custom calendar?
For instance, a calendar with 15 months per 12 months and 31 days for every month?
I created such a calendar for my article, which describes use cases for calendar-based Time Intelligence (See the Link on the Top and within the References section).
Once you have a look at the code for the measure, you’ll notice that it’s different:
Running Average by Month (Custom) =
VAR LastSelDate = MAX('Financial Calendar'[CalendarEndOfMonthDate])
VAR MaxDateID = CALCULATE(MAX('Financial Calendar'[ID_Date])
,REMOVEFILTERS('Financial Calendar')
,'Financial Calendar'[CalendarEndOfMonthDate] = LastSelDate
)
VAR MinDateID = CALCULATE(MIN('Financial Calendar'[ID_Date])
,REMOVEFILTERS('Financial Calendar')
,'Financial Calendar'[CalendarEndOfMonthDate] = EOMONTH(LastSelDate, -2)
)
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Financial Calendar'[CalendarYearMonth]
, "#Sales", [Sum Online Sales]
)
,'Financial Calendar'[ID_Date] >= MinDateID
&& 'Financial Calendar'[ID_Date] <= MaxDateID
)
RETURN
AVERAGEX(SalesByMonth, [#Sales])
The explanation for the changes is that this table lacks a date column usable with the DATESINPERIOD() function. For that reason, I have to use custom code to calculate the worth range for ID_Date.
These are the outcomes:

As you'll be able to check, the outcomes are correct.
Optimizing by utilizing a day index
But once I analyze the performance, it’s not that great.
It takes almost half a second to calculate the outcomes.
We are able to improve performance by removing the necessity to retrieve the minimum and maximum ID_Date and performing a more efficient calculation.
I do know that every month has 31 days.
To return three months, I do know that I have to return by 93 days.
I can use this to create a faster version of the measure:
Running Average by Month (Financial) =
// Step 1: Get the last Month (ID)
VAR SelMonth = MAX('Financial Calendar'[ID_Month])
// Step 2: Generate the Date Range from the last 93 days
VAR DateRange =
TOPN(93
,CALCULATETABLE(
SUMMARIZECOLUMNS('Financial Calendar'[ID_Date])
,REMOVEFILTERS('Financial Calendar')
,'Financial Calendar'[ID_Month] <= SelMonth
)
,'Financial Calendar'[ID_Date], DESC
)
// 3. Generate a table filtered by the Date Range generated at step 2
// This table incorporates only three rows
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Financial Calendar'[ID_Month]
, "#Sales", [Sum Online Sales]
)
,DateRange
)
RETURN
// 4. Calculate the Average over the three values within the table generate in step 3
AVERAGEX(SalesByMonth, [#Sales])
This time, I used the TOPN() function to retrieve the 93 previous rows from the Financial Calendar table and used this list as a filter.
The outcomes are equivalent to the previous version:

This version needs only 118 ms to finish.
But can we go even further with the optimization?
Next, I added a brand new column to the Fiscal Calendar to assign ranks to the rows. Now, each date has a novel number which is in direct correlation to the order of them:

The measure using this column is the next:
Running Average by Month (Financial) =
// Step 1: Get the last Month (ID)
VAR MaxDateRank = MAX('Financial Calendar'[ID_Date_RowRank])
// Step 2: Generate the Date Range from the last 93 days
VAR DateRange =
CALCULATETABLE(
SUMMARIZECOLUMNS('Financial Calendar'[ID_Date])
,REMOVEFILTERS('Financial Calendar')
,'Financial Calendar'[ID_Date_RowRank] <= MaxDateRank
&& 'Financial Calendar'[ID_Date_RowRank] >= MaxDateRank - 92
)
--ORDER BY 'Financial Calendar'[ID_Date] DESC
// 3. Generate a table filtered by the Date Range generated at step 2
// This table incorporates only three rows
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Financial Calendar'[ID_Month]
, "#Sales", [Sum Online Sales]
)
,DateRange
)
RETURN
// 4. Calculate the Average over the three values within the table generate in step 3
AVERAGEX(SalesByMonth, [#Sales])
The result is similar, I don’t show it again.
But here is the comparison from the execution statistics:

As you'll be able to see, the Version using TOPN() is barely slower than the one using the RowRank column.
However the differences are marginal.
More importantly, the version using the RowRank column requires more data to finish the calculations. See the Rows column for details.
This implies more RAM usage.
But with this small variety of rows, the differences are still marginal.
It’s your alternative which version you favor.
Using a weekly calendar
Lastly, let’s have a look at a week-based calculation.
This time, I need to calculate the rolling average over the past three weeks.
Because the calendar-based Time Intelligence allows for the creation of a week-based calendar, the measure may be very much like the second:
Running Average by Week =
// 1. Get the primary and last Date for the present Filter Context
VAR MaxDate = MAX( 'Gregorian Date Table'[Date] )
// 2. Generate the Date range needed for the Moving average (three months)
VAR DateRange =
DATESINPERIOD( 'Week Calendar'
,MaxDate
,-3
,WEEK
)
// 3. Generate a table filtered by the Date Range generated at step 2
// This table incorporates only three rows
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Gregorian Date Table'[WeekKey]
, "#Sales", [Sum Online Sales]
)
,DateRange
)
RETURN
// 4. Calculate the Average over the three values within the table generate in step 3
AVERAGEX(SalesByMonth, [#Sales])
The important thing part is that I exploit the “WEEK” parameter within the DATESINPERIOD() call.
That’s all.
That is the results of the query:

The performance is superb, with execution times below 100 ms.
Remember that weekly calculations are only possible with the calendar-based Time Intelligence.
Conclusion
As you might have seen, the calendar-based Time Intelligence makes life easier with custom logic: we only must pass the calendar as an alternative of a date column to the functions. And we will calculate weekly intervals.
But the present feature set doesn’t include a semester interval. After we must calculate semester-based results, we must either use classic Time Intelligence or write custom code.
But we still need custom logic, especially once we don’t have a date column in our calendar table. In such cases, we will’t use the usual time intelligence functions, as they still work with date columns.
Remember: An important task when working with calendar-based Time Intelligence is constructing a consistent and complete calendar table. From my experience, that is essentially the most complex task.
As a sidenote, I discovered some interesting functions on daxlib.org a couple of running average.
I added a link to the functions within the References section below.
These functions follow a very different pattern, but I wanted to incorporate them to create a whole picture of this topic.
References
The mentioned SQLBI.com article on calculating the running Average:
https://www.sqlbi.com/articles/rolling-12-months-average-in-dax
Time Series functions on daxlib.org with a unique approach:
https://daxlib.org/package/TimeSeries.MovingAverage
Here is my last article, where I explain Calendar-based Time-Intelligence:
How to Implement Three Use Cases for the New Calendar-Based Time Intelligence
Like in my previous articles, I exploit the Contoso sample dataset. You possibly can download the ContosoRetailDW Dataset free of charge from Microsoft here.
The Contoso Data will be used freely under the MIT License, as described on this document. I modified the dataset to shift the information to contemporary dates.
