Volatility is such an overused term that we may forget how important the assumptions are that lead to the final number.
Let us start with the basics. When we talk about volatility, we mean the volatility of returns, not the volatility of the asset price itself. Next, volatility changes over time, so we need to deduce it across time.
One way to do this is to calculate the moving average variance of returns and then derive volatility from it. For example, let us take the price and return statistics of a stock:

Here, returns are derived from price changes using the following formula:

After that, we calculate the squared returns, which will later be used when computing the variance:

Next, depending on the period we choose, we calculate the moving average of the squared returns. For example, the average variance over the last 20 days.

As we move forward in time, the 20-day window also moves forward, allowing us to see what the average variance looked like using rolling 20-day frames. Finally, by taking the square root of the variance, we obtain the corresponding volatility estimates.
At this point we should answer an important question: why do we take the average instead of calculating variance using the exact formula, which looks like this:

The reason is that we simplify the formula using a practical assumption — that the average return is approximately zero.
In other words, we are not interested in how the sea level changes; we are interested in the movement of the waves. Changes in the level itself are negligible compared to the movement of the waves within a short period of time. Therefore, assuming zero mean does not introduce much error, while it significantly simplifies the calculations. This simplification becomes especially helpful when we move to more advanced volatility estimation methods.
Exponentially Weighted Moving Average (EWMA)
Because the traditional Moving Average approach is quite rigid, in practice the EWMA method has become more widely used. In EWMA, more recent observations receive higher weights than older ones. Past information does not disappear — it simply loses weight over time.
The formula is:

This means that today’s variance is mostly yesterday’s variance, slightly adjusted by the most recent return, given that ( \lambda = 0.94 ).
There are several interesting points here.
The number 0.94 does not come from theory. In the 1990s, J.P. Morgan launched a major research project called RiskMetrics. The goal was to standardize risk measurement across financial institutions.
Researchers tested many different models across multiple markets — equities, bonds, and currencies — and asked a very practical question:
Which value of Lambda forecasts real volatility most accurately?
After extensive testing, they found that 0.94 worked very well for daily data.
This is why it became an industry standard. Not because it has some special mathematical property — but because it works well in practice.
What Lambda Actually Means
If λ = 0.94, then 1-λ = 0.06.
This means yesterday’s squared return receives a 6% weight in today’s variance estimate.
But the interesting part is how weights decay over time.
| Days ago | Weight |
|---|---|
| 1 | 6.0% |
| 2 | 5.64% |
| 3 | 5.30% |
| 10 | 3.2% |
| 50 | ~0.3% |
Old observations never fully disappear — they simply lose influence gradually.
The “11-Day” Intuition
In EWMA, the concept of half-life is often used. It represents the time after which the influence of information declines by half.
It is calculated using the formula:

If λ = 0.94:

This means that the impact of a market shock is cut in half after about 11 trading days.
In other words, if the market experiences a large movement today, the effect of that shock will be roughly half as strong about two weeks later, although its influence can still be felt for roughly 60 days.
This matches real market dynamics quite well.
Time Interval and λ
An important detail is that λ depends on the frequency of the data.
RiskMetrics suggested:
| Data Frequency | λ |
|---|---|
| Daily | 0.94 |
| Monthly | 0.97 |
The logic is simple.
If we observe data less frequently (for example once per month), the model needs a longer memory, so λ must be larger. In other words, when the time interval changes, λ should change as well.
Finally: Moving Average vs EWMA
| Feature | Moving Average (MA) | EWMA |
|---|---|---|
| Weights | All observations receive equal weight | Recent observations receive larger weights |
| Old data influence | Disappears completely outside the window | Gradually declines but never disappears |
| Reaction to market shocks | Can jump suddenly when data leaves the window | Adjusts more smoothly |
| Memory | Fixed window (e.g., 20 days) | Practically infinite but exponentially decaying |
| Model nature | Very simple but somewhat artificial | More realistic for market dynamics |
Our Excel model produces the following graphs:

Adapted from:
Options, Futures & Other Derivatives, John C. Hull
Comments are closed.