11 Time Series Methods of Forecasting
Vikas Singla
11.1 OBJECTIVES
This chapter would help students to understand:
- Importance of time period in fluctuation of data
- Role of variation of data in selecting a particular forecasting technique
- Methods of forecasting when variation in demand is random: Simple Moving Average, Weighted Moving Average, Exponential Smoothing
11.2 INTRODUCTION
Time series models of forecasting require past data to fulfill objective of prediction. This type of data is recorded over a period of time taken at regular intervals (e.g. hourly, weekly, monthly, annually). The data collected over a selected time period can be regarding sales of a product, absenteeism rate, operating time on a machine, profits etc. For example, sales figures collected over a period of four weeks can be used to predict sales of fifth week.
The most important aspect to understand is the time interval for which data has to be collected. Should a particular data be collected hourly, daily, weekly, monthly or at any other rate? Data fluctuation is directly proportional to time interval selected. It can show a trend, seasonal, cyclical or random variation. The fluctuation behavior of data would be instrumental in deciding proper forecasting technique. Time intervals are always discussed in relative sense. Terms such as short, medium and long term are relative to the context in which they are used. Table 11.1.1 shows type of forecasting technique some of which would be discussed in this chapter in context of time period. Short term period is referred as data collected over a time period of less than 3 months; medium term: three months to two years and long term as more than two years.
Table 11.1.1 | ||
Time period | Data pattern | Forecasting technique |
Short term: less than 3 months | Data does not show any particular pattern or variation. |
|
Medium term: 3 months to two years | Data either repeats periodically or move along a particular direction i.e. upward or downward |
|
Long term: more than 2 years | Data shows repeated pattern but at irregular intervals with irregular intensity |
|
Also selected time period has an influence over accuracy of prediction. If the data is collected for a very long term period it might provide good results but older data may not have relevance in present scenario. Likewise, if only recent data is used it might not be sufficient to give reliable prediction results. Thus, time period and availability of data over that time period influences accuracy of forecasting.
In previous modules we have discussed four ways in which data can fluctuate. These methods have been briefed here with some illustrations.
- Seasonal: Amount of traffic increases twice every day and shows similar pattern next working day. So, data regarding traffic will show a repetitive cycle with equal time intervals.
- Cyclical: Fashion keeps on changing with one style of clothing which had gone out of fashion would again be considered in vogue. But this happens at irregular time intervals where it is very difficult to predict when one style will go out of fashion and when it will reappear.
- Trend: A mobile phone company might see an upward trend of sale of their smart phones over last six- eight months but with competitor launching a new variant the company’s sale might start to slow down weekly. So, the sale of smart phones showed an increasing trend over monthly data whereas decrease in sale of smart phones showed a decreasing trend over weekly data.
- Random: Fluctuation of a share in stock market shows an irregular trend and its fluctuation is recorded hourly or even at smaller time period.
Thus, which forecasting model a firm should select, depends on:
- Time horizon to forecast
- Availability of data
- Accuracy required
Taking these aspects into consideration following section discusses three forecasting techniques in context of random variation of data. These methods are:
- Simple Moving Average
- Weighted Moving Average
- Exponential Smoothing
11.3 SIMPLE MOVING AVERAGE
A data is considered to follow random fluctuations if it is neither increasing nor decreasing over a short term period and also does not shown any seasonal pattern over medium term. The absence of pattern in the data is reflected when it is recorded with short time intervals such as hourly, daily or weekly. The method is called as moving average because forecasted data keeps moving from one time period to another. For instance, suppose we have data for last 12 weeks and we need to forecast for 6th week by taking past data of five weeks. This can be done by averaging actual data of first five weeks and calculating its simple average. Similarly forecast for 7th week would require data of past five weeks i.e. from 2nd to 6th week and in same fashion data keeps on moving by leaving the earliest week and adding the most recent week data.
This has been illustrated by following example:
Example 11.2.1: For the following weekly demand for 10 weeks (as shown in Table 11.2.1) predict demand from 4th week to 11th week by using past 3 weekly data.
Table 11.2.1 | ||||||||||
Week | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
Actual Demand | 800 | 1400 | 1000 | 1500 | 1500 | 1300 | 1800 | 1700 | 1300 | 1700 |
To predict for 4th week by taking past data calculate simple average of actual demand of week 1, week 2 and week
Average (4th week)
= Actual demand of (week 1 + week 2 + week 3) / 3
= (800 + 1400 + 1000)/3
= 1067 units
Average (5th week)
= Actual demand of (week 2 + week 3 + week 4) / 3
= (1400 + 1000 + 1500)/3
= 1300 units
Similarly forecasted demand can be calculated for each week as shown in Table 11.3.2
Table 11.2.2 | ||||||||||
Week | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
Actual Demand | 800 | 1400 | 1000 | 1500 | 1500 | 1300 | 1800 | 1700 | 1300 | 1700 |
Forecasted demand | — | — | — | 1067 | 1300 | 1333 | 1433 | 1533 | 1600 | 1600 |
Forecasted demand for week 11
= Actual demand of (week 8 + week 9 + week 10) / 3
= (1700 + 1300 +1700) / 3
= 1567 units
Which time period to select: Sensitivity vs. Stability
Proper selection of time period is an important aspect of accurate prediction. Different time periods would smoother the fluctuations of past data with random variations in different manner. The longer the moving average period, the more random variables are smoothed which is desirable in most cases. But long time intervals face the danger of missing or ignoring variations which show a particular trend (increasing or decreasing) occurring at small time intervals. Thus, long intervals would provide stable or consistent results whereas short time periods would be more sensitive to abrupt or unexpected variation in the data. For example, daily data collected over a week of a stock to observe its performance may show random variation every day during that week. So, taking into consideration daily data moving average method can be used to predict its performance for first day of next week. But if data is recorded hourly, then certain increasing or decreasing trends can be monitored which were missed in daily data set. Thus, data collected over a short period was more sensitive to certain variation whereas same data recorded over a longer period would be more smooth and stable.
Such a scenario has been explained in following illustration.
Example 11.2.2: For following weekly data (shown in Table 11.2.3) forecast demand by using 3 weekly and 9 weekly moving average.
Table 11.2.3 | |||||||||||
Week | Demand | 3-week | 9-week | Week | Demand | 3-week | 9-week | Week | Demand | 3-week | 9-week |
1. | 800 | — | — | 11. | 1,700 | 1,567 | 1,467 | 21. | 2,400 | 2,167 | 2,011 |
2. | 1,400 | — | — | 12. | 1,500 | 1,567 | 1,500 | 22. | 2,600 | 2,233 | 2,111 |
3. | 1,000 | — | — | 13. | 2,300 | 1,633 | 1,556 | 23. | 2,000 | 2,467 | 2,144 |
4 | 1,500 | 1,067 | — | 14. | 2,300 | 1,833 | 1,644 | 24. | 2,500 | 2,333 | 2,111 |
5. | 1,500 | 1,300 | — | 15. | 2,000 | 2,033 | 1,733 | 25. | 2,600 | 2,367 | 2,167 |
6. | 1,300 | 1,333 | — | 16. | 1,700 | 2,200 | 1,811 | 26. | 2,200 | 2,367 | 2,267 |
7. | 1,800 | 1,433 | — | 17. | 1,800 | 2,000 | 1,800 | 27. | 2,200 | 2,433 | 2,311 |
8. | 1,700 | 1,533 | — | 18. | 2,200 | 1,833 | 1,811 | 28. | 2,500 | 2,333 | 2,311 |
9. | 1,300 | 1,600 | — | 19. | 1,900 | 1,900 | 1,911 | 29. | 2,400 | 2,300 | 2,378 |
10. | 1,700 | 1,600 | 1,367 | 20. | 2,400 | 1,967 | 1,933 | 30. | 2,100 | 2,367 | 2,378 |
Fig. 11.2.1 |
Fig. 11.2.1 shows graphically the effects of various time periods of a moving average. It can be inferred from the graph that by using 9-week period the prediction line is much smoother and thus provides easier way of predicting next periods’ demand. But 3-week moving average line clearly shows that there is sudden and sharp increase in demand after 13th week which falls suddenly after 16th week. Then there is a gradual increase in demand. Also, this line shows that growth level levels off after 23rd week. Thus, 3-week moving average responds in a better manner in following this change than the 9-week average curve. This illustration clearly demonstrates the stability effect of long term and sensitivity effect of short term.
11.4 WEIGHTED SIMPLE MOVING AVERAGE
The main disadvantage of simple moving average is that the method gives equal weight age to each component of past data. This shortcoming is not too severe when data is collected over a short term period. But if moving average is calculated over a long term period like 60-day period then the problem of data reliability becomes more acute. The importance of data goes on decreasing with passage of time because of changing business conditions. The data which is very old may not be relevant in present scenario so its importance in prediction should be minimal.
Weighted moving average is useful in solving such discrepancy. In this method oldest data is given least weight age and most recent data the maximum weight age or importance. The sum of weights should not be more than 1. Example 11.3.1: A manager has collected sales data for last four weeks and would like to predict sales for next coming week by considering weight age to first week as least important and for recent week as most important. (Table 11.3.1)
Table 11.3.1 | ||||
Week | 1. | 2. | 3. | 4. |
Sales | 100 | 90 | 105 | 95 |
Weight age (%) | 10 | 20 | 30 | 40 |
Forecast for 5th week
= 0.10*100 + 0.20*90 + 0.30*105 + 0.40*95
= 97.5 units.
Selection of weights is prerogative of the manager by taking into consideration a simple fact: that most recent data is most accurate way of predicting next period and most past data is least reliable way of doing the same.
11.5 EXPONENTIAL SMOOTHING
Exponential Smoothing is another method used for prediction of data which varies over a short term period of less than six months and does not follow any regular pattern. In previous two methods, using method of moving averages, the process of calculating averages becomes tedious if data is huge and time periods selected is also large. Exponential smoothing formula, as would be discussed below, requires only immediate three data values, namely, forecast and actual demand of immediate previous period and a smoothing coefficient.
This method uses following formula for predicting demand or any other data:
Ft+1 = αDt + (1- α)Ft ….(1)
Where: Ft+1 = forecast for the estimated period
Dt = actual demand for immediate previous period
Ft = forecasted demand for immediate previous period α = smoothing coefficient
Selection of smoothing coefficient ‘α’:
The selection of a proper smoothing coefficient value can be determined by following two methods:
1. An ‘α’ value that results in least amount of forecasting error is considered to provide more accurate forecasting results. Forecast error is defined as the difference between forecasted and actual demand. Methods of calculating forecasting errors and their application in deciding proper ‘α’ would be discussed in next module.
2. Another way of determining a proper ‘α’ value is its role in smoothing an irregular data. As has been discussed this method along with method of averages, accurate forecasting depend on how data gets smoothed by using law of averages. Such methodology is adopted because of random or unknown pattern of variation in the data. In exponential smoothing method the random data is smoothen out by using a smoothing coefficient denoted by α. The value of this smoothing coefficient varies from 0 to 1. The formula as shown in equation (1) indicates multiplication of ‘α’ with actual demand of previous period and its opposite i.e. (1 – α) with forecasted demand of previous period. This implies that a manger would tend to use higher value of smoothing coefficient (such as 0.7, 0.8 etc.) if data of actual demand is considered more important than forecasted demand. In such a case actual demand would contribute more in estimation of next periods’ demand. This can also be interpreted as, that higher value of smoothing coefficient suggests less accurate or reliable forecast for the previous period so a manager tends to give more importance to actual demand data. Similarly, a lower value of ‘α’ (such as 0.3, 0.4…etc.) would imply more accurate forecasted data whose contribution in estimating next periods’ data would be more as compared to actual demand data. This has been illustrated by using following example.
Example 11.4.1: By using smoothing coefficient α as 0.3 and 0.7 forecast for the following time series data (table 11.4.1) by considering actual demand of week 1 as forecast demand for week 2.
Table 11.4.1 | ||||||||||
Time (in
weeks) |
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
9 |
10 |
Actual
Demand |
27 |
30 |
32 |
31 |
28 |
27 |
30 |
33 |
33 |
31 |
Exponential Smoothing uses following formula:
Ft+1 = αDt + (1- α)Ft
By using α = 0.3 forecast for week 2 would be
F2 | = αD1 + (1 – α )F1 | |
= 0.3*27 + 0.7*27 | ||
= 27 | ||
Similarly for | F3 | = αD2 + (1 – α)F2 |
= 0.3*30 + 0.7*27 | ||
= 27.9 |
Forecast for other weeks can be calculated in similar fashion. Table 11.4.2 shows final results with forecasted demand with both α= 0.3 and α =0.7.
Table 11.4.2 | |||
Time (in weeks) | Actual Demand | Forecast demand (α= 0.3) | Forecast demand (α= 0.7) |
1. | 27 | —– | —– |
2. | 30 | 27 | 27 |
3. | 32 | 27.9 | 29.1 |
4. | 31 | 29.13 | 31.13 |
5. | 28 | 29.69 | 31.03 |
6. | 27 | 29.18 | 28.91 |
7. | 30 | 28.52 | 27.57 |
8. | 33 | 28.97 | 29.27 |
9. | 31 | 30.17 | 31.88 |
10. | 31 | 30.42 | 31.26 |
Fig. 11.4.1 |
Fig. 11.4.1 clearly shows that with α = 0.3 provides more smoothing effect to random demand data than with α = 0.7. From this it can be inferred that forecasting done for week 11 would be more accurate by using α = 0.3. Forecast for week 11 = 0.3*31 + 0.7*30.42
= 30.59
11.6 SUMMARY
This chapter discusses methods of forecasting when demand or any other data set varies over a short term period. It has been emphasized that during such short term period it is difficult to assign a pattern to demand fluctuation. Thus, methods such as simple moving average, weighted moving average and exponential smoothing which have been discussed with illustrations are applied when fluctuation in data is random and unknown. Simple moving average gives equal importance to every data value. But with passage of time old data has less relevance to present conditions. Thus, its importance decreases. So, weighted moving average was devised to give different weights to different data values depending on their occurrence in time series. Oldest data was given least importance and recent data the maximum. These two methods faced with another problem. In both the methods a manager has to carry all of data to predict for future time periods. In big data sets this becomes quite a tedious task. Exponential smoothing method helps to resolve this problem to a great extent.
11.7 GLOSSARY
- Time Series analysis: A type of forecast in which data relating to past demand are used to predict future demand.
- Simple Moving Average: A forecasting technique in which every pat data value is given equal weight age.
- Weighted Moving Average: A forecasting technique in which older data is given less importance than recent data.
- Exponential Smoothing: A forecasting technique in which each increment of past demand data is decreased by (1 – α).
11.8 REFERENCES/ SUGGESTED READINGS
- Chase, B.R., Shankar, R., Jacobs, F.R. and Aquilano, N.J., Operations & Supply Chain Management, 12th Edition, McGraw Hill.
- Stevenson, W.J., Operations Management, 9th Edition, Tata McGraw Hill.
- Lee J. Krajewski, Operations Management, Prentice-Hall of India, New Delhi, 8th Edition.
11.9 Short Answer Questions
1. Which of the following is true concerning the smoothing parameter (α) used in exponential smoothing?
a. α = 0.4 means the forecast for the next period is based on 40% older data and 60% recent data.
b. If α = 0, the forecast is equivalent to the naive forecast.
c. The higher the value of α, the less the effect of smoothing.
d. All of the above.
Answer: d
2. Which of the following forecasting method is not used for short term period forecast?
a. Seasonal Method b. Simple Moving Average
c. Weighted Moving Average d. Exponential Smoothing
Answer: a
- Exponential smoothing only includes the most recent demand and ignores earlier demands, so simple moving average is better.
- True b. False
Answer: b
- Data forecasted by using short term periods are whereas those forecasted with longer time intervals are more _.
Answer: sensitive, stable
11.10 Model Questions
- What advantages does exponential smoothing method of forecasting have over moving averages?
- How does the number of periods in a moving average affect the responsiveness of the forecast?
- What factors are considered in the choice of exponential smoothing coefficient?
- For the following data make a forecast for this week on the following basis:
- (a) Daily using a simple four-week moving av
- (b) Daily, using a weighted average of 0.40, 0.30, 0.20 and 0.10 for past four week
- (c) Daily, by using smoothing coefficient of 10% for this we Take last weeks’ Saturday demand as initial demand for this week’s Monday.
4 Weeks Ago | 3 Weeks Ago | 2 Weeks Ago | Last Week | |
Monday | 2,200 | 2,400 | 2,300 | 2,400 |
Tuesday | 2,000 | 2,100 | 2,200 | 2,200 |
Wednesday | 2,300 | 2,400 | 2,300 | 2,500 |
Thursday | 1,800 | 1,900 | 1,800 | 2,000 |
Friday | 1,900 | 1,800 | 2,100 | 2,000 |
Saturday | 2800 | 2700 | 3000 | 2900 |