Time Series

Download as pdf or txt
Download as pdf or txt
You are on page 1of 1

Build a machine learning model that can forecast liquior sales

Time series
A time series is a collection of observations of well-defined data items obtained through repeated measurements over time.

Examples of Time Series

A time series can be constructed by any data that is measured over time at evenly-spaced intervals. Historical stock prices, earnings, GDP, or other sequences of financial or economic data can be analyzed as a time series.

Linear vs. nonlinear time series data

A linear time series is one where, for each data point Xt, that data point can be viewed as a linear combination of past or future values or differences. Nonlinear time series are generated by nonlinear dynamic equations. They
have features that cannot be modelled by linear processes: time-changing variance, asymmetric cycles, higher-moment structures, thresholds and breaks. Here are some important considerations when working with linear and
nonlinear time series data:

If a regression equation doesn’t follow the rules for a linear model, then it must be a nonlinear model. Nonlinear regression can fit an enormous variety of curves. The defining characteristic for both types of models are the
functional forms.

Why Time Series


Helps organizations/business understand the underlying causes of trends or systemic patterns over time.
Using data visualizations, business users can see seasonal trends and dig deeper into why these trends occur. With modern analytics platforms, these visualizations can go far beyond line graphs. When organizations
analyze data over consistent intervals, they can also use time series forecasting to predict the likelihood of future events.
Time series forecasting is part of predictive analytics. It can show likely changes in the data, like seasonality or cyclic behavior, which provides a better understanding of data variables and helps forecast better.

Real time Examples of Time series

Forecasting the closing price of a stock each day.


Forecasting product sales in units sold each day for a store.
Forecasting Subscribers rate
Forecasting the average price of gasoline each day.

Understanding Average in time series

Moving Averages

A moving average model leverages the average of the data points that exist in a specific overlapping subsection of the series. An average is taken from the first subset of the data, and then it is moved forward to the next data
point while dropping out the initial data point. A moving average can give you information about the current trends, and reduce the amount of noise in your data. Often, it is a preprocessing step for forecasting.

Simple Moving Average

The way a SMA is calculated is that it takes the subset of the data mentioned in the moving average model description, adds together the data points, and then takes the average over the subset of data. It can help identify the
direction of trends in your data, and identify levels of resistance where in business or trading data, there is a price ceiling that can’t be broken through. For instance if you’re trying to identify the point where you can’t charge
past a certain amount for a product, or in investing why a stock can’t move past a certain price point, you can identify that ceiling with a moving average.

Moving averages are used by investors and traders for analyzing short term trends in stock market data, while SMA’s are being used in healthcare to better understand current trends in surgeries, and even analyze quality
control in healthcare providers.

components that can occur in time series data

Level: When you read about the “level” or the “level index” of time series data, it’s referring to the mean of the series.

Noise: All time series data will have noise or randomness in the data points that aren’t correlated with any explained trends. Noise is unsystematic and is short term.

Seasonality: If there are regular and predictable fluctuations in the series that are correlated with the calendar – could be quarterly, weekly, or even days of the week, then the series includes a seasonality component. It’s
important to note that seasonality is domain specific, for example real estate sales are usually higher in the summer months versus the winter months while regular retail usually peaks during the end of the year. Also, not
all time series have a seasonal component, as mentioned for audio or video data.

Trend: When referring to the “trend” in time series data, it means that the data has a long term trajectory which can either be trending in the positive or negative direction. An example of a trend would be a long term
increase in a company’s sales data or network usage.

Cycle: Repeating periods that are not related to the calendar. This includes business cycles such as economic downturns or expansions or salmon run cycles, or even audio files which have cycles, but aren’t related to the
calendar in the weekly, monthly, or yearly sense.

About the dataset

We have dataset that gives information about the liquor consumption with time
Period: time period
Value: Amount of liquor consumed

For Data set please click here Click here

Period Value

0 1992-01-01 1509

1 1992-02-01 1541

2 1992-03-01 1597

3 1992-04-01 1675

4 1992-05-01 1822

... ... ...

288 2016-01-01 3559

289 2016-02-01 3718

290 2016-03-01 3986

291 2016-04-01 4043

292 2016-05-01 4311

293 rows × 2 columns

<class 'pandas.core.frame.DataFrame'>
RangeIndex: 293 entries, 0 to 292
Data columns (total 2 columns):
# Column Non-Null Count Dtype
--- ------ -------------- -----
0 Period 293 non-null datetime64[ns]
1 Value 293 non-null int64
dtypes: datetime64[ns](1), int64(1)
memory usage: 4.7 KB

Value

count 293.000000

mean 2790.494881

std 861.360248

min 1501.000000

25% 2059.000000

50% 2638.000000

75% 3438.000000

max 5834.000000

Period 0
Value 0
dtype: int64

There are no null values


no need to handle missing values

We have 293 rows in our data


lets use 75% of data for training
lets see the amount of data we will having in our training data

219

Length of test dataset 74


Period Value

219 2010-04-01 3310

220 2010-05-01 3466

221 2010-06-01 3438

222 2010-07-01 3657

223 2010-08-01 3455

Period Value

0 1992-01-01 1509.0

1 1992-02-01 1541.0

2 1992-03-01 1597.0

3 1992-04-01 1675.0

4 1992-05-01 1822.0

<AxesSubplot:xlabel='Period', ylabel='Value'>

<AxesSubplot:xlabel='Period', ylabel='Value'>

Seasonality
Seasonality is a characteristic of a time series in which the data experiences regular and predictable changes that recur every calendar year. Any predictable fluctuation or pattern that recurs or repeats over a one-year
period is said to be seasonal.
Seasonality in time-series data refers to a pattern that occurs at a regular interval. This is different from regular cyclic trends, such as the rise and fall of liquor prices, that re-occur regularly but don’t have a fixed period.
There’s a lot of insight to be gained from understanding seasonality patterns in data and you can even use it as a baseline to compare your time-series machine learning models.

Checking the stationarity

it is checking the statistical properties of the time series data


A linear trend is a straight line.
A linear seasonality has the same frequency (width of cycles) and amplitude (height of cycles).

<Figure size 432x288 with 0 Axes>

Looks like there is an upward trend in data with similar seasonality. Below is a function to test stationarity of data using Dickey Fuller test and also plotting the rolling statistics. In Dickey Fuller test checking the p value if it
less than 5% then the series is considered to be stationary

check Stationarity

During the TSA model preparation workflow, we must access if the given dataset is Stationary or NOT. Using Statistical and Plots test.

a. Augmented Dickey-Fuller (ADF) Test

Augmented Dickey-Fuller (ADF) Test or Unit Root Test: The ADF test is the most popular statistical test and with the following - assumptions
Null Hypothesis (H0): Series is non-stationary
Alternate Hypothesis (HA): Series is stationary
p-value >0.05 Fail to reject (H0)
p-value <= 0.05 Accept (H1)

Results of Dickey-Fuller Test:


p-value = 0.9945. The series is likely non-stationary.
Test Statistic 1.019454
p-value 0.994467
#Lags Used 20.000000
Number of Observations Used 198.000000
Critical Value (1%) -3.463815
Critical Value (5%) -2.876251
Critical Value (10%) -2.574611
dtype: float64

Augmented Dickey-Fuller (ADF) test is a type of statistical test called a unit root test. Unit roots are a cause for non-stationarity.

Null Hypothesis (H0): Time series has a unit root. (Time series is not stationary).

Alternate Hypothesis (H1): Time series has no unit root (Time series is stationary).

If the null hypothesis can be rejected, we can conclude that the time series is stationary.

There are two ways to rejects the null hypothesis:

On the one hand, the null hypothesis can be rejected if the p-value is below a set significance level. The defaults significance level is 5%

p-value > significance level (default: 0.05): Fail to reject the null hypothesis (H0), the data has a unit root and is non-stationary. p-value <= significance level (default: 0.05): Reject the null hypothesis (H0), the data does
not have a unit root and is stationary. On the other hand, the null hypothesis can be rejects if the test statistic is less than the critical value.

Results of Dickey-Fuller Test:


p-value = 0.0020. The series is likely stationary.
Test Statistic -3.906522
p-value 0.001987
#Lags Used 20.000000
Number of Observations Used 197.000000
Critical Value (1%) -3.463987
Critical Value (5%) -2.876326
Critical Value (10%) -2.574652
dtype: float64

ACFand PCAF
A partial autocorrelation is a summary of the relationship between an observation in a time series with observations at prior time steps with the relationships of intervening observations removed.

The partial autocorrelation at lag k is the correlation that results after removing the effect of any correlations due to the terms at shorter lags.

The autocorrelation function (ACF) is a statistical technique that we can use to identify how correlated the values in a time series are with each other. The ACF plots the correlation coefficient against the lag, which is measured
in terms of a number of periods or units. A lag corresponds to a certain point in time after which we observe the first value in the time series.

The correlation coefficient can range from -1 (a perfect negative relationship) to +1 (a perfect positive relationship). A coefficient of 0 means that there is no relationship between the variables. Also, most often, it is measured
either by Pearson’s correlation coefficient or by Spearman’s rank correlation coefficient.

Blue bars on an ACF plot above are the error bands, and anything within these bars is not statistically significant. It means that correlation values outside of this area are very likely a correlation and not a statistical fluke. The
confidence interval is set to 95% by default.

Notice that for a lag zero, ACF is always equal to one, which makes sense because the signal is always perfectly correlated with itself.

To summarize, autocorrelation is the correlation between a time series (signal) and a delayed version of itself, while the ACF plots the correlation coefficient against the lag, and it’s a visual representation of autocorrelation.

The ACF plot for the AR(p) time series would be strong to a lag of p and remain stagnant for subsequent lag values, trailing off at some point as the effect is weakened. The PACF, on the other hand, describes the direct
relationship between an observation and its lag. This generally leads to no correlation for lag values beyond p.

The ACF for the MA(q) process would show a strong correlation with recent values up to the lag of q, then an immediate decline to minimal or no correlation. For the PACF, the plot shows a strong relationship to the lag
and then a tailing off to no correlation from the lag onwards. Above is the ACF & PACFplot for our stationary data.

To summarize, a partial autocorrelation function captures a “direct” correlation between time series and a lagged version of itself.

Partial autocorrelation is a statistical measure that captures the correlation between two variables after controlling for the effects of other variables..

ARIMA
ARIMA stands for AutoRegressive Integrated Moving Average. It is a generalization of the simpler AutoRegressive Moving Average and adds the notion of integration.

This acronym is descriptive, capturing the key aspects of the model itself. Briefly, they are:

AR: Autoregression. A model that uses the dependent relationship between an observation and some number of lagged observations. I: Integrated. The use of differencing of raw observations (e.g. subtracting an observation
from an observation at the previous time step) in order to make the time series stationary. MA: Moving Average. A model that uses the dependency between an observation and a residual error from a moving average model
applied to lagged observations.

The parameters of the ARIMA model are defined as follows:

p: The number of lag observations included in the model, also called the lag order. d: The number of times that the raw observations are differenced, also called the degree of differencing. q: The size of the moving average
window, also called the order of moving average. A linear regression model is constructed including the specified number and type of terms, and the data is prepared by a degree of differencing in order to make it stationary, i.e.
to remove trend and seasonal structures that negatively affect the regression model.

A value of 0 can be used for a parameter, which indicates to not use that element of the model. This way, the ARIMA model can be configured to perform the function of an ARMA model, and even a simple AR, I, or MA model.

Adopting an ARIMA model for a time series assumes that the underlying process that generated the observations is an ARIMA process. This may seem obvious, but helps to motivate the need to confirm the assumptions of the
model in the raw observations and in the residual errors of forecasts from the model.

Order of Diffrencing
p is the order of the AR term, q is the order of the MA term, d is the number of differencing required to make the time series stationary

So more formerly if we are saying that ARIMA(1,1,1) which means ARIMA model of order (1, 1, 1) where AR specification is 1, Integration order or shift order is one and Moving average specification is 1

How to determin p, d, q In our case, we see the first order differencing make the ts stationary. I = 1.

AR model might be investigated first with lag length selected from the PACF or via empirical investigation. In our case, it's clearly that within 1 lags the AR is significant. Which means, we can use AR = 2

To avoid the potential for incorrectly specifying the MA order (in the case where the MA is first tried then the MA order is being set to 0), it may often make sense to extend the lag observed from the last significant term in the
PACF

ARIMA Model Results


==============================================================================
Dep. Variable: D.Value No. Observations: 218
Model: ARIMA(0, 1, 1) Log Likelihood -1587.963
Method: css-mle S.D. of innovations 348.255
Date: Thu, 06 Jan 2022 AIC 3181.925
Time: 16:26:02 BIC 3192.079
Sample: 1 HQIC 3186.026

=================================================================================
coef std err z P>|z| [0.025 0.975]
---------------------------------------------------------------------------------
const 8.2135 0.372 22.065 0.000 7.484 8.943
ma.L1.D.Value -1.0000 nan nan nan nan nan
Roots
=============================================================================
Real Imaginary Modulus Frequency
-----------------------------------------------------------------------------
MA.1 1.0000 +0.0000j 1.0000 0.0000
-----------------------------------------------------------------------------
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:547: HessianInversionWarning: Inverting hessian failed, no bse or cov_params available
warnings.warn('Inverting hessian failed, no bse or cov_params '

Dep. Variable - What we’re trying to predict.


Model - The type of model we’re using. AR, MA, ARIMA.
Date - The date we ran the model
Time - The time the model finished
Sample - The range of the data
No. Observations - The number of observations

Akaike’s Information Criterion

Akaike’s Information Criterion (AIC) helps determine the strength of the linear regression model. The AIC penalizes a model for adding parameters since adding more parameters will always increase the maximum
likelihood value.

Bayesian Information Criterion

Bayesian Information Criterion (BIC), like the AIC, also punishes a model for complexity, but it also incorporates the number of rows in the data.

Hannan-Quinn Information Criterion

Hannan-Quinn Information Criterion (HQIC), like AIC and BIC is another criterion for model selection; however, it’s not used as often in practice.

Residual plot

A residual value is a measure of how much a regression line vertically misses a data point. Regression lines are the best fit of a set of data. You can think of the lines as averages; a few data points will fit the line and
others will miss. A residual plot has the Residual Values on the vertical axis; the horizontal axis displays the independent variable.

<AxesSubplot:>

count 218.000000

mean -26.145275

std 350.263249

min -586.716837

25% -216.253124

50% -76.855526

75% 55.899427

max 1320.599770

array([3340.44770213, 3348.66122709, 3356.87475205, 3365.088277 ,


3373.30180196, 3381.51532692, 3389.72885187, 3397.94237683,
3406.15590179, 3414.36942674, 3422.5829517 , 3430.79647666,
3439.01000161, 3447.22352657, 3455.43705153, 3463.65057648,
3471.86410144, 3480.0776264 , 3488.29115135, 3496.50467631,
3504.71820127, 3512.93172622, 3521.14525118, 3529.35877614,
3537.57230109, 3545.78582605, 3553.99935101, 3562.21287596,
3570.42640092, 3578.63992588, 3586.85345083, 3595.06697579,
3603.28050075, 3611.4940257 , 3619.70755066, 3627.92107562,
3636.13460057, 3644.34812553, 3652.56165049, 3660.77517544,
3668.9887004 , 3677.20222536, 3685.41575031, 3693.62927527,
3701.84280023, 3710.05632518, 3718.26985014, 3726.4833751 ,
3734.69690005, 3742.91042501, 3751.12394997, 3759.33747492,
3767.55099988, 3775.76452484, 3783.97804979, 3792.19157475,
3800.40509971, 3808.61862466, 3816.83214962, 3825.04567458,
3833.25919953, 3841.47272449, 3849.68624945, 3857.8997744 ,
3866.11329936, 3874.32682432, 3882.54034927, 3890.75387423,
3898.96739919, 3907.18092414, 3915.3944491 , 3923.60797406,
3931.82149901, 3940.03502397])

From above plot we can make sure that the prediction on test data is pretty good

74

74

Mean Squared Error


mean squared error (MSE) of an estimator (of a procedure for estimating an unobserved quantity) measures the average of the squares of the errors—that is, the average squared difference between the estimated values
and the actual value.

Validation RMS 527.9737439495028

MAPE Function
statistical measure to define the accuracy of a machine learning algorithm on a particular dataset

The MAPE for Validation is 10.554307272570833

[3340.44770213 3348.66122709 3356.87475205 3365.088277 3373.30180196


3381.51532692 3389.72885187 3397.94237683 3406.15590179 3414.36942674
3422.5829517 3430.79647666 3439.01000161 3447.22352657 3455.43705153
3463.65057648 3471.86410144 3480.0776264 3488.29115135 3496.50467631
3504.71820127 3512.93172622 3521.14525118 3529.35877614 3537.57230109
3545.78582605 3553.99935101 3562.21287596 3570.42640092 3578.63992588
3586.85345083 3595.06697579 3603.28050075 3611.4940257 3619.70755066
3627.92107562 3636.13460057 3644.34812553 3652.56165049 3660.77517544
3668.9887004 3677.20222536 3685.41575031 3693.62927527 3701.84280023
3710.05632518 3718.26985014 3726.4833751 3734.69690005 3742.91042501
3751.12394997 3759.33747492 3767.55099988 3775.76452484 3783.97804979
3792.19157475 3800.40509971 3808.61862466 3816.83214962 3825.04567458
3833.25919953 3841.47272449 3849.68624945 3857.8997744 3866.11329936
3874.32682432 3882.54034927 3890.75387423 3898.96739919 3907.18092414
3915.3944491 3923.60797406 3931.82149901 3940.03502397 3948.24854893
3956.46207388 3964.67559884 3972.8891238 3981.10264875 3989.31617371
3997.52969867 4005.74322362 4013.95674858 4022.17027354 4030.3837985
4038.59732345 4046.81084841 4055.02437337 4063.23789832 4071.45142328
4079.66494824 4087.87847319 4096.09199815 4104.30552311 4112.51904806
4120.73257302 4128.94609798 4137.15962293 4145.37314789 4153.58667285]

Predicting sales

From above graph we can make sure that prediction rate is pretty accurate

To get P,D, Q and Seasonal sarima parameters by iterating given inputs

Examples of parameter combinations for Seasonal ARIMA...


SARIMAX: (0, 0, 1) x (0, 0, 1, 12)
SARIMAX: (0, 0, 1) x (0, 0, 2, 12)
SARIMAX: (0, 0, 2) x (0, 1, 0, 12)
SARIMAX: (0, 0, 2) x (0, 1, 1, 12)

Current Iter - 1, ARIMA(0, 0, 0)x(0, 0, 0, 12) 12 - AIC:4035.2332296832456


Current Iter - 2, ARIMA(0, 0, 0)x(0, 0, 1, 12) 12 - AIC:3569.1068568458013
Current Iter - 3, ARIMA(0, 0, 0)x(0, 0, 2, 12) 12 - AIC:3168.5876767683526
Current Iter - 4, ARIMA(0, 0, 0)x(0, 1, 0, 12) 12 - AIC:2593.5475761224934
Current Iter - 5, ARIMA(0, 0, 0)x(0, 1, 1, 12) 12 - AIC:2414.1688957943097
Current Iter - 6, ARIMA(0, 0, 0)x(0, 1, 2, 12) 12 - AIC:2247.757791382374
Current Iter - 7, ARIMA(0, 0, 0)x(0, 2, 0, 12) 12 - AIC:2432.075426472753
Current Iter - 8, ARIMA(0, 0, 0)x(0, 2, 1, 12) 12 - AIC:2203.091555621073
Current Iter - 9, ARIMA(0, 0, 0)x(0, 2, 2, 12) 12 - AIC:2067.143092236951
Current Iter - 10, ARIMA(0, 0, 0)x(1, 0, 0, 12) 12 - AIC:2455.187386930361
Current Iter - 11, ARIMA(0, 0, 0)x(1, 0, 1, 12) 12 - AIC:2444.857227428677
Current Iter - 12, ARIMA(0, 0, 0)x(1, 0, 2, 12) 12 - AIC:2261.1312960654204
Current Iter - 13, ARIMA(0, 0, 0)x(1, 1, 0, 12) 12 - AIC:2398.432067236328
Current Iter - 14, ARIMA(0, 0, 0)x(1, 1, 1, 12) 12 - AIC:2343.637482163157
Current Iter - 15, ARIMA(0, 0, 0)x(1, 1, 2, 12) 12 - AIC:2201.9745563723354
Current Iter - 16, ARIMA(0, 0, 0)x(1, 2, 0, 12) 12 - AIC:2254.480249445054
Current Iter - 17, ARIMA(0, 0, 0)x(1, 2, 1, 12) 12 - AIC:2205.0708700207615
Current Iter - 18, ARIMA(0, 0, 0)x(1, 2, 2, 12) 12 - AIC:2056.8832073750364
Current Iter - 19, ARIMA(0, 0, 0)x(2, 0, 0, 12) 12 - AIC:2308.222071885669
Current Iter - 20, ARIMA(0, 0, 0)x(2, 0, 1, 12) 12 - AIC:2283.3026711010516
Current Iter - 21, ARIMA(0, 0, 0)x(2, 0, 2, 12) 12 - AIC:2261.8415331087617
Current Iter - 22, ARIMA(0, 0, 0)x(2, 1, 0, 12) 12 - AIC:2238.9734730953714
Current Iter - 23, ARIMA(0, 0, 0)x(2, 1, 1, 12) 12 - AIC:2207.360595328489
Current Iter - 24, ARIMA(0, 0, 0)x(2, 1, 2, 12) 12 - AIC:2195.877244927356
Current Iter - 25, ARIMA(0, 0, 0)x(2, 2, 0, 12) 12 - AIC:2110.4354193084437
Current Iter - 26, ARIMA(0, 0, 0)x(2, 2, 1, 12) 12 - AIC:2067.2568472859753
Current Iter - 27, ARIMA(0, 0, 0)x(2, 2, 2, 12) 12 - AIC:2056.7753057920654
Current Iter - 28, ARIMA(0, 0, 1)x(0, 0, 0, 12) 12 - AIC:3787.2704892981974
Current Iter - 29, ARIMA(0, 0, 1)x(0, 0, 1, 12) 12 - AIC:3346.635633141098
Current Iter - 30, ARIMA(0, 0, 1)x(0, 0, 2, 12) 12 - AIC:2976.8656492945147
Current Iter - 31, ARIMA(0, 0, 1)x(0, 1, 0, 12) 12 - AIC:2511.2851323266223
Current Iter - 32, ARIMA(0, 0, 1)x(0, 1, 1, 12) 12 - AIC:2358.596024031044
Current Iter - 33, ARIMA(0, 0, 1)x(0, 1, 2, 12) 12 - AIC:2207.017626119875
Current Iter - 34, ARIMA(0, 0, 1)x(0, 2, 0, 12) 12 - AIC:2408.1142570982934
Current Iter - 35, ARIMA(0, 0, 1)x(0, 2, 1, 12) 12 - AIC:2176.6216970886717
Current Iter - 36, ARIMA(0, 0, 1)x(0, 2, 2, 12) 12 - AIC:2039.8388784019385
Current Iter - 37, ARIMA(0, 0, 1)x(1, 0, 0, 12) 12 - AIC:2439.0741989779717
Current Iter - 38, ARIMA(0, 0, 1)x(1, 0, 1, 12) 12 - AIC:2411.408041311812
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 39, ARIMA(0, 0, 1)x(1, 0, 2, 12) 12 - AIC:2267.9989205224456
Current Iter - 40, ARIMA(0, 0, 1)x(1, 1, 0, 12) 12 - AIC:2371.6516725430997
Current Iter - 41, ARIMA(0, 0, 1)x(1, 1, 1, 12) 12 - AIC:2317.3786649278572
Current Iter - 42, ARIMA(0, 0, 1)x(1, 1, 2, 12) 12 - AIC:2178.2574364952043
Current Iter - 43, ARIMA(0, 0, 1)x(1, 2, 0, 12) 12 - AIC:2247.573704995828
Current Iter - 44, ARIMA(0, 0, 1)x(1, 2, 1, 12) 12 - AIC:2178.6031314732004
Current Iter - 45, ARIMA(0, 0, 1)x(1, 2, 2, 12) 12 - AIC:2032.6236310916713
Current Iter - 46, ARIMA(0, 0, 1)x(2, 0, 0, 12) 12 - AIC:2297.1318559291685
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 47, ARIMA(0, 0, 1)x(2, 0, 1, 12) 12 - AIC:2306.2297538699354
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 48, ARIMA(0, 0, 1)x(2, 0, 2, 12) 12 - AIC:2529.009038979756
Current Iter - 49, ARIMA(0, 0, 1)x(2, 1, 0, 12) 12 - AIC:2225.409076476655
Current Iter - 51, ARIMA(0, 0, 1)x(2, 1, 2, 12) 12 - AIC:2174.376189763166
Current Iter - 52, ARIMA(0, 0, 1)x(2, 2, 0, 12) 12 - AIC:2104.7720539133124
Current Iter - 53, ARIMA(0, 0, 1)x(2, 2, 1, 12) 12 - AIC:2054.7601500901305
Current Iter - 54, ARIMA(0, 0, 1)x(2, 2, 2, 12) 12 - AIC:2017.8698290558636
Current Iter - 55, ARIMA(0, 0, 2)x(0, 0, 0, 12) 12 - AIC:3594.5851997914588
Current Iter - 56, ARIMA(0, 0, 2)x(0, 0, 1, 12) 12 - AIC:3203.095599083
Current Iter - 57, ARIMA(0, 0, 2)x(0, 0, 2, 12) 12 - AIC:2879.6781522281035
Current Iter - 58, ARIMA(0, 0, 2)x(0, 1, 0, 12) 12 - AIC:2463.459756511029
Current Iter - 59, ARIMA(0, 0, 2)x(0, 1, 1, 12) 12 - AIC:2315.752872214537
Current Iter - 60, ARIMA(0, 0, 2)x(0, 1, 2, 12) 12 - AIC:2177.4468236019566
Current Iter - 61, ARIMA(0, 0, 2)x(0, 2, 0, 12) 12 - AIC:2381.344474284118
Current Iter - 62, ARIMA(0, 0, 2)x(0, 2, 1, 12) 12 - AIC:2143.5627046663335
Current Iter - 63, ARIMA(0, 0, 2)x(0, 2, 2, 12) 12 - AIC:2007.4694305909616
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 64, ARIMA(0, 0, 2)x(1, 0, 0, 12) 12 - AIC:2439.8029058330253
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 65, ARIMA(0, 0, 2)x(1, 0, 1, 12) 12 - AIC:2381.1983537436017
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 66, ARIMA(0, 0, 2)x(1, 0, 2, 12) 12 - AIC:2237.4904985682397
Current Iter - 67, ARIMA(0, 0, 2)x(1, 1, 0, 12) 12 - AIC:2346.3315877080436
Current Iter - 68, ARIMA(0, 0, 2)x(1, 1, 1, 12) 12 - AIC:2285.0048093328983
Current Iter - 69, ARIMA(0, 0, 2)x(1, 1, 2, 12) 12 - AIC:2147.071418882394
Current Iter - 70, ARIMA(0, 0, 2)x(1, 2, 0, 12) 12 - AIC:2234.759734182349
Current Iter - 71, ARIMA(0, 0, 2)x(1, 2, 1, 12) 12 - AIC:2145.152916903582
Current Iter - 72, ARIMA(0, 0, 2)x(1, 2, 2, 12) 12 - AIC:2002.342695727737
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 73, ARIMA(0, 0, 2)x(2, 0, 0, 12) 12 - AIC:2335.5991117899393
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 74, ARIMA(0, 0, 2)x(2, 0, 1, 12) 12 - AIC:2573.6278634725286
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 75, ARIMA(0, 0, 2)x(2, 0, 2, 12) 12 - AIC:2242.1924468752622
Current Iter - 76, ARIMA(0, 0, 2)x(2, 1, 0, 12) 12 - AIC:2207.0559530533333
Current Iter - 77, ARIMA(0, 0, 2)x(2, 1, 1, 12) 12 - AIC:2177.327263873781
Current Iter - 78, ARIMA(0, 0, 2)x(2, 1, 2, 12) 12 - AIC:2145.2814624960038
Current Iter - 79, ARIMA(0, 0, 2)x(2, 2, 0, 12) 12 - AIC:2081.5827951080137
Current Iter - 80, ARIMA(0, 0, 2)x(2, 2, 1, 12) 12 - AIC:2033.857963252387
Current Iter - 81, ARIMA(0, 0, 2)x(2, 2, 2, 12) 12 - AIC:1987.6257187196175
Current Iter - 82, ARIMA(0, 1, 0)x(0, 0, 0, 12) 12 - AIC:3305.431839567088
Current Iter - 83, ARIMA(0, 1, 0)x(0, 0, 1, 12) 12 - AIC:2923.7098014713165
Current Iter - 84, ARIMA(0, 1, 0)x(0, 0, 2, 12) 12 - AIC:2636.351675823265
Current Iter - 85, ARIMA(0, 1, 0)x(0, 1, 0, 12) 12 - AIC:2479.2674237932283
Current Iter - 86, ARIMA(0, 1, 0)x(0, 1, 1, 12) 12 - AIC:2343.3747631791075
Current Iter - 87, ARIMA(0, 1, 0)x(0, 1, 2, 12) 12 - AIC:2191.136819793596
Current Iter - 88, ARIMA(0, 1, 0)x(0, 2, 0, 12) 12 - AIC:2475.2244745633707
Current Iter - 89, ARIMA(0, 1, 0)x(0, 2, 1, 12) 12 - AIC:2229.9779134631576
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 90, ARIMA(0, 1, 0)x(0, 2, 2, 12) 12 - AIC:2088.37264887793
Current Iter - 91, ARIMA(0, 1, 0)x(1, 0, 0, 12) 12 - AIC:2492.4101691790065
Current Iter - 92, ARIMA(0, 1, 0)x(1, 0, 1, 12) 12 - AIC:2457.130103911819
Current Iter - 93, ARIMA(0, 1, 0)x(1, 0, 2, 12) 12 - AIC:2287.148554127929
Current Iter - 94, ARIMA(0, 1, 0)x(1, 1, 0, 12) 12 - AIC:2354.864669937763
Current Iter - 95, ARIMA(0, 1, 0)x(1, 1, 1, 12) 12 - AIC:2337.518903302168
Current Iter - 96, ARIMA(0, 1, 0)x(1, 1, 2, 12) 12 - AIC:2192.6279366664453
Current Iter - 97, ARIMA(0, 1, 0)x(1, 2, 0, 12) 12 - AIC:2311.762185869199
Current Iter - 98, ARIMA(0, 1, 0)x(1, 2, 1, 12) 12 - AIC:2231.8486823449093
Current Iter - 99, ARIMA(0, 1, 0)x(1, 2, 2, 12) 12 - AIC:2090.1989906160784
Current Iter - 101, ARIMA(0, 1, 0)x(2, 0, 1, 12) 12 - AIC:2318.1300957667863
Current Iter - 102, ARIMA(0, 1, 0)x(2, 0, 2, 12) 12 - AIC:2289.059270124527
Current Iter - 103, ARIMA(0, 1, 0)x(2, 1, 0, 12) 12 - AIC:2208.462965225043
Current Iter - 104, ARIMA(0, 1, 0)x(2, 1, 1, 12) 12 - AIC:2193.745994657176
Current Iter - 105, ARIMA(0, 1, 0)x(2, 1, 2, 12) 12 - AIC:2134.5167679696783
Current Iter - 106, ARIMA(0, 1, 0)x(2, 2, 0, 12) 12 - AIC:2162.3706157512156
Current Iter - 107, ARIMA(0, 1, 0)x(2, 2, 1, 12) 12 - AIC:2093.528168308972
Current Iter - 108, ARIMA(0, 1, 0)x(2, 2, 2, 12) 12 - AIC:2037.4793960090074
Current Iter - 109, ARIMA(0, 1, 1)x(0, 0, 0, 12) 12 - AIC:3165.6144740425643
Current Iter - 110, ARIMA(0, 1, 1)x(0, 0, 1, 12) 12 - AIC:2799.816865930882
Current Iter - 111, ARIMA(0, 1, 1)x(0, 0, 2, 12) 12 - AIC:2523.0694901166835
Current Iter - 112, ARIMA(0, 1, 1)x(0, 1, 0, 12) 12 - AIC:2363.379706257214
Current Iter - 113, ARIMA(0, 1, 1)x(0, 1, 1, 12) 12 - AIC:2221.934062436474
Current Iter - 114, ARIMA(0, 1, 1)x(0, 1, 2, 12) 12 - AIC:2083.432527103766
Current Iter - 115, ARIMA(0, 1, 1)x(0, 2, 0, 12) 12 - AIC:2383.265132881969
Current Iter - 116, ARIMA(0, 1, 1)x(0, 2, 1, 12) 12 - AIC:2121.4385550044212
Current Iter - 117, ARIMA(0, 1, 1)x(0, 2, 2, 12) 12 - AIC:1971.6033173176234
Current Iter - 118, ARIMA(0, 1, 1)x(1, 0, 0, 12) 12 - AIC:2388.3476606992876
Current Iter - 119, ARIMA(0, 1, 1)x(1, 0, 1, 12) 12 - AIC:2333.931301919083
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 120, ARIMA(0, 1, 1)x(1, 0, 2, 12) 12 - AIC:2226.856315893648
Current Iter - 121, ARIMA(0, 1, 1)x(1, 1, 0, 12) 12 - AIC:2249.004949598494
Current Iter - 122, ARIMA(0, 1, 1)x(1, 1, 1, 12) 12 - AIC:2221.2732021873917
Current Iter - 123, ARIMA(0, 1, 1)x(1, 1, 2, 12) 12 - AIC:2085.432473106221
Current Iter - 124, ARIMA(0, 1, 1)x(1, 2, 0, 12) 12 - AIC:2215.092095909772
Current Iter - 125, ARIMA(0, 1, 1)x(1, 2, 1, 12) 12 - AIC:2119.8696193362453
Current Iter - 126, ARIMA(0, 1, 1)x(1, 2, 2, 12) 12 - AIC:1969.5769578808763
Current Iter - 127, ARIMA(0, 1, 1)x(2, 0, 0, 12) 12 - AIC:2246.9441428425953
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 128, ARIMA(0, 1, 1)x(2, 0, 1, 12) 12 - AIC:2218.392220702721
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 129, ARIMA(0, 1, 1)x(2, 0, 2, 12) 12 - AIC:2179.2329821280446
Current Iter - 130, ARIMA(0, 1, 1)x(2, 1, 0, 12) 12 - AIC:2112.3739411591814
Current Iter - 131, ARIMA(0, 1, 1)x(2, 1, 1, 12) 12 - AIC:2107.3699394153086
Current Iter - 132, ARIMA(0, 1, 1)x(2, 1, 2, 12) 12 - AIC:2063.7018593133007
Current Iter - 133, ARIMA(0, 1, 1)x(2, 2, 0, 12) 12 - AIC:2063.1700350603924
Current Iter - 134, ARIMA(0, 1, 1)x(2, 2, 1, 12) 12 - AIC:2003.6713947488936
Current Iter - 135, ARIMA(0, 1, 1)x(2, 2, 2, 12) 12 - AIC:1955.6671059163982
Current Iter - 136, ARIMA(0, 1, 2)x(0, 0, 0, 12) 12 - AIC:3153.4354771249614
Current Iter - 137, ARIMA(0, 1, 2)x(0, 0, 1, 12) 12 - AIC:2788.129042278566
Current Iter - 138, ARIMA(0, 1, 2)x(0, 0, 2, 12) 12 - AIC:2512.932947479674
Current Iter - 139, ARIMA(0, 1, 2)x(0, 1, 0, 12) 12 - AIC:2337.6392415590094
Current Iter - 140, ARIMA(0, 1, 2)x(0, 1, 1, 12) 12 - AIC:2195.073094459969
Current Iter - 141, ARIMA(0, 1, 2)x(0, 1, 2, 12) 12 - AIC:2062.4775609117687
Current Iter - 142, ARIMA(0, 1, 2)x(0, 2, 0, 12) 12 - AIC:2360.4558804908647
Current Iter - 143, ARIMA(0, 1, 2)x(0, 2, 1, 12) 12 - AIC:2093.8715412519928
Current Iter - 144, ARIMA(0, 1, 2)x(0, 2, 2, 12) 12 - AIC:1944.7629025940755
Current Iter - 145, ARIMA(0, 1, 2)x(1, 0, 0, 12) 12 - AIC:2371.763261514335
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 146, ARIMA(0, 1, 2)x(1, 0, 1, 12) 12 - AIC:2345.874081487991
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 147, ARIMA(0, 1, 2)x(1, 0, 2, 12) 12 - AIC:2159.3955764906586
Current Iter - 148, ARIMA(0, 1, 2)x(1, 1, 0, 12) 12 - AIC:2233.2875751322035
Current Iter - 149, ARIMA(0, 1, 2)x(1, 1, 1, 12) 12 - AIC:2196.2790639131026
Current Iter - 151, ARIMA(0, 1, 2)x(1, 2, 0, 12) 12 - AIC:2200.053518224978
Current Iter - 152, ARIMA(0, 1, 2)x(1, 2, 1, 12) 12 - AIC:2090.079319864082
Current Iter - 153, ARIMA(0, 1, 2)x(1, 2, 2, 12) 12 - AIC:1946.6728776933758
Current Iter - 154, ARIMA(0, 1, 2)x(2, 0, 0, 12) 12 - AIC:2228.641393925763
Current Iter - 155, ARIMA(0, 1, 2)x(2, 0, 1, 12) 12 - AIC:2206.653379393032
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 156, ARIMA(0, 1, 2)x(2, 0, 2, 12) 12 - AIC:2169.151479286578
Current Iter - 157, ARIMA(0, 1, 2)x(2, 1, 0, 12) 12 - AIC:2099.638050922922
Current Iter - 158, ARIMA(0, 1, 2)x(2, 1, 1, 12) 12 - AIC:2099.47757046855
Current Iter - 159, ARIMA(0, 1, 2)x(2, 1, 2, 12) 12 - AIC:2050.1861590573126
Current Iter - 160, ARIMA(0, 1, 2)x(2, 2, 0, 12) 12 - AIC:2041.8032962559096
Current Iter - 161, ARIMA(0, 1, 2)x(2, 2, 1, 12) 12 - AIC:1987.4132034543127
Current Iter - 162, ARIMA(0, 1, 2)x(2, 2, 2, 12) 12 - AIC:1936.4551474929535
Current Iter - 163, ARIMA(0, 2, 0)x(0, 0, 0, 12) 12 - AIC:3515.929652568969
Current Iter - 164, ARIMA(0, 2, 0)x(0, 0, 1, 12) 12 - AIC:3123.856964690955
Current Iter - 165, ARIMA(0, 2, 0)x(0, 0, 2, 12) 12 - AIC:2817.65844053013
Current Iter - 166, ARIMA(0, 2, 0)x(0, 1, 0, 12) 12 - AIC:2700.3731629001572
Current Iter - 167, ARIMA(0, 2, 0)x(0, 1, 1, 12) 12 - AIC:2551.831945383417
Current Iter - 168, ARIMA(0, 2, 0)x(0, 1, 2, 12) 12 - AIC:2386.2923380975208
Current Iter - 169, ARIMA(0, 2, 0)x(0, 2, 0, 12) 12 - AIC:2680.526454672513
Current Iter - 170, ARIMA(0, 2, 0)x(0, 2, 1, 12) 12 - AIC:2424.573611800454
Current Iter - 171, ARIMA(0, 2, 0)x(0, 2, 2, 12) 12 - AIC:2264.546678977159
Current Iter - 172, ARIMA(0, 2, 0)x(1, 0, 0, 12) 12 - AIC:2714.8126590762113
Current Iter - 173, ARIMA(0, 2, 0)x(1, 0, 1, 12) 12 - AIC:2685.2908569785927
Current Iter - 174, ARIMA(0, 2, 0)x(1, 0, 2, 12) 12 - AIC:2494.051526274946
Current Iter - 175, ARIMA(0, 2, 0)x(1, 1, 0, 12) 12 - AIC:2564.23173239845
Current Iter - 176, ARIMA(0, 2, 0)x(1, 1, 1, 12) 12 - AIC:2544.6051659068994
Current Iter - 177, ARIMA(0, 2, 0)x(1, 1, 2, 12) 12 - AIC:2387.391302997052
Current Iter - 178, ARIMA(0, 2, 0)x(1, 2, 0, 12) 12 - AIC:2505.6313631059957
Current Iter - 179, ARIMA(0, 2, 0)x(1, 2, 1, 12) 12 - AIC:2426.5257403215296
Current Iter - 180, ARIMA(0, 2, 0)x(1, 2, 2, 12) 12 - AIC:2253.412909912667
Current Iter - 181, ARIMA(0, 2, 0)x(2, 0, 0, 12) 12 - AIC:2564.6944227285603
Current Iter - 182, ARIMA(0, 2, 0)x(2, 0, 1, 12) 12 - AIC:2535.363165163397
Current Iter - 183, ARIMA(0, 2, 0)x(2, 0, 2, 12) 12 - AIC:2496.0514100946934
Current Iter - 184, ARIMA(0, 2, 0)x(2, 1, 0, 12) 12 - AIC:2405.5816972246284
Current Iter - 185, ARIMA(0, 2, 0)x(2, 1, 1, 12) 12 - AIC:2387.1821457920832
Current Iter - 186, ARIMA(0, 2, 0)x(2, 1, 2, 12) 12 - AIC:2316.562579432628
Current Iter - 187, ARIMA(0, 2, 0)x(2, 2, 0, 12) 12 - AIC:2346.099126181565
Current Iter - 188, ARIMA(0, 2, 0)x(2, 2, 1, 12) 12 - AIC:2274.6391826977474
Current Iter - 189, ARIMA(0, 2, 0)x(2, 2, 2, 12) 12 - AIC:2207.6051154685383
Current Iter - 190, ARIMA(0, 2, 1)x(0, 0, 0, 12) 12 - AIC:3283.2403738883067
Current Iter - 191, ARIMA(0, 2, 1)x(0, 0, 1, 12) 12 - AIC:2899.498114657188
Current Iter - 192, ARIMA(0, 2, 1)x(0, 0, 2, 12) 12 - AIC:2609.255776748101
Current Iter - 193, ARIMA(0, 2, 1)x(0, 1, 0, 12) 12 - AIC:2462.5567276954334
Current Iter - 194, ARIMA(0, 2, 1)x(0, 1, 1, 12) 12 - AIC:2325.472326314769
Current Iter - 195, ARIMA(0, 2, 1)x(0, 1, 2, 12) 12 - AIC:2173.9386028006975
Current Iter - 196, ARIMA(0, 2, 1)x(0, 2, 0, 12) 12 - AIC:2458.3848489354214
Current Iter - 197, ARIMA(0, 2, 1)x(0, 2, 1, 12) 12 - AIC:2214.442650386156
Current Iter - 198, ARIMA(0, 2, 1)x(0, 2, 2, 12) 12 - AIC:2068.9918532698957
Current Iter - 199, ARIMA(0, 2, 1)x(1, 0, 0, 12) 12 - AIC:2489.227788002615
Current Iter - 201, ARIMA(0, 2, 1)x(1, 0, 2, 12) 12 - AIC:2274.2358861214607
Current Iter - 202, ARIMA(0, 2, 1)x(1, 1, 0, 12) 12 - AIC:2348.757037629991
Current Iter - 203, ARIMA(0, 2, 1)x(1, 1, 1, 12) 12 - AIC:2327.4502599440275
Current Iter - 204, ARIMA(0, 2, 1)x(1, 1, 2, 12) 12 - AIC:2175.6355920000733
Current Iter - 205, ARIMA(0, 2, 1)x(1, 2, 0, 12) 12 - AIC:2306.5807956493645
Current Iter - 206, ARIMA(0, 2, 1)x(1, 2, 1, 12) 12 - AIC:2216.357962935538
Current Iter - 207, ARIMA(0, 2, 1)x(1, 2, 2, 12) 12 - AIC:2073.8470221009575
Current Iter - 208, ARIMA(0, 2, 1)x(2, 0, 0, 12) 12 - AIC:2350.111491571483
Current Iter - 209, ARIMA(0, 2, 1)x(2, 0, 1, 12) 12 - AIC:2318.0272073676897
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 210, ARIMA(0, 2, 1)x(2, 0, 2, 12) 12 - AIC:2341.1894955185653
Current Iter - 211, ARIMA(0, 2, 1)x(2, 1, 0, 12) 12 - AIC:2205.233192321346
Current Iter - 212, ARIMA(0, 2, 1)x(2, 1, 1, 12) 12 - AIC:2191.7312852926075
Current Iter - 213, ARIMA(0, 2, 1)x(2, 1, 2, 12) 12 - AIC:2117.8617007489465
Current Iter - 214, ARIMA(0, 2, 1)x(2, 2, 0, 12) 12 - AIC:2158.0822334012655
Current Iter - 215, ARIMA(0, 2, 1)x(2, 2, 1, 12) 12 - AIC:2091.872844297912
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 216, ARIMA(0, 2, 1)x(2, 2, 2, 12) 12 - AIC:2023.2186282227883
Current Iter - 217, ARIMA(0, 2, 2)x(0, 0, 0, 12) 12 - AIC:3129.502529693858
Current Iter - 218, ARIMA(0, 2, 2)x(0, 0, 1, 12) 12 - AIC:2765.9970062715383
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 219, ARIMA(0, 2, 2)x(0, 0, 2, 12) 12 - AIC:2495.759444046209
Current Iter - 220, ARIMA(0, 2, 2)x(0, 1, 0, 12) 12 - AIC:2371.2156787650056
Current Iter - 221, ARIMA(0, 2, 2)x(0, 1, 1, 12) 12 - AIC:2205.6032283547875
Current Iter - 222, ARIMA(0, 2, 2)x(0, 1, 2, 12) 12 - AIC:2066.8847872744195
Current Iter - 223, ARIMA(0, 2, 2)x(0, 2, 0, 12) 12 - AIC:2372.8319260193434
Current Iter - 224, ARIMA(0, 2, 2)x(0, 2, 1, 12) 12 - AIC:2106.643255055006
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 225, ARIMA(0, 2, 2)x(0, 2, 2, 12) 12 - AIC:1957.2217690857865
Current Iter - 226, ARIMA(0, 2, 2)x(1, 0, 0, 12) 12 - AIC:2388.3151086450343
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 227, ARIMA(0, 2, 2)x(1, 0, 1, 12) 12 - AIC:2322.6845392869
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 228, ARIMA(0, 2, 2)x(1, 0, 2, 12) 12 - AIC:2167.79720365918
Current Iter - 229, ARIMA(0, 2, 2)x(1, 1, 0, 12) 12 - AIC:2249.8602686794293
Current Iter - 230, ARIMA(0, 2, 2)x(1, 1, 1, 12) 12 - AIC:2207.5979385509636
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 231, ARIMA(0, 2, 2)x(1, 1, 2, 12) 12 - AIC:2068.882657368497
Current Iter - 232, ARIMA(0, 2, 2)x(1, 2, 0, 12) 12 - AIC:2213.1142426939614
Current Iter - 233, ARIMA(0, 2, 2)x(1, 2, 1, 12) 12 - AIC:2125.8440221748083
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 234, ARIMA(0, 2, 2)x(1, 2, 2, 12) 12 - AIC:1955.509800205042
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 235, ARIMA(0, 2, 2)x(2, 0, 0, 12) 12 - AIC:2245.9472913439868
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 236, ARIMA(0, 2, 2)x(2, 0, 1, 12) 12 - AIC:2220.143655839998
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 237, ARIMA(0, 2, 2)x(2, 0, 2, 12) 12 - AIC:2208.963795252596
Current Iter - 238, ARIMA(0, 2, 2)x(2, 1, 0, 12) 12 - AIC:2112.043209778445
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 239, ARIMA(0, 2, 2)x(2, 1, 1, 12) 12 - AIC:2118.6536361548433
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 240, ARIMA(0, 2, 2)x(2, 1, 2, 12) 12 - AIC:2051.9037433061585
Current Iter - 241, ARIMA(0, 2, 2)x(2, 2, 0, 12) 12 - AIC:2062.3486926272685
Current Iter - 242, ARIMA(0, 2, 2)x(2, 2, 1, 12) 12 - AIC:2000.0132734294768
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 243, ARIMA(0, 2, 2)x(2, 2, 2, 12) 12 - AIC:2009.4809217835623
Current Iter - 244, ARIMA(1, 0, 0)x(0, 0, 0, 12) 12 - AIC:3320.183148196841
Current Iter - 245, ARIMA(1, 0, 0)x(0, 0, 1, 12) 12 - AIC:2937.8933721988437
Current Iter - 246, ARIMA(1, 0, 0)x(0, 0, 2, 12) 12 - AIC:2649.884172199204
Current Iter - 247, ARIMA(1, 0, 0)x(0, 1, 0, 12) 12 - AIC:2460.197594004937
Current Iter - 248, ARIMA(1, 0, 0)x(0, 1, 1, 12) 12 - AIC:2324.7567161015986
Current Iter - 249, ARIMA(1, 0, 0)x(0, 1, 2, 12) 12 - AIC:2190.411881100975
Current Iter - 251, ARIMA(1, 0, 0)x(0, 2, 1, 12) 12 - AIC:2175.9660275213837
Current Iter - 252, ARIMA(1, 0, 0)x(0, 2, 2, 12) 12 - AIC:2041.8685577817841
Current Iter - 253, ARIMA(1, 0, 0)x(1, 0, 0, 12) 12 - AIC:2417.341701773808
Current Iter - 254, ARIMA(1, 0, 0)x(1, 0, 1, 12) 12 - AIC:2407.0305958086683
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 255, ARIMA(1, 0, 0)x(1, 0, 2, 12) 12 - AIC:2223.8798057636386
Current Iter - 256, ARIMA(1, 0, 0)x(1, 1, 0, 12) 12 - AIC:2324.954876751712
Current Iter - 257, ARIMA(1, 0, 0)x(1, 1, 1, 12) 12 - AIC:2318.3599143420843
Current Iter - 258, ARIMA(1, 0, 0)x(1, 1, 2, 12) 12 - AIC:2181.5880171143035
Current Iter - 259, ARIMA(1, 0, 0)x(1, 2, 0, 12) 12 - AIC:2231.861906189104
Current Iter - 260, ARIMA(1, 0, 0)x(1, 2, 1, 12) 12 - AIC:2177.890548647365
Current Iter - 261, ARIMA(1, 0, 0)x(1, 2, 2, 12) 12 - AIC:2035.6590876379942
Current Iter - 262, ARIMA(1, 0, 0)x(2, 0, 0, 12) 12 - AIC:2276.9786023993274
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 263, ARIMA(1, 0, 0)x(2, 0, 1, 12) 12 - AIC:2282.574120220891
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 264, ARIMA(1, 0, 0)x(2, 0, 2, 12) 12 - AIC:2225.74152976998
Current Iter - 265, ARIMA(1, 0, 0)x(2, 1, 0, 12) 12 - AIC:2189.624649263508
Current Iter - 266, ARIMA(1, 0, 0)x(2, 1, 1, 12) 12 - AIC:2185.195792299228
Current Iter - 267, ARIMA(1, 0, 0)x(2, 1, 2, 12) 12 - AIC:2177.525516455904
Current Iter - 268, ARIMA(1, 0, 0)x(2, 2, 0, 12) 12 - AIC:2087.39674991018
Current Iter - 269, ARIMA(1, 0, 0)x(2, 2, 1, 12) 12 - AIC:2031.9202396624323
Current Iter - 270, ARIMA(1, 0, 0)x(2, 2, 2, 12) 12 - AIC:2013.3265044050904
Current Iter - 271, ARIMA(1, 0, 1)x(0, 0, 0, 12) 12 - AIC:3176.546463222805
Current Iter - 272, ARIMA(1, 0, 1)x(0, 0, 1, 12) 12 - AIC:2926.705480143677
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 273, ARIMA(1, 0, 1)x(0, 0, 2, 12) 12 - AIC:2551.16369666649
Current Iter - 274, ARIMA(1, 0, 1)x(0, 1, 0, 12) 12 - AIC:2374.9504152791005
Current Iter - 275, ARIMA(1, 0, 1)x(0, 1, 1, 12) 12 - AIC:2235.948148886623
Current Iter - 276, ARIMA(1, 0, 1)x(0, 1, 2, 12) 12 - AIC:2095.9757909813625
Current Iter - 277, ARIMA(1, 0, 1)x(0, 2, 0, 12) 12 - AIC:2381.2059850534697
Current Iter - 278, ARIMA(1, 0, 1)x(0, 2, 1, 12) 12 - AIC:2132.15741440898
Current Iter - 279, ARIMA(1, 0, 1)x(0, 2, 2, 12) 12 - AIC:1981.1057064473084
Current Iter - 280, ARIMA(1, 0, 1)x(1, 0, 0, 12) 12 - AIC:2383.553547930519
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 281, ARIMA(1, 0, 1)x(1, 0, 1, 12) 12 - AIC:2343.655609635264
C:\Users\shyam\anaconda3\lib\site-packages\statsmodels\base\model.py:566: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals
warnings.warn("Maximum Likelihood optimization failed to "
Current Iter - 282, ARIMA(1, 0, 1)x(1, 0, 2, 12) 12 - AIC:2251.8901626469155
Current Iter - 283, ARIMA(1, 0, 1)x(1, 1, 0, 12) 12 - AIC:2248.8261366042
C:\Users\shyam\anaconda3\lib\sit

B m m w AC

SAR MAX

SAR MAX S A R M A w X AR MA m AR MA m w SAR MAX


X w m m m T w SAR MAX m SAR MA A AR MA

A m w T m m m F m m m
w w mm S w m m w w m W
O m w

SAR MAX m m w w T m AR MAX m w


w w m

Why no AR MA
A M A AR MA m m

A m m m T m w m m w

A mw AR MA T m w

AR MA m m m

Why SAR MA
S A M A SAR MA S AR MA AR MA m w m

w m AR m MA m w m

A AR MA m m m AR MA T m m m m m

H w C SAR MA C SAR MA m m

T E m T m

T m AR MA m

p T
d T
q T m

S E m

P S

D S

Q S m

m T m m

F m w w

W h u d bo h AR MA nd SAR MAX

Fo AR MA Mod w go u o
RMS AR MA =
MAPE AR MA =
Fo SAR MAX Mod w go u o
RMS SAR MAX =
MAPE SAR MAX =

B on d ng bo u w nb h SAR MAX u o b h n AR MA

m on o T m S An
Tm wm m w
Sm m m TSA
T m
D m m
M m w U

You might also like