import warnings
import itertools
import numpy as np
import matplotlib.pyplot as plt
warnings.filterwarnings("ignore")
plt.style.use('fivethirtyeight')
import pandas as pd
import statsmodels.api as sm
import matplotlib
matplotlib.rcParams['axes.labelsize'] = 14
matplotlib.rcParams['xtick.labelsize'] = 12
matplotlib.rcParams['ytick.labelsize'] = 12
matplotlib.rcParams['text.color'] = 'k'
We are going to do time series analysis and forecasting for furniture sales.
df = pd.read_excel("Superstore.xls")
furniture = df.loc[df['Category'] == 'Furniture']
We have a good 4-year furniture sales data.
furniture['Order Date'].min()
Timestamp('2014-01-06 00:00:00')
furniture['Order Date'].max()
Timestamp('2017-12-30 00:00:00')
This step includes removing columns we do not need, check missing values, aggregate sales by date and so on.
cols = ['Row ID', 'Order ID', 'Ship Date', 'Ship Mode', 'Customer ID', 'Customer Name', 'Segment', 'Country', 'City', 'State', 'Postal Code', 'Region', 'Product ID', 'Category', 'Sub-Category', 'Product Name', 'Quantity', 'Discount', 'Profit']
furniture.drop(cols, axis=1, inplace=True)
furniture = furniture.sort_values('Order Date')
furniture.isnull().sum()
Order Date 0 Sales 0 dtype: int64
furniture = furniture.groupby('Order Date')['Sales'].sum().reset_index()
furniture.head()
Order Date | Sales | |
---|---|---|
0 | 2014-01-06 | 2573.820 |
1 | 2014-01-07 | 76.728 |
2 | 2014-01-10 | 51.940 |
3 | 2014-01-11 | 9.940 |
4 | 2014-01-13 | 879.939 |
furniture = furniture.set_index('Order Date')
furniture.index
DatetimeIndex(['2014-01-06', '2014-01-07', '2014-01-10', '2014-01-11', '2014-01-13', '2014-01-14', '2014-01-16', '2014-01-19', '2014-01-20', '2014-01-21', ... '2017-12-18', '2017-12-19', '2017-12-21', '2017-12-22', '2017-12-23', '2017-12-24', '2017-12-25', '2017-12-28', '2017-12-29', '2017-12-30'], dtype='datetime64[ns]', name='Order Date', length=889, freq=None)
Our current datetime data can be tricky to work with, therefore, we will use the averages daily sales value for that month instead, and we are using the start of each month as the timestamp.
y = furniture['Sales'].resample('MS').mean()
Have a quick peek 2017 sales data.
y['2017':]
Order Date 2017-01-01 397.602133 2017-02-01 528.179800 2017-03-01 544.672240 2017-04-01 453.297905 2017-05-01 678.302328 2017-06-01 826.460291 2017-07-01 562.524857 2017-08-01 857.881889 2017-09-01 1209.508583 2017-10-01 875.362728 2017-11-01 1277.817759 2017-12-01 1256.298672 Freq: MS, Name: Sales, dtype: float64
y.plot(figsize=(15, 6))
plt.show()
Some distinguishable patterns appear when we plot the data. The time-series has seasonality pattern, such as sales are always low at the beginning of the year and high at the end of the year. There is always a strong upward trend within any single year with a couple of low months in the mid of the year.
We can also visualize our data using a method called time-series decomposition that allows us to decompose our time series into three distinct components: trend, seasonality, and noise.
from pylab import rcParams
rcParams['figure.figsize'] = 18, 8
decomposition = sm.tsa.seasonal_decompose(y, model='additive')
fig = decomposition.plot()
plt.show()
The plot above clearly shows that the sales of furniture is unstable, along with its obvious seasonality.
We are going to apply one of the most commonly used method for time-series forecasting, known as ARIMA, which stands for Autoregressive Integrated Moving Average.
Parameter Selection for the ARIMA Time Series Model
p = d = q = range(0, 2)
pdq = list(itertools.product(p, d, q))
seasonal_pdq = [(x[0], x[1], x[2], 12) for x in list(itertools.product(p, d, q))]
print('Examples of parameter combinations for Seasonal ARIMA...')
print('SARIMAX: {} x {}'.format(pdq[1], seasonal_pdq[1]))
print('SARIMAX: {} x {}'.format(pdq[1], seasonal_pdq[2]))
print('SARIMAX: {} x {}'.format(pdq[2], seasonal_pdq[3]))
print('SARIMAX: {} x {}'.format(pdq[2], seasonal_pdq[4]))
Examples of parameter combinations for Seasonal ARIMA... SARIMAX: (0, 0, 1) x (0, 0, 1, 12) SARIMAX: (0, 0, 1) x (0, 1, 0, 12) SARIMAX: (0, 1, 0) x (0, 1, 1, 12) SARIMAX: (0, 1, 0) x (1, 0, 0, 12)
for param in pdq:
for param_seasonal in seasonal_pdq:
try:
mod = sm.tsa.statespace.SARIMAX(y,
order=param,
seasonal_order=param_seasonal,
enforce_invertibility=False)
results = mod.fit()
print('ARIMA{}x{}12 - AIC:{}'.format(param, param_seasonal, results.aic))
except:
continue
RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 1 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 8.15333D+00 |proj g|= 1.77636D-10 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 1 0 1 0 0 0 1.776D-10 8.153D+00 F = 8.1533264604570608 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 0, 0)x(0, 0, 0, 12)12 - AIC:784.7193402038779 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 2 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 3.87216D+01 |proj g|= 4.21458D-03 ys=-7.327E+01 -gs= 9.867E-01 BFGS update SKIPPED ys=-2.084E+01 -gs= 9.715E-01 BFGS update SKIPPED * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 2 5 78 3 2 0 1.812D+05 1.978D+01 F = 19.776516749356237 ABNORMAL_TERMINATION_IN_LNSRCH ARIMA(0, 0, 0)x(0, 0, 1, 12)12 - AIC:1902.5456079381986 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 1 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.15640D+00 |proj g|= 1.26095D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 1 4 6 1 0 0 4.149D-06 5.139D+00 F = 5.1392802369613984 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 0, 0)x(0, 1, 0, 12)12 - AIC:495.37090274829427 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 2 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.13952D+00 |proj g|= 2.50193D-01 At iterate 5 f= 5.07774D+00 |proj g|= 7.74561D-04 At iterate 10 f= 5.07696D+00 |proj g|= 2.43593D-02 At iterate 15 f= 5.06158D+00 |proj g|= 1.96399D-02 At iterate 20 f= 5.06073D+00 |proj g|= 8.15170D-07 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 2 20 21 1 0 0 8.152D-07 5.061D+00 F = 5.0607347150657942 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 0, 0)x(0, 1, 1, 12)12 - AIC:489.8305326463162 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 2 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 7.20013D+00 |proj g|= 2.68072D-02 At iterate 5 f= 7.16441D+00 |proj g|= 5.59890D-05 At iterate 10 f= 7.16438D+00 |proj g|= 9.84742D-04 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 2 14 17 1 0 0 9.936D-07 7.164D+00 F = 7.1643610901865129 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 0, 0)x(1, 0, 0, 12)12 - AIC:691.7786646579052 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 3.66481D+01 |proj g|= 1.20803D-01 At iterate 5 f= 3.65200D+01 |proj g|= 2.72893D-02 ys=-9.642E+01 -gs= 9.902E-01 BFGS update SKIPPED At iterate 10 f= 1.98918D+01 |proj g|= 1.06823D+05
This problem is unconstrained. This problem is unconstrained. Bad direction in the line search; refresh the lbfgs memory and restart the iteration. Line search cannot locate an adequate point after MAXLS function and gradient evaluations. Previous x, f and g restored. Possible causes: 1 error in function or gradient evaluation; 2 rounding error dominate computation. /Users/thomas/miniforge3/lib/python3.9/site-packages/statsmodels/base/model.py:604: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals warnings.warn("Maximum Likelihood optimization failed to " This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. Bad direction in the line search; refresh the lbfgs memory and restart the iteration. /Users/thomas/miniforge3/lib/python3.9/site-packages/statsmodels/base/model.py:604: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals warnings.warn("Maximum Likelihood optimization failed to " Line search cannot locate an adequate point after MAXLS function and gradient evaluations. Previous x, f and g restored. Possible causes: 1 error in function or gradient evaluation; 2 rounding error dominate computation.
* * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 12 93 2 1 0 1.545D+03 1.984D+01 F = 19.838064602299699 ABNORMAL_TERMINATION_IN_LNSRCH ARIMA(0, 0, 0)x(1, 0, 1, 12)12 - AIC:1910.454201820771 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 2 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.11290D+00 |proj g|= 1.20437D-01 At iterate 5 f= 5.10059D+00 |proj g|= 6.61878D-03 At iterate 10 f= 5.08737D+00 |proj g|= 7.27325D-02 At iterate 15 f= 5.07556D+00 |proj g|= 9.11858D-05 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 2 17 20 1 0 0 4.902D-06 5.076D+00 F = 5.0755623119031466 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 0, 0)x(1, 1, 0, 12)12 - AIC:491.25398194270207 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.13952D+00 |proj g|= 2.50193D-01 At iterate 5 f= 5.07715D+00 |proj g|= 1.27108D-03 At iterate 10 f= 5.06764D+00 |proj g|= 1.64783D-02 At iterate 15 f= 5.06068D+00 |proj g|= 1.79158D-03 At iterate 20 f= 5.06052D+00 |proj g|= 9.21415D-05 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 24 29 1 0 0 1.373D-05 5.061D+00 F = 5.0605162817634435 CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH ARIMA(0, 0, 0)x(1, 1, 1, 12)12 - AIC:491.80956304929055 ARIMA(0, 0, 1)x(0, 0, 0, 12)12 - AIC:751.063546276295 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 2 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 7.94463D+00 |proj g|= 2.71719D-01 At iterate 5 f= 7.90082D+00 |proj g|= 6.36244D-04 At iterate 10 f= 7.90075D+00 |proj g|= 1.42473D-02 At iterate 15 f= 7.89103D+00 |proj g|= 2.33417D-01 At iterate 20 f= 7.78210D+00 |proj g|= 2.43408D-02 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 2 24 26 1 0 0 2.897D-06 7.782D+00 F = 7.7819119403780732 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 3.99996D+01 |proj g|= 1.03129D-02 At iterate 5 f= 3.99738D+01 |proj g|= 1.07428D-01 ys=-3.754E+01 -gs= 9.751E-01 BFGS update SKIPPED ys=-7.685E+01 -gs= 9.873E-01 BFGS update SKIPPED At iterate 10 f= 1.90652D+01 |proj g|= 2.61481D+05 ys=-9.876E+00 -gs= 9.089E-01 BFGS update SKIPPED
This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. Bad direction in the line search; refresh the lbfgs memory and restart the iteration. /Users/thomas/miniforge3/lib/python3.9/site-packages/statsmodels/base/model.py:604: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals warnings.warn("Maximum Likelihood optimization failed to " Line search cannot locate an adequate point after MAXLS function and gradient evaluations. Previous x, f and g restored. Possible causes: 1 error in function or gradient evaluation; 2 rounding error dominate computation.
* * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 11 106 2 3 0 2.615D+05 1.907D+01 F = 19.065176407702072 ABNORMAL_TERMINATION_IN_LNSRCH ARIMA(0, 0, 1)x(0, 0, 1, 12)12 - AIC:1836.256935139399 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 2 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.13750D+00 |proj g|= 2.96550D-02 At iterate 5 f= 5.13701D+00 |proj g|= 9.86056D-04 At iterate 10 f= 5.13686D+00 |proj g|= 1.51373D-02 At iterate 15 f= 5.13588D+00 |proj g|= 6.98119D-04 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 2 18 21 1 0 0 4.499D-07 5.136D+00 F = 5.1358807903327355 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 0, 1)x(0, 1, 0, 12)12 - AIC:497.0445558719426 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.13750D+00 |proj g|= 2.55474D-01 At iterate 5 f= 5.07245D+00 |proj g|= 7.01531D-04 At iterate 10 f= 5.07223D+00 |proj g|= 1.19344D-02 At iterate 15 f= 5.06225D+00 |proj g|= 5.58563D-02 At iterate 20 f= 5.05900D+00 |proj g|= 3.65722D-04 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 22 24 1 0 0 5.096D-06 5.059D+00 F = 5.0590008034239196 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 0, 1)x(0, 1, 1, 12)12 - AIC:491.66407712869625 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 8.12323D+00 |proj g|= 4.44601D-02 At iterate 5 f= 7.89113D+00 |proj g|= 3.08464D-01 At iterate 10 f= 7.80992D+00 |proj g|= 9.44820D-04 At iterate 15 f= 7.80934D+00 |proj g|= 1.50567D-02 At iterate 20 f= 7.71507D+00 |proj g|= 3.45004D-01 At iterate 25 f= 7.19091D+00 |proj g|= 1.84170D-01 At iterate 30 f= 7.12147D+00 |proj g|= 2.20090D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 33 44 1 0 0 9.412D-06 7.121D+00 F = 7.1214300924372980 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 0, 1)x(1, 0, 0, 12)12 - AIC:689.6572888739806 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 3.85240D+01 |proj g|= 1.20803D-01 At iterate 5 f= 3.83983D+01 |proj g|= 3.87422D-03 At iterate 10 f= 3.83971D+01 |proj g|= 1.57121D-02 At iterate 15 f= 3.62010D+01 |proj g|= 4.32192D-01 ys=-5.416E+01 -gs= 9.807E-01 BFGS update SKIPPED At iterate 20 f= 2.35996D+01 |proj g|= 4.72887D+03 ys=-4.500E+01 -gs= 9.788E-01 BFGS update SKIPPED
This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. Bad direction in the line search; refresh the lbfgs memory and restart the iteration. /Users/thomas/miniforge3/lib/python3.9/site-packages/statsmodels/base/model.py:604: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals warnings.warn("Maximum Likelihood optimization failed to " Line search cannot locate an adequate point after MAXLS function and gradient evaluations. Previous x, f and g restored. Possible causes: 1 error in function or gradient evaluation; 2 rounding error dominate computation. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. Bad direction in the line search; refresh the lbfgs memory and restart the iteration. /Users/thomas/miniforge3/lib/python3.9/site-packages/statsmodels/base/model.py:604: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals warnings.warn("Maximum Likelihood optimization failed to " Line search cannot locate an adequate point after MAXLS function and gradient evaluations. Previous x, f and g restored. Possible causes: 1 error in function or gradient evaluation; 2 rounding error dominate computation. This problem is unconstrained.
* * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 23 118 2 2 0 6.971D+04 1.942D+01 F = 19.423130849308965 ABNORMAL_TERMINATION_IN_LNSRCH ARIMA(0, 0, 1)x(1, 0, 1, 12)12 - AIC:1872.6205615336607 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.08100D+00 |proj g|= 2.71652D-02 At iterate 5 f= 5.08008D+00 |proj g|= 7.92781D-04 At iterate 10 f= 5.07961D+00 |proj g|= 2.19736D-02 At iterate 15 f= 5.07501D+00 |proj g|= 4.10726D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 18 20 1 0 0 5.609D-07 5.075D+00 F = 5.0749790268603041 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 0, 1)x(1, 1, 0, 12)12 - AIC:493.1979865785892 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.13750D+00 |proj g|= 2.55474D-01 At iterate 5 f= 5.07234D+00 |proj g|= 1.42634D-02 At iterate 10 f= 5.07204D+00 |proj g|= 3.59409D-03 At iterate 15 f= 5.06700D+00 |proj g|= 1.62360D-02 At iterate 20 f= 5.05888D+00 |proj g|= 5.04330D-03 At iterate 25 f= 5.05864D+00 |proj g|= 4.40247D-04 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 28 31 1 0 0 8.689D-06 5.059D+00 F = 5.0586390857830503 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 0, 1)x(1, 1, 1, 12)12 - AIC:493.62935223517286 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 1 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 7.18416D+00 |proj g|= 5.66494D-05 ARIMA(0, 1, 0)x(0, 0, 0, 12)12 - AIC:691.6686053842182 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 1 2 3 1 0 0 1.061D-06 7.184D+00 F = 7.1840479727522721 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 2 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 3.83048D+01 |proj g|= 4.50049D-03 ys=-1.916E+02 -gs= 9.742E-01 BFGS update SKIPPED ys=-3.365E+01 -gs= 9.522E-01 BFGS update SKIPPED At iterate 5 f= 1.91345D+01 |proj g|= 2.55753D+05 ys=-2.087E-01 -gs= 4.684E-01 BFGS update SKIPPED * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 2 6 90 3 3 0 2.558D+05 1.913D+01 F = 19.134499272968608 ABNORMAL_TERMINATION_IN_LNSRCH ARIMA(0, 1, 0)x(0, 0, 1, 12)12 - AIC:1840.9119302049864 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 1 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.22017D+00 |proj g|= 1.04818D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 1 4 6 1 0 0 5.281D-06 5.200D+00 F = 5.1999136972365623 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 1, 0)x(0, 1, 0, 12)12 - AIC:501.19171493471
This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. Bad direction in the line search; refresh the lbfgs memory and restart the iteration. Bad direction in the line search; refresh the lbfgs memory and restart the iteration.
RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 2 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.20010D+00 |proj g|= 1.65626D-01 At iterate 5 f= 5.15755D+00 |proj g|= 4.20220D-03 At iterate 10 f= 5.15338D+00 |proj g|= 3.97760D-02 At iterate 15 f= 5.14814D+00 |proj g|= 7.59092D-05 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 2 17 21 1 0 0 8.341D-07 5.148D+00 F = 5.1481373288791925 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 1, 0)x(0, 1, 1, 12)12 - AIC:498.2211835724025 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 2 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 6.96802D+00 |proj g|= 2.72179D-02 At iterate 5 f= 6.96681D+00 |proj g|= 1.16739D-04 ARIMA(0, 1, 0)x(1, 0, 0, 12)12 - AIC:672.790589808306 At iterate 10 f= 6.96679D+00 |proj g|= 3.31551D-03 At iterate 15 f= 6.96657D+00 |proj g|= 6.54611D-04 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 2 17 19 1 0 0 4.213D-06 6.967D+00 F = 6.9665686438365206 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 3.77945D+01 |proj g|= 1.09516D-01 At iterate 5 f= 3.77385D+01 |proj g|= 2.85503D-02 ys=-1.817E+02 -gs= 5.829E-01 BFGS update SKIPPED ys=-2.081E+02 -gs= 9.735E-01 BFGS update SKIPPED At iterate 10 f= 2.38550D+01 |proj g|= 1.90459D+04
/Users/thomas/miniforge3/lib/python3.9/site-packages/statsmodels/base/model.py:604: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals warnings.warn("Maximum Likelihood optimization failed to " Line search cannot locate an adequate point after MAXLS function and gradient evaluations. Previous x, f and g restored. Possible causes: 1 error in function or gradient evaluation; 2 rounding error dominate computation. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. Bad direction in the line search; refresh the lbfgs memory and restart the iteration.
* * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 12 111 3 2 0 3.732D+04 2.128D+01 F = 21.275212315993517 ABNORMAL_TERMINATION_IN_LNSRCH ARIMA(0, 1, 0)x(1, 0, 1, 12)12 - AIC:2048.4203823353773 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 2 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.21020D+00 |proj g|= 1.04011D-01 At iterate 5 f= 5.19951D+00 |proj g|= 4.65956D-03 At iterate 10 f= 5.19159D+00 |proj g|= 6.88585D-02 At iterate 15 f= 5.16779D+00 |proj g|= 1.29302D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 2 18 21 1 0 0 8.231D-07 5.168D+00 F = 5.1677817440067679 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 1, 0)x(1, 1, 0, 12)12 - AIC:500.1070474246497 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.20010D+00 |proj g|= 1.65626D-01 At iterate 5 f= 5.15729D+00 |proj g|= 4.33297D-04 At iterate 10 f= 5.15702D+00 |proj g|= 4.77516D-03 At iterate 15 f= 5.14877D+00 |proj g|= 2.07044D-02 At iterate 20 f= 5.14605D+00 |proj g|= 2.57480D-04 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 22 25 1 0 0 1.867D-06 5.146D+00 F = 5.1460470961672637 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 1, 0)x(1, 1, 1, 12)12 - AIC:500.0205212320573 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 2 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 7.04689D+00 |proj g|= 9.11707D-02 At iterate 5 f= 7.03908D+00 |proj g|= 3.43026D-04 At iterate 10 f= 7.03883D+00 |proj g|= 5.84629D-03 At iterate 15 f= 7.03492D+00 |proj g|= 2.31516D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 2 18 24 1 0 0 2.858D-06 7.035D+00 F = 7.0349117265651389 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 1, 1)x(0, 0, 0, 12)12 - AIC:679.3515257502534 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 3.87350D+01 |proj g|= 6.47884D-03 At iterate 5 f= 3.86829D+01 |proj g|= 2.77991D-01 ys=-2.902E+01 -gs= 9.483E-01 BFGS update SKIPPED ys=-3.085E+01 -gs= 9.498E-01 BFGS update SKIPPED At iterate 10 f= 2.08105D+01 |proj g|= 4.94295D+04 ys=-1.497E+00 -gs= 9.585E-01 BFGS update SKIPPED
/Users/thomas/miniforge3/lib/python3.9/site-packages/statsmodels/base/model.py:604: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals warnings.warn("Maximum Likelihood optimization failed to " Line search cannot locate an adequate point after MAXLS function and gradient evaluations. Previous x, f and g restored. Possible causes: 1 error in function or gradient evaluation; 2 rounding error dominate computation. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained.
* * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 11 105 2 3 0 4.943D+04 2.081D+01 F = 20.810533465475274 ABNORMAL_TERMINATION_IN_LNSRCH ARIMA(0, 1, 1)x(0, 0, 1, 12)12 - AIC:2003.8112126856263 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 2 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.07796D+00 |proj g|= 8.60975D-02 At iterate 5 f= 5.06507D+00 |proj g|= 4.56653D-04 At iterate 10 f= 5.06474D+00 |proj g|= 1.44089D-02 At iterate 15 f= 5.05879D+00 |proj g|= 9.28368D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 2 19 22 1 0 0 3.588D-06 5.059D+00 F = 5.0586928270555935 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 1, 1)x(0, 1, 0, 12)12 - AIC:489.63451139733695 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.07796D+00 |proj g|= 2.15436D-01 At iterate 5 f= 5.01521D+00 |proj g|= 9.97784D-04 At iterate 10 f= 5.01460D+00 |proj g|= 2.66512D-02 At iterate 15 f= 4.97837D+00 |proj g|= 7.37651D-02 At iterate 20 f= 4.96759D+00 |proj g|= 5.24458D-03 At iterate 25 f= 4.96700D+00 |proj g|= 2.58078D-04 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 27 30 1 0 0 1.327D-06 4.967D+00 F = 4.9669995082315213 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 1, 1)x(0, 1, 1, 12)12 - AIC:482.831952790226 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 6.92632D+00 |proj g|= 7.11986D-02 At iterate 5 f= 6.90617D+00 |proj g|= 1.51339D-03 At iterate 10 f= 6.80402D+00 |proj g|= 6.88232D-02 At iterate 15 f= 6.77626D+00 |proj g|= 8.03868D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 19 36 1 0 0 4.207D-06 6.776D+00 F = 6.7762235340320416 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 1, 1)x(1, 0, 0, 12)12 - AIC:656.517459267076 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 3.88270D+01 |proj g|= 1.09511D-01 At iterate 5 f= 3.87753D+01 |proj g|= 2.90594D-03 At iterate 10 f= 3.45587D+01 |proj g|= 2.47620D-01 ys=-7.745E+01 -gs= 9.675E-01 BFGS update SKIPPED At iterate 15 f= 2.11250D+01 |proj g|= 1.98597D+02 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 17 85 1 1 0 1.520D+02 2.112D+01 F = 21.124062780596994 CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH ARIMA(0, 1, 1)x(1, 0, 1, 12)12 - AIC:2035.9100269373114 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.03740D+00 |proj g|= 9.04964D-02 At iterate 5 f= 5.01839D+00 |proj g|= 9.15972D-04 At iterate 10 f= 5.01803D+00 |proj g|= 1.35677D-02 At iterate 15 f= 4.99936D+00 |proj g|= 7.06349D-02 At iterate 20 f= 4.98658D+00 |proj g|= 1.13119D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 23 27 1 0 0 4.685D-06 4.987D+00 F = 4.9865799455200941 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 1, 1)x(1, 1, 0, 12)12 - AIC:484.711674769929 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.07796D+00 |proj g|= 2.15436D-01 At iterate 5 f= 5.01464D+00 |proj g|= 3.51472D-03 At iterate 10 f= 5.01433D+00 |proj g|= 2.18864D-03 At iterate 15 f= 5.01308D+00 |proj g|= 3.30191D-02 At iterate 20 f= 4.98556D+00 |proj g|= 1.10737D-01 At iterate 25 f= 4.96696D+00 |proj g|= 1.46917D-04 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 26 30 1 0 0 3.076D-06 4.967D+00 F = 4.9669573181330735 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(0, 1, 1)x(1, 1, 1, 12)12 - AIC:484.827902540775 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 2 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 7.33280D+00 |proj g|= 5.52117D-03 At iterate 5 f= 7.33214D+00 |proj g|= 3.25849D-04 At iterate 10 f= 7.33210D+00 |proj g|= 1.50709D-04 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 2 11 13 1 0 0 8.458D-06 7.332D+00 F = 7.3320959831063739 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(1, 0, 0)x(0, 0, 0, 12)12 - AIC:707.8812143782119 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 3.91740D+01 |proj g|= 7.49298D-03 At iterate 5 f= 2.75187D+01 |proj g|= 3.21565D+02
Warning: more than 10 function and gradient evaluations in the last line search. Termination may possibly be caused by a bad search direction. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained.
ys=-3.271E+01 -gs= 9.711E-01 BFGS update SKIPPED ys=-3.280E+01 -gs= 9.432E-01 BFGS update SKIPPED * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 8 94 2 2 0 3.989D+05 1.781D+01 F = 17.809859710486304 ABNORMAL_TERMINATION_IN_LNSRCH ARIMA(1, 0, 0)x(0, 0, 1, 12)12 - AIC:1715.7465322066853 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 2 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.13790D+00 |proj g|= 2.32707D-02 At iterate 5 f= 5.13750D+00 |proj g|= 4.13837D-03 At iterate 10 f= 5.13585D+00 |proj g|= 2.75494D-02 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 2 14 17 1 0 0 3.474D-06 5.135D+00 F = 5.1350384398242648 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(1, 0, 0)x(0, 1, 0, 12)12 - AIC:496.96369022312945 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.13790D+00 |proj g|= 2.62326D-01 At iterate 5 f= 5.06952D+00 |proj g|= 6.35391D-04 At iterate 10 f= 5.06929D+00 |proj g|= 1.34829D-02 At iterate 15 f= 5.06253D+00 |proj g|= 6.23571D-02 At iterate 20 f= 5.05879D+00 |proj g|= 2.33177D-04 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 22 24 1 0 0 1.463D-06 5.059D+00 F = 5.0587860329996692 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(1, 0, 0)x(0, 1, 1, 12)12 - AIC:491.6434591679682 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 7.50608D+00 |proj g|= 4.51293D-02 At iterate 5 f= 7.15336D+00 |proj g|= 1.31719D-03 At iterate 10 f= 7.15211D+00 |proj g|= 1.07502D-02 At iterate 15 f= 7.11952D+00 |proj g|= 2.56401D-02 At iterate 20 f= 7.06744D+00 |proj g|= 1.02358D-01 At iterate 25 f= 7.04720D+00 |proj g|= 3.13707D-04 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 27 44 1 0 0 1.477D-06 7.047D+00 F = 7.0472020195114444 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(1, 0, 0)x(1, 0, 0, 12)12 - AIC:682.5313938730986 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 3.77056D+01 |proj g|= 1.23274D-01 At iterate 5 f= 3.75687D+01 |proj g|= 3.33297D-02 At iterate 10 f= 3.75517D+01 |proj g|= 3.54339D-03 At iterate 15 f= 3.75330D+01 |proj g|= 7.63085D-02
Bad direction in the line search; refresh the lbfgs memory and restart the iteration. /Users/thomas/miniforge3/lib/python3.9/site-packages/statsmodels/base/model.py:604: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals warnings.warn("Maximum Likelihood optimization failed to " Line search cannot locate an adequate point after MAXLS function and gradient evaluations. Previous x, f and g restored. Possible causes: 1 error in function or gradient evaluation; 2 rounding error dominate computation. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained.
At iterate 20 f= 2.05689D+01 |proj g|= 3.24810D+05 ys=-1.002E+02 -gs= 9.831E-01 BFGS update SKIPPED ys=-4.273E-01 -gs= 2.953E-01 BFGS update SKIPPED * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 23 124 2 2 0 5.268D+05 1.526D+01 F = 15.255223050387372 ABNORMAL_TERMINATION_IN_LNSRCH ARIMA(1, 0, 0)x(1, 0, 1, 12)12 - AIC:1472.5014128371877 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.07940D+00 |proj g|= 3.00782D-02 At iterate 5 f= 5.07808D+00 |proj g|= 8.94832D-04 At iterate 10 f= 5.07756D+00 |proj g|= 1.67179D-02 At iterate 15 f= 5.07488D+00 |proj g|= 1.52835D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 17 20 1 0 0 7.367D-07 5.075D+00 F = 5.0748746082731699 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(1, 0, 0)x(1, 1, 0, 12)12 - AIC:493.1879623942243 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.13790D+00 |proj g|= 2.62326D-01 At iterate 5 f= 5.06941D+00 |proj g|= 1.24297D-02 At iterate 10 f= 5.06922D+00 |proj g|= 3.41999D-03 At iterate 15 f= 5.06592D+00 |proj g|= 3.76760D-02 At iterate 20 f= 5.05862D+00 |proj g|= 8.96677D-03 At iterate 25 f= 5.05842D+00 |proj g|= 1.54267D-03 At iterate 30 f= 5.05841D+00 |proj g|= 2.37250D-05 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 31 35 1 0 0 8.681D-06 5.058D+00 F = 5.0584107369136140 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL
Bad direction in the line search; refresh the lbfgs memory and restart the iteration. /Users/thomas/miniforge3/lib/python3.9/site-packages/statsmodels/base/model.py:604: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals warnings.warn("Maximum Likelihood optimization failed to " Line search cannot locate an adequate point after MAXLS function and gradient evaluations. Previous x, f and g restored. Possible causes: 1 error in function or gradient evaluation; 2 rounding error dominate computation. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained.
ARIMA(1, 0, 0)x(1, 1, 1, 12)12 - AIC:493.6074307437069 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 2.15502D+01 |proj g|= 6.13680D+01 At iterate 5 f= 7.42109D+00 |proj g|= 3.59910D-01 At iterate 10 f= 7.21866D+00 |proj g|= 3.34441D-02 At iterate 15 f= 7.20484D+00 |proj g|= 1.22176D-02 ARIMA(1, 0, 1)x(0, 0, 0, 12)12 - AIC:697.3491224687206 At iterate 20 f= 7.20378D+00 |proj g|= 1.12708D-02 At iterate 25 f= 7.20261D+00 |proj g|= 9.81946D-03 At iterate 30 f= 7.20238D+00 |proj g|= 5.92999D-04 At iterate 35 f= 7.20235D+00 |proj g|= 8.54062D-03 At iterate 40 f= 7.20203D+00 |proj g|= 1.47424D-02 At iterate 45 f= 7.20155D+00 |proj g|= 6.57676D-04 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 47 55 1 0 0 8.227D-06 7.202D+00 F = 7.2015533590491723 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 3.90364D+01 |proj g|= 1.68674D-02 ys=-4.294E+01 -gs= 9.800E-01 BFGS update SKIPPED ys=-3.152E+01 -gs= 9.687E-01 BFGS update SKIPPED At iterate 5 f= 2.20466D+01 |proj g|= 8.11431D+04 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 7 125 3 2 0 2.173D+04 2.146D+01 F = 21.462989585849176 ABNORMAL_TERMINATION_IN_LNSRCH ARIMA(1, 0, 1)x(0, 0, 1, 12)12 - AIC:2068.447000241521 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.14160D+00 |proj g|= 5.71167D-02 At iterate 5 f= 5.14057D+00 |proj g|= 9.23750D-03 At iterate 10 f= 5.13653D+00 |proj g|= 4.24404D-02
Bad direction in the line search; refresh the lbfgs memory and restart the iteration. Bad direction in the line search; refresh the lbfgs memory and restart the iteration. /Users/thomas/miniforge3/lib/python3.9/site-packages/statsmodels/base/model.py:604: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals warnings.warn("Maximum Likelihood optimization failed to " Line search cannot locate an adequate point after MAXLS function and gradient evaluations. Previous x, f and g restored. Possible causes: 1 error in function or gradient evaluation; 2 rounding error dominate computation. This problem is unconstrained. This problem is unconstrained.
At iterate 15 f= 5.13515D+00 |proj g|= 2.92333D-04 At iterate 20 f= 5.13507D+00 |proj g|= 9.85388D-03 At iterate 25 f= 5.13400D+00 |proj g|= 1.25128D-02 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 29 33 1 0 0 6.125D-06 5.134D+00 F = 5.1338828693029859 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(1, 0, 1)x(0, 1, 0, 12)12 - AIC:498.8527554530867 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.14160D+00 |proj g|= 2.62926D-01 At iterate 5 f= 5.07287D+00 |proj g|= 1.65466D-03 At iterate 10 f= 5.07249D+00 |proj g|= 2.25056D-02 At iterate 15 f= 5.07042D+00 |proj g|= 1.35928D-03 At iterate 20 f= 5.06988D+00 |proj g|= 4.15480D-03 At iterate 25 f= 5.06125D+00 |proj g|= 1.14166D-02 At iterate 30 f= 5.06067D+00 |proj g|= 8.45111D-04 At iterate 35 f= 5.06036D+00 |proj g|= 1.25159D-02 At iterate 40 f= 5.05974D+00 |proj g|= 1.33966D-02 At iterate 45 f= 5.05809D+00 |proj g|= 1.41118D-02 At iterate 50 f= 5.05789D+00 |proj g|= 2.36319D-04 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 50 74 1 0 0 2.363D-04 5.058D+00 F = 5.0578905889841410 STOP: TOTAL NO. of ITERATIONS REACHED LIMIT ARIMA(1, 0, 1)x(0, 1, 1, 12)12 - AIC:493.5574965424775 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 7.52644D+00 |proj g|= 8.95388D-01 At iterate 5 f= 7.31397D+00 |proj g|= 4.73832D-02 At iterate 10 f= 7.08463D+00 |proj g|= 1.34397D-01 At iterate 15 f= 7.05013D+00 |proj g|= 3.02721D-02
/Users/thomas/miniforge3/lib/python3.9/site-packages/statsmodels/base/model.py:604: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals warnings.warn("Maximum Likelihood optimization failed to " This problem is unconstrained. This problem is unconstrained.
At iterate 20 f= 7.04541D+00 |proj g|= 1.59950D-03 At iterate 25 f= 7.04016D+00 |proj g|= 2.51886D-01 At iterate 30 f= 6.99665D+00 |proj g|= 4.70531D-01 At iterate 35 f= 6.93243D+00 |proj g|= 8.01851D-01 At iterate 40 f= 6.93022D+00 |proj g|= 5.83495D-05 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 41 63 1 0 0 5.835D-05 6.930D+00 F = 6.9302165905897235 CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH ARIMA(1, 0, 1)x(1, 0, 0, 12)12 - AIC:673.3007926966135 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 5 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 3.75608D+01 |proj g|= 1.20799D-01 At iterate 5 f= 3.74327D+01 |proj g|= 3.08994D-03 At iterate 10 f= 3.74106D+01 |proj g|= 4.59884D-02 At iterate 15 f= 2.04430D+01 |proj g|= 4.50239D+04 ys=-2.219E+01 -gs= 9.997E-01 BFGS update SKIPPED * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 5 20 89 2 1 0 2.854D+04 1.947D+01 F = 19.465082552554620 ABNORMAL_TERMINATION_IN_LNSRCH ARIMA(1, 0, 1)x(1, 0, 1, 12)12 - AIC:1878.6479250452435 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.08276D+00 |proj g|= 5.69070D-02
Nonpositive definiteness in Cholesky factorization in formk; refresh the lbfgs memory and restart the iteration. /Users/thomas/miniforge3/lib/python3.9/site-packages/statsmodels/base/model.py:604: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals warnings.warn("Maximum Likelihood optimization failed to " Line search cannot locate an adequate point after MAXLS function and gradient evaluations. Previous x, f and g restored. Possible causes: 1 error in function or gradient evaluation; 2 rounding error dominate computation. This problem is unconstrained. This problem is unconstrained.
At iterate 5 f= 5.08026D+00 |proj g|= 4.93144D-04 At iterate 10 f= 5.07975D+00 |proj g|= 1.68348D-02 At iterate 15 f= 5.07734D+00 |proj g|= 1.04994D-03 At iterate 20 f= 5.07675D+00 |proj g|= 1.72197D-02 At iterate 25 f= 5.07379D+00 |proj g|= 1.41169D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 29 38 1 0 0 1.084D-05 5.074D+00 F = 5.0737777824799695 CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH ARIMA(1, 0, 1)x(1, 1, 0, 12)12 - AIC:495.08266711807704 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 5 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.14160D+00 |proj g|= 2.62926D-01 At iterate 5 f= 5.07272D+00 |proj g|= 1.59831D-02 At iterate 10 f= 5.07239D+00 |proj g|= 8.06336D-03 At iterate 15 f= 5.07010D+00 |proj g|= 5.15679D-03 At iterate 20 f= 5.07000D+00 |proj g|= 1.11924D-03 At iterate 25 f= 5.06884D+00 |proj g|= 9.11609D-03 At iterate 30 f= 5.05720D+00 |proj g|= 1.15467D-02 At iterate 35 f= 5.05568D+00 |proj g|= 4.75436D-03 At iterate 40 f= 5.05546D+00 |proj g|= 3.35449D-04 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 5 43 53 1 0 0 2.012D-06 5.055D+00 F = 5.0554573478167599 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(1, 0, 1)x(1, 1, 1, 12)12 - AIC:495.3239053904089 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 2 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 7.09353D+00 |proj g|= 5.68673D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 2 3 5 1 0 0 5.467D-05 7.094D+00 F = 7.0935060822781111 CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH ARIMA(1, 1, 0)x(0, 0, 0, 12)12 - AIC:684.9765838986987 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 3.87467D+01 |proj g|= 2.86598D-03 ys=-5.109E+02 -gs= 9.773E-01 BFGS update SKIPPED ys=-4.493E+01 -gs= 9.587E-01 BFGS update SKIPPED At iterate 5 f= 1.93053D+01 |proj g|= 2.37298D+05 ys=-5.455E-02 -gs= 4.478E-01 BFGS update SKIPPED * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 6 86 4 3 0 2.373D+05 1.931D+01 F = 19.305265289027478 ABNORMAL_TERMINATION_IN_LNSRCH ARIMA(1, 1, 0)x(0, 0, 1, 12)12 - AIC:1859.305467746638 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 2 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.10745D+00 |proj g|= 5.90573D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 2 3 5 1 0 0 3.271D-05 5.107D+00 F = 5.1074085124345423 CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH ARIMA(1, 1, 0)x(0, 1, 0, 12)12 - AIC:494.31121719371606 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.10745D+00 |proj g|= 1.99473D-01 At iterate 5 f= 5.06018D+00 |proj g|= 7.95676D-04 At iterate 10 f= 5.05870D+00 |proj g|= 2.35552D-02 At iterate 15 f= 5.04830D+00 |proj g|= 1.25348D-03 At iterate 20 f= 5.04822D+00 |proj g|= 4.95075D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 24 29 1 0 0 6.684D-06 5.048D+00 F = 5.0481695334536907 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(1, 1, 0)x(0, 1, 1, 12)12 - AIC:490.62427521155433 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 6.95336D+00 |proj g|= 6.09824D-02 At iterate 5 f= 6.94482D+00 |proj g|= 1.24245D-03 At iterate 10 f= 6.94238D+00 |proj g|= 3.50007D-02 At iterate 15 f= 6.87031D+00 |proj g|= 4.85148D-02 At iterate 20 f= 6.86736D+00 |proj g|= 1.81446D-06 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 20 23 1 0 0 1.814D-06 6.867D+00 F = 6.8673584614945691 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(1, 1, 0)x(1, 0, 0, 12)12 - AIC:665.2664123034787
This problem is unconstrained. This problem is unconstrained. Bad direction in the line search; refresh the lbfgs memory and restart the iteration. /Users/thomas/miniforge3/lib/python3.9/site-packages/statsmodels/base/model.py:604: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals warnings.warn("Maximum Likelihood optimization failed to " Line search cannot locate an adequate point after MAXLS function and gradient evaluations. Previous x, f and g restored. Possible causes: 1 error in function or gradient evaluation; 2 rounding error dominate computation. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained.
RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 3.88387D+01 |proj g|= 1.09516D-01 At iterate 5 f= 3.87848D+01 |proj g|= 2.35875D-02 ys=-3.032E+02 -gs= 9.792E-01 BFGS update SKIPPED At iterate 10 f= 2.48747D+01 |proj g|= 6.28348D+03 ys=-1.010E+02 -gs= 9.699E-01 BFGS update SKIPPED * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 11 86 2 2 0 6.283D+03 2.487D+01 F = 24.874704186381678 ABNORMAL_TERMINATION_IN_LNSRCH ARIMA(1, 1, 0)x(1, 0, 1, 12)12 - AIC:2395.971601892641 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.07065D+00 |proj g|= 6.36916D-02 At iterate 5 f= 5.06580D+00 |proj g|= 1.38342D-03 At iterate 10 f= 5.06477D+00 |proj g|= 2.33476D-02 At iterate 15 f= 5.06099D+00 |proj g|= 1.30584D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 17 20 1 0 0 3.320D-06 5.061D+00 F = 5.0609812621708823 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(1, 1, 0)x(1, 1, 0, 12)12 - AIC:491.8542011684047 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.10745D+00 |proj g|= 1.99473D-01 At iterate 5 f= 5.06004D+00 |proj g|= 5.15030D-03 At iterate 10 f= 5.05996D+00 |proj g|= 2.88921D-03
Bad direction in the line search; refresh the lbfgs memory and restart the iteration. /Users/thomas/miniforge3/lib/python3.9/site-packages/statsmodels/base/model.py:604: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals warnings.warn("Maximum Likelihood optimization failed to " Line search cannot locate an adequate point after MAXLS function and gradient evaluations. Previous x, f and g restored. Possible causes: 1 error in function or gradient evaluation; 2 rounding error dominate computation. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained.
At iterate 15 f= 5.05657D+00 |proj g|= 4.03565D-02 At iterate 20 f= 5.04779D+00 |proj g|= 2.28646D-03 At iterate 25 f= 5.04766D+00 |proj g|= 7.02837D-05 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 26 29 1 0 0 9.141D-06 5.048D+00 F = 5.0476608497579551 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(1, 1, 0)x(1, 1, 1, 12)12 - AIC:492.5754415767637 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 7.05024D+00 |proj g|= 8.52365D-02 ARIMA(1, 1, 1)x(0, 0, 0, 12)12 - AIC:678.4136280158871 At iterate 5 f= 7.01969D+00 |proj g|= 8.12547D-02 At iterate 10 f= 7.01480D+00 |proj g|= 5.50545D-04 At iterate 15 f= 7.01470D+00 |proj g|= 1.39017D-02 At iterate 20 f= 7.00652D+00 |proj g|= 2.80144D-02 At iterate 25 f= 7.00431D+00 |proj g|= 1.76119D-05 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 26 35 1 0 0 6.574D-07 7.004D+00 F = 7.0043086251654900 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 3.87332D+01 |proj g|= 4.68220D-03 At iterate 5 f= 3.86889D+01 |proj g|= 1.38670D-01 ys=-2.488E+02 -gs= 6.772E-01 BFGS update SKIPPED ys=-3.102E+01 -gs= 9.492E-01 BFGS update SKIPPED At iterate 10 f= 2.14967D+01 |proj g|= 1.35055D+04 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 11 99 2 2 0 1.351D+04 2.150D+01 F = 21.496724691239422 CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH ARIMA(1, 1, 1)x(0, 0, 1, 12)12 - AIC:2071.6855703589845
Bad direction in the line search; refresh the lbfgs memory and restart the iteration. Warning: more than 10 function and gradient evaluations in the last line search. Termination may possibly be caused by a bad search direction. This problem is unconstrained. This problem is unconstrained. This problem is unconstrained.
RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 3 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.09331D+00 |proj g|= 2.47995D-02 At iterate 5 f= 5.05664D+00 |proj g|= 1.52350D-03 At iterate 10 f= 5.05662D+00 |proj g|= 5.37550D-03 At iterate 15 f= 5.05447D+00 |proj g|= 5.00822D-02 At iterate 20 f= 5.05085D+00 |proj g|= 3.93014D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 3 23 29 1 0 0 6.559D-06 5.051D+00 F = 5.0508240419213113 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(1, 1, 1)x(0, 1, 0, 12)12 - AIC:490.8791080244459 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.09331D+00 |proj g|= 2.33947D-01 At iterate 5 f= 5.01261D+00 |proj g|= 3.71120D-02 At iterate 10 f= 5.00873D+00 |proj g|= 9.79364D-04 At iterate 15 f= 5.00848D+00 |proj g|= 1.97599D-02 At iterate 20 f= 4.97490D+00 |proj g|= 8.32389D-02 At iterate 25 f= 4.96474D+00 |proj g|= 1.01137D-02 At iterate 30 f= 4.96452D+00 |proj g|= 2.43306D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 33 37 1 0 0 2.307D-06 4.965D+00 F = 4.9645173729603469 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(1, 1, 1)x(0, 1, 1, 12)12 - AIC:484.5936678041933 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 6.92770D+00 |proj g|= 7.02084D-02 At iterate 5 f= 6.89149D+00 |proj g|= 2.04355D-02 At iterate 10 f= 6.89071D+00 |proj g|= 4.02459D-03 At iterate 15 f= 6.87719D+00 |proj g|= 6.57989D-02 At iterate 20 f= 6.78349D+00 |proj g|= 1.21812D-01 At iterate 25 f= 6.75567D+00 |proj g|= 3.19136D-02 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 29 43 1 0 0 9.195D-06 6.756D+00 F = 6.7555667174494269 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(1, 1, 1)x(1, 0, 0, 12)12 - AIC:656.534404875145 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 5 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 3.88252D+01 |proj g|= 1.09515D-01
This problem is unconstrained.
At iterate 5 f= 3.87734D+01 |proj g|= 2.91145D-03 At iterate 10 f= 3.84988D+01 |proj g|= 8.68885D-01 ys=-3.759E+02 -gs= 6.769E-01 BFGS update SKIPPED ys=-3.922E+01 -gs= 9.551E-01 BFGS update SKIPPED
Nonpositive definiteness in Cholesky factorization in formt; refresh the lbfgs memory and restart the iteration. /Users/thomas/miniforge3/lib/python3.9/site-packages/statsmodels/base/model.py:604: ConvergenceWarning: Maximum Likelihood optimization failed to converge. Check mle_retvals warnings.warn("Maximum Likelihood optimization failed to " Line search cannot locate an adequate point after MAXLS function and gradient evaluations. Previous x, f and g restored. Possible causes: 1 error in function or gradient evaluation; 2 rounding error dominate computation. This problem is unconstrained. This problem is unconstrained.
* * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 5 15 96 2 2 0 3.681D+05 1.854D+01 F = 18.541180475004293 ABNORMAL_TERMINATION_IN_LNSRCH ARIMA(1, 1, 1)x(1, 0, 1, 12)12 - AIC:1789.9533256004122 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.04808D+00 |proj g|= 8.88357D-02 At iterate 5 f= 5.01487D+00 |proj g|= 2.85906D-02 At iterate 10 f= 5.01305D+00 |proj g|= 8.83301D-04 At iterate 15 f= 5.01275D+00 |proj g|= 1.18778D-02 At iterate 20 f= 4.99531D+00 |proj g|= 9.50058D-02 At iterate 25 f= 4.98503D+00 |proj g|= 1.23486D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 28 37 1 0 0 8.216D-07 4.985D+00 F = 4.9850333137500700 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(1, 1, 1)x(1, 1, 0, 12)12 - AIC:486.5631981200067 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 5 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.09331D+00 |proj g|= 2.33947D-01 At iterate 5 f= 5.01345D+00 |proj g|= 6.35790D-02 At iterate 10 f= 5.00824D+00 |proj g|= 2.96564D-03 At iterate 15 f= 5.00814D+00 |proj g|= 3.18523D-03 At iterate 20 f= 5.00509D+00 |proj g|= 4.18692D-02 At iterate 25 f= 4.96998D+00 |proj g|= 4.48102D-02 At iterate 30 f= 4.96445D+00 |proj g|= 3.97815D-03 At iterate 35 f= 4.96440D+00 |proj g|= 3.04858D-06 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 5 35 40 1 0 0 3.049D-06 4.964D+00 F = 4.9643973998915065 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ARIMA(1, 1, 1)x(1, 1, 1, 12)12 - AIC:486.5821503895846
mod = sm.tsa.statespace.SARIMAX(y,
order=(1, 1, 1),
seasonal_order=(1, 1, 0, 12),
enforce_invertibility=False)
results = mod.fit()
print(results.summary().tables[1])
RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 4 M = 10 At X0 0 variables are exactly at the bounds At iterate 0 f= 5.04808D+00 |proj g|= 8.88357D-02 At iterate 5 f= 5.01487D+00 |proj g|= 2.85906D-02 At iterate 10 f= 5.01305D+00 |proj g|= 8.83301D-04 At iterate 15 f= 5.01275D+00 |proj g|= 1.18778D-02 At iterate 20 f= 4.99531D+00 |proj g|= 9.50058D-02 At iterate 25 f= 4.98503D+00 |proj g|= 1.23486D-03 * * * Tit = total number of iterations Tnf = total number of function evaluations Tnint = total number of segments explored during Cauchy searches Skip = number of BFGS updates skipped Nact = number of active bounds at final generalized Cauchy point Projg = norm of the final projected gradient F = final function value * * * N Tit Tnf Tnint Skip Nact Projg F 4 28 37 1 0 0 8.216D-07 4.985D+00 F = 4.9850333137500700 CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL ============================================================================== coef std err z P>|z| [0.025 0.975] ------------------------------------------------------------------------------ ar.L1 0.0676 0.226 0.299 0.765 -0.376 0.511 ma.L1 -1.0000 0.279 -3.590 0.000 -1.546 -0.454 ar.S.L12 -0.4807 0.147 -3.260 0.001 -0.770 -0.192 sigma2 4.108e+04 6.78e-06 6.06e+09 0.000 4.11e+04 4.11e+04 ==============================================================================
This problem is unconstrained.
results.plot_diagnostics(figsize=(16, 8))
plt.show()
To help us understand the accuracy of our forecasts, we compare predicted sales to real sales of the time series, and we set forecasts to start at 2017-07-01 to the end of the data.
pred = results.get_prediction(start=pd.to_datetime('2017-01-01'), dynamic=False)
pred_ci = pred.conf_int()
ax = y['2014':].plot(label='observed')
pred.predicted_mean.plot(ax=ax, label='One-step ahead Forecast', alpha=.7, figsize=(14, 7))
ax.fill_between(pred_ci.index,
pred_ci.iloc[:, 0],
pred_ci.iloc[:, 1], color='k', alpha=.2)
ax.set_xlabel('Date')
ax.set_ylabel('Furniture Sales')
plt.legend()
plt.show()
The line plot is showing the observed values compared to the rolling forecast predictions. Overall, our forecasts align with the true values very well, showing an upward trend starts from the beginning of the year.
y_forecasted = pred.predicted_mean
y_truth = y['2017-01-01':]
# Compute the mean square error
mse = ((y_forecasted - y_truth) ** 2).mean()
print('The Mean Squared Error of our forecasts is {}'.format(round(mse, 2)))
The Mean Squared Error of our forecasts is 39996.01
print('The Root Mean Squared Error of our forecasts is {}'.format(round(np.sqrt(mse), 2)))
The Root Mean Squared Error of our forecasts is 199.99
In statistics, the mean squared error (MSE) of an estimator measures the average of the squares of the errors — that is, the average squared difference between the estimated values and what is estimated. The MSE is a measure of the quality of an estimator—it is always non-negative, and the smaller the MSE, the closer we are to finding the line of best fit.
Root Mean Square Error (RMSE) tells us that our model was able to forecast the average daily furniture sales in the test set within 151.64 of the real sales. Our furniture daily sales range from around 400 to over 1200. In my opinion, this is a pretty good model so far.
pred_uc = results.get_forecast(steps=100)
pred_ci = pred_uc.conf_int()
ax = y.plot(label='observed', figsize=(14, 7))
pred_uc.predicted_mean.plot(ax=ax, label='Forecast')
ax.fill_between(pred_ci.index,
pred_ci.iloc[:, 0],
pred_ci.iloc[:, 1], color='k', alpha=.25)
ax.set_xlabel('Date')
ax.set_ylabel('Furniture Sales')
plt.legend()
plt.show()
Our model clearly captured furniture sales seasonality. As we forecast further out into the future, it is natural for us to become less confident in our values. This is reflected by the confidence intervals generated by our model, which grow larger as we move further out into the future.
The above time series analysis for furniture makes me curious about other categories, and how do they compare with each other onver time. Therefore, we are going to compare time series of furniture and office supplier.
furniture = df.loc[df['Category'] == 'Furniture']
office = df.loc[df['Category'] == 'Office Supplies']
According to our data, there were way more number of sales from Office Supplies than from Furniture over the years.
furniture.shape, office.shape
((2121, 21), (6026, 21))
cols = ['Row ID', 'Order ID', 'Ship Date', 'Ship Mode', 'Customer ID', 'Customer Name', 'Segment', 'Country', 'City', 'State', 'Postal Code', 'Region', 'Product ID', 'Category', 'Sub-Category', 'Product Name', 'Quantity', 'Discount', 'Profit']
furniture.drop(cols, axis=1, inplace=True)
office.drop(cols, axis=1, inplace=True)
furniture = furniture.sort_values('Order Date')
office = office.sort_values('Order Date')
furniture = furniture.groupby('Order Date')['Sales'].sum().reset_index()
office = office.groupby('Order Date')['Sales'].sum().reset_index()
Have a quick peek, perfect!
furniture.head()
Order Date | Sales | |
---|---|---|
0 | 2014-01-06 | 2573.820 |
1 | 2014-01-07 | 76.728 |
2 | 2014-01-10 | 51.940 |
3 | 2014-01-11 | 9.940 |
4 | 2014-01-13 | 879.939 |
office.head()
Order Date | Sales | |
---|---|---|
0 | 2014-01-03 | 16.448 |
1 | 2014-01-04 | 288.060 |
2 | 2014-01-05 | 19.536 |
3 | 2014-01-06 | 685.340 |
4 | 2014-01-07 | 10.430 |
We are going to compare two categories' sales in the same time period. This means combine two data frames into one and plot these two categories' time series into one plot.
furniture = furniture.set_index('Order Date')
office = office.set_index('Order Date')
y_furniture = furniture['Sales'].resample('MS').mean()
y_office = office['Sales'].resample('MS').mean()
furniture = pd.DataFrame({'Order Date':y_furniture.index, 'Sales':y_furniture.values})
office = pd.DataFrame({'Order Date': y_office.index, 'Sales': y_office.values})
store = furniture.merge(office, how='inner', on='Order Date')
store.rename(columns={'Sales_x': 'furniture_sales', 'Sales_y': 'office_sales'}, inplace=True)
store.head()
Order Date | furniture_sales | office_sales | |
---|---|---|---|
0 | 2014-01-01 | 480.194231 | 285.357647 |
1 | 2014-02-01 | 367.931600 | 63.042588 |
2 | 2014-03-01 | 857.291529 | 391.176318 |
3 | 2014-04-01 | 567.488357 | 464.794750 |
4 | 2014-05-01 | 432.049188 | 324.346545 |
plt.figure(figsize=(20, 8))
plt.plot(store['Order Date'], store['furniture_sales'], 'b-', label = 'furniture')
plt.plot(store['Order Date'], store['office_sales'], 'r-', label = 'office supplies')
plt.xlabel('Date'); plt.ylabel('Sales'); plt.title('Sales of Furniture and Office Supplies')
plt.legend();
We observe that sales of furniture and office supplies shared a similar seasonal pattern. Early of the year is the off season for both of the two categories. It seems summer time is quiet for office supplies too. in addition, average daily sales for furniture are higher than those of office supplies in most of the months. It is understandable, as the value of furniture should be much higher than those of office supplies. Occationaly, office supplies passed furnitue on average daily sales. Let's find out when was the first time office supplies' sales surpassed those of furniture's.
first_date = store.loc[np.min(list(np.where(store['office_sales'] > store['furniture_sales'])[0])), 'Order Date']
print("Office supplies first time produced higher sales than furniture is {}.".format(first_date.date()))
Office supplies first time produced higher sales than furniture is 2014-07-01.
It was July 2014.
Released by Facebook in 2017, forecasting tool Prophet is designed for analyzing time-series that display patterns on different time scales such as yearly, weekly and daily. It also has advanced capabilities for modeling the effects of holidays on a time-series and implementing custom changepoints. Therefore, we are using Prophet to get a model up and running.
from prophet import Prophet
furniture = furniture.rename(columns={'Order Date': 'ds', 'Sales': 'y'})
furniture_model = Prophet(interval_width=0.95)
furniture_model.fit(furniture)
office = office.rename(columns={'Order Date': 'ds', 'Sales': 'y'})
office_model = Prophet(interval_width=0.95)
office_model.fit(office)
Importing plotly failed. Interactive plots will not work. INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this. INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this. INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this. INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.
Initial log joint probability = -59.4782 Iteration 1. Log joint probability = 32.1963. Improved by 91.6745. Iteration 2. Log joint probability = 66.4876. Improved by 34.2913. Iteration 3. Log joint probability = 91.4725. Improved by 24.9849. Iteration 4. Log joint probability = 97.1278. Improved by 5.65524. Iteration 5. Log joint probability = 97.227. Improved by 0.099276. Iteration 6. Log joint probability = 97.2722. Improved by 0.0451306. Iteration 7. Log joint probability = 97.2786. Improved by 0.0064387. Iteration 8. Log joint probability = 97.2875. Improved by 0.00892092. Iteration 9. Log joint probability = 97.2913. Improved by 0.0037551. Iteration 10. Log joint probability = 97.2951. Improved by 0.00385468. Iteration 11. Log joint probability = 97.2963. Improved by 0.00112628. Iteration 12. Log joint probability = 97.2966. Improved by 0.000305425. Iteration 13. Log joint probability = 97.2991. Improved by 0.00250368. Iteration 14. Log joint probability = 97.2997. Improved by 0.000593893. Iteration 15. Log joint probability = 97.2997. Improved by 8.36309e-05. Iteration 16. Log joint probability = 97.3011. Improved by 0.00138081. Iteration 17. Log joint probability = 97.3012. Improved by 0.000115901. Iteration 18. Log joint probability = 97.3014. Improved by 0.00016083. Iteration 19. Log joint probability = 97.3018. Improved by 0.000439184. Iteration 20. Log joint probability = 97.302. Improved by 0.00020158. Iteration 21. Log joint probability = 97.3022. Improved by 0.000111919. Iteration 22. Log joint probability = 97.3024. Improved by 0.000245425. Iteration 23. Log joint probability = 97.3025. Improved by 7.82796e-05. Iteration 24. Log joint probability = 97.3026. Improved by 0.000159162. Iteration 25. Log joint probability = 97.3027. Improved by 1.70008e-05. Iteration 26. Log joint probability = 97.3027. Improved by 5.27051e-06. Iteration 27. Log joint probability = 97.3027. Improved by 2.23368e-05. Iteration 28. Log joint probability = 97.3027. Improved by 6.18042e-05. Iteration 29. Log joint probability = 97.3028. Improved by 1.47716e-05. Iteration 30. Log joint probability = 97.3028. Improved by 7.00067e-06. Iteration 31. Log joint probability = 97.3028. Improved by 4.72555e-05. Iteration 32. Log joint probability = 97.3028. Improved by 2.11586e-05. Iteration 33. Log joint probability = 97.3028. Improved by 9.45415e-07. Iteration 34. Log joint probability = 97.3028. Improved by 8.17126e-07. Iteration 35. Log joint probability = 97.3028. Improved by 3.01499e-06. Iteration 36. Log joint probability = 97.3028. Improved by 8.84627e-06. Iteration 37. Log joint probability = 97.3028. Improved by 1.7282e-06. Iteration 38. Log joint probability = 97.3029. Improved by 6.42052e-06. Iteration 39. Log joint probability = 97.3029. Improved by 2.39659e-06. Iteration 40. Log joint probability = 97.3029. Improved by 8.88455e-07. Iteration 41. Log joint probability = 97.3029. Improved by 6.63525e-07. Iteration 42. Log joint probability = 97.3029. Improved by 5.28772e-07. Iteration 43. Log joint probability = 97.3029. Improved by 9.87917e-08. Iteration 44. Log joint probability = 97.3029. Improved by 2.29707e-07. Iteration 45. Log joint probability = 97.3029. Improved by 1.64764e-07. Iteration 46. Log joint probability = 97.3029. Improved by 1.57734e-07. Iteration 47. Log joint probability = 97.3029. Improved by 7.75721e-08. Iteration 48. Log joint probability = 97.3029. Improved by 2.72054e-09. Initial log joint probability = -59.2582 Iteration 1. Log joint probability = 38.6371. Improved by 97.8953. Iteration 2. Log joint probability = 65.5131. Improved by 26.876. Iteration 3. Log joint probability = 79.6826. Improved by 14.1695. Iteration 4. Log joint probability = 79.7921. Improved by 0.109445. Iteration 5. Log joint probability = 79.8485. Improved by 0.0564444. Iteration 6. Log joint probability = 79.9048. Improved by 0.0562415. Iteration 7. Log joint probability = 79.915. Improved by 0.0102011. Iteration 8. Log joint probability = 79.9719. Improved by 0.0569912. Iteration 9. Log joint probability = 80.0012. Improved by 0.0292655. Iteration 10. Log joint probability = 80.0279. Improved by 0.0266806. Iteration 11. Log joint probability = 80.0448. Improved by 0.0169591. Iteration 12. Log joint probability = 80.0537. Improved by 0.0088597. Iteration 13. Log joint probability = 80.0611. Improved by 0.0073945. Iteration 14. Log joint probability = 80.0649. Improved by 0.00378284. Iteration 15. Log joint probability = 80.0739. Improved by 0.00902414. Iteration 16. Log joint probability = 80.0752. Improved by 0.00127615. Iteration 17. Log joint probability = 80.0765. Improved by 0.00131062. Iteration 18. Log joint probability = 80.0794. Improved by 0.00291987. Iteration 19. Log joint probability = 80.0809. Improved by 0.0014927. Iteration 20. Log joint probability = 80.0823. Improved by 0.00137598. Iteration 21. Log joint probability = 80.0831. Improved by 0.000815686. Iteration 22. Log joint probability = 80.0848. Improved by 0.00167409. Iteration 23. Log joint probability = 80.0854. Improved by 0.00061637. Iteration 24. Log joint probability = 80.0863. Improved by 0.000947166. Iteration 25. Log joint probability = 80.0868. Improved by 0.000487622. Iteration 26. Log joint probability = 80.0872. Improved by 0.000330843. Iteration 27. Log joint probability = 80.0874. Improved by 0.000227349. Iteration 28. Log joint probability = 80.0874. Improved by 6.35621e-07. Iteration 29. Log joint probability = 80.0875. Improved by 7.74378e-05. Iteration 30. Log joint probability = 80.0879. Improved by 0.000408844. Iteration 31. Log joint probability = 80.0882. Improved by 0.000326871. Iteration 32. Log joint probability = 80.0884. Improved by 0.000191256. Iteration 33. Log joint probability = 80.0884. Improved by 4.66241e-05. Iteration 34. Log joint probability = 80.0887. Improved by 0.000213995. Iteration 35. Log joint probability = 80.0887. Improved by 8.15744e-05. Iteration 36. Log joint probability = 80.0889. Improved by 0.00015383. Iteration 37. Log joint probability = 80.0889. Improved by 6.02522e-05. Iteration 38. Log joint probability = 80.089. Improved by 4.12535e-05. Iteration 39. Log joint probability = 80.089. Improved by 1.52172e-05. Iteration 40. Log joint probability = 80.0891. Improved by 4.85992e-05. Iteration 41. Log joint probability = 80.0891. Improved by 4.91487e-05. Iteration 42. Log joint probability = 80.0891. Improved by 1.26866e-05. Iteration 43. Log joint probability = 80.0891. Improved by 1.88532e-06. Iteration 44. Log joint probability = 80.0891. Improved by 8.61439e-06. Iteration 45. Log joint probability = 80.0891. Improved by 2.44017e-05. Iteration 46. Log joint probability = 80.0892. Improved by 2.18579e-05. Iteration 47. Log joint probability = 80.0892. Improved by 9.80386e-06. Iteration 48. Log joint probability = 80.0892. Improved by 1.06951e-05. Iteration 49. Log joint probability = 80.0892. Improved by 6.16583e-06. Iteration 50. Log joint probability = 80.0892. Improved by 1.98196e-07. Iteration 51. Log joint probability = 80.0892. Improved by 8.47884e-06. Iteration 52. Log joint probability = 80.0892. Improved by 2.96348e-06. Iteration 53. Log joint probability = 80.0892. Improved by 4.70151e-06. Iteration 54. Log joint probability = 80.0892. Improved by 1.6169e-
<prophet.forecaster.Prophet at 0x140945100>
06. Iteration 55. Log joint probability = 80.0892. Improved by 2.52923e-06. Iteration 56. Log joint probability = 80.0892. Improved by 1.87124e-06. Iteration 57. Log joint probability = 80.0892. Improved by 9.328e-07. Iteration 58. Log joint probability = 80.0892. Improved by 2.17926e-06. Iteration 59. Log joint probability = 80.0892. Improved by 2.01237e-07. Iteration 60. Log joint probability = 80.0892. Improved by 1.30211e-07. Iteration 61. Log joint probability = 80.0892. Improved by 3.95631e-07. Iteration 62. Log joint probability = 80.0892. Improved by 9.2246e-07. Iteration 63. Log joint probability = 80.0892. Improved by 1.90158e-07. Iteration 64. Log joint probability = 80.0892. Improved by 3.68394e-07. Iteration 65. Log joint probability = 80.0892. Improved by 9.89863e-08. Iteration 66. Log joint probability = 80.0892. Improved by 4.13581e-08. Iteration 67. Log joint probability = 80.0892. Improved by 3.94547e-07. Iteration 68. Log joint probability = 80.0892. Improved by 3.94549e-07. Iteration 69. Log joint probability = 80.0892. Improved by 1.51908e-07. Iteration 70. Log joint probability = 80.0892. Improved by 3.20735e-07. Iteration 71. Log joint probability = 80.0892. Improved by 5.39842e-08. Iteration 72. Log joint probability = 80.0892. Improved by 6.52261e-08. Iteration 73. Log joint probability = 80.0892. Improved by 4.33995e-08. Iteration 74. Log joint probability = 80.0892. Improved by 2.14595e-08. Iteration 75. Log joint probability = 80.0892. Improved by 1.07748e-08. Iteration 76. Log joint probability = 80.0892. Improved by 4.2336e-09.
furniture_forecast = furniture_model.make_future_dataframe(periods=36, freq='MS')
furniture_forecast = furniture_model.predict(furniture_forecast)
office_forecast = office_model.make_future_dataframe(periods=36, freq='MS')
office_forecast = office_model.predict(office_forecast)
plt.figure(figsize=(18, 6))
furniture_model.plot(furniture_forecast, xlabel = 'Date', ylabel = 'Sales')
plt.title('Furniture Sales');
<Figure size 1800x600 with 0 Axes>
plt.figure(figsize=(18, 6))
office_model.plot(office_forecast, xlabel = 'Date', ylabel = 'Sales')
plt.title('Office Supplies Sales');
<Figure size 1800x600 with 0 Axes>
We already have the forecasts for three years for these two categories into the future. We will now join them together to compare their future forecasts.
furniture_names = ['furniture_%s' % column for column in furniture_forecast.columns]
office_names = ['office_%s' % column for column in office_forecast.columns]
merge_furniture_forecast = furniture_forecast.copy()
merge_office_forecast = office_forecast.copy()
merge_furniture_forecast.columns = furniture_names
merge_office_forecast.columns = office_names
forecast = pd.merge(merge_furniture_forecast, merge_office_forecast, how = 'inner', left_on = 'furniture_ds', right_on = 'office_ds')
forecast = forecast.rename(columns={'furniture_ds': 'Date'}).drop('office_ds', axis=1)
forecast.head()
Date | furniture_trend | furniture_yhat_lower | furniture_yhat_upper | furniture_trend_lower | furniture_trend_upper | furniture_additive_terms | furniture_additive_terms_lower | furniture_additive_terms_upper | furniture_yearly | ... | office_additive_terms | office_additive_terms_lower | office_additive_terms_upper | office_yearly | office_yearly_lower | office_yearly_upper | office_multiplicative_terms | office_multiplicative_terms_lower | office_multiplicative_terms_upper | office_yhat | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 2014-01-01 | 726.057713 | 308.847618 | 766.893401 | 726.057713 | 726.057713 | -190.685662 | -190.685662 | -190.685662 | -190.685662 | ... | -140.040481 | -140.040481 | -140.040481 | -140.040481 | -140.040481 | -140.040481 | 0.0 | 0.0 | 0.0 | 347.490278 |
1 | 2014-02-01 | 727.494023 | 205.973005 | 688.341122 | 727.494023 | 727.494023 | -276.377703 | -276.377703 | -276.377703 | -276.377703 | ... | -385.678283 | -385.678283 | -385.678283 | -385.678283 | -385.678283 | -385.678283 | 0.0 | 0.0 | 0.0 | 109.240162 |
2 | 2014-03-01 | 728.791335 | 464.177460 | 936.271600 | 728.791335 | 728.791335 | -22.389755 | -22.389755 | -22.389755 | -22.389755 | ... | -31.379844 | -31.379844 | -31.379844 | -31.379844 | -31.379844 | -31.379844 | 0.0 | 0.0 | 0.0 | 470.211349 |
3 | 2014-04-01 | 730.227645 | 386.124561 | 873.643884 | 730.227645 | 730.227645 | -100.141158 | -100.141158 | -100.141158 | -100.141158 | ... | -134.291690 | -134.291690 | -134.291690 | -134.291690 | -134.291690 | -134.291690 | 0.0 | 0.0 | 0.0 | 374.687188 |
4 | 2014-05-01 | 731.617622 | 331.461015 | 819.125115 | 731.617622 | 731.617622 | -160.815662 | -160.815662 | -160.815662 | -160.815662 | ... | -263.821569 | -263.821569 | -263.821569 | -263.821569 | -263.821569 | -263.821569 | 0.0 | 0.0 | 0.0 | 252.306682 |
5 rows × 31 columns
plt.figure(figsize=(10, 7))
plt.plot(forecast['Date'], forecast['furniture_trend'], 'b-')
plt.plot(forecast['Date'], forecast['office_trend'], 'r-')
plt.legend(); plt.xlabel('Date'); plt.ylabel('Sales')
plt.title('Furniture vs. Office Supplies Sales Trend');
WARNING:matplotlib.legend:No artists with labels found to put in legend. Note that artists whose label start with an underscore are ignored when legend() is called with no argument.
plt.figure(figsize=(10, 7))
plt.plot(forecast['Date'], forecast['furniture_yhat'], 'b-')
plt.plot(forecast['Date'], forecast['office_yhat'], 'r-')
plt.legend(); plt.xlabel('Date'); plt.ylabel('Sales')
plt.title('Furniture vs. Office Supplies Estimate');
WARNING:matplotlib.legend:No artists with labels found to put in legend. Note that artists whose label start with an underscore are ignored when legend() is called with no argument.
Now, we can use the Prophet Models to inspect different trends of these two categories in the data.
furniture_model.plot_components(furniture_forecast);
office_model.plot_components(office_forecast);
Good to see that the sales for both furniture and office supplies have been linearly increasing over time although office supplies' growth seems slightly stronger.
The worst month for furniture is April, the worst month for office supplies is February. The best month for furniture is December, and the best month for office supplies is November.
There are many time-series analysis we can explore from now on, such as forecast with uncertainty bounds, change point and anomaly detection, forecast time-series with external data source. We have only scratched the surface here. Stay tuned for future works on time-series analysis.