# Tutorial - Time series forecasting¶

## Introduction¶

Time series are an ubiquitous type of data in all types of processes. Producing forecasts for them can be highly valuable in domains like retail or industrial manufacture, among many others.

Lightwood supports time series forecasting (both univariate and multivariate inputs), handling many of the pain points commonly associated with setting up a manual time series predictive pipeline.

In this tutorial, we will train a lightwood predictor and analyze its forecasts for the task of counting sunspots in monthly intervals.

[1]:

import pandas as pd

df

[1]:

Month Sunspots
0 1749-01 58.0
1 1749-02 62.6
2 1749-03 70.0
3 1749-04 55.7
4 1749-05 85.0
... ... ...
2815 1983-08 71.8
2816 1983-09 50.3
2817 1983-10 55.8
2818 1983-11 33.3
2819 1983-12 33.4

2820 rows × 2 columns

This is a very simple dataset. It’s got a single column that specifies the month in which the measurement was done, and then in the ‘Sunspots’ column we have the actual quantity we are interested in forecasting. As such, we can characterize this as a univariate time series problem.

We will use Lightwood high level methods to state what we want to predict. As this is a time series task (because we want to leverage the notion of time to predict), we need to specify a set of arguments that will activate Lightwood’s time series pipeline:

[2]:

from lightwood.api.high_level import ProblemDefinition

INFO:lightwood-2759:No torchvision detected, image helpers not supported.
INFO:lightwood-2759:No torchvision/pillow detected, image encoder not supported

[3]:

tss = {'horizon': 6,   # the predictor will learn to forecast what the next semester counts will look like (6 data points at monthly intervals -> 6 months)
'order_by': ['Month'], # what column is used to order the entire datset
'window': 12           # how many past values to consider for emitting predictions
}

pdef = ProblemDefinition.from_dict({'target': 'Sunspots',         # specify the column to forecast
'timeseries_settings': tss    # pass along all time series specific parameters
})


Now, let’s do a very simple train-test split, leaving 10% of the data to check the forecasts that our predictor will produce:

[4]:

cutoff = int(len(df)*0.9)

train = df[:cutoff]
test = df[cutoff:]

print(train.shape, test.shape)

(2538, 2) (282, 2)


## Generate the predictor object¶

Now, we can generate code for a machine learning model by using our problem definition and the data:

[5]:

from lightwood.api.high_level import (
json_ai_from_problem,
code_from_json_ai,
predictor_from_code
)

json_ai = json_ai_from_problem(df, problem_definition=pdef)
code = code_from_json_ai(json_ai)
predictor = predictor_from_code(code)

# uncomment this to see the generated code:
# print(code)

INFO:lightwood-2759:Analyzing a sample of 2467
INFO:lightwood-2759:from a total population of 2820, this is equivalent to 87.5% of your data.
INFO:lightwood-2759:Infering type for: Month
INFO:lightwood-2759:Column Month has data type date
INFO:lightwood-2759:Infering type for: Sunspots
INFO:lightwood-2759:Column Sunspots has data type float
/home/runner/work/lightwood/lightwood/lightwood/helpers/text.py:245: RuntimeWarning: invalid value encountered in double_scalars
randomness_per_index.append(S / np.log(N))
INFO:lightwood-2759:Starting statistical analysis
INFO:lightwood-2759:Finished statistical analysis
INFO:lightwood-2759:Unable to import black formatter, predictor code might be a bit ugly.


## Train¶

Okay, everything is ready now for our predictor to learn based on the training data we will provide.

Internally, lightwood cleans and reshapes the data, featurizes measurements and timestamps, and comes up with a handful of different models that will be evaluated to keep the one that produces the best forecasts.

Let’s train the predictor. This should take a couple of minutes, at most:

[6]:

predictor.learn(train)

INFO:lightwood-2759:[Learn phase 1/8] - Statistical analysis
INFO:lightwood-2759:Starting statistical analysis
INFO:lightwood-2759:Finished statistical analysis
DEBUG:lightwood-2759: analyze_data runtime: 0.32 seconds
INFO:lightwood-2759:[Learn phase 2/8] - Data preprocessing
INFO:lightwood-2759:Cleaning the data
INFO:lightwood-2759:Transforming timeseries data
INFO:lightwood-2759:Using 1 processes to reshape.
DEBUG:lightwood-2759: preprocess runtime: 11.5 seconds
INFO:lightwood-2759:[Learn phase 3/8] - Data splitting
INFO:lightwood-2759:Splitting the data into train/test
DEBUG:lightwood-2759: split runtime: 0.0 seconds
INFO:lightwood-2759:[Learn phase 4/8] - Preparing encoders
DEBUG:lightwood-2759: prepare runtime: 0.3 seconds
INFO:lightwood-2759:[Learn phase 5/8] - Feature generation
INFO:lightwood-2759:Featurizing the data
DEBUG:lightwood-2759: featurize runtime: 0.0 seconds
INFO:lightwood-2759:[Learn phase 6/8] - Mixer training
INFO:lightwood-2759:Training the mixers
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
[LightGBM] [Fatal] GPU Tree Learner was not enabled in this build.
Please recompile with CMake option -DUSE_GPU=1
WARNING:lightwood-2759:LightGBM running on CPU, this somewhat slower than the GPU version, consider using a GPU instead
[LightGBM] [Fatal] GPU Tree Learner was not enabled in this build.
Please recompile with CMake option -DUSE_GPU=1
WARNING:lightwood-2759:LightGBM running on CPU, this somewhat slower than the GPU version, consider using a GPU instead
[LightGBM] [Fatal] GPU Tree Learner was not enabled in this build.
Please recompile with CMake option -DUSE_GPU=1
WARNING:lightwood-2759:LightGBM running on CPU, this somewhat slower than the GPU version, consider using a GPU instead
[LightGBM] [Fatal] GPU Tree Learner was not enabled in this build.
Please recompile with CMake option -DUSE_GPU=1
WARNING:lightwood-2759:LightGBM running on CPU, this somewhat slower than the GPU version, consider using a GPU instead
[LightGBM] [Fatal] GPU Tree Learner was not enabled in this build.
Please recompile with CMake option -DUSE_GPU=1
WARNING:lightwood-2759:LightGBM running on CPU, this somewhat slower than the GPU version, consider using a GPU instead
[LightGBM] [Fatal] GPU Tree Learner was not enabled in this build.
Please recompile with CMake option -DUSE_GPU=1
WARNING:lightwood-2759:LightGBM running on CPU, this somewhat slower than the GPU version, consider using a GPU instead
warnings.warn("torch.cuda.amp.GradScaler is enabled, but CUDA is not available.  Disabling.")
addcmul_(Number value, Tensor tensor1, Tensor tensor2)
Consider using one of the following signatures instead:
addcmul_(Tensor tensor1, Tensor tensor2, *, Number value) (Triggered internally at  ../torch/csrc/utils/python_arg_parser.cpp:1025.)
INFO:lightwood-2759:Loss of 0.5851023197174072 with learning rate 0.0001
INFO:lightwood-2759:Loss of 0.714859127998352 with learning rate 0.00014
INFO:lightwood-2759:Found learning rate of: 0.0001
warnings.warn("torch.cuda.amp.GradScaler is enabled, but CUDA is not available.  Disabling.")
INFO:lightwood-2759:Loss @ epoch 1: 0.6456677317619324
INFO:lightwood-2759:Loss @ epoch 2: 0.6410343199968338
INFO:lightwood-2759:Loss @ epoch 3: 0.6354705095291138
INFO:lightwood-2759:Loss @ epoch 4: 0.6291152238845825
INFO:lightwood-2759:Loss @ epoch 5: 0.6220875233411789
INFO:lightwood-2759:Loss @ epoch 6: 0.6145200580358505
INFO:lightwood-2759:Loss @ epoch 7: 0.6013987064361572
INFO:lightwood-2759:Loss @ epoch 8: 0.5925577580928802
INFO:lightwood-2759:Loss @ epoch 9: 0.583435520529747
INFO:lightwood-2759:Loss @ epoch 10: 0.5741045027971268
INFO:lightwood-2759:Loss @ epoch 11: 0.5645481199026108
INFO:lightwood-2759:Loss @ epoch 12: 0.5548548251390457
INFO:lightwood-2759:Loss @ epoch 13: 0.5391151160001755
INFO:lightwood-2759:Loss @ epoch 14: 0.5290990322828293
INFO:lightwood-2759:Loss @ epoch 15: 0.5191140025854111
INFO:lightwood-2759:Loss @ epoch 16: 0.5091893970966339
INFO:lightwood-2759:Loss @ epoch 17: 0.49928827583789825
INFO:lightwood-2759:Loss @ epoch 18: 0.4894482493400574
INFO:lightwood-2759:Loss @ epoch 19: 0.47377973794937134
INFO:lightwood-2759:Loss @ epoch 20: 0.46406178176403046
INFO:lightwood-2759:Loss @ epoch 21: 0.45449112355709076
INFO:lightwood-2759:Loss @ epoch 22: 0.44506753981113434
INFO:lightwood-2759:Loss @ epoch 23: 0.43571387231349945
INFO:lightwood-2759:Loss @ epoch 24: 0.42658860981464386
INFO:lightwood-2759:Loss @ epoch 25: 0.41221652925014496
INFO:lightwood-2759:Loss @ epoch 26: 0.4034354239702225
INFO:lightwood-2759:Loss @ epoch 27: 0.394923135638237
INFO:lightwood-2759:Loss @ epoch 28: 0.3865997791290283
INFO:lightwood-2759:Loss @ epoch 29: 0.37845784425735474
INFO:lightwood-2759:Loss @ epoch 30: 0.37062760442495346
INFO:lightwood-2759:Loss @ epoch 31: 0.35856904089450836
INFO:lightwood-2759:Loss @ epoch 32: 0.3514312580227852
INFO:lightwood-2759:Loss @ epoch 33: 0.3445674926042557
INFO:lightwood-2759:Loss @ epoch 34: 0.33791903406381607
INFO:lightwood-2759:Loss @ epoch 35: 0.33144131302833557
INFO:lightwood-2759:Loss @ epoch 36: 0.3252798691391945
INFO:lightwood-2759:Loss @ epoch 37: 0.31595368683338165
INFO:lightwood-2759:Loss @ epoch 38: 0.31045476347208023
INFO:lightwood-2759:Loss @ epoch 39: 0.30521363765001297
INFO:lightwood-2759:Loss @ epoch 40: 0.3001897260546684
INFO:lightwood-2759:Loss @ epoch 41: 0.2954142168164253
INFO:lightwood-2759:Loss @ epoch 42: 0.29095470905303955
INFO:lightwood-2759:Loss @ epoch 43: 0.28426560014486313
INFO:lightwood-2759:Loss @ epoch 44: 0.2803715690970421
INFO:lightwood-2759:Loss @ epoch 45: 0.27670086175203323
INFO:lightwood-2759:Loss @ epoch 46: 0.27328842878341675
INFO:lightwood-2759:Loss @ epoch 47: 0.2699623182415962
INFO:lightwood-2759:Loss @ epoch 48: 0.2668989822268486
INFO:lightwood-2759:Loss @ epoch 49: 0.26228372007608414
INFO:lightwood-2759:Loss @ epoch 50: 0.259634830057621
INFO:lightwood-2759:Loss @ epoch 51: 0.25711600482463837
INFO:lightwood-2759:Loss @ epoch 52: 0.2546936720609665
INFO:lightwood-2759:Loss @ epoch 53: 0.25234681367874146
INFO:lightwood-2759:Loss @ epoch 54: 0.2503012418746948
INFO:lightwood-2759:Loss @ epoch 55: 0.24729573726654053
INFO:lightwood-2759:Loss @ epoch 56: 0.2454727664589882
INFO:lightwood-2759:Loss @ epoch 57: 0.24366793036460876
INFO:lightwood-2759:Loss @ epoch 58: 0.24181383103132248
INFO:lightwood-2759:Loss @ epoch 59: 0.239965058863163
INFO:lightwood-2759:Loss @ epoch 60: 0.23839683085680008
INFO:lightwood-2759:Loss @ epoch 61: 0.23602312058210373
INFO:lightwood-2759:Loss @ epoch 62: 0.23456629365682602
INFO:lightwood-2759:Loss @ epoch 63: 0.23311389982700348
INFO:lightwood-2759:Loss @ epoch 64: 0.2316449210047722
INFO:lightwood-2759:Loss @ epoch 65: 0.23032782971858978
INFO:lightwood-2759:Loss @ epoch 66: 0.22926552593708038
INFO:lightwood-2759:Loss @ epoch 67: 0.22766253352165222
INFO:lightwood-2759:Loss @ epoch 68: 0.22664203494787216
INFO:lightwood-2759:Loss @ epoch 69: 0.22564993798732758
INFO:lightwood-2759:Loss @ epoch 70: 0.22471969574689865
INFO:lightwood-2759:Loss @ epoch 71: 0.22381772845983505
INFO:lightwood-2759:Loss @ epoch 72: 0.22311770170927048
INFO:lightwood-2759:Loss @ epoch 73: 0.2221347540616989
INFO:lightwood-2759:Loss @ epoch 74: 0.22143390774726868
INFO:lightwood-2759:Loss @ epoch 75: 0.22074797749519348
INFO:lightwood-2759:Loss @ epoch 76: 0.2201334834098816
INFO:lightwood-2759:Loss @ epoch 77: 0.21961654722690582
INFO:lightwood-2759:Loss @ epoch 78: 0.21918465197086334
INFO:lightwood-2759:Loss @ epoch 79: 0.21856776624917984
INFO:lightwood-2759:Loss @ epoch 80: 0.21822092682123184
INFO:lightwood-2759:Loss @ epoch 81: 0.2179049775004387
INFO:lightwood-2759:Loss @ epoch 82: 0.21762075275182724
INFO:lightwood-2759:Loss @ epoch 83: 0.2173527553677559
INFO:lightwood-2759:Loss @ epoch 84: 0.21712396293878555
INFO:lightwood-2759:Loss @ epoch 85: 0.21683821082115173
INFO:lightwood-2759:Loss @ epoch 86: 0.2166566252708435
INFO:lightwood-2759:Loss @ epoch 87: 0.21646138280630112
INFO:lightwood-2759:Loss @ epoch 88: 0.21623966097831726
INFO:lightwood-2759:Loss @ epoch 89: 0.21599581092596054
INFO:lightwood-2759:Loss @ epoch 90: 0.21577569097280502
INFO:lightwood-2759:Loss @ epoch 91: 0.2154984548687935
INFO:lightwood-2759:Loss @ epoch 92: 0.21529610455036163
INFO:lightwood-2759:Loss @ epoch 93: 0.21512483805418015
INFO:lightwood-2759:Loss @ epoch 94: 0.21490700542926788
INFO:lightwood-2759:Loss @ epoch 95: 0.21468307822942734
INFO:lightwood-2759:Loss @ epoch 96: 0.21451719105243683
INFO:lightwood-2759:Loss @ epoch 97: 0.21431321650743484
INFO:lightwood-2759:Loss @ epoch 98: 0.2141672745347023
INFO:lightwood-2759:Loss @ epoch 99: 0.21403365582227707
INFO:lightwood-2759:Loss @ epoch 100: 0.21386437118053436
INFO:lightwood-2759:Loss @ epoch 101: 0.213673397898674
INFO:lightwood-2759:Loss @ epoch 102: 0.21355227380990982
INFO:lightwood-2759:Loss @ epoch 103: 0.21344823390245438
INFO:lightwood-2759:Loss @ epoch 104: 0.21333791315555573
INFO:lightwood-2759:Loss @ epoch 105: 0.21318786591291428
INFO:lightwood-2759:Loss @ epoch 106: 0.2130158394575119
INFO:lightwood-2759:Loss @ epoch 107: 0.2127927914261818
INFO:lightwood-2759:Loss @ epoch 108: 0.21261801570653915
INFO:lightwood-2759:Loss @ epoch 109: 0.21250233799219131
INFO:lightwood-2759:Loss @ epoch 110: 0.21234575659036636
INFO:lightwood-2759:Loss @ epoch 111: 0.21219460666179657
INFO:lightwood-2759:Loss @ epoch 112: 0.21191512793302536
INFO:lightwood-2759:Loss @ epoch 113: 0.2116353064775467
INFO:lightwood-2759:Loss @ epoch 114: 0.2114258110523224
INFO:lightwood-2759:Loss @ epoch 115: 0.21123968064785004
INFO:lightwood-2759:Loss @ epoch 116: 0.21106985211372375
INFO:lightwood-2759:Loss @ epoch 117: 0.21091864258050919
INFO:lightwood-2759:Loss @ epoch 118: 0.2106359377503395
INFO:lightwood-2759:Loss @ epoch 119: 0.21038822084665298
INFO:lightwood-2759:Loss @ epoch 120: 0.21020638942718506
INFO:lightwood-2759:Loss @ epoch 121: 0.20994381606578827
INFO:lightwood-2759:Loss @ epoch 122: 0.20987889915704727
INFO:lightwood-2759:Loss @ epoch 123: 0.209790401160717
INFO:lightwood-2759:Loss @ epoch 124: 0.20942267775535583
INFO:lightwood-2759:Loss @ epoch 125: 0.20917751640081406
INFO:lightwood-2759:Loss @ epoch 126: 0.20900005847215652
INFO:lightwood-2759:Loss @ epoch 127: 0.20878595858812332
INFO:lightwood-2759:Loss @ epoch 128: 0.20859776437282562
INFO:lightwood-2759:Loss @ epoch 129: 0.20846472680568695
INFO:lightwood-2759:Loss @ epoch 130: 0.2082698568701744
INFO:lightwood-2759:Loss @ epoch 131: 0.20808269083499908
INFO:lightwood-2759:Loss @ epoch 132: 0.20790109783411026
INFO:lightwood-2759:Loss @ epoch 133: 0.20763012021780014
INFO:lightwood-2759:Loss @ epoch 134: 0.20749341696500778
INFO:lightwood-2759:Loss @ epoch 135: 0.20773321390151978
warnings.warn("torch.cuda.amp.GradScaler is enabled, but CUDA is not available.  Disabling.")
INFO:lightwood-2759:Loss @ epoch 1: 0.15921965376897293
INFO:lightwood-2759:Loss @ epoch 2: 0.16152001240036704
INFO:lightwood-2759:Loss @ epoch 3: 0.1614339527758685
INFO:lightwood-2759:Loss @ epoch 4: 0.1622994515028867
INFO:lightwood-2759:Loss @ epoch 5: 0.16224708340384744
DEBUG:lightwood-2759: fit_mixer runtime: 14.22 seconds
INFO:lightwood-2759:Started fitting LGBM models for array prediction
INFO:lightwood-2759:Started fitting LGBM model
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:A single GBM iteration takes 0.1 seconds
INFO:lightwood-2759:Training GBM (<module 'lightgbm' from '/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/__init__.py'>) with 38015 iterations given 4751.87961602211 seconds constraint
INFO:lightwood-2759:Lightgbm model contains 25 weak estimators
INFO:lightwood-2759:Updating lightgbm model with 1.5 iterations
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:Model now has a total of 26 weak estimators
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:59: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
train_data.data_frame[self.target] = train_data.data_frame[f'{self.target}_timestep_{timestep}']
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:60: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
dev_data.data_frame[self.target] = dev_data.data_frame[f'{self.target}_timestep_{timestep}']
INFO:lightwood-2759:Started fitting LGBM model
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:A single GBM iteration takes 0.1 seconds
INFO:lightwood-2759:Training GBM (<module 'lightgbm' from '/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/__init__.py'>) with 38015 iterations given 4751.888533830643 seconds constraint
INFO:lightwood-2759:Lightgbm model contains 19 weak estimators
INFO:lightwood-2759:Updating lightgbm model with 1 iterations
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:Model now has a total of 20 weak estimators
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:59: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
train_data.data_frame[self.target] = train_data.data_frame[f'{self.target}_timestep_{timestep}']
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:60: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
dev_data.data_frame[self.target] = dev_data.data_frame[f'{self.target}_timestep_{timestep}']
INFO:lightwood-2759:Started fitting LGBM model
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:A single GBM iteration takes 0.1 seconds
INFO:lightwood-2759:Training GBM (<module 'lightgbm' from '/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/__init__.py'>) with 38015 iterations given 4751.887797355652 seconds constraint
INFO:lightwood-2759:Lightgbm model contains 15 weak estimators
INFO:lightwood-2759:Updating lightgbm model with 1 iterations
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:Model now has a total of 16 weak estimators
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:59: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
train_data.data_frame[self.target] = train_data.data_frame[f'{self.target}_timestep_{timestep}']
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:60: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
dev_data.data_frame[self.target] = dev_data.data_frame[f'{self.target}_timestep_{timestep}']
INFO:lightwood-2759:Started fitting LGBM model
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:A single GBM iteration takes 0.1 seconds
INFO:lightwood-2759:Training GBM (<module 'lightgbm' from '/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/__init__.py'>) with 38015 iterations given 4751.886675834656 seconds constraint
INFO:lightwood-2759:Lightgbm model contains 15 weak estimators
INFO:lightwood-2759:Updating lightgbm model with 1 iterations
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:Model now has a total of 16 weak estimators
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:59: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
train_data.data_frame[self.target] = train_data.data_frame[f'{self.target}_timestep_{timestep}']
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:60: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
dev_data.data_frame[self.target] = dev_data.data_frame[f'{self.target}_timestep_{timestep}']
INFO:lightwood-2759:Started fitting LGBM model
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:A single GBM iteration takes 0.1 seconds
INFO:lightwood-2759:Training GBM (<module 'lightgbm' from '/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/__init__.py'>) with 38015 iterations given 4751.885835170746 seconds constraint
INFO:lightwood-2759:Lightgbm model contains 14 weak estimators
INFO:lightwood-2759:Updating lightgbm model with 1 iterations
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:Model now has a total of 15 weak estimators
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:59: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
train_data.data_frame[self.target] = train_data.data_frame[f'{self.target}_timestep_{timestep}']
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:60: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
dev_data.data_frame[self.target] = dev_data.data_frame[f'{self.target}_timestep_{timestep}']
INFO:lightwood-2759:Started fitting LGBM model
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:A single GBM iteration takes 0.1 seconds
INFO:lightwood-2759:Training GBM (<module 'lightgbm' from '/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/__init__.py'>) with 38015 iterations given 4751.886430501938 seconds constraint
INFO:lightwood-2759:Lightgbm model contains 11 weak estimators
INFO:lightwood-2759:Updating lightgbm model with 1 iterations
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:Model now has a total of 12 weak estimators
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:65: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
train_data.data_frame[self.target] = original_target_train
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:66: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
dev_data.data_frame[self.target] = original_target_dev
DEBUG:lightwood-2759: fit_mixer runtime: 1.93 seconds
INFO:lightwood-2759:Started fitting sktime forecaster for array prediction
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/scipy/optimize/_numdiff.py:557: RuntimeWarning: invalid value encountered in subtract
df = fun(x) - f0
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
DEBUG:lightwood-2759: fit_mixer runtime: 18.05 seconds
INFO:lightwood-2759:Ensembling the mixer
INFO:lightwood-2759:Mixer: Neural got accuracy: 0.4888339079586589
WARNING:lightwood-2759:This model does not output probability estimates
INFO:lightwood-2759:Mixer: LightGBMArray got accuracy: 0.3777388112765242
WARNING:lightwood-2759:This mixer does not output probability estimates
INFO:lightwood-2759:Mixer: SkTime got accuracy: 0.20665332782785042
INFO:lightwood-2759:Picked best mixer: Neural
DEBUG:lightwood-2759: fit runtime: 37.13 seconds
INFO:lightwood-2759:[Learn phase 7/8] - Ensemble analysis
INFO:lightwood-2759:Analyzing the ensemble of mixers
INFO:lightwood-2759:The block ICP is now running its analyze() method
INFO:lightwood-2759:The block AccStats is now running its analyze() method
INFO:lightwood-2759:The block ConfStats is now running its analyze() method
DEBUG:lightwood-2759: analyze_ensemble runtime: 0.31 seconds
INFO:lightwood-2759:[Learn phase 8/8] - Adjustment on validation requested
INFO:lightwood-2759:Updating the mixers
warnings.warn("torch.cuda.amp.GradScaler is enabled, but CUDA is not available.  Disabling.")
INFO:lightwood-2759:Loss @ epoch 1: 0.16297179087996483
INFO:lightwood-2759:Loss @ epoch 2: 0.16293908779819807
INFO:lightwood-2759:Loss @ epoch 3: 0.16309100513656935
INFO:lightwood-2759:Loss @ epoch 4: 0.16299042478203773
INFO:lightwood-2759:Loss @ epoch 5: 0.16329201807578406
INFO:lightwood-2759:Updating array of LGBM models...
INFO:lightwood-2759:Updating lightgbm model with 1.5 iterations
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:Model now has a total of 27 weak estimators
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:75: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
train_data.data_frame[self.target] = train_data.data_frame[f'{self.target}_timestep_{timestep}']
INFO:lightwood-2759:Updating lightgbm model with 1 iterations
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:Model now has a total of 21 weak estimators
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:75: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
train_data.data_frame[self.target] = train_data.data_frame[f'{self.target}_timestep_{timestep}']
INFO:lightwood-2759:Updating lightgbm model with 1 iterations
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:Model now has a total of 17 weak estimators
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:75: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
train_data.data_frame[self.target] = train_data.data_frame[f'{self.target}_timestep_{timestep}']
INFO:lightwood-2759:Updating lightgbm model with 1 iterations
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:Model now has a total of 17 weak estimators
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:75: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
train_data.data_frame[self.target] = train_data.data_frame[f'{self.target}_timestep_{timestep}']
INFO:lightwood-2759:Updating lightgbm model with 1 iterations
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:Model now has a total of 16 weak estimators
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:75: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
train_data.data_frame[self.target] = train_data.data_frame[f'{self.target}_timestep_{timestep}']
INFO:lightwood-2759:Updating lightgbm model with 1 iterations
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:177: UserWarning: Found num_iterations in params. Will use it instead of argument
_log_warning(f"Found {alias} in params. Will use it instead of argument")
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/lightgbm/engine.py:240: UserWarning: 'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. Pass 'log_evaluation()' callback via 'callbacks' argument instead.
_log_warning("'verbose_eval' argument is deprecated and will be removed in a future release of LightGBM. "
INFO:lightwood-2759:Model now has a total of 13 weak estimators
/home/runner/work/lightwood/lightwood/lightwood/mixer/lightgbm_array.py:81: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
train_data.data_frame[self.target] = original_target_train
INFO:lightwood-2759:Started fitting sktime forecaster for array prediction
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/scipy/optimize/_numdiff.py:557: RuntimeWarning: invalid value encountered in subtract
df = fun(x) - f0
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/statsforecast/arima.py:878: UserWarning: possible convergence problem: minimize gave code 2]
warnings.warn(
DEBUG:lightwood-2759: adjust runtime: 2.3 seconds
DEBUG:lightwood-2759: learn runtime: 51.88 seconds


## Predict¶

Once the predictor has trained, we can use it to generate 6-month forecasts for each of the test set data points:

[7]:

forecasts = predictor.predict(test)

INFO:lightwood-2759:[Predict phase 1/4] - Data preprocessing
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
data[col] = [None] * len(data)
INFO:lightwood-2759:Cleaning the data
INFO:lightwood-2759:Transforming timeseries data
DEBUG:lightwood-2759: preprocess runtime: 1.27 seconds
INFO:lightwood-2759:[Predict phase 2/4] - Feature generation
INFO:lightwood-2759:Featurizing the data
DEBUG:lightwood-2759: featurize runtime: 0.0 seconds
INFO:lightwood-2759:[Predict phase 3/4] - Calling ensemble
INFO:lightwood-2759:[Predict phase 4/4] - Analyzing output
/opt/hostedtoolcache/Python/3.9.13/x64/lib/python3.9/site-packages/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)
INFO:lightwood-2759:The block ICP is now running its explain() method
INFO:lightwood-2759:The block AccStats is now running its explain() method
INFO:lightwood-2759:AccStats.explain() has not been implemented, no modifications will be done to the data insights.
INFO:lightwood-2759:The block ConfStats is now running its explain() method
INFO:lightwood-2759:ConfStats.explain() has not been implemented, no modifications will be done to the data insights.
DEBUG:lightwood-2759: predict runtime: 3.17 seconds


Let’s check how a single row might look:

[8]:

forecasts.iloc[[10]]

[8]:

original_index prediction order_Month confidence lower upper anomaly
10 10 [63.586027429849636, 65.84454074193691, 69.263... [-271641600.0, -268963200.0, -266284800.0, -26... [0.59, 0.59, 0.59, 0.59, 0.59, 0.59] [51.32310951724612, 51.51105828263548, 53.1587... [75.84894534245316, 80.17802320123833, 85.3675... False

You’ll note that the point prediction has associated lower and upper bounds that are a function of the estimated confidence the model has on its own output. Apart from this, order_Month yields the timestamps of each prediction, the anomaly tag will let you know if the observed value falls outside of the predicted region.

## Visualizing a forecast¶

Okay, time series are much easier to appreciate through plots. Let’s make one:

NOTE: We will use matplotlib to generate a simple plot of these forecasts. If you want to run this notebook locally, you will need to pip install matplotlib for the following code to work.

[9]:

import matplotlib.pyplot as plt

[10]:

plt.figure(figsize=(12, 8))
plt.plot([None for _ in range(forecasts.shape[0])] + forecasts.iloc[-1]['prediction'], color='purple', label='point prediction')
plt.plot([None for _ in range(forecasts.shape[0])] + forecasts.iloc[-1]['lower'], color='grey')
plt.plot([None for _ in range(forecasts.shape[0])] + forecasts.iloc[-1]['upper'], color='grey')
plt.xlabel('timestep')
plt.ylabel('# sunspots')
plt.title("Forecasted amount of sunspots for the next semester")
plt.legend()
plt.show()


## Conclusion¶

In this tutorial, we have gone through how you can train a machine learning model with Lightwood to produce forecasts for a univariate time series task.

There are additional parameters to further customize your timeseries settings and/or prediction insights, so be sure to check the rest of the documentation.