MLPModel#
- class MLPModel(input_size: int, decoder_length: int, hidden_size: List, encoder_length: int = 0, lr: float = 0.001, loss: Module | None = None, train_batch_size: int = 16, test_batch_size: int = 16, optimizer_params: dict | None = None, trainer_params: dict | None = None, train_dataloader_params: dict | None = None, test_dataloader_params: dict | None = None, val_dataloader_params: dict | None = None, split_params: dict | None = None)[source]#
Bases:
DeepBaseModel
MLPModel.
Note
This model requires
torch
extension to be installed. Read more about this at installation page.Init MLP model.
- Parameters:
input_size (int) – size of the input feature space: target plus extra features
decoder_length (int) – decoder length
hidden_size (List) – List of sizes of the hidden states
encoder_length (int) – encoder length
lr (float) – learning rate
loss (torch.nn.Module | None) – loss function, MSELoss by default
train_batch_size (int) – batch size for training
test_batch_size (int) – batch size for testing
optimizer_params (dict | None) – parameters for optimizer for Adam optimizer (api reference
torch.optim.Adam
)trainer_params (dict | None) – Pytorch ligthning trainer parameters (api reference
pytorch_lightning.trainer.trainer.Trainer
)train_dataloader_params (dict | None) – parameters for train dataloader like sampler for example (api reference
torch.utils.data.DataLoader
)test_dataloader_params (dict | None) – parameters for test dataloader
val_dataloader_params (dict | None) – parameters for validation dataloader
split_params (dict | None) –
- dictionary with parameters for
torch.utils.data.random_split()
for train-test splitting train_size: (float) value from 0 to 1 - fraction of samples to use for training
generator: (Optional[torch.Generator]) - generator for reproducibile train-test splitting
torch_dataset_size: (Optional[int]) - number of samples in dataset, in case of dataset not implementing
__len__
- dictionary with parameters for
Methods
fit
(ts)Fit model.
forecast
(ts, prediction_size[, ...])Make predictions.
Get model.
load
(path)Load an object.
Get default grid for tuning hyperparameters.
predict
(ts, prediction_size[, return_components])Make predictions.
raw_fit
(torch_dataset)Fit model on torch like Dataset.
raw_predict
(torch_dataset)Make inference on torch like Dataset.
save
(path)Save the object.
set_params
(**params)Return new object instance with modified parameters.
to_dict
()Collect all information about etna object in dict.
Attributes
This class stores its
__init__
parameters as attributes.Context size of the model.
- fit(ts: TSDataset) DeepBaseModel [source]#
Fit model.
- Parameters:
ts (TSDataset) – TSDataset with features
- Returns:
Model after fit
- Return type:
DeepBaseModel
- forecast(ts: TSDataset, prediction_size: int, return_components: bool = False) TSDataset [source]#
Make predictions.
This method will make autoregressive predictions.
- Parameters:
- Returns:
Dataset with predictions
- Return type:
- classmethod load(path: Path) Self [source]#
Load an object.
- Parameters:
path (Path) – Path to load object from.
- Returns:
Loaded object.
- Return type:
Self
- params_to_tune() Dict[str, BaseDistribution] [source]#
Get default grid for tuning hyperparameters.
This grid tunes parameters:
lr
,hidden_size.i
where i from 0 tolen(hidden_size) - 1
. Other parameters are expected to be set by the user.- Returns:
Grid to tune.
- Return type:
- predict(ts: TSDataset, prediction_size: int, return_components: bool = False) TSDataset [source]#
Make predictions.
This method will make predictions using true values instead of predicted on a previous step. It can be useful for making in-sample forecasts.
- Parameters:
- Returns:
Dataset with predictions
- Return type:
- raw_fit(torch_dataset: Dataset) DeepBaseModel [source]#
Fit model on torch like Dataset.
- Parameters:
torch_dataset (Dataset) – Torch like dataset for model fit
- Returns:
Model after fit
- Return type:
DeepBaseModel
- raw_predict(torch_dataset: Dataset) Dict[Tuple[str, str], ndarray] [source]#
Make inference on torch like Dataset.
- set_params(**params: dict) Self [source]#
Return new object instance with modified parameters.
Method also allows to change parameters of nested objects within the current object. For example, it is possible to change parameters of a
model
in aPipeline
.Nested parameters are expected to be in a
<component_1>.<...>.<parameter>
form, where components are separated by a dot.- Parameters:
**params (dict) – Estimator parameters
- Returns:
New instance with changed parameters
- Return type:
Self
Examples
>>> from etna.pipeline import Pipeline >>> from etna.models import NaiveModel >>> from etna.transforms import AddConstTransform >>> model = model=NaiveModel(lag=1) >>> transforms = [AddConstTransform(in_column="target", value=1)] >>> pipeline = Pipeline(model, transforms=transforms, horizon=3) >>> pipeline.set_params(**{"model.lag": 3, "transforms.0.value": 2}) Pipeline(model = NaiveModel(lag = 3, ), transforms = [AddConstTransform(in_column = 'target', value = 2, inplace = True, out_column = None, )], horizon = 3, )