Models

Pytorch Models for Sequential Data
from nbdev.config import get_config
project_root = get_config().config_file.parent
f_path = project_root / 'test_data/WienerHammerstein'
seq = DataBlock(blocks=(SequenceBlock.from_hdf(['u','y'],TensorSequencesInput,clm_shift=[0,-1]),
                        SequenceBlock.from_hdf(['y'],TensorSequencesOutput,clm_shift=[-1])),
                 get_items=CreateDict([DfHDFCreateWindows(win_sz=100+1,stp_sz=100,clm='u')]),
                 splitter=ApplyToDict(ParentSplitter()))
db = seq.dataloaders(get_hdf_files(f_path))

Batchnorm


source

BatchNorm_1D_Stateful

 BatchNorm_1D_Stateful (hidden_size, seq_len=None, stateful=False,
                        batch_first=True, eps=1e-07, momentum=0.1,
                        affine=True, track_running_stats=True)

Batchnorm for stateful models. Stores batch statistics for for every timestep seperately to mitigate transient effects.

Type Default Details
hidden_size
seq_len NoneType None
stateful bool False
batch_first bool True
eps float 1e-07
momentum float 0.1
affine bool True
track_running_stats bool True num_features

Linear


source

SeqLinear

 SeqLinear (input_size, output_size, hidden_size=100, hidden_layer=1,
            act=<class 'torch.nn.modules.activation.Mish'>,
            batch_first=True)

*Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.

.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.

:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool*

Autoregressive Models


source

Normalizer1D

 Normalizer1D (mean, std)

*Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.

.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.

:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool*


source

AR_Model

 AR_Model (model, ar=True, stateful=False, model_has_state=False,
           return_state=False, out_sz=None)

Autoregressive model container which work autoregressively if the sequence y is not provided, otherwise it works as a normal model. This way it can be trained either with teacher forcing or with autoregression

model = AR_Model(SeqLinear(3,1),model_has_state=False,ar=True,out_sz=1)
model.init_normalize(db.one_batch())
lrn = Learner(db,model,loss_func=nn.MSELoss()).fit(1)
0.00% [0/1 00:00<?]
epoch train_loss valid_loss time

0.00% [0/12 00:00<?]