from fastai.gen_doc.nbdoc import *
from fastai.tabular.models import TabularModel
show_doc(TabularModel)
emb_szs
match each categorical variable size with an embedding size, n_cont
is the number of continuous variables. The model consists of Embedding
layers for the categorical variables, followed by a Dropout
of emb_drop
, and a BatchNorm
for the continuous variables. The results are concatenated and followed by blocks of BatchNorm
, Dropout
, Linear
and ReLU
(the first block skips BatchNorm
and Dropout
, the last block skips the ReLU
).
The sizes of the blocks are given in layers
and the probabilities of the Dropout
in ps
. The last size is out_sz
, and we add a last activation that is a sigmoid rescaled to cover y_range
(if it's not None
). Lastly, if use_bn
is set to False, all BatchNorm
layers are skipped except the one applied to the continuous variables.
Generally it's easiest to just create a learner with tabular_learner
, which will automatically create a TabularModel
for you.
show_doc(TabularModel.forward)
forward
[source]
forward
(x_cat
:Tensor
,x_cont
:Tensor
) →Tensor
Defines the computation performed at every call. Should be overridden by all subclasses.
.. note::
Although the recipe for forward pass needs to be defined within
this function, one should call the :class:Module
instance afterwards
instead of this since the former takes care of running the
registered hooks while the latter silently ignores them.
show_doc(TabularModel.get_sizes)
get_sizes
[source]
get_sizes
(layers
,out_sz
)