LCWALitModule
- class LCWALitModule(dataset='nations', dataset_kwargs=None, mode=None, model='distmult', model_kwargs=None, batch_size=32, learning_rate=0.001, label_smoothing=0.0, optimizer=None, optimizer_kwargs=None)[source]
Bases:
LitModule
A PyTorch Lightning module for training a model with LCWA training loop.
Create the lightning module.
- Parameters:
dataset (
Union
[str
,Dataset
,Type
[Dataset
],None
]) – the dataset, or a hint thereofdataset_kwargs (
Optional
[Mapping
[str
,Any
]]) – additional keyword-based parameters passed to the datasetmode (
Optional
[Literal
[‘training’, ‘validation’, ‘testing’]]) – the inductive mode; defaults to transductive trainingmodel (
Union
[str
,Model
,Type
[Model
],None
]) – the model, or a hint thereofmodel_kwargs (
Optional
[Mapping
[str
,Any
]]) – additional keyword-based parameters passed to the modelbatch_size (
int
) – the training batch sizelearning_rate (
float
) – the learning ratelabel_smoothing (
float
) – the label smoothingoptimizer (
Union
[str
,Optimizer
,Type
[Optimizer
],None
]) – the optimizer, or a hint thereofoptimizer_kwargs (
Optional
[Mapping
[str
,Any
]]) – additional keyword-based parameters passed to the optimizer. should not contain lr, or params.