PyTorch Lightning Integration
PyTorch Lightning integration.
PyTorch Lightning poses an alternative way to implement a training loop and evaluation loop for knowledge graph embedding models that has some nice features:
mixed precision training
multi-gpu training
model = LitLCWAModule(
dataset="fb15k237",
dataset_kwargs=dict(create_inverse_triples=True),
model="mure",
model_kwargs=dict(embedding_dim=128, loss="bcewithlogits"),
batch_size=128,
)
trainer = pytorch_lightning.Trainer(
accelerator="auto", # automatically choose accelerator
logger=False, # defaults to TensorBoard; explicitly disabled here
precision=16, # mixed precision training
)
trainer.fit(model=model)
Classes
|
A base module for training models with PyTorch Lightning. |
|
A PyTorch Lightning module for training a model with LCWA training loop. |
|
A PyTorch Lightning module for training a model with sLCWA training loop. |