class LabelBasedInitializer(labels, encoder=None, encoder_kwargs=None, batch_size=None)[source]

Bases: PretrainedInitializer

An initializer using pretrained models from the transformers library to encode labels.

Example Usage:

Initialize entity representations as Transformer encodings of their labels. Afterwards, the parameters are detached from the labels, and trained on the KGE task without any further connection to the Transformer model.

from pykeen.datasets import get_dataset
from pykeen.nn.init import LabelBasedInitializer
from pykeen.models import ERMLPE

dataset = get_dataset(dataset="nations")
model = ERMLPE(
    embedding_dim=768,  # for BERT base

Initialize the initializer.

  • labels (Sequence[str]) – the labels

  • encoder (Union[str, TextEncoder, Type[TextEncoder], None]) – the text encoder to use, cf. text_encoder_resolver

  • encoder_kwargs (Optional[Mapping[str, Any]]) – additional keyword-based parameters passed to the encoder

  • batch_size (Optional[int]) – >0 the (maximum) batch size to use while encoding. If None, use len(labels), i.e., only a single batch.

Methods Summary

from_triples_factory(triples_factory[, ...])

Prepare a label-based initializer with labels from a triples factory.

Methods Documentation

classmethod from_triples_factory(triples_factory, for_entities=True, **kwargs)[source]

Prepare a label-based initializer with labels from a triples factory.

  • triples_factory (TriplesFactory) – the triples factory

  • for_entities (bool) – whether to create the initializer for entities (or relations)

  • kwargs – additional keyword-based arguments passed to LabelBasedInitializer.__init__()

Return type:



A label-based initializer


ImportError – if the transformers library could not be imported