Regularizers

Regularization in PyKEEN.

Functions

get_regularizer_cls(query)

Get the regularizer class.

Classes

LpRegularizer(device[, weight, dim, …])

A simple L_p norm based regularizer.

NoRegularizer(device[, weight, apply_only_once])

A regularizer which does not perform any regularization.

CombinedRegularizer(regularizers, device[, …])

A convex combination of regularizers.

PowerSumRegularizer(device[, weight, dim, …])

A simple x^p based regularizer.

TransHRegularizer(device[, weight, epsilon])

A regularizer for the soft constraints in TransH.

Class Inheritance Diagram

Inheritance diagram of pykeen.regularizers.LpRegularizer, pykeen.regularizers.NoRegularizer, pykeen.regularizers.CombinedRegularizer, pykeen.regularizers.PowerSumRegularizer, pykeen.regularizers.TransHRegularizer

Base Classes

class Regularizer(device, weight=1.0, apply_only_once=False)[source]

A base class for all regularizers.

Initializes internal Module state, shared by both nn.Module and ScriptModule.

apply_only_once: bool

Should the regularization only be applied once? This was used for ConvKB and defaults to False.

abstract forward(x)[source]

Compute the regularization term for one tensor.

Return type

FloatTensor

classmethod get_normalized_name()[source]

Get the normalized name of the regularizer class.

Return type

str

hpo_default: ClassVar[Mapping[str, Any]]

The default strategy for optimizing the regularizer’s hyper-parameters

regularization_term: torch.FloatTensor

The current regularization term (a scalar)

reset()[source]

Reset the regularization term to zero.

Return type

None

property term: torch.FloatTensor

Return the weighted regularization term.

Return type

FloatTensor

update(*tensors)[source]

Update the regularization term based on passed tensors.

Return type

None

weight: torch.FloatTensor

The overall regularization weight