PyKEEN
PyKEEN is a Python package for reproducible, facile knowledge graph embeddings.
The fastest way to get up and running is to use the pykeen.pipeline.pipeline()
function.
It provides a high-level entry into the extensible functionality of
this package. The following example shows how to train and evaluate the
TransE model (pykeen.models.TransE
) on the Nations dataset (pykeen.datasets.Nations
)
by referring to them by name. By default, the training loop uses the stochastic closed world assumption training
approach
(pykeen.training.SLCWATrainingLoop
) and evaluates with rank-based evaluation
(pykeen.evaluation.RankBasedEvaluator
).
>>> from pykeen.pipeline import pipeline
>>> result = pipeline(
... model='TransE',
... dataset='Nations',
... )
The results are returned in a pykeen.pipeline.PipelineResult
instance, which has
attributes for the trained model, the training loop, and the evaluation.
PyKEEN has a function pykeen.env()
that magically prints relevant version information
about PyTorch, CUDA, and your operating system that can be used for debugging.
If you’re in a Jupyter notebook, it will be pretty printed as an HTML table.
>>> import pykeen
>>> pykeen.env()
- Installation
- First Steps
- Knowledge Graph Embedding Models
- Representations
- Tracking Results during Training
- Saving Checkpoints during Training
- A Toy Example with Translational Distance Models
- Understanding the Evaluation
- Optimizing a Model’s Hyper-parameters
- Running an Ablation Study
- Performance Tricks
- Getting Started with NodePiece
- Basic Usage
- Anchor Selection and Searching
- How many total anchors num_anchors and anchors & relations num_tokens do I need for my graph?
- Using NodePiece with
pykeen.pipeline.pipeline()
- Pre-Computed Vocabularies
- Configuring the Interaction Function
- Configuring the Aggregation Function
- NodePiece + GNN
- Tokenizing Large Graphs with METIS
- Inductive Link Prediction
- Splitting
- PyTorch Lightning Integration
- Using Resolvers
- Normalizer, Constrainer & Regularizer
- Troubleshooting
- Bring Your Own Data
- Bring Your Own Interaction
- Implementing your first Interaction Module
- Interactions with Hyper-Parameters
- Interactions with Trainable Parameters
- Interactions with Different Shaped Vectors
- Interactions with Multiple Representations
- Interactions with Different Dimension Vectors
- Differences between
pykeen.nn.modules.Interaction
andpykeen.models.Model
- Ad hoc Models from Interactions
- Interaction Pipeline
- Pipeline
- Models
- Datasets
- Inductive Datasets
- Entity Alignment
- Triples
- Triples Workflows
- Training
- Stoppers
- Loss Functions
- Regularizers
- Result Trackers
- Negative Sampling
- Filtering
- Optimizers
- Evaluation
- Metrics
- Hyper-parameter Optimization
- Ablation
- Prediction
- Uncertainty
- Sealant
- Constants
PYKEEN_BENCHMARKS
PYKEEN_CHECKPOINTS
PYKEEN_DATASETS
PYKEEN_EXPERIMENTS
PYKEEN_HOME
PYKEEN_LOGS
Constrainer
DeviceHint
GaussianDistribution
HeadRepresentation
InductiveMode
Initializer
LabeledTriples
Mutation
Normalizer
RelationRepresentation
TailRepresentation
Target
TargetColumn
TorchRandomHint
cast_constrainer()
normalize_rank_type()
normalize_target()
- Flexible Weight Checkpoints
pykeen.nn
- Utilities
Bias
ExtraReprMixin
NoRandomSeedNecessary
Result
all_in_bounds()
at_least_eps()
batched_dot()
broadcast_upgrade_to_sequences()
calculate_broadcasted_elementwise_result_shape()
check_shapes()
clamp_norm()
combine_complex()
compact_mapping()
complex_normalize()
compose
create_relation_to_entity_set_mapping()
einsum()
ensure_complex()
ensure_ftp_directory()
ensure_torch_random_state()
ensure_tuple()
estimate_cost_of_sequence()
extend_batch()
fix_dataclass_init_docs()
flatten_dictionary()
format_relative_comparison()
get_batchnorm_modules()
get_benchmark()
get_connected_components()
get_devices()
get_df_io()
get_dropout_modules()
get_edge_index()
get_expected_norm()
get_json_bytes_io()
get_model_io()
get_optimal_sequence()
get_preferred_device()
get_until_first_blank()
invert_mapping()
is_triple_tensor_subset()
isin_many_dim()
logcumsumexp()
lp_norm()
negative_norm()
negative_norm_of_sum()
nested_get()
normalize_path()
normalize_string()
powersum_norm()
prepare_filter_triples()
project_entity()
random_non_negative_int()
rate_limited()
resolve_device()
set_random_seed()
split_complex()
split_workload()
tensor_product()
tensor_sum()
triple_tensor_to_set()
unpack_singletons()
upgrade_to_sequence()
view_complex()
env()
get_git_branch()
get_git_hash()
get_version()