:py:mod:`real_models` ===================== .. py:module:: real_models Module Contents --------------- Classes ~~~~~~~ .. autoapisummary:: real_models.Distmult real_models.TransE real_models.Tucker Attributes ~~~~~~~~~~ .. autoapisummary:: real_models.seed .. py:data:: seed :value: 1 .. py:class:: Distmult(param) Bases: :py:obj:`torch.nn.Module` Base class for all neural network modules. Your models should also subclass this class. Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:: import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x)) Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:`to`, etc. .. note:: As per the example above, an ``__init__()`` call to the parent class must be made before assignment on the child. :ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool .. py:method:: forward_head_batch(*, e1_idx, rel_idx) .. py:method:: forward_head_and_loss(e1_idx, rel_idx, targets) .. py:method:: init() .. py:method:: get_embeddings() .. py:method:: forward_triples(*, e1_idx, rel_idx, e2_idx) .. py:method:: forward_triples_and_loss(e1_idx, rel_idx, e2_idx, targets) .. py:class:: TransE(param) Bases: :py:obj:`torch.nn.Module` TransE trained with binary cross entropy .. py:method:: init() .. py:method:: get_embeddings() .. py:method:: forward_triples(*, e1_idx, rel_idx, e2_idx) .. py:method:: forward_triples_and_loss(e1_idx, rel_idx, e2_idx, target) .. py:method:: forward_head_and_loss(*args, **kwargs) :abstractmethod: .. py:class:: Tucker(param) Bases: :py:obj:`torch.nn.Module` Base class for all neural network modules. Your models should also subclass this class. Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:: import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x)) Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:`to`, etc. .. note:: As per the example above, an ``__init__()`` call to the parent class must be made before assignment on the child. :ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool .. py:method:: init() .. py:method:: forward_head_batch(e1_idx, rel_idx) .. py:method:: forward_head_and_loss(e1_idx, rel_idx, targets) .. py:method:: get_embeddings() .. py:method:: forward_triples(*args, **kwargs) :abstractmethod: .. py:method:: forward_triples_and_loss(*args, **kwargs) :abstractmethod: