model.deeplearn.loss package
Submodules
model.deeplearn.loss.class_weighted_binary_crossentropy module
- model.deeplearn.loss.class_weighted_binary_crossentropy.vl3d_class_weighted_binary_crossentropy(class_weight)
Function to compute a weighted binary cross-entropy loss.
Let \(\mathcal{L}(\pmb{y}, \pmb{\hat{y}}) \in \mathbb{R}^{m}\) be a binary crossentropy loss on \(m\) samples. Now, let \(\pmb{w} \in \mathbb{R}^2\) be a vector of class weights for binary classification, i.e., two classes. Thus, it is possible to define a vector \(\pmb{\omega} \in \mathbb{R}^{m}\) such that \(\omega_{i} = y_i w_2 + (1-y_i) w_1\), where any \(y_i\) must be either zero or one. For then, the class weighted binary crossentropy can be obtained simply by computing the following Hadamard Product (where \(\pmb{\hat{y}}\) is the vector of binary predictions):
\[\mathcal{L}(\pmb{y}, \pmb{\hat{y}}) \odot \pmb{\omega}\]- Parameters:
class_weight – The vector of class weights. The component i of this vector (\(\pmb{w}\)) is the weight for class i.
- Returns:
The weighted binary cross entropy loss.
model.deeplearn.loss.class_weighted_categorical_crossentropy module
- model.deeplearn.loss.class_weighted_categorical_crossentropy.vl3d_class_weighted_categorical_crossentropy(class_weight)
Function to compute a weighted categorical cross-entropy loss.
Let \(\mathcal{L}(\pmb{y}, \pmb{\hat{y}}) \in \mathbb{R}^{m}\) be a categorical crossentropy loss on \(m\) samples. Now, let \(\pmb{w} \in \mathbb{R}^n\) be a vector of class weights for multiclass classification, i.e., potentially many classes. Thus, it is possible to define a vector \(\pmb{\omega} \in \mathbb{R}^{m}\) such that \(\omega_i = \langle{\pmb{w}, \pmb{y}}\rangle\), where any \(y_j\) must be either zero or one for \(j=1,\ldots,n\). For then, the class weighted categorical crossentropy can be obtained simply by computing the following Hadamard Product (where \(\pmb{\hat{y}}\) is the vector of one-hot-encoded multiclass predictions).
\[\mathcal{L}(\pmb{y}, \pmb{\hat{y}}) \odot \pmb{\omega}\]- Parameters:
class_weight – The vector of class weights. The component i of this vector (\(\pmb{w}\)) is the weight for class i.
- Returns:
The weighted categorical cross entropy loss
model.deeplearn.loss.multiloss_linear_superposition module
- model.deeplearn.loss.multiloss_linear_superposition.vl3d_multiloss_linear_superposition(loss)
Function to compute the linear superposition of an underlying loss function on many output heads that are assumed to correspond to a single set of reference samples.
\[\mathcal{L}'(\pmb{y}, \pmb{\hat{y}_1}, \ldots, \pmb{\hat{y}_l}) = \sum_{k=1}^{l}{\mathcal{L}(\pmb{y}, \pmb{\hat{y}_k})}\]- Parameters:
loss – The underlying loss function L. It must be evaluable for L(y_true, y_pred[i]), for any i.
- Returns:
The linear superposition of the underlying loss function for each output head.
model.deeplearn.loss.ragged_binary_crossentropy module
- model.deeplearn.loss.ragged_binary_crossentropy.vl3d_ragged_binary_crossentropy()
Version of
keras.ops.binary_crossentropy()that works with ragged tensors.
model.deeplearn.loss.ragged_categorical_crossentropy module
- model.deeplearn.loss.ragged_categorical_crossentropy.vl3d_ragged_categorical_crossentropy()
Version of
keras.op.categorical_crossentropy()that works with ragged tensors.
model.deeplearn.loss.ragged_class_weighted_binary_crossentropy module
- model.deeplearn.loss.ragged_class_weighted_binary_crossentropy.vl3d_ragged_class_weighted_binary_crossentropy(class_weight)
Version of
class_weighted_binary_crossentropy.vl3d_class_weighted_binary_crossentropy()that works with ragged tensors.
model.deeplearn.loss.ragged_class_weighted_categorical_crossentropy module
- model.deeplearn.loss.ragged_class_weighted_categorical_crossentropy.vl3d_ragged_class_weighted_categorical_crossentropy(class_weight)
Version of
class_weighted_categorical_crossentropy.vl3d_class_weighted_categorical_crossentropy()that works with ragged tensors.
Module contents
- author:
Alberto M. Esmoris Pena
The loss package contains the logic to handle custom loss functions in deep learning models.