loss¶
Loss functions for neural network training.
This module contains implementations of common loss functions used in neural network training, such as Mean Squared Error and Cross Entropy.
- Classes:
MeanSquaredError: Calculates the Mean Squared Error loss. CrossEntropy: Calculates the Cross Entropy loss.
- class CrossEntropy[source]¶
Bases:
Op
Calculates Cross Entropy loss.
This class implements the Cross Entropy loss function, which is commonly used for classification tasks. It computes the loss given logits and target indices (as opposed to one-hot encoded tensors).
- _y_true¶
The true labels (cached for backward pass).
- _log_softmax_pred¶
The log softmax of predictions (cached for backward pass).
- _out¶
The computed loss (cached for backward pass).
- Type:
numpy._typing._array_like._SupportsArray[numpy.dtype[Any]] | numpy._typing._nested_sequence._NestedSequence[numpy._typing._array_like._SupportsArray[numpy.dtype[Any]]] | bool | int | float | complex | str | bytes | numpy._typing._nested_sequence._NestedSequence[bool | int | float | complex | str | bytes] | None
- _grad¶
The computed gradients (cached for backward pass).
- class MeanSquaredError[source]¶
Bases:
Op
Calculates Mean Squared Error loss.
This class implements the Mean Squared Error (MSE) loss function, which measures the average squared difference between the predicted and true values.
- diff¶
The difference between predicted and true values.
- divisor¶
A scaling factor for the loss calculation.