functions¶
Functions for neural network activation operations.
This module provides implementations of common activation functions used in neural networks, including Softmax and Sigmoid.
- class Sigmoid[source]¶
Bases:
Op
Applies the sigmoid function to the input tensor.
- _out¶
The output of the forward pass.
- Type:
numpy._typing._array_like._SupportsArray[numpy.dtype[Any]] | numpy._typing._nested_sequence._NestedSequence[numpy._typing._array_like._SupportsArray[numpy.dtype[Any]]] | bool | int | float | complex | str | bytes | numpy._typing._nested_sequence._NestedSequence[bool | int | float | complex | str | bytes] | None
- _grad¶
The gradient computed during the backward pass.
- class Softmax[source]¶
Bases:
Op
Applies the softmax function to the input tensor.
The softmax function is applied only to the final dimension of the tensor. The input is normalized for numeric stability.
- _out¶
The output of the forward pass.
- Type:
numpy._typing._array_like._SupportsArray[numpy.dtype[Any]] | numpy._typing._nested_sequence._NestedSequence[numpy._typing._array_like._SupportsArray[numpy.dtype[Any]]] | bool | int | float | complex | str | bytes | numpy._typing._nested_sequence._NestedSequence[bool | int | float | complex | str | bytes] | None
- _grad¶
The gradient computed during the backward pass.