activations#
- datacheese.activations.sigmoid(x)#
Sigmoid function, \(\sigma(x)\).
\[\sigma(x) = \frac{1}{1 + e^{-x}}\]- Parameters:
x (numpy.ndarray) – Input values.
- Returns:
f – Output values.
- Return type:
numpy.ndarray
- datacheese.activations.sigmoid_derivative(x=None, f=None)#
Sigmoid derivative function, \(\sigma(x)'\).
If
x
is provided, the output is computed directly:\[\sigma'(x) = \frac{e^{-x}}{(1 + e^{-x})^2}\]If the output of antiderivative
f
is provided, the output is computed using the antiderivative:\[\sigma'(x) = \sigma(x) \bigg(1 - \sigma(x)\bigg)\]- Parameters:
x (numpy.ndarray, default
None
) – Input values.f (numpy.ndarray, default
None
) – Output values of antiderivative function.
- Returns:
f_prime – Output values.
- Return type:
numpy.ndarray
- datacheese.activations.tanh(x)#
Hyperbolic tangent function, \(\tanh(x)\).
\[\tanh(x) = \frac{e^x - e^{-x}}{e^x + e^{-x}}\]- Parameters:
x (numpy.ndarray) – Input values.
- Returns:
f – Output values.
- Return type:
numpy.ndarray
- datacheese.activations.tanh_derivative(x=None, f=None)#
Hyperbolic tangent derivative function, \(\tanh'(x)\).
If
x
is provided, the output is computed directly:\[\tanh'(x) = 1 - \bigg(\frac{e^x - e^{-x}}{e^x + e^{-x}}\bigg)^2\]If the output of antiderivative
f
is provided, the output is computed using the antiderivative:\[\tanh'(x) = 1 - \tanh^2(x)\]- Parameters:
x (numpy.ndarray, default
None
) – Input values.f (numpy.ndarray, default
None
) – Output values of antiderivative function.
- Returns:
f_prime – Output values.
- Return type:
numpy.ndarray
- datacheese.activations.relu(x)#
ReLU (rectified linear unit) function.
\[\text{relu}(x) = \max(0, x)\]- Parameters:
x (numpy.ndarray) – Input values.
- Returns:
f – Output values.
- Return type:
numpy.ndarray
- datacheese.activations.relu_derivative(x=None, f=None)#
ReLU (rectified linear unit) derivative function.
If
x
is provided, the output is computed directly:\[\begin{split}\text{relu}'(x) = \begin{cases} 0 & \text{ if } x \le 0 \\ 1 & \text{ if } x > 0 \end{cases}\end{split}\]If the output of antiderivative
f
is provided, the output is computed using the antiderivative:\[\begin{split}\text{relu}'(x) = \begin{cases} 0 & \text{ if } \text{relu}(x) = 0 \\ 1 & \text{ if } \text{relu}(x) \ne 0 \end{cases}\end{split}\]- Parameters:
x (numpy.ndarray, default
None
) – Input values.f (numpy.ndarray, default
None
) – Output values of antiderivative function.
- Returns:
f_prime – Output values.
- Return type:
numpy.ndarray
- datacheese.activations.leaky_relu(x, alpha=0.01)#
Leaky ReLU (rectified linear unit) function.
\[\text{leakyrelu}(x) = \max(\alpha x, x)\]- Parameters:
x (numpy.ndarray) – Input values.
alpha (float) – Negative slope \(\alpha\), where \(0 \le \alpha \le 1\).
- Returns:
f – Output values.
- Return type:
numpy.ndarray
- datacheese.activations.leaky_relu_derivative(x=None, f=None, alpha=0.01)#
Leaky ReLU (rectified linear unit) derivative function.
If
x
is provided, the output is computed directly:\[\begin{split}\text{leakyrelu}'(x) = \begin{cases} \alpha & \text{ if } x \le 0 \\ 1 & \text{ if } x > 0 \end{cases}\end{split}\]If the output of antiderivative
f
is provided, the output is computed using the antiderivative:\[\begin{split}\text{leakyrelu}'(x) = \begin{cases} \alpha & \text{ if } \text{leakyrelu}(x) \le 0 \\ 1 & \text{ if } \text{leakyrelu}(x) > 0 \end{cases}\end{split}\]- Parameters:
x (numpy.ndarray, default
None
) – Input values.f (numpy.ndarray, default
None
) – Output values of antiderivative function.
- Returns:
f_prime – Output values.
- Return type:
numpy.ndarray