# Activation functions

statinf.ml.activations.elu(x, alpha=1.0)[source]

Exponential Linear Unit activation function.

Parameters

x (float or numpy.array) – Input value

Formula
$\begin{split}\mathrm{elu}(x) = \begin{cases} x, & x > 0\\ \alpha \left(e^{x} - 1\right), & x \le 0 \end{cases}\end{split}$
Returns

Activated value.

Return type

float

statinf.ml.activations.logit(x, weights, bias=0)[source]

Logistic function

Parameters
• x (numpy.array) – Input value

• weights (numpy.array) – Vector of weights $$\beta$$

• bias (numpy.array) – Vector of bias $$\epsilon$$, defaults to 0.

Returns

Logistic transformation: $$\mathrm{logit}(x, \beta) = \dfrac{1}{1 + e^{-x \beta}}$$

Return type

float

statinf.ml.activations.relu(x)[source]

Rectified Linear Unit activation function.

Parameters

x (float or numpy.array) – Input value

Returns

Activated value: $$\mathrm{relu}(x) = \max(0, x)$$

Return type

float

statinf.ml.activations.sigmoid(x)[source]

Sigmoid activation function.

Parameters

x (float or numpy.array) – Input value

Returns

Sigmoid activated value: $$sigmoid(x) = \dfrac{1}{1 + e^{-x}}$$

Return type

float

statinf.ml.activations.softmax(x, axis=-1)[source]

Softmax activation function.

Parameters

x (float or numpy.array) – Input value

Returns

Activated value: $$\mathrm{softmax}(x) = \frac{\exp(x_i)}{\sum_j \exp(x_j)}$$

Return type

float

statinf.ml.activations.softplus(x)[source]

Softplus activation function.

Parameters

x (float or numpy.array) – Input value

Returns

Activated value: $$\mathrm{softplus}(x) = \log(1 + e^{-x})$$

Return type

float

statinf.ml.activations.tanh(x)[source]

Hyperbolic tangent activation function.

Parameters

x (float or numpy.array) – Input value

Returns

Activated value: $$\tanh(x)$$

Return type

float