Loss Functions

class slugnet.loss.Objective[source]

Bases: object

An objective function (or loss function, or optimization score function) is one of the two parameters required to compile a model.

backward(outputs, targets)[source]

Backward function.

outputs, targets : numpy.array
The arrays to compute the derivatives of them.
numpy.array
An array of derivative.
class slugnet.loss.BinaryCrossEntropy(epsilon=1e-11)[source]

Bases: slugnet.loss.Objective

Standard binary cross-entropy loss function.

Binary cross-entropy is given by

\bm{\ell}(\bm{\hat{y}}, \bm{y}) = - \frac{1}{N} \sum_{i=1}^N
    [\bm{y}_i \, \text{log}(\bm{\hat{y}}_i) + (1 - \bm{y}_i) \text{log}(1 - \bm{\hat{y}}_i)]

backward(outputs, targets)[source]

Backward function.

outputs, targets : numpy.array
The arrays to compute the derivatives of them.
numpy.array
An array of derivative.
class slugnet.loss.SoftmaxCategoricalCrossEntropy(epsilon=1e-11)[source]

Bases: slugnet.loss.Objective

backward(outputs, targets)[source]

Backward function.

outputs, targets : numpy.array
The arrays to compute the derivatives of them.
numpy.array
An array of derivative.