weights (NUMERIC) - Weights array. May be null. If null, a weight of 1.0 is used
lossReduce - Reduction type for the loss. See LossReduce for more details. Default: LossReduce#MEAN_BY_NONZERO_WEIGHT_COUNT - default = LossReduce.MEAN_BY_NONZERO_WEIGHT_COUNT
Cosine distance loss: 1 - cosineSimilarity(x,y) or 1 - sum_i label[i] * prediction[i], which is
equivalent to cosine distance when both the predictions and labels are normalized.
Note: This loss function assumes that both the predictions and labels are normalized to have unit l2 norm.
If this is not the case, you should normalize them first by dividing by norm2(String, SDVariable, boolean, int...)
along the cosine distance dimension (with keepDims=true).
label (NUMERIC) - Label array
predictions (NUMERIC) - Predictions array
weights (NUMERIC) - Weights array. May be null. If null, a weight of 1.0 is use
lossReduce - Reduction type for the loss. See LossReduce for more details. Default: LossReduce#MEAN_BY_NONZERO_WEIGHT_COUNT - default = LossReduce.MEAN_BY_NONZERO_WEIGHT_COUNT
dimension - Dimension to perform the cosine distance over
Hinge loss: a loss function used for training classifiers.
Implements L = max(0, 1 - t * predictions) where t is the label values after internally converting to {-1,1`
from the user specified {0,1. Note that Labels should be provided with values {0,1.
label (NUMERIC) - Label array. Each value should be 0.0 or 1.0 (internally -1 to 1 is used)
predictions (NUMERIC) - Predictions array
weights (NUMERIC) - Weights array. May be null. If null, a weight of 1.0 is used
lossReduce - Reduction type for the loss. See LossReduce for more details. Default: LossReduce#MEAN_BY_NONZERO_WEIGHT_COUNT - default = LossReduce.MEAN_BY_NONZERO_WEIGHT_COUNT
weights (NUMERIC) - Weights array. May be null. If null, a weight of 1.0 is used
lossReduce - Reduction type for the loss. See LossReduce for more details. Default: LossReduce#MEAN_BY_NONZERO_WEIGHT_COUNT - default = LossReduce.MEAN_BY_NONZERO_WEIGHT_COUNT
weights (NUMERIC) - Weights array. May be null. If null, a weight of 1.0 is used
lossReduce - Reduction type for the loss. See LossReduce for more details. Default: LossReduce#MEAN_BY_NONZERO_WEIGHT_COUNT - default = LossReduce.MEAN_BY_NONZERO_WEIGHT_COUNT
Log poisson loss: a loss function used for training classifiers.
Implements L = exp(c) - z * c where c is log(predictions) and z is labels.
label (NUMERIC) - Label array. Each value should be 0.0 or 1.0
predictions (NUMERIC) - Predictions array (has to be log(x) of actual predictions)
weights (NUMERIC) - Weights array. May be null. If null, a weight of 1.0 is used
lossReduce - Reduction type for the loss. See LossReduce for more details. Default: LossReduce#MEAN_BY_NONZERO_WEIGHT_COUNT - default = LossReduce.MEAN_BY_NONZERO_WEIGHT_COUNT
full - Boolean flag. true for logPoissonFull, false for logPoisson
weights (NUMERIC) - Weights array. May be null. If null, a weight of 1.0 is used. Must be either null, scalar, or have shape [batchSize]
lossReduce - Reduction type for the loss. See LossReduce for more details. Default: LossReduce#MEAN_BY_NONZERO_WEIGHT_COUNT - default = LossReduce.MEAN_BY_NONZERO_WEIGHT_COUNT
Mean squared error loss function. Implements (label[i] - prediction[i])^2 - i.e., squared error on a per-element basis.
When averaged (using LossReduce#MEAN_BY_WEIGHT or LossReduce#MEAN_BY_NONZERO_WEIGHT_COUNT (the default))
this is the mean squared error loss function.
label (NUMERIC) - Label array
predictions (NUMERIC) - Predictions array
weights (NUMERIC) - Weights array. May be null. If null, a weight of 1.0 is used
lossReduce - Reduction type for the loss. See LossReduce for more details. Default: LossReduce#MEAN_BY_NONZERO_WEIGHT_COUNT - default = LossReduce.MEAN_BY_NONZERO_WEIGHT_COUNT
weights (NUMERIC) - Weights array. May be null. If null, a weight of 1.0 is used
lossReduce - Reduction type for the loss. See LossReduce for more details. Default: LossReduce#MEAN_BY_NONZERO_WEIGHT_COUNT - default = LossReduce.MEAN_BY_NONZERO_WEIGHT_COUNT
Applies the softmax activation function to the input, then implement multi-class cross entropy:
{@code -sum_classes label[i] * log(p[c])} where {@code p = softmax(logits)}
If LossReduce#NONE is used, returned shape is [numExamples] out for [numExamples, numClasses] predicitons/labels;
otherwise, the output is a scalar.
When label smoothing is > 0, the following label smoothing is used:
weights (NUMERIC) - Weights array. May be null. If null, a weight of 1.0 is used
lossReduce - Reduction type for the loss. See LossReduce for more details. Default: LossReduce#MEAN_BY_NONZERO_WEIGHT_COUNT - default = LossReduce.MEAN_BY_NONZERO_WEIGHT_COUNT