Activations
Special algorithms for gradient descent.
Note the below algorithms are not reflective of all of the availabe choices for activations. If you need more than the below please consider using Samediff with a much wider array of features. Samediff can be embedded in a dl4j network using the layers in:
What are activations?
Usage
GraphBuilder graphBuilder = new NeuralNetConfiguration.Builder()
// add hyperparameters and other layers
.addLayer("softmax", new ActivationLayer(Activation.SOFTMAX), "previous_input")
// add more layers and output
.build();Available activations
ActivationRectifiedTanh
ActivationELU
ActivationReLU
ActivationRationalTanh
ActivationThresholdedReLU
ActivationReLU6
ActivationHardTanH
ActivationSigmoid
ActivationGELU
ActivationPReLU
ActivationIdentity
ActivationSoftSign
ActivationHardSigmoid
ActivationSoftmax
ActivationCube
ActivationRReLU
ActivationTanH
ActivationSELU
ActivationLReLU
ActivationSwish
ActivationSoftPlus
Last updated
Was this helpful?