Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Supported Keras loss functions.
DL4J supports all available Keras losses (except for logcosh
), namely:
mean_squared_error
mean_absolute_error
mean_absolute_percentage_error
mean_squared_logarithmic_error
squared_hinge
hinge
categorical_hinge
logcosh
categorical_crossentropy
sparse_categorical_crossentropy
binary_crossentropy
kullback_leibler_divergence
poisson
cosine_proximity
The mapping of Keras loss functions can be found in KerasLossUtils.
Supported Keras weight initializers.
DL4J supports all available Keras initializers, namely:
Zeros
Ones
Constant
RandomNormal
RandomUniform
TruncatedNormal
VarianceScaling
Orthogonal
Identity
lecun_uniform
lecun_normal
glorot_normal
glorot_uniform
he_normal
he_uniform
The mapping of Keras to DL4J initializers can be found in KerasInitilizationUtils.
Supported Keras constraints.
All Keras constraints are supported:
max_norm
non_neg
unit_norm
min_max_norm
Mapping Keras to DL4J constraints happens in KerasConstraintUtils.
Supported Keras optimizers
All standard Keras optimizers are supported, but importing custom TensorFlow optimizers won't work:
SGD
RMSprop
Adagrad
Adadelta
Adam
Adamax
Nadam
TFOptimizer
Supported Keras regularizers.
All [Keras regularizers] are supported by DL4J model import:
l1
l2
l1_l2
Mapping of regularizers can be found in KerasRegularizerUtils.
Supported Keras activations.
We support all Keras activation functions, namely:
softmax
elu
selu
softplus
softsign
relu
tanh
sigmoid
hard_sigmoid
linear
The mapping of Keras to DL4J activation functions is defined in KerasActivationUtils
Supported Keras features.
While not every concept in DL4J has an equivalent in Keras and vice versa, many of the key concepts can be matched. Importing keras models into DL4J is done in our deeplearning4j-modelimport module. Below is a comprehensive list of currently supported features.
Note that we also support importing tf.keras models as well. The format only changed a little bit from keras to tf.keras. We handle this transition from beta7 and above.
Mapping keras to DL4J layers is done in the layers sub-module of model import. The structure of this project loosely reflects the structure of Keras.
❌ GRU
✅ LSTM
❌ ConvLSTM2D
✅ Add / add
✅ Multiply / multiply
✅ Subtract / subtract
✅ Average / average
✅ Maximum / maximum
✅ Concatenate / concatenate
❌ Dot / dot
✅ PReLU
✅ ELU
❌ TimeDistributed
✅ mean_squared_error
✅ mean_absolute_error
✅ mean_absolute_percentage_error
✅ mean_squared_logarithmic_error
✅ squared_hinge
✅ hinge
✅ categorical_hinge
❌ logcosh
✅ categorical_crossentropy
✅ sparse_categorical_crossentropy
✅ binary_crossentropy
✅ kullback_leibler_divergence
✅ poisson
✅ cosine_proximity
✅ softmax
✅ elu
✅ selu
✅ softplus
✅ softsign
✅ relu
✅ tanh
✅ sigmoid
✅ hard_sigmoid
✅ linear
✅ Zeros
✅ Ones
✅ Constant
✅ RandomNormal
✅ RandomUniform
✅ TruncatedNormal
✅ VarianceScaling
✅ Orthogonal
✅ Identity
✅ lecun_uniform
✅ lecun_normal
✅ glorot_normal
✅ glorot_uniform
✅ he_normal
✅ he_uniform
✅ l1
✅ l2
✅ l1_l2
✅ max_norm
✅ non_neg
✅ unit_norm
✅ min_max_norm
✅ SGD
✅ RMSprop
✅ Adagrad
✅ Adadelta
✅ Adam
✅ Adamax
✅ Nadam
❌ TFOptimizer