Deeplearning4j
Community ForumND4J JavadocDL4J Javadoc
EN 1.0.0-M2
EN 1.0.0-M2
  • Deeplearning4j Suite Overview
  • Release Notes
    • 1.0.0-M2
    • 1.0.0-M1.1
    • 1.0.0-M1
    • 1.0.0-beta7
    • 1.0.0-beta6
    • 1.0.0-beta5
    • 1.0.0-beta4
    • 1.0.0-beta3
    • 1.0.0-beta2
    • 1.0.0-beta
    • 1.0.0-alpha
    • 0.9.1
    • 0.9.0
    • 0.8.0
    • 0.7.2
    • 0.7.1
    • 0.7.0
    • 0.6.0
    • 0.5.0
    • 0.4.0
    • 1.00-M2.2
  • Multi-Project
    • Tutorials
      • Beginners
      • Quickstart
    • How To Guides
      • Import in to your favorite IDE
      • Contribute
        • Eclipse Contributors
      • Developer Docs
        • Github Actions/Build Infra
        • Javacpp
        • Release
        • Testing
      • Build From Source
      • Benchmark
      • Beginners
    • Reference
      • Examples Tour
    • Explanation
      • The core workflow
      • Configuration
        • Backends
          • Performance Issues
          • CPU
          • Cudnn
        • Memory
          • Workspaces
      • Build Tools
      • Snapshots
      • Maven
  • Deeplearning4j
    • Tutorials
      • Quick Start
      • Language Processing
        • Doc2Vec
        • Sentence Iterator
        • Tokenization
        • Vocabulary Cache
    • How To Guides
      • Custom Layers
      • Keras Import
        • Functional Models
        • Sequential Models
        • Custom Layers
        • Keras Import API Overview
          • Advanced Activations
          • Convolutional Layers
          • Core Layers
          • Embedding Layers
          • Local Layers
          • Noise Layers
          • Normalization Layers
          • Pooling Layers
          • Recurrent Layers
          • Wrapper Layers
        • Supported Features Overview
          • Activations
          • Constraints
          • Initializers
          • Losses
          • Optimizers
          • Regularizers
      • Tuning and Training
        • Visualization
        • Troubleshooting Training
        • Early Stopping
        • Evaluation
        • Transfer Learning
    • Reference
      • Model Zoo
        • Zoo Models
      • Activations
      • Auto Encoders
      • Computation Graph
      • Convolutional Layers
      • DataSet Iterators
      • Layers
      • Model Listeners
      • Saving and Loading Models
      • Multi Layer Network
      • Recurrent Layers
      • Updaters/Optimizers
      • Vertices
      • Word2vec/Glove/Doc2Vec
    • Explanation
  • datavec
    • Tutorials
      • Overview
    • How To Guides
    • Reference
      • Analysis
      • Conditions
      • Executors
      • Filters
      • Normalization
      • Operations
      • Transforms
      • Readers
      • Records
      • Reductions
      • Schemas
      • Serialization
      • Visualization
    • Explanation
  • Nd4j
    • Tutorials
      • Quickstart
    • How To Guides
      • Other Framework Interop
        • Tensorflow
        • TVM
        • Onnx
      • Matrix Manipulation
      • Element wise Operations
      • Basics
    • Reference
      • Op Descriptor Format
      • Tensor
      • Syntax
    • Explanation
  • Samediff
    • Tutorials
      • Quickstart
    • How To Guides
      • Importing Tensorflow
      • Adding Operations
        • codegen
    • Reference
      • Operation Namespaces
        • Base Operations
        • Bitwise
        • CNN
        • Image
        • LinAlg
        • Loss
        • Math
        • NN
        • Random
        • RNN
      • Variables
    • Explanation
      • Model Import Framework
  • Libnd4j
    • How To Guides
      • Building on Windows
      • Building for raspberry pi or Jetson Nano
      • Building on ios
      • How to Add Operations
      • How to Setup CLion
    • Reference
      • Understanding graph execution
      • Overview of working with libnd4j
      • Helpers Overview (CUDNN, OneDNN,Armcompute)
    • Explanation
  • Python4j
    • Tutorials
      • Quickstart
    • How To Guides
      • Write Python Script
    • Reference
      • Python Types
      • Python Path
      • Garbage Collection
      • Python Script Execution
    • Explanation
  • Spark
    • Tutorials
      • DL4J on Spark Quickstart
    • How To Guides
      • How To
      • Data How To
    • Reference
      • Parameter Server
      • Technical Reference
    • Explanation
      • Spark API Reference
  • codegen
Powered by GitBook
On this page
  • Keras Model Import: Supported Features
  • Layers
  • Core Layers
  • Convolutional Layers
  • Pooling Layers
  • Locally-connected Layers
  • Recurrent Layers
  • Embedding Layers
  • Merge Layers
  • Advanced Activation Layers
  • Normalization Layers
  • Noise Layers
  • Layer Wrappers
  • Losses
  • Activations
  • Initializers
  • Regularizers
  • Constraints
  • Optimizers

Was this helpful?

Export as PDF
  1. Deeplearning4j
  2. How To Guides
  3. Keras Import

Supported Features Overview

Supported Keras features.

PreviousWrapper LayersNextActivations

Last updated 3 years ago

Was this helpful?

Keras Model Import: Supported Features

While not every concept in DL4J has an equivalent in Keras and vice versa, many of the key concepts can be matched. Importing keras models into DL4J is done in our module. Below is a comprehensive list of currently supported features.

Note that we also support importing tf.keras models as well. The format only changed a little bit from keras to tf.keras. We handle this transition from beta7 and above.

Mapping keras to DL4J layers is done in the sub-module of model import. The structure of this project loosely reflects the structure of Keras.

  • ✅

  • ✅

  • ✅

  • ✅

  • ✅

  • ✅

  • ✅

  • ✅

  • ✅

  • ❌ ActivityRegularization

  • ✅

  • ✅

  • ✅

  • ✅

  • ❌ SeparableConv1D

  • ❌ Conv3DTranspose

  • ❌ GRU

  • ❌ ConvLSTM2D

  • ✅ Add / add

  • ✅ Multiply / multiply

  • ✅ Subtract / subtract

  • ✅ Average / average

  • ✅ Maximum / maximum

  • ✅ Concatenate / concatenate

  • ❌ Dot / dot

  • ✅ ELU

Noise Layers

Layer Wrappers

  • ❌ TimeDistributed

  • ✅ mean_squared_error

  • ✅ mean_absolute_error

  • ✅ mean_absolute_percentage_error

  • ✅ mean_squared_logarithmic_error

  • ✅ squared_hinge

  • ✅ hinge

  • ✅ categorical_hinge

  • ❌ logcosh

  • ✅ categorical_crossentropy

  • ✅ sparse_categorical_crossentropy

  • ✅ binary_crossentropy

  • ✅ kullback_leibler_divergence

  • ✅ poisson

  • ✅ cosine_proximity

  • ✅ softmax

  • ✅ elu

  • ✅ selu

  • ✅ softplus

  • ✅ softsign

  • ✅ relu

  • ✅ tanh

  • ✅ sigmoid

  • ✅ hard_sigmoid

  • ✅ linear

  • ✅ Zeros

  • ✅ Ones

  • ✅ Constant

  • ✅ RandomNormal

  • ✅ RandomUniform

  • ✅ TruncatedNormal

  • ✅ VarianceScaling

  • ✅ Orthogonal

  • ✅ Identity

  • ✅ lecun_uniform

  • ✅ lecun_normal

  • ✅ glorot_normal

  • ✅ glorot_uniform

  • ✅ he_normal

  • ✅ he_uniform

  • ✅ l1

  • ✅ l2

  • ✅ l1_l2

  • ✅ max_norm

  • ✅ non_neg

  • ✅ unit_norm

  • ✅ min_max_norm

  • ✅ SGD

  • ✅ RMSprop

  • ✅ Adagrad

  • ✅ Adadelta

  • ✅ Adam

  • ✅ Adamax

  • ✅ Nadam

  • ❌ TFOptimizer

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

deeplearning4j-modelimport
Layers
Losses
Activations
Initializers
Regularizers
Constraints
Metrics
Optimizers
Layers
layers
Core Layers
Dense
Activation
Dropout
Flatten
Reshape
Merge
Permute
RepeatVector
Lambda
Masking
SpatialDropout1D
SpatialDropout2D
SpatialDropout3D
Convolutional Layers
Conv1D
Conv2D
Conv3D
AtrousConvolution1D
AtrousConvolution2D
SeparableConv2D
Conv2DTranspose
Cropping1D
Cropping2D
Cropping3D
UpSampling1D
UpSampling2D
UpSampling3D
ZeroPadding1D
ZeroPadding2D
ZeroPadding3D
Pooling Layers
MaxPooling1D
MaxPooling2D
MaxPooling3D
AveragePooling1D
AveragePooling2D
AveragePooling3D
GlobalMaxPooling1D
GlobalMaxPooling2D
GlobalMaxPooling3D
GlobalAveragePooling1D
GlobalAveragePooling2D
GlobalAveragePooling3D
Locally-connected Layers
LocallyConnected1D
LocallyConnected2D
Recurrent Layers
SimpleRNN
LSTM
Embedding Layers
Embedding
Merge Layers
Advanced Activation Layers
LeakyReLU
PReLU
ThresholdedReLU
Normalization Layers
BatchNormalization
GaussianNoise
GaussianDropout
AlphaDropout
Bidirectional
Losses
Activations
Initializers
Regularizers
Constraints
Optimizers