Deeplearning4j
Community ForumND4J JavadocDL4J Javadoc
EN 1.0.0-beta7
EN 1.0.0-beta7
  • Eclipse DeepLearning4J
  • Getting Started
    • Quickstart
      • Untitled
    • Tutorials
      • Quickstart with MNIST
      • MultiLayerNetwork And ComputationGraph
      • Logistic Regression
      • Built-in Data Iterators
      • Feed Forward Networks
      • Basic Autoencoder
      • Advanced Autoencoder
      • Convolutional Networks
      • Recurrent Networks
      • Early Stopping
      • Layers and Preprocessors
      • Hyperparameter Optimization
      • Using Multiple GPUs
      • Clinical Time Series LSTM
      • Sea Temperature Convolutional LSTM
      • Sea Temperature Convolutional LSTM 2
      • Instacart Multitask Example
      • Instacart Single Task Example
      • Cloud Detection Example
    • Core Concepts
    • Cheat Sheet
    • Examples Tour
    • Deep Learning Beginners
    • Build from Source
    • Contribute
      • Eclipse Contributors
    • Benchmark Guide
    • About
    • Release Notes
  • Configuration
    • Backends
      • CPU and AVX
      • cuDNN
      • Performance Issues
    • Memory Management
      • Memory Workspaces
    • Snapshots
    • Maven
    • SBT, Gradle, & Others
  • Models
    • Autoencoders
    • Multilayer Network
    • Computation Graph
    • Convolutional Neural Network
    • Recurrent Neural Network
    • Layers
    • Vertices
    • Iterators
    • Listeners
    • Custom Layers
    • Model Persistence
    • Activations
    • Updaters
  • Model Zoo
    • Overview
    • Zoo Models
  • ND4J
    • Overview
    • Quickstart
    • Basics
    • Elementwise Operations
    • Matrix Manipulation
    • Syntax
    • Tensors
  • SAMEDIFF
    • Importing TensorFlow models
    • Variables
    • Ops
    • Adding Ops
  • ND4J & SameDiff Ops
    • Overview
    • Bitwise
    • Linalg
    • Math
    • Random
    • BaseOps
    • CNN
    • Image
    • Loss
    • NN
    • RNN
  • Tuning & Training
    • Evaluation
    • Visualization
    • Trouble Shooting
    • Early Stopping
    • t-SNE Visualization
    • Transfer Learning
  • Keras Import
    • Overview
    • Get Started
    • Supported Features
      • Activations
      • Losses
      • Regularizers
      • Initializers
      • Constraints
      • Optimizers
    • Functional Model
    • Sequential Model
    • Custom Layers
    • API Reference
      • Core Layers
      • Convolutional Layers
      • Embedding Layers
      • Local Layers
      • Noise Layers
      • Normalization Layers
      • Pooling Layers
      • Recurrent Layers
      • Wrapper Layers
      • Advanced Activations
  • DISTRIBUTED DEEP LEARNING
    • Introduction/Getting Started
    • Technical Explanation
    • Spark Guide
    • Spark Data Pipelines Guide
    • API Reference
    • Parameter Server
  • Arbiter
    • Overview
    • Layer Spaces
    • Parameter Spaces
  • Datavec
    • Overview
    • Records
    • Reductions
    • Schema
    • Serialization
    • Transforms
    • Analysis
    • Readers
    • Conditions
    • Executors
    • Filters
    • Operations
    • Normalization
    • Visualization
  • Language Processing
    • Overview
    • Word2Vec
    • Doc2Vec
    • Sentence Iteration
    • Tokenization
    • Vocabulary Cache
  • Mobile (Android)
    • Setup
    • Tutorial: First Steps
    • Tutorial: Classifier
    • Tutorial: Image Classifier
    • FAQ
    • Press
    • Support
    • Why Deep Learning?
Powered by GitBook
On this page
  • Keras Model Import: Supported Features
  • Layers
  • Core Layers
  • Convolutional Layers
  • Pooling Layers
  • Locally-connected Layers
  • Recurrent Layers
  • Embedding Layers
  • Merge Layers
  • Advanced Activation Layers
  • Normalization Layers
  • Noise Layers
  • Layer Wrappers
  • Losses
  • Activations
  • Initializers
  • Regularizers
  • Constraints
  • Optimizers

Was this helpful?

Edit on Git
Export as PDF
  1. Keras Import

Supported Features

Supported Keras features.

PreviousGet StartedNextActivations

Last updated 5 years ago

Was this helpful?

Keras Model Import: Supported Features

While not every concept in DL4J has an equivalent in Keras and vice versa, many of the key concepts can be matched. Importing keras models into DL4J is done in our module. Below is a comprehensive list of currently supported features.

Mapping keras to DL4J layers is done in the sub-module of model import. The structure of this project loosely reflects the structure of Keras.

  • ✅

  • ✅

  • ✅

  • ✅

  • ✅

  • ✅

  • ✅

  • ✅

  • ✅

  • ❌ ActivityRegularization

  • ✅

  • ✅

  • ✅

  • ✅

  • ❌ SeparableConv1D

  • ❌ Conv3DTranspose

  • ❌ GRU

  • ❌ ConvLSTM2D

  • ✅ Add / add

  • ✅ Multiply / multiply

  • ✅ Subtract / subtract

  • ✅ Average / average

  • ✅ Maximum / maximum

  • ✅ Concatenate / concatenate

  • ❌ Dot / dot

  • ✅ ELU

Noise Layers

Layer Wrappers

  • ❌ TimeDistributed

  • ✅ mean_squared_error

  • ✅ mean_absolute_error

  • ✅ mean_absolute_percentage_error

  • ✅ mean_squared_logarithmic_error

  • ✅ squared_hinge

  • ✅ hinge

  • ✅ categorical_hinge

  • ❌ logcosh

  • ✅ categorical_crossentropy

  • ✅ sparse_categorical_crossentropy

  • ✅ binary_crossentropy

  • ✅ kullback_leibler_divergence

  • ✅ poisson

  • ✅ cosine_proximity

  • ✅ softmax

  • ✅ elu

  • ✅ selu

  • ✅ softplus

  • ✅ softsign

  • ✅ relu

  • ✅ tanh

  • ✅ sigmoid

  • ✅ hard_sigmoid

  • ✅ linear

  • ✅ Zeros

  • ✅ Ones

  • ✅ Constant

  • ✅ RandomNormal

  • ✅ RandomUniform

  • ✅ TruncatedNormal

  • ✅ VarianceScaling

  • ✅ Orthogonal

  • ✅ Identity

  • ✅ lecun_uniform

  • ✅ lecun_normal

  • ✅ glorot_normal

  • ✅ glorot_uniform

  • ✅ he_normal

  • ✅ he_uniform

  • ✅ l1

  • ✅ l2

  • ✅ l1_l2

  • ✅ max_norm

  • ✅ non_neg

  • ✅ unit_norm

  • ✅ min_max_norm

  • ✅ SGD

  • ✅ RMSprop

  • ✅ Adagrad

  • ✅ Adadelta

  • ✅ Adam

  • ✅ Adamax

  • ✅ Nadam

  • ❌ TFOptimizer

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

✅

Convolutional Layers
Conv1D
Conv2D
Conv3D
AtrousConvolution1D
AtrousConvolution2D
SeparableConv2D
Conv2DTranspose
Cropping1D
Cropping2D
Cropping3D
UpSampling1D
UpSampling2D
UpSampling3D
ZeroPadding1D
ZeroPadding2D
ZeroPadding3D
Pooling Layers
MaxPooling1D
MaxPooling2D
MaxPooling3D
AveragePooling1D
AveragePooling2D
AveragePooling3D
GlobalMaxPooling1D
GlobalMaxPooling2D
GlobalMaxPooling3D
GlobalAveragePooling1D
GlobalAveragePooling2D
GlobalAveragePooling3D
Locally-connected Layers
LocallyConnected1D
LocallyConnected2D
Recurrent Layers
SimpleRNN
LSTM
Embedding Layers
Embedding
Merge Layers
Advanced Activation Layers
LeakyReLU
PReLU
ThresholdedReLU
Normalization Layers
BatchNormalization
GaussianNoise
GaussianDropout
AlphaDropout
Bidirectional
Losses
Activations
Initializers
Regularizers
Constraints
Optimizers
deeplearning4j-modelimport
Layers
layers
Core Layers
Dense
Activation
Dropout
Flatten
Reshape
Merge
Permute
RepeatVector
Lambda
Masking
SpatialDropout1D
SpatialDropout2D
SpatialDropout3D
Layers
Losses
Activations
Initializers
Regularizers
Constraints
Metrics
Optimizers