0.7.2
Last updated
Was this helpful?
Last updated
Was this helpful?
Added variational autoencoder
Activation function refactor
Activation functions are now an interface
Configuration now via enumeration, not via String (see examples - )
Custom activation functions now supported
New activation functions added: hard sigmoid, randomized leaky rectified linear units (RReLU)
Multiple fixes/improvements for Keras model import
Added P-norm pooling for CNNs (option as part of SubsamplingLayer configuration)
Iteration count persistence: stored/persisted properly in model configuration + fixes to learning rate schedules for Spark network training
LSTM: gate activation function can now be configured (previously: hard-coded to sigmoid)
UI:
Added Chinese translation
Fixes for UI + pretrain layers
Added Java 7 compatible stats collection compatibility
Improvements in front-end for handling NaNs
Added UIServer.stop() method
Fixed score vs. iteration moving average line (with subsampling)
Solved Jaxb/Jackson issue with Spring Boot based applications
RecordReaderDataSetIterator now supports NDArrayWritable for the labels (set regression == true; used for multi-label classification + images, etc)
Activation functions (built-in): now specified using Activation enumeration, not String (String-based configuration has been deprecated)