Deeplearning4j
Community ForumND4J JavadocDL4J Javadoc
EN 1.0.0-M2
EN 1.0.0-M2
  • Deeplearning4j Suite Overview
  • Release Notes
    • 1.0.0-M2
    • 1.0.0-M1.1
    • 1.0.0-M1
    • 1.0.0-beta7
    • 1.0.0-beta6
    • 1.0.0-beta5
    • 1.0.0-beta4
    • 1.0.0-beta3
    • 1.0.0-beta2
    • 1.0.0-beta
    • 1.0.0-alpha
    • 0.9.1
    • 0.9.0
    • 0.8.0
    • 0.7.2
    • 0.7.1
    • 0.7.0
    • 0.6.0
    • 0.5.0
    • 0.4.0
    • 1.00-M2.2
  • Multi-Project
    • Tutorials
      • Beginners
      • Quickstart
    • How To Guides
      • Import in to your favorite IDE
      • Contribute
        • Eclipse Contributors
      • Developer Docs
        • Github Actions/Build Infra
        • Javacpp
        • Release
        • Testing
      • Build From Source
      • Benchmark
      • Beginners
    • Reference
      • Examples Tour
    • Explanation
      • The core workflow
      • Configuration
        • Backends
          • Performance Issues
          • CPU
          • Cudnn
        • Memory
          • Workspaces
      • Build Tools
      • Snapshots
      • Maven
  • Deeplearning4j
    • Tutorials
      • Quick Start
      • Language Processing
        • Doc2Vec
        • Sentence Iterator
        • Tokenization
        • Vocabulary Cache
    • How To Guides
      • Custom Layers
      • Keras Import
        • Functional Models
        • Sequential Models
        • Custom Layers
        • Keras Import API Overview
          • Advanced Activations
          • Convolutional Layers
          • Core Layers
          • Embedding Layers
          • Local Layers
          • Noise Layers
          • Normalization Layers
          • Pooling Layers
          • Recurrent Layers
          • Wrapper Layers
        • Supported Features Overview
          • Activations
          • Constraints
          • Initializers
          • Losses
          • Optimizers
          • Regularizers
      • Tuning and Training
        • Visualization
        • Troubleshooting Training
        • Early Stopping
        • Evaluation
        • Transfer Learning
    • Reference
      • Model Zoo
        • Zoo Models
      • Activations
      • Auto Encoders
      • Computation Graph
      • Convolutional Layers
      • DataSet Iterators
      • Layers
      • Model Listeners
      • Saving and Loading Models
      • Multi Layer Network
      • Recurrent Layers
      • Updaters/Optimizers
      • Vertices
      • Word2vec/Glove/Doc2Vec
    • Explanation
  • datavec
    • Tutorials
      • Overview
    • How To Guides
    • Reference
      • Analysis
      • Conditions
      • Executors
      • Filters
      • Normalization
      • Operations
      • Transforms
      • Readers
      • Records
      • Reductions
      • Schemas
      • Serialization
      • Visualization
    • Explanation
  • Nd4j
    • Tutorials
      • Quickstart
    • How To Guides
      • Other Framework Interop
        • Tensorflow
        • TVM
        • Onnx
      • Matrix Manipulation
      • Element wise Operations
      • Basics
    • Reference
      • Op Descriptor Format
      • Tensor
      • Syntax
    • Explanation
  • Samediff
    • Tutorials
      • Quickstart
    • How To Guides
      • Importing Tensorflow
      • Adding Operations
        • codegen
    • Reference
      • Operation Namespaces
        • Base Operations
        • Bitwise
        • CNN
        • Image
        • LinAlg
        • Loss
        • Math
        • NN
        • Random
        • RNN
      • Variables
    • Explanation
      • Model Import Framework
  • Libnd4j
    • How To Guides
      • Building on Windows
      • Building for raspberry pi or Jetson Nano
      • Building on ios
      • How to Add Operations
      • How to Setup CLion
    • Reference
      • Understanding graph execution
      • Overview of working with libnd4j
      • Helpers Overview (CUDNN, OneDNN,Armcompute)
    • Explanation
  • Python4j
    • Tutorials
      • Quickstart
    • How To Guides
      • Write Python Script
    • Reference
      • Python Types
      • Python Path
      • Garbage Collection
      • Python Script Execution
    • Explanation
  • Spark
    • Tutorials
      • DL4J on Spark Quickstart
    • How To Guides
      • How To
      • Data How To
    • Reference
      • Parameter Server
      • Technical Reference
    • Explanation
      • Spark API Reference
  • codegen
Powered by GitBook
On this page
  • What are listeners?
  • Usage
  • Available listeners
  • EvaluativeListener
  • ScoreIterationListener
  • ComposableIterationListener
  • CollectScoresIterationListener
  • CheckpointListener
  • SharedGradient
  • SleepyTrainingListener
  • CollectScoresListener
  • PerformanceListener
  • ParamAndGradientIterationListener
  • TimeIterationListener

Was this helpful?

Export as PDF
  1. Deeplearning4j
  2. Reference

Model Listeners

Adding hooks and listeners on DL4J models.

PreviousLayersNextSaving and Loading Models

Last updated 3 years ago

Was this helpful?

What are listeners?

Listeners allow users to "hook" into certain events in Eclipse Deeplearning4j. This allows you to collect or print information useful for tasks like training. For example, a ScoreIterationListener allows you to print training scores from the output layer of a neural network.

Usage

To add one or more listeners to a MultiLayerNetwork or ComputationGraph, use the addListener method:

MultiLayerNetwork model = new MultiLayerNetwork(conf);
model.init();
//print the score with every 1 iteration
model.setListeners(new ScoreIterationListener(1));

Available listeners

EvaluativeListener

This TrainingListener implementation provides simple way for model evaluation during training. It can be launched every Xth Iteration/Epoch, depending on frequency and InvocationType constructor arguments

EvaluativeListener

public EvaluativeListener(@NonNull DataSetIterator iterator, int frequency)

This callback will be invoked after evaluation finished

iterationDone

public void iterationDone(Model model, int iteration, int epoch)
  • param iterator Iterator to provide data for evaluation

  • param frequency Frequency (in number of iterations/epochs according to the invocation type) to perform evaluation

  • param type Type of value for ‘frequency’ - iteration end, epoch end, etc

ScoreIterationListener

Score iteration listener. Reports the score (value of the loss function )of the network during training every N iterations

ScoreIterationListener

public ScoreIterationListener(int printIterations)
  • param printIterations frequency with which to print scores (i.e., every printIterations parameter updates)

ComposableIterationListener

A group of listeners

CollectScoresIterationListener

CollectScoresIterationListener simply stores the model scores internally (along with the iteration) every 1 or N iterations (this is configurable). These scores can then be obtained or exported.

CollectScoresIterationListener

public CollectScoresIterationListener()

Constructor for collecting scores with default saving frequency of 1

iterationDone

public void iterationDone(Model model, int iteration, int epoch)

Constructor for collecting scores with the specified frequency.

  • param frequency Frequency with which to collect/save scores

exportScores

public void exportScores(OutputStream outputStream) throws IOException

Export the scores in tab-delimited (one per line) UTF-8 format.

exportScores

public void exportScores(OutputStream outputStream, String delimiter) throws IOException

Export the scores in delimited (one per line) UTF-8 format with the specified delimiter

  • param outputStream Stream to write to

  • param delimiter Delimiter to use

exportScores

public void exportScores(File file) throws IOException

Export the scores to the specified file in delimited (one per line) UTF-8 format, tab delimited

  • param file File to write to

exportScores

public void exportScores(File file, String delimiter) throws IOException

Export the scores to the specified file in delimited (one per line) UTF-8 format, using the specified delimiter

  • param file File to write to

  • param delimiter Delimiter to use for writing scores

CheckpointListener

CheckpointListener: The goal of this listener is to periodically save a copy of the model during training.. Model saving may be done:

  1. Every N epochs

  2. Every N iterations

  3. Every T time units (every 15 minutes, for example) Or some combination of the 3. Example 1: Saving a checkpoint every 2 epochs, keep all model files

.keepAll() //Don't delete any models
.saveEveryNEpochs(2)
.build()
}

Example 2: Saving a checkpoint every 1000 iterations, but keeping only the last 3 models (all older model files will be automatically deleted)

.keepLast(3)
.saveEveryNIterations(1000)
.build();
}

Example 3: Saving a checkpoint every 15 minutes, keeping the most recent 3 and otherwise every 4th checkpoint file:

.keepLastAndEvery(3, 4)
.saveEvery(15, TimeUnit.MINUTES)
.build();
}

Note that you can mix these: for example, to save every epoch and every 15 minutes (independent of last save time): To save every epoch, and every 15 minutes, since the last model save use: Note that is this last example, the sinceLast parameter is true. This means the 15-minute counter will be reset any time a model is saved.

CheckpointListener

public CheckpointListener build()

List all available checkpoints. A checkpoint is ‘available’ if the file can be loaded. Any checkpoint files that have been automatically deleted (given the configuration) will not be returned here.

  • return List of checkpoint files that can be loaded

SharedGradient

SleepyTrainingListener

This TrainingListener implementation provides a way to “sleep” during specific Neural Network training phases. Suitable for debugging/testing purposes only.

PLEASE NOTE: All timers treat time values as milliseconds. PLEASE NOTE: Do not use it in production environment.

onEpochStart

public void onEpochStart(Model model)

In this mode parkNanos() call will be used, to make process really idle

CollectScoresListener

A simple listener that collects scores to a list every N iterations. Can also optionally log the score.

PerformanceListener

Simple IterationListener that tracks time spend on training per iteration.

PerformanceListener

public PerformanceListener build()

This method defines, if iteration number should be reported together with other data

  • param reportIteration

  • return

ParamAndGradientIterationListener

An iteration listener that provides details on parameters and gradients at each iteration during traning. Attempts to provide much of the same information as the UI histogram iteration listener, but in a text-based format (for example, when learning on a system accessed via SSH etc). i.e., is intended to aid network tuning and debugging This iteration listener is set up to calculate mean, min, max, and mean absolute value of each type of parameter and gradient in the network at each iteration.

TimeIterationListener

Time Iteration Listener. This listener displays into INFO logs the remaining time in minutes and the date of the end of the process. Remaining time is estimated from the amount of time for training so far, and the total number of iterations specified by the user

TimeIterationListener

public TimeIterationListener(int iterationCount)

Constructor

  • param iterationCount The global number of iteration for training (all epochs)

[source]
[source]
[source]
[source]
[source]
[source]
[source]
[source]
[source]
[source]
[source]