Brief tour of available examples in DL4J.
To access examples as they were during the beta6 release, please use this version of the example repository:
https://github.com/eclipse/deeplearning4j-examples/tree/a4594ba7508a3ab1342b3390a3f8354f6c754ee1
Deeplearning4j's Github repository has many examples to cover its functionality. The Quick Start Guide shows you how to set up Intellij and clone the repository. This page provides an overview of some of those examples.
Most of the examples make use of DataVec, a toolkit for preprocessing and clearing data through normalization, standardization, search and replace, column shuffles and vectorization. Reading raw data and transforming it into a DataSet object for your Neural Network is often the first step toward training that network. If you're unfamiliar with DataVec, here is a description and some links to useful examples.
This example takes the canonical Iris dataset of the flower species of the same name, whose relevant measurements are sepal length, sepal width, petal length and petal width. It builds a Spark RDD from the relatively small dataset and runs an analysis against it.
This example loads data into a Spark RDD. All DataVec transform operations use Spark RDDs. Here, we use DataVec to filter data, apply time transformations and remove columns.
This example shows the print Schema tools that are useful to visualize and to ensure that the code for the transform is behaving as expected.
You may need to join datasets before passing to a neural network. You can do that in DataVec, and this example shows you how.
This is an example of parsing log data using DataVec. The obvious use cases are cybersecurity and customer relationship management.
This example is from the video below, which demonstrates the ParentPathLabelGenerator and ImagePreProcessing scaler.
This example demonstrates preprocessing features available in DataVec.
DataMeta data tracking - i.e. seeing where data for each example comes from - is useful when tracking down malformed data that causes errors and other issues. This example demonstrates the functionality in the RecordMetaData class.
To build a neural net, you will use either MultiLayerNetwork
or ComputationGraph
. Both options work using a Builder interface. A few highlights from the examples are described below.
MNIST is the "Hello World" of deep learning. Simple, straightforward, and focused on image recognition, a task that Neural Networks do well.
This is a Single Layer Perceptron for recognizing digits. Note that this pulls the images from a binary package containing the dataset, a rather special case for data ingestion.
A two-layer perceptron for MNIST, showing there is more than one useful network for a given dataset.
Data flows through feed-forward neural networks in a single pass from input via hidden layers to output.
These networks can be used for a wide range of tasks depending on they are configured. Along with image classification over MNIST data, this directory has examples demonstrating regression, classification, and anomaly detection.
Convolutional Neural Networks are mainly used for image recognition, although they apply to sound and text as well.
This example can be run using either LeNet or AlexNet.
Training a network over a large volume of training data takes time. Fortunately, you can save a trained model and load the model for later training or inference.
This demonstrates saving and loading a network build using the class ComputationGraph.
Demonstrates saving and loading a Neural Network built with the class MultiLayerNetwork.
Our video series shows code that includes saving and loading models, as well as inference.
Do you need to add a Loss Function that is not available or prebuilt yet? Check out these examples.
Do you need to add a layer with features that aren't available in DeepLearning4J core? This example show where to begin.
Neural Networks for NLP? We have those, too.
A vectorized representation of words. Described here
One way to represent sentences is as a sequence of words.
Described here
t-Distributed Stochastic Neighbor Embedding (t-SNE) is useful for data visualization. We include an example in the NLP section since word similarity visualization is a common use.
Recurrent Neural Networks are useful for processing time series data or other sequentially fed data like video.
The examples folder for Recurrent Neural Networks has the following:
An RNN learns a string of characters.
Takes the complete works of Shakespeare as a sequence of characters and Trains a Neural Net to generate "Shakespeare" one character at a time.
This example trains a neural network to do addition.
This example trains a neural network to perform various math operations.
A publicly available dataset of time series data of six classes, cyclic, up-trending, etc. Example of an RNN learning to classify the time series.
How do autonomous vehicles distinguish between a pedestrian, a stop sign and a green light? A complex neural net using Convolutional and Recurrent layers is trained on a set of training videos. The trained network is passed live onboard video and decisions based on object detection from the Neural Net determine the vehicles actions.
This example is similar, but simplified. It combines convolutional, max pooling, dense (feed forward) and recurrent (LSTM) layers to classify frames in a video.
This sentiment analysis example classifies sentiment as positive or negative using word vectors and a Recurrent Neural Network.
DeepLearning4j supports using a Spark Cluster for network training. Here are the examples.
This is an example of a Multi-Layer Perceptron training on the Mnist data set of handwritten digits.
An LSTM recurrent Network in Spark.
ND4J is a tensor processing library. It can be thought of as Numpy for the JVM. Neural Networks work by processing and updating MultiDimensional arrays of numeric values. In a typical Neural Net application you use DataVec to ingest and convert the data to numeric. Classes used would be RecordReader. Once you need to pass data into a Neural Network, you typically use RecordReaderDataSetIterator. RecordReaderDataSetIterator returns a DataSet object. DataSet consists of an NDArray of the input features and an NDArray of the labels.
The learning algorithms and loss functions are executed as ND4J operations.
This is a directory with examples for creating and manipulating NDArrays.
Deep learning algorithms have learned to play Space Invaders and Doom using reinforcement learning. DeepLearning4J/RL4J examples of Reinforcement Learning are available here: