Setup
Setting up and configuring Android Studio for DL4J.
While neural networks are typically run on powerful computers using multiple GPUs, the compatibility of Deeplearning4J with the Android platform makes using DL4J neural networks in android applications a possibility. This tutorial will cover the basics of setting up android studio for building DL4J applications. Several configurations for dependencies, memory management, and compilation exclusions needed to mitigate the limitations of low powered mobile device are outlined below. If you just want to get a DL4J app running on your device, you can jump ahead to a simple demo application which trains a neural network for Iris flower classification available here.
Prerequisites
Android Studio 3.6.3 or newer, which can be downloaded here.
Android Studio version 3.6.3 and higher comes with the latest OpenJDK embedded; however, it is recommended to have the JDK installed on your own as you are then able to update it independent of Android Studio. Android Studio 3.0 and later supports all of Java 7 and a subset of Java 8 language features. Java JDKs can be downloaded from the Oracle or OpenJDK website.
Within Android studio, the Android SDK Manager can be used to install Android Build tools 24.0.1 or later, SDK platform 24 or later, and the Android Support Repository.
An Android device or an emulator running API level 21 or higher. A minimum of 200 MB of internal storage space free is recommended.
It is also recommended that you download and install IntelliJ IDEA, Maven, and the complete dl4j-examples directory for building and building and training neural nets on your desktop instead of android studio. With the setup we are giving you here you can use dl4j on your android device and emulator. You need to add the regular dependencies to use dl4j in the \app\src\test\java\
tests.
Required Dependencies
In order to use Deeplearning4J in your Android projects, you will need to add the following dependencies to your app module’s build.gradle file. Depending on the type of neural network used in your application, you may need to add additional dependencies.
You may also need to add the following compile options:
DL4J depends on ND4J, which is a library that offers fast n-dimensional arrays. ND4J in turn depends on a platform-specific native code library called JavaCPP, therefore you must load a version of ND4J that matches the architecture of the Android device. Both -x86 and -arm types can be included to support multiple device processor types.
After adding the above to the build.gradle file, try syncing Gradle with to see if any exclusions are needed. The error message will identify the file path that should be added to the list of exclusions. An example error message with file path is: > More than one file was found with OS independent path 'org/bytedeco/javacpp/ windows-x86_64/msvp120.dll'
A conflict in the junit module versions often causes the following error: > Conflict with dependency 'junit:junit' in project ':app'. Resolved versions for app (4.8.2) and test app (4.12) differ. This can be suppressed by forcing all of the junit modules to use the same version with the following:
Managing Dependencies with ProGuard
The DL4J dependencies compile a large number of files. ProGuard can be used to minimize your APK file size. ProGuard detects and removes unused classes, fields, methods, and attributes from your packaged app, including those from code libraries. You can learn more about using Proguard here. To enable code shrinking with ProGuard, add minifyEnabled true to the appropriate build type in your build.gradle file.
It is recommended to upgrade your ProGuard in the Android SDK to the latest release (5.1 or higher). Note that upgrading the build tools or other aspects of your SDK might cause Proguard to reset to the version shipped with the SDK. In order to force ProGuard to use a version of other than the Android Gradle default, you can include this in the buildscript of build.gradle
file:
Testing your app is the best way to check if any errors are being caused by inappropriately removed code; however, you can also inspect what was removed by reviewing the usage.txt output file saved in /build/outputs/mapping/release/.
To fix errors and force ProGuard to retain certain code, add a -keep line in the ProGuard configuration file. For example:
Memory Management
It may also be advantageous to increase the allocated memory to your app by adding android:largeHeap="true" to the manifest file. Allocating a larger heap means that you decrease the risk of throwing an OutOfMemoryError during memory intensive operations.
ND4J offers an additional memory-management model: workspaces. Workspaces allow you to reuse memory for cyclic workloads without the JVM Garbage Collector for off-heap memory tracking. D4j Workspace allows for memory to be preallocated before a try / catch block and reused over in over within that block.
If your training process uses workspaces, it is recommended that you disable or reduce the frequency of periodic GC calls prior to your model.fit() call.
The example below illustrates the use of a Workspace for memory allocation. More information concerning ND4J Workspaces can be found here.
Saving and Loading Networks on Android
Practical considerations regarding performance limits are needed when building Android applications that run neural networks. Training a neural network on a device is possible, but should only be attempted with networks with limited numbers of layers, nodes, and iterations. The first Demo app DL4JIrisClassifierDemo is able to train on a standard device in about 15 seconds.
When training on a device is a reasonable option, the application performance can be improved by saving the trained model on external storage once an initial training is complete. The trained model can then be used as an application resource. This approach is useful for training networks with data obtained from user input. The following code illustrates how to train a network and save it on external resources.
For API 23 and greater, you will need to include the permissions in your manifest and also programmatically request the read and write permissions in your activity. The required Manifest permissions are:
You need to implement ActivityCompat.OnRequestPermissionsResultCallback in the activity and then check for permission status.
To save a network after training on the device use a OutputStream within a try catch block.
To load the trained network from storage you can use the restoreMultiLayerNetwork method.
For larger or more complex neural networks like Convolutional or Recurrent Neural Networks, training on the device is not a realistic option as long processing times during network training run the risk of generating an OutOfMemoryError and make for a poor user experience. As an alternative, the Neural Network can be trained on the desktop, saved via ModelSerializer, and then loaded as a pre-trained model in the application. Using a pre-trained model in you Android application can be achieved with the following steps:
Train the yourModel on desktop and save via modelSerializer.
Create a raw resource folder in the res directory of the application.
Copy yourModel.zip file into the raw folder.
Access it from your resources using an inputStream within a try / catch block.
Snapshots
If you choose to use a SNAPSHOT version of the dependencies with gradle, you will need to create the a pom.xml file in the root directory and run mvn -U compile
on it from the terminal. You will also need to include mavenLocal()
in the repository {}
block of the build.gradle file. An example pom.xml file is provided below.
Last updated