autodiff
module of the ND4J API located here on GitHub.​functions
: This module has the basic building blocks to build SameDiff variables and graphs.execution
: has everything related to SameDiff graph execution.gradcheck
: Utility functionality for checking SameDiff gradients, similar in structure to the respective tool in DL4J.loss
: Loss functions for SameDiffsamediff
: Main SameDiff module to define, set up and run SameDiff operations and graphs.functions
modulefunctions
module is DifferentialFunction
, which underlies pretty much everything in SameDiff. Mathematically, what we're doing in SameDiff is build a directed acyclic graph whose nodes are differential functions, for which we can compute gradients. In that regard, DifferentialFunction
makes up a SameDiff graph on a fundamental level.DifferentialFunction
comes with a SameDiff
instance. We'll discuss SameDiff
and this relationship later on. Also, while there's only few key abstractions, they're essentially used everywhere, so it's almost impossible to discuss SameDiff concepts separately. Eventually we'll get around to each part.attributeAdaptersForFunction
, mappingsForFunction
, propertiesForFunction
and resolvePropertiesFromSameDiffBeforeExecution
are what you want to look at to get started.args()
: returns all input variables.arg()
: returns the first input variable (the only one for unary operations).larg()
and rarg()
: return the first and second (read "left" and "right") argument for binary operationsoutputVariables()
: returns a list of all output variables. Depending on the operation, this may be computed dynamically. As we'll see later on, to get the result for ops with a single output, we'll call .outputVariables()[0]
.calculateOutputShape
for a differential function might be necessary, but if implemented incorrectly can lead to hard-to-debug failures. (Note that SameDiff will eventually call op execution in libnd4j
and dynamic custom ops either infer output shapes or need to be provided with the correct output shape.)doDiff
. Each operation has to provide an implementation of doDiff
. If you're implementing a SameDiff operation for a libnd4j
op x
and you're lucky to find x_bp
(as in "back-propagation") you can use that and your doDiff
implementation comes essentially for free.diff
implementation that's used internally and calls doDiff
.DifferentialFunctionFactory
, by calling f()
. More precisely, this will return the factory of the SameDiff instance the differential function has:sum
to the DifferentialFunctionFactory
:Sum
operation defined elsewhere in ND4J and then return the first output variable (of type SDVariable
, discussed in a second). Disregarding the implementation details for now, what this allows you to do is call f().sum(...)
from anywhere you have access to a differential function factory. For instance, when implementing a SameDiff op x
and you already have x_bp
in your function factory, you can override doDiff
for x
samediff
SDVariable
(read SameDiff variable) extends DifferentialFunction
and is to SameDiff what INDArray
is to good old ND4J. In particular, SameDiff graphs operate on these variables and each individual operation takes in and spits out a list of SDVariable
. An SDVariable
comes with a name, is equipped with a SameDiff
instance, has shape information and knows how to initialize itself with an ND4J WeightInitScheme
. You'll also find a few helpers to set and get these properties.SDVariable
can do that a DifferentialFunction
can't it evaluate its result and return an underlying INDArray
by calling eval()
. This will run SameDiff internally and retrieve the result. A similar getter is getArr()
which you can call at any point to get the current value of this variable. This functionality is used extensively in testing, to assert proper results. An SDVariable
also has access to its current gradient through gradient()
. Upon initialization there won't be any gradient, it will usually be computed at a later point.SDVariable
also carries methods for concrete ops (and is in that regard a little similar to DifferentialFunctionFactory
). For instance, defining add
as follows:c = a.add(b)
on two SameDiff variables, the result of which can be accessed by c.eval()
.SameDiff
class is the main workhorse of the module and brings together most of the concepts discussed so far. A little unfortunately, the inverse is also true and SameDiff
instances are part of all other SameDiff module abstractions in some way or the other (which is why you've seen it many times already). Generally speaking, SameDiff
is the main entry point for automatic differentiation and you use it to define a symbolic graph that carries operations on SDVariable
s. Once built, a SameDiff graph can be run in a few ways, for instance exec()
and execAndEndResult()
.SameDiff()
sets up a million things! Essentially, SameDiff
will collect and give you access (in terms of both getters and setters) topropertiesToResolve
and propertiesForFunction
are of particular note.SameDiff
is also the place where you expose new operations to the SameDiff module. Essentially, you write a little wrapper for the respective operation in the DifferentialFunctionFactory
instance f()
. Here's an example for cross products:DifferentialFunctionFactory
and SameDiff
to expose them to SameDiff at various levels. As for actually implementing these ops, you need to know a few things. In libnd4j you find two classes of operations, which are described here in detail. We'll show how to implement both op types.layers
, which is reserved for deep learning layer implementations (like Conv2D
). These higher-level ops are based on the concept of Modules, similar to modules in pytorch or layers in TensorFlow. These layer op implementation also provide a source of more involved op implementations.cos
legacy op from libn4j: Cosine implementation. When it comes to SameDiff, the good thing about legacy ops is that they're already available in ND4J, but need to be augmented by SameDiff specific functionality to pass the muster. Since the cosine function does not have any properties, this implementation is straightforward. The parts that make this op SameDiff compliant are:Cos
extends BaseTransformOp
, which implements other SameDiff functionality. (Note that BaseTransformOp
is a BaseOp
, which extends DifferentialFunction
from earlier.) For instance, calculateOutputShape
is implemented there. If you want to implement a new transform, you can simply inherit from BaseTransformOp
, too. For other op types like reductions etc. there are op base classes available as well, meaning you only need to address the three bullet points above.legacy_ops.h
.DynamicCustomOp
is the new kind of operation from libnd4j and all recent additions are implemented as such. This operation type in ND4J directly extends DifferentialFunction
.blocks
and crops
. Note how blocks
and crops
, which are both of integer type, get added to integer arguments for the operation by calling addIArgument
. For float arguments and other types, use addTArgument
instead.DynamicPartition
. This op has precisely one property, called numPartitions
in SameDiff. To map and use this property, you do the following:addArgs
that is used in the constructor of the op and in an import helper one-liner that we're discussing next. It's not necessary, but encouraged to do this and call it addArgs
consistently, for clarity.DynamicPartition
has proper property mapping, it currently does not have a working doDiff
implementation.Dilation2D
. Not only has this op far more properties to map, as you can see in mappingsForFunction
, the properties also come with property values, as defined in attributeAdaptersForFunction
. We've chosen to show this op because it is one that has property mapping, but is neither exposed to DifferentialFunctionFactory
not SameDiff
.DynamicCustomOp
examples shown each come with their own defects and represent examples of the work that has to be done for SameDiff. To summarize, to add a new SameDiff op you need to: