blob: 1ac3601c7048bb079cbcf88982956d5e19686f01 [file] [log] [blame]
<!DOCTYPE html PUBLIC ""
"">
<html><head><meta charset="UTF-8" /><title>org.apache.clojure-mxnet.module documentation</title><link rel="stylesheet" type="text/css" href="css/default.css" /><link rel="stylesheet" type="text/css" href="css/highlight.css" /><script type="text/javascript" src="js/highlight.min.js"></script><script type="text/javascript" src="js/jquery.min.js"></script><script type="text/javascript" src="js/page_effects.js"></script><script>hljs.initHighlightingOnLoad();</script></head><body><div id="header"><h2>Generated by <a href="https://github.com/weavejester/codox">Codox</a></h2><h1><a href="index.html"><span class="project-title"><span class="project-name">Clojure-mxnet</span> <span class="project-version">1.8.0-SNAPSHOT</span></span></a></h1></div><div class="sidebar primary"><h3 class="no-link"><span class="inner">Project</span></h3><ul class="index-link"><li class="depth-1 "><a href="index.html"><div class="inner">Index</div></a></li></ul><h3 class="no-link"><span class="inner">Namespaces</span></h3><ul><li class="depth-1"><div class="no-link"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>org</span></div></div></li><li class="depth-2"><div class="no-link"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>apache</span></div></div></li><li class="depth-3"><div class="no-link"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>clojure-mxnet</span></div></div></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.base.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>base</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.callback.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>callback</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.context.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>context</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.dtype.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>dtype</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.eval-metric.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>eval-metric</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.executor.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>executor</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.image.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>image</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.infer.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>infer</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.initializer.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>initializer</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.io.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>io</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.kvstore.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>kvstore</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.kvstore-server.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>kvstore-server</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.layout.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>layout</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.lr-scheduler.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>lr-scheduler</span></div></a></li><li class="depth-4 branch current"><a href="org.apache.clojure-mxnet.module.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>module</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.monitor.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>monitor</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.ndarray.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>ndarray</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.ndarray-api.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>ndarray-api</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.ndarray-random-api.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>ndarray-random-api</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.optimizer.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>optimizer</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.primitives.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>primitives</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.profiler.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>profiler</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.random.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>random</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.resource-scope.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>resource-scope</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.shape.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>shape</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.symbol.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>symbol</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.symbol-api.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>symbol-api</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.symbol-random-api.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>symbol-random-api</span></div></a></li><li class="depth-4 branch"><a href="org.apache.clojure-mxnet.util.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>util</span></div></a></li><li class="depth-4"><a href="org.apache.clojure-mxnet.visualization.html"><div class="inner"><span class="tree"><span class="top"></span><span class="bottom"></span></span><span>visualization</span></div></a></li></ul></div><div class="sidebar secondary"><h3><a href="#top"><span class="inner">Public Vars</span></a></h3><ul><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-arg-params"><div class="inner"><span>arg-params</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-aux-params"><div class="inner"><span>aux-params</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-backward"><div class="inner"><span>backward</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-bind"><div class="inner"><span>bind</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-borrow-optimizer"><div class="inner"><span>borrow-optimizer</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-data-names"><div class="inner"><span>data-names</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-data-shapes"><div class="inner"><span>data-shapes</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-exec-group"><div class="inner"><span>exec-group</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-fit"><div class="inner"><span>fit</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-fit-params"><div class="inner"><span>fit-params</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-forward"><div class="inner"><span>forward</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-forward-backward"><div class="inner"><span>forward-backward</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-get-params"><div class="inner"><span>get-params</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-grad-arrays"><div class="inner"><span>grad-arrays</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-init-optimizer"><div class="inner"><span>init-optimizer</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-init-params"><div class="inner"><span>init-params</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-input-grads"><div class="inner"><span>input-grads</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-input-grads-merged"><div class="inner"><span>input-grads-merged</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-install-monitor"><div class="inner"><span>install-monitor</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-label-shapes"><div class="inner"><span>label-shapes</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-load-checkpoint"><div class="inner"><span>load-checkpoint</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-load-optimizer-states"><div class="inner"><span>load-optimizer-states</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-module"><div class="inner"><span>module</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-output-names"><div class="inner"><span>output-names</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-output-shapes"><div class="inner"><span>output-shapes</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-outputs"><div class="inner"><span>outputs</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-outputs-merged"><div class="inner"><span>outputs-merged</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-params"><div class="inner"><span>params</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-predict"><div class="inner"><span>predict</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-predict-batch"><div class="inner"><span>predict-batch</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-predict-every-batch"><div class="inner"><span>predict-every-batch</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-reshape"><div class="inner"><span>reshape</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-save-checkpoint"><div class="inner"><span>save-checkpoint</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-save-optimizer-states"><div class="inner"><span>save-optimizer-states</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-score"><div class="inner"><span>score</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-set-params"><div class="inner"><span>set-params</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-symbol"><div class="inner"><span>symbol</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-update"><div class="inner"><span>update</span></div></a></li><li class="depth-1"><a href="org.apache.clojure-mxnet.module.html#var-update-metric"><div class="inner"><span>update-metric</span></div></a></li></ul></div><div class="namespace-docs" id="content"><h1 class="anchor" id="top">org.apache.clojure-mxnet.module</h1><div class="doc"><pre class="plaintext">Module API for Clojure package.
</pre></div><div class="public anchor" id="var-arg-params"><h3>arg-params</h3><div class="usage"><code>(arg-params mod)</code></div><div class="doc"><pre class="plaintext"></pre></div></div><div class="public anchor" id="var-aux-params"><h3>aux-params</h3><div class="usage"><code>(aux-params mod)</code></div><div class="doc"><pre class="plaintext"></pre></div></div><div class="public anchor" id="var-backward"><h3>backward</h3><div class="usage"><code>(backward mod out-grads)</code><code>(backward mod)</code></div><div class="doc"><pre class="plaintext">Backward computation.
`out-grads`: collection of NDArrays
Gradient on the outputs to be propagated back. This parameter is only
needed when bind is called on outputs that are not a loss function.</pre></div></div><div class="public anchor" id="var-bind"><h3>bind</h3><div class="usage"><code>(bind mod {:keys [data-shapes label-shapes for-training inputs-need-grad force-rebind shared-module grad-req], :as opts, :or {for-training true, inputs-need-grad false, force-rebind false, grad-req "write"}})</code></div><div class="doc"><pre class="plaintext">Bind the symbols to construct executors. This is necessary before one
can perform computation with the module.
`mod`: module
`opts-map` {
`data-shapes`: map of `:name`, `:shape`, `:dtype`, and `:layout`
Typically is `(provide-data-desc data-iter)`.Data shape must be in the
form of `io/data-desc`
`label-shapes`: map of `:name` `:shape` `:dtype` and `:layout`
Typically is `(provide-label-desc data-iter)`.
`for-training`: boolean - Default is `true`
Whether the executors should be bind for training.
`inputs-need-grad`: boolean - Default is `false`.
Whether the gradients to the input data need to be computed.
Typically this is not needed. But this might be needed when
implementing composition of modules.
`force-rebind`: boolean - Default is `false`.
This function does nothing if the executors are already binded. But
with this `true`, the executors will be forced to rebind.
`shared-module`: Default is nil.
This is used in bucketing. When not `nil`, the shared module
essentially corresponds to a different bucket -- a module with
different symbol but with the same sets of parameters (e.g. unrolled
RNNs with different lengths).
}
Ex:
(bind {:data-shapes (mx-io/provide-data train-iter)
:label-shapes (mx-io/provide-label test-iter)})) </pre></div></div><div class="public anchor" id="var-borrow-optimizer"><h3>borrow-optimizer</h3><div class="usage"><code>(borrow-optimizer mod shared-module)</code></div><div class="doc"><pre class="plaintext">Borrow optimizer from a shared module. Used in bucketing, where exactly the
same optimizer (esp. kvstore) is used.
`mod`: Module
`shared-module`</pre></div></div><div class="public anchor" id="var-data-names"><h3>data-names</h3><div class="usage"><code>(data-names mod)</code></div><div class="doc"><pre class="plaintext"></pre></div></div><div class="public anchor" id="var-data-shapes"><h3>data-shapes</h3><div class="usage"><code>(data-shapes mod)</code></div><div class="doc"><pre class="plaintext"></pre></div></div><div class="public anchor" id="var-exec-group"><h3>exec-group</h3><div class="usage"><code>(exec-group mod)</code></div><div class="doc"><pre class="plaintext"></pre></div></div><div class="public anchor" id="var-fit"><h3>fit</h3><div class="usage"><code>(fit mod {:keys [train-data eval-data num-epoch fit-params], :as opts, :or {num-epoch 1, fit-params (new FitParams)}})</code></div><div class="doc"><pre class="plaintext">Train the module parameters.
`mod`: Module
`opts-map` {
`train-data`: DataIter
`eval-data`: DataIter
If not nil, will be used as validation set and evaluate the
performance after each epoch.
`num-epoch`: int
Number of epochs to run training.
`fit-params`: FitParams
Extra parameters for training (see fit-params).
}
Ex:
(fit {:train-data train-iter :eval-data test-iter :num-epoch 100)
(fit {:train-data train-iter
:eval-data test-iter
:num-epoch 5
:fit-params
(fit-params {:batch-end-callback (callback/speedometer 128 100)
:initializer (initializer/xavier)
:optimizer (optimizer/sgd {:learning-rate 0.01})
:eval-metric (eval-metric/mse)}))</pre></div></div><div class="public anchor" id="var-fit-params"><h3>fit-params</h3><div class="usage"><code>(fit-params {:keys [eval-metric kvstore optimizer initializer arg-params aux-params allow-missing force-rebind force-init begin-epoch validation-metric monitor batch-end-callback], :as opts, :or {eval-metric (eval-metric/accuracy), kvstore "local", optimizer (optimizer/sgd), initializer (initializer/uniform 0.01), allow-missing false, force-rebind false, force-init false, begin-epoch 0}})</code><code>(fit-params)</code></div><div class="doc"><pre class="plaintext">Initialize FitParams with provided parameters.
`eval-metric`: EvalMetric - Default is `accuracy`
`kvstore`: String - Default is "local"
`optimizer`: Optimizer - Default is `sgd`
`initializer`: Initializer - Default is `uniform`
Called to initialize parameters if needed.
`arg-params`: map
If not nil, should be a map of existing `arg-params`. Initialization
will be copied from that.
`aux-params`: map -
If not nil, should be a map of existing `aux-params`. Initialization
will be copied from that.
`allow-missing`: boolean - Default is `false`
If `true`, params could contain missing values, and the initializer will
be called to fill those missing params.
`force-rebind`: boolean - Default is `false`
This function does nothing if the executors are already binded. But with
this `true`, the executors will be forced to rebind.
`force-init`: boolean - Default is `false`
If `true`, will force re-initialize even if already initialized.
`begin-epoch`: int - Default is 0
`validation-metric`: EvalMetric
`monitor`: Monitor
Ex:
(fit-params {:force-init true :force-rebind true :allow-missing true})
(fit-params
{:batch-end-callback (callback/speedometer batch-size 100)
:initializer (initializer/xavier)
:optimizer (optimizer/sgd {:learning-rate 0.01})
:eval-metric (eval-metric/mse)})</pre></div></div><div class="public anchor" id="var-forward"><h3>forward</h3><div class="usage"><code>(forward mod data-batch is-train)</code><code>(forward mod data-batch-map)</code></div><div class="doc"><pre class="plaintext">Forward computation.
`data-batch`: Either map or DataBatch
Input data of form `io/data-batch`.
`is-train`: Default is nil
Which means `is_train` takes the value of `for_training`.</pre></div></div><div class="public anchor" id="var-forward-backward"><h3>forward-backward</h3><div class="usage"><code>(forward-backward mod data-batch)</code></div><div class="doc"><pre class="plaintext">A convenient function that calls both `forward` and `backward`.
</pre></div></div><div class="public anchor" id="var-get-params"><h3>get-params</h3><div class="usage"><code>(get-params mod)</code></div><div class="doc"><pre class="plaintext"></pre></div></div><div class="public anchor" id="var-grad-arrays"><h3>grad-arrays</h3><div class="usage"><code>(grad-arrays mod)</code></div><div class="doc"><pre class="plaintext"></pre></div></div><div class="public anchor" id="var-init-optimizer"><h3>init-optimizer</h3><div class="usage"><code>(init-optimizer mod {:keys [kvstore optimizer reset-optimizer force-init], :as opts, :or {kvstore "local", optimizer (optimizer/sgd), reset-optimizer true, force-init false}})</code><code>(init-optimizer mod)</code></div><div class="doc"><pre class="plaintext">Install and initialize optimizers.
`mod`: Module
`opts-map` {
`kvstore`: string - Default is "local"
`optimizer`: Optimizer - Default is `sgd`
`reset-optimizer`: boolean - Default is `true`
Indicating whether we should set `rescaleGrad` &amp; `idx2name` for
optimizer according to executorGroup.
`force-init`: boolean - Default is `false`
Indicating whether we should force re-initializing the optimizer
in the case an optimizer is already installed.
Ex:
(init-optimizer {:optimizer (optimizer/sgd {:learning-rate 0.1})})</pre></div></div><div class="public anchor" id="var-init-params"><h3>init-params</h3><div class="usage"><code>(init-params mod {:keys [initializer arg-params aux-params allow-missing force-init allow-extra], :as opts, :or {initializer (initializer/uniform 0.01), allow-missing false, force-init false, allow-extra false}})</code><code>(init-params mod)</code></div><div class="doc"><pre class="plaintext">Initialize the parameters and auxiliary states.
`opts-map` {
`initializer`: Initializer - Default is `uniform`
Called to initialize parameters if needed.
`arg-params`: map
If not nil, should be a map of existing arg-params. Initialization
will be copied from that.
`aux-params`: map
If not nil, should be a map of existing aux-params. Initialization
will be copied from that.
`allow-missing`: boolean - Default is `false`
If true, params could contain missing values, and the initializer will
be called to fill those missing params.
`force-init` boolean - Default is `false`
If true, will force re-initialize even if already initialized.
`allow-extra`: boolean - Default is `false`
Whether allow extra parameters that are not needed by symbol.
If this is `true`, no error will be thrown when `arg-params` or
`aux-params` contain extra parameters that is not needed by the
executor.
Ex:
(init-params {:initializer (initializer/xavier)})
(init-params {:force-init true :allow-extra true})</pre></div></div><div class="public anchor" id="var-input-grads"><h3>input-grads</h3><div class="usage"><code>(input-grads mod)</code></div><div class="doc"><pre class="plaintext">Get the gradients to the inputs, computed in the previous backward computation.
In the case when data-parallelism is used, the outputs will be collected from
multiple devices. The results will look like
`[[grad1_dev1, grad1_dev2], [grad2_dev1, grad2_dev2]]`.
Those `NDArray`s might live on different devices.</pre></div></div><div class="public anchor" id="var-input-grads-merged"><h3>input-grads-merged</h3><div class="usage"><code>(input-grads-merged mod)</code></div><div class="doc"><pre class="plaintext">Get the gradients to the inputs, computed in the previous backward computation.
In the case when data-parallelism is used, the outputs will be merged from
multiple devices, as they look like from a single executor.
The results will look like `[grad1, grad2]`.</pre></div></div><div class="public anchor" id="var-install-monitor"><h3>install-monitor</h3><div class="usage"><code>(install-monitor mod monitor)</code></div><div class="doc"><pre class="plaintext">Install monitor on all executors.
</pre></div></div><div class="public anchor" id="var-label-shapes"><h3>label-shapes</h3><div class="usage"><code>(label-shapes mod)</code></div><div class="doc"><pre class="plaintext"></pre></div></div><div class="public anchor" id="var-load-checkpoint"><h3>load-checkpoint</h3><div class="usage"><code>(load-checkpoint {:keys [prefix epoch load-optimizer-states data-names label-names contexts workload-list fixed-param-names], :as opts, :or {load-optimizer-states false, data-names ["data"], label-names ["softmax_label"], contexts [(context/cpu)], workload-list nil, fixed-param-names nil}})</code><code>(load-checkpoint prefix epoch)</code></div><div class="doc"><pre class="plaintext">Create a model from previously saved checkpoint.
`opts-map` {
`prefix`: string
Path prefix of saved model files. You should have prefix-symbol.json,
prefix-xxxx.params, and optionally prefix-xxxx.states, where xxxx is
the epoch number.
`epoch`: int
Epoch to load.
`load-optimizer-states`: boolean - Default is false
Whether to load optimizer states. Checkpoint needs to have been made
with `save-optimizer-states` = `true`.
`data-names`: vector of strings - Default is ["data"]
Input data names.
`label-names`: vector of strings - Default is ["softmax_label"]
Input label names.
`contexts`: Context - Default is `context/cpu`
`workload-list`: Default nil
Indicating uniform workload.
`fixed-param-names`: Default nil
Indicating no network parameters are fixed.
Ex:
(load-checkpoint {:prefix "my-model" :epoch 1 :load-optimizer-states true}</pre></div></div><div class="public anchor" id="var-load-optimizer-states"><h3>load-optimizer-states</h3><div class="usage"><code>(load-optimizer-states mod fname)</code></div><div class="doc"><pre class="plaintext">Load optimizer (updater) state from file.
`mod`: Module
`fname`: string - Path to input states file.</pre></div></div><div class="public anchor" id="var-module"><h3>module</h3><div class="usage"><code>(module sym {:keys [data-names label-names contexts workload-list fixed-param-names], :as opts, :or {data-names ["data"], label-names ["softmax_label"], contexts [(context/default-context)]}})</code><code>(module sym data-names label-names contexts)</code><code>(module sym)</code></div><div class="doc"><pre class="plaintext">Module is a basic module that wrap a `symbol`.
`sym`: Symbol definition.
`opts-map` {
`data-names`: vector of strings - Default is ["data"]
Input data names
`label-names`: vector of strings - Default is ["softmax_label"]
Input label names
`contexts`: Context - Default is `context/cpu`.
`workload-list`: Default nil
Indicating uniform workload.
`fixed-param-names`: Default nil
Indicating no network parameters are fixed.
}
Ex:
(module sym)
(module sym {:data-names ["data"]
:label-names ["linear_regression_label"]}</pre></div></div><div class="public anchor" id="var-output-names"><h3>output-names</h3><div class="usage"><code>(output-names mod)</code></div><div class="doc"><pre class="plaintext"></pre></div></div><div class="public anchor" id="var-output-shapes"><h3>output-shapes</h3><div class="usage"><code>(output-shapes mod)</code></div><div class="doc"><pre class="plaintext"></pre></div></div><div class="public anchor" id="var-outputs"><h3>outputs</h3><div class="usage"><code>(outputs mod)</code></div><div class="doc"><pre class="plaintext">Get outputs of the previous forward computation.
In the case when data-parallelism is used, the outputs will be collected from
multiple devices. The results will look like
`[[out1_dev1, out1_dev2], [out2_dev1, out2_dev2]]`.
Those `NDArray`s might live on different devices.</pre></div></div><div class="public anchor" id="var-outputs-merged"><h3>outputs-merged</h3><div class="usage"><code>(outputs-merged mod)</code></div><div class="doc"><pre class="plaintext">Get outputs of the previous forward computation.
In the case when data-parallelism is used, the outputs will be merged from
multiple devices, as they look like from a single executor.
The results will look like `[out1, out2]`.</pre></div></div><div class="public anchor" id="var-params"><h3>params</h3><div class="usage"><code>(params mod)</code></div><div class="doc"><pre class="plaintext"></pre></div></div><div class="public anchor" id="var-predict"><h3>predict</h3><div class="usage"><code>(predict mod {:keys [eval-data num-batch reset], :as opts, :or {num-batch -1, reset true}})</code></div><div class="doc"><pre class="plaintext">Run prediction and collect the outputs.
`mod`: Module
`opts-map` {
`eval-data`: DataIter
`num-batch` int - Default is `-1`
Indicating running all the batches in the data iterator.
`reset`: boolean - Default is `true`
Indicating whether we should reset the data iter before start doing
prediction.
}
returns: vector of NDArrays `[out1, out2, out3]` where each element is the
concatenation of the outputs for all the mini-batches.
Ex:
(predict mod {:eval-data test-iter})
(predict mod {:eval-data test-iter :num-batch 10 :reset false})</pre></div></div><div class="public anchor" id="var-predict-batch"><h3>predict-batch</h3><div class="usage"><code>(predict-batch mod data-batch)</code></div><div class="doc"><pre class="plaintext">Run the predication on a data batch.
`mod`: Module
`data-batch`: data-batch</pre></div></div><div class="public anchor" id="var-predict-every-batch"><h3>predict-every-batch</h3><div class="usage"><code>(predict-every-batch mod {:keys [eval-data num-batch reset], :as opts, :or {num-batch -1, reset true}})</code></div><div class="doc"><pre class="plaintext">Run prediction and collect the outputs.
`mod`: Module
`opts-map` {
`eval-data`: DataIter
`num-batch` int - Default is `-1`
Indicating running all the batches in the data iterator.
`reset` boolean - Default is `true`
Indicating whether we should reset the data iter before start doing
prediction.
}
returns: nested list like this
`[[out1_batch1, out2_batch1, ...], [out1_batch2, out2_batch2, ...]]`
Note: This mode is useful because in some cases (e.g. bucketing), the module
does not necessarily produce the same number of outputs.
Ex:
(predict-every-batch mod {:eval-data test-iter})</pre></div></div><div class="public anchor" id="var-reshape"><h3>reshape</h3><div class="usage"><code>(reshape mod data-shapes label-shapes)</code><code>(reshape mod data-shapes)</code></div><div class="doc"><pre class="plaintext">Reshapes the module for new input shapes.
`mod`: Module
`data-shapes`: Typically is `(provide-data data-iter)`
`label-shapes`: Typically is `(provide-label data-tier)`</pre></div></div><div class="public anchor" id="var-save-checkpoint"><h3>save-checkpoint</h3><div class="usage"><code>(save-checkpoint mod {:keys [prefix epoch save-opt-states], :as opts, :or {save-opt-states false}})</code><code>(save-checkpoint mod prefix epoch)</code></div><div class="doc"><pre class="plaintext">Save current progress to checkpoint.
Use mx.callback.module_checkpoint as epoch_end_callback to save during
training.
`mod`: Module
`opts-map` {
`prefix`: string
The file prefix to checkpoint to
`epoch`: int
The current epoch number
`save-opt-states`: boolean - Default is `false`
Whether to save optimizer states for continue training
}
Ex:
(save-checkpoint {:prefix "saved_model" :epoch 0 :save-opt-states true})</pre></div></div><div class="public anchor" id="var-save-optimizer-states"><h3>save-optimizer-states</h3><div class="usage"><code>(save-optimizer-states mod fname)</code></div><div class="doc"><pre class="plaintext">Save optimizer (updater) state to file.
`mod`: Module
`fname`: string - Path to output states file.</pre></div></div><div class="public anchor" id="var-score"><h3>score</h3><div class="usage"><code>(score mod {:keys [eval-data eval-metric num-batch reset epoch], :as opts, :or {num-batch Integer/MAX_VALUE, reset true, epoch 0}})</code></div><div class="doc"><pre class="plaintext">Run prediction on `eval-data` and evaluate the performance according to
`eval-metric`.
`mod`: module
`opts-map` {
`eval-data`: DataIter
`eval-metric`: EvalMetric
`num-batch`: int - Default is `Integer.MAX_VALUE`
Number of batches to run. Indicating run until the `DataIter`
finishes.
`batch-end-callback`: not supported yet.
`reset`: boolean - Default is `true`,
Indicating whether we should reset `eval-data` before starting
evaluating.
`epoch`: int - Default is 0
For compatibility, this will be passed to callbacks (if any). During
training, this will correspond to the training epoch number.
}
Ex:
(score mod {:eval-data data-iter :eval-metric (eval-metric/accuracy)})
(score mod {:eval-data data-iter
:eval-metric (eval-metric/mse) :num-batch 10})</pre></div></div><div class="public anchor" id="var-set-params"><h3>set-params</h3><div class="usage"><code>(set-params mod {:keys [arg-params aux-params allow-missing force-init allow-extra], :as opts, :or {allow-missing false, force-init true, allow-extra false}})</code></div><div class="doc"><pre class="plaintext">Assign parameters and aux state values.
`mod`: Module
`opts-map` {
`arg-params`: map - map of name to value (`NDArray`) mapping.
`aux-params`: map - map of name to value (`NDArray`) mapping.
`allow-missing`: boolean
If true, params could contain missing values, and the initializer will
be called to fill those missing params.
`force-init`: boolean - Default is `false`
If true, will force re-initialize even if already initialized.
`allow-extra`: boolean - Default is `false`
Whether allow extra parameters that are not needed by symbol. If this
is `true`, no error will be thrown when arg-params or aux-params
contain extra parameters that is not needed by the executor.
}
Ex:
(set-params mod
{:arg-params {"fc_0_weight" (ndarray/array [0.15 0.2 0.25 0.3] [2 2])
:allow-missing true})</pre></div></div><div class="public anchor" id="var-symbol"><h3>symbol</h3><div class="usage"><code>(symbol mod)</code></div><div class="doc"><pre class="plaintext"></pre></div></div><div class="public anchor" id="var-update"><h3>update</h3><div class="usage"><code>(update mod)</code></div><div class="doc"><pre class="plaintext">Update parameters according to the installed optimizer and the gradients
computed in the previous forward-backward batch.</pre></div></div><div class="public anchor" id="var-update-metric"><h3>update-metric</h3><div class="usage"><code>(update-metric mod eval-metric labels)</code></div><div class="doc"><pre class="plaintext">Evaluate and accumulate evaluation metric on outputs of the last forward
computation.
`mod`: module
`eval-metric`: EvalMetric
`labels`: collection of NDArrays
Ex:
(update-metric mod (eval-metric/mse) labels)</pre></div></div></div></body></html>