tree: 85a07a9e96f3955bc84e158c6608e2bbb10af2be [path history] [tgz]
  1. custom_softmax.py
  2. data.py
  3. ndarray_softmax.py
  4. numpy_softmax.py
  5. README.md
  6. weighted_logistic_regression.py
example/numpy-ops/README.md

Training MNIST With NumpyOp

Uses the same setup as example/mnist/mlp.py. Except the loss symbol is custom defined with NumpyOp. mxnet.operator.NumpyOp help move computation in a symbol's forward/backward operation to python frontend. This is for fast implementation/experimentation of non-performance-critical symbols. If it is becoming a bottleneck, please consider write a C++/CUDA version.