blob: 01460256b21e11ca65bf8cbd111bac8a7d982712 [file] [log] [blame]
{
"cells": [
{
"cell_type": "markdown",
"id": "30a2eeef",
"metadata": {},
"source": [
"<!--- Licensed to the Apache Software Foundation (ASF) under one -->\n",
"<!--- or more contributor license agreements. See the NOTICE file -->\n",
"<!--- distributed with this work for additional information -->\n",
"<!--- regarding copyright ownership. The ASF licenses this file -->\n",
"<!--- to you under the Apache License, Version 2.0 (the -->\n",
"<!--- \"License\"); you may not use this file except in compliance -->\n",
"<!--- with the License. You may obtain a copy of the License at -->\n",
"\n",
"<!--- http://www.apache.org/licenses/LICENSE-2.0 -->\n",
"\n",
"<!--- Unless required by applicable law or agreed to in writing, -->\n",
"<!--- software distributed under the License is distributed on an -->\n",
"<!--- \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -->\n",
"<!--- KIND, either express or implied. See the License for the -->\n",
"<!--- specific language governing permissions and limitations -->\n",
"<!--- under the License. -->\n",
"\n",
"\n",
"# Custom Numpy Operators\n",
"\n",
"In this tutorial, we will learn how to build custom operators with numpy in python. We will go through two examples:\n",
"- Custom operator without any `Parameter`s\n",
"- Custom operator with `Parameter`s\n",
"\n",
"Custom operator in python is easy to develop and good for prototyping, but may hurt performance. If you find it to be a bottleneck, please consider moving to a C++ based implementation in the backend."
]
},
{
"cell_type": "markdown",
"id": "7ed79493",
"metadata": {},
"source": [
"```python\n",
"import numpy as np\n",
"import mxnet as mx\n",
"from mxnet import gluon, autograd\n",
"import os\n",
"```\n"
]
},
{
"cell_type": "markdown",
"id": "6e7757ee",
"metadata": {},
"source": [
"## Parameter-less operators\n",
"\n",
"This operator implements the standard sigmoid activation function. This is only for illustration purposes, in real life you would use the built-in operator `mx.nd.relu`.\n",
"\n",
"### Forward & backward implementation\n",
"\n",
"First we implement the forward and backward computation by sub-classing `mx.operator.CustomOp`:"
]
},
{
"cell_type": "markdown",
"id": "871f3df6",
"metadata": {},
"source": [
"```python\n",
"class Sigmoid(mx.operator.CustomOp):\n",
" def forward(self, is_train, req, in_data, out_data, aux):\n",
" \"\"\"Implements forward computation.\n",
"\n",
" is_train : bool, whether forwarding for training or testing.\n",
" req : list of {'null', 'write', 'inplace', 'add'}, how to assign to out_data. 'null' means skip assignment, etc.\n",
" in_data : list of NDArray, input data.\n",
" out_data : list of NDArray, pre-allocated output buffers.\n",
" aux : list of NDArray, mutable auxiliary states. Usually not used.\n",
" \"\"\"\n",
" x = in_data[0].asnumpy()\n",
" y = 1.0 / (1.0 + np.exp(-x))\n",
" self.assign(out_data[0], req[0], mx.nd.array(y))\n",
"\n",
" def backward(self, req, out_grad, in_data, out_data, in_grad, aux):\n",
" \"\"\"Implements backward computation\n",
"\n",
" req : list of {'null', 'write', 'inplace', 'add'}, how to assign to in_grad\n",
" out_grad : list of NDArray, gradient w.r.t. output data.\n",
" in_grad : list of NDArray, gradient w.r.t. input data. This is the output buffer.\n",
" \"\"\"\n",
" y = out_data[0].asnumpy()\n",
" dy = out_grad[0].asnumpy()\n",
" dx = dy*(1.0 - y)*y\n",
" self.assign(in_grad[0], req[0], mx.nd.array(dx))\n",
"```\n"
]
},
{
"cell_type": "markdown",
"id": "2922f759",
"metadata": {},
"source": [
"### Register custom operator\n",
"\n",
"Then we need to register the custom op and describe it's properties like input and output shapes so that mxnet can recognize it. This is done by sub-classing `mx.operator.CustomOpProp`:"
]
},
{
"cell_type": "markdown",
"id": "9f0e27a2",
"metadata": {},
"source": [
"```python\n",
"@mx.operator.register(\"sigmoid\") # register with name \"sigmoid\"\n",
"class SigmoidProp(mx.operator.CustomOpProp):\n",
" def __init__(self):\n",
" super(SigmoidProp, self).__init__(True)\n",
"\n",
" def list_arguments(self):\n",
" # this can be omitted if you only have 1 input.\n",
" return ['data']\n",
"\n",
" def list_outputs(self):\n",
" # this can be omitted if you only have 1 output.\n",
" return ['output']\n",
"\n",
" def infer_shape(self, in_shapes):\n",
" \"\"\"Calculate output shapes from input shapes. This can be\n",
" omited if all your inputs and outputs have the same shape.\n",
"\n",
" in_shapes : list of shape. Shape is described by a tuple of int.\n",
" \"\"\"\n",
" data_shape = in_shapes[0]\n",
" output_shape = data_shape\n",
" # return 3 lists representing inputs shapes, outputs shapes, and aux data shapes.\n",
" return (data_shape,), (output_shape,), ()\n",
"\n",
" def create_operator(self, ctx, in_shapes, in_dtypes):\n",
" # create and return the CustomOp class.\n",
" return Sigmoid()\n",
"```\n"
]
},
{
"cell_type": "markdown",
"id": "497c38b8",
"metadata": {},
"source": [
"### Example Usage\n",
"\n",
"We can now use this operator by calling `mx.nd.Custom`:"
]
},
{
"cell_type": "markdown",
"id": "3ec13b4b",
"metadata": {},
"source": [
"```python\n",
"x = mx.nd.array([0, 1, 2, 3])\n",
"# attach gradient buffer to x for autograd\n",
"x.attach_grad()\n",
"# forward in a record() section to save computation graph for backward\n",
"# see autograd tutorial to learn more.\n",
"with autograd.record():\n",
" y = mx.nd.Custom(x, op_type='sigmoid')\n",
"print(y)\n",
"```\n"
]
},
{
"cell_type": "markdown",
"id": "d7ef09e9",
"metadata": {},
"source": [
"```python\n",
"# call backward computation\n",
"y.backward()\n",
"# gradient is now saved to the grad buffer we attached previously\n",
"print(x.grad)\n",
"```\n"
]
},
{
"cell_type": "markdown",
"id": "35f27eeb",
"metadata": {},
"source": [
"## Parametrized Operator\n",
"\n",
"In the second use case we implement an operator with learnable weights. We implement the dense (or fully connected) layer that has one input, one output, and two learnable parameters: weight and bias.\n",
"\n",
"The dense operator performs a dot product between data and weight, then add bias to it.\n",
"\n",
"### Forward & backward implementation"
]
},
{
"cell_type": "markdown",
"id": "e997cfd6",
"metadata": {},
"source": [
"```python\n",
"class Dense(mx.operator.CustomOp):\n",
" def __init__(self, bias):\n",
" self._bias = bias\n",
"\n",
" def forward(self, is_train, req, in_data, out_data, aux):\n",
" x = in_data[0].asnumpy()\n",
" weight = in_data[1].asnumpy()\n",
" y = x.dot(weight.T) + self._bias\n",
" self.assign(out_data[0], req[0], mx.nd.array(y))\n",
"\n",
" def backward(self, req, out_grad, in_data, out_data, in_grad, aux):\n",
" x = in_data[0].asnumpy()\n",
" dy = out_grad[0].asnumpy()\n",
" dx = dy.T.dot(x)\n",
" self.assign(in_grad[0], req[0], mx.nd.array(dx))\n",
"```\n"
]
},
{
"cell_type": "markdown",
"id": "506b23b4",
"metadata": {},
"source": [
"### Registration"
]
},
{
"cell_type": "markdown",
"id": "735b55c9",
"metadata": {},
"source": [
"```python\n",
"@mx.operator.register(\"dense\") # register with name \"sigmoid\"\n",
"class DenseProp(mx.operator.CustomOpProp):\n",
" def __init__(self, bias):\n",
" super(DenseProp, self).__init__(True)\n",
" # we use constant bias here to illustrate how to pass arguments\n",
" # to operators. All arguments are in string format so you need\n",
" # to convert them back to the type you want.\n",
" self._bias = float(bias)\n",
"\n",
" def list_arguments(self):\n",
" return ['data', 'weight']\n",
"\n",
" def list_outputs(self):\n",
" # this can be omitted if you only have 1 output.\n",
" return ['output']\n",
"\n",
" def infer_shape(self, in_shapes):\n",
" data_shape = in_shapes[0]\n",
" weight_shape = in_shapes[1]\n",
" output_shape = (data_shape[0], weight_shape[0])\n",
" # return 3 lists representing inputs shapes, outputs shapes, and aux data shapes.\n",
" return (data_shape, weight_shape), (output_shape,), ()\n",
"\n",
" def create_operator(self, ctx, in_shapes, in_dtypes):\n",
" # create and return the CustomOp class.\n",
" return Dense(self._bias)\n",
"```\n"
]
},
{
"cell_type": "markdown",
"id": "5496235c",
"metadata": {},
"source": [
"### Use CustomOp together with Block\n",
"\n",
"Parameterized CustomOp are usually used together with Blocks, which holds the parameter."
]
},
{
"cell_type": "markdown",
"id": "437772aa",
"metadata": {},
"source": [
"```python\n",
"class DenseBlock(mx.gluon.Block):\n",
" def __init__(self, in_channels, channels, bias, **kwargs):\n",
" super(DenseBlock, self).__init__(**kwargs)\n",
" self._bias = bias\n",
" self.weight = self.params.get('weight', shape=(channels, in_channels))\n",
"\n",
" def forward(self, x):\n",
" ctx = x.context\n",
" return mx.nd.Custom(x, self.weight.data(ctx), bias=self._bias, op_type='dense')\n",
"```\n"
]
},
{
"cell_type": "markdown",
"id": "40931418",
"metadata": {},
"source": [
"### Example usage"
]
},
{
"cell_type": "markdown",
"id": "e65a6c82",
"metadata": {},
"source": [
"```python\n",
"dense = DenseBlock(3, 5, 0.1)\n",
"dense.initialize()\n",
"x = mx.nd.uniform(shape=(4, 3))\n",
"y = dense(x)\n",
"print(y)\n",
"```\n"
]
},
{
"cell_type": "markdown",
"id": "ee639976",
"metadata": {},
"source": [
"## Using custom operators with fork\n",
"In Linux systems, the default method in multiprocessing to create process is by using fork. If there are unfinished async custom operations when forking, the program will be blocked because of python GIL. Always use sync calls like `wait_to_read` or `waitall` before calling fork."
]
},
{
"cell_type": "markdown",
"id": "b1094963",
"metadata": {},
"source": [
"```python\n",
"x = mx.nd.array([0, 1, 2, 3])\n",
"y = mx.nd.Custom(x, op_type='sigmoid')\n",
"# unfinished async sigmoid operation will cause blocking\n",
"os.fork()\n",
"```\n"
]
},
{
"cell_type": "markdown",
"id": "54287d06",
"metadata": {},
"source": [
"Correctly handling this will make mxnet depend upon libpython, so the workaround now is to ensure that all custom operations are executed before forking process.\n",
"\n",
"```python\n",
"x = mx.nd.array([0, 1, 2, 3])\n",
"y = mx.nd.Custom(x, op_type='sigmoid')\n",
"# force execution by reading y\n",
"print(y.asnumpy())\n",
"os.fork()\n",
"```"
]
}
],
"metadata": {
"language_info": {
"name": "python"
}
},
"nbformat": 4,
"nbformat_minor": 5
}