blob: 7f6b981acf480e0c05c0594e0903c2fbbbbfb103 [file] [log] [blame]
{"nbformat": 4, "cells": [{"source": "# Naming of Gluon Parameter and Blocks\n\nIn gluon, each Parameter or Block has a name (and prefix). Parameter names are specified by users and Block names can be either specified by users or automatically created.\n\nIn this tutorial we talk about the best practices on naming. First, let's import MXNet and Gluon:", "cell_type": "markdown", "metadata": {}}, {"source": "from __future__ import print_function\nimport mxnet as mx\nfrom mxnet import gluon", "cell_type": "code", "execution_count": null, "outputs": [], "metadata": {}}, {"source": "## Naming Blocks\n\nWhen creating a block, you can assign a prefix to it:", "cell_type": "markdown", "metadata": {}}, {"source": "mydense = gluon.nn.Dense(100, prefix='mydense_')\nprint(mydense.prefix)", "cell_type": "code", "execution_count": null, "outputs": [], "metadata": {}}, {"source": " mydense_\n\n\nWhen no prefix is given, Gluon will automatically generate one:", "cell_type": "markdown", "metadata": {}}, {"source": "dense0 = gluon.nn.Dense(100)\nprint(dense0.prefix)", "cell_type": "code", "execution_count": null, "outputs": [], "metadata": {}}, {"source": " dense0_\n\n\nWhen you create more Blocks of the same kind, they will be named with incrementing suffixes to avoid collision:", "cell_type": "markdown", "metadata": {}}, {"source": "dense1 = gluon.nn.Dense(100)\nprint(dense1.prefix)", "cell_type": "code", "execution_count": null, "outputs": [], "metadata": {}}, {"source": " dense1_\n\n\n## Naming Parameters\n\nParameters within a Block will be named by prepending the prefix of the Block to the name of the Parameter:", "cell_type": "markdown", "metadata": {}}, {"source": "print(dense0.collect_params())", "cell_type": "code", "execution_count": null, "outputs": [], "metadata": {}}, {"source": " dense0_ (\n Parameter dense0_weight (shape=(100, 0), dtype=<type 'numpy.float32'>)\n Parameter dense0_bias (shape=(100,), dtype=<type 'numpy.float32'>)\n )\n\n\n## Name scopes\n\nTo manage the names of nested Blocks, each Block has a `name_scope` attached to it. All Blocks created within a name scope will have its parent Block's prefix prepended to its name.\n\nLet's demonstrate this by first defining a simple neural net:", "cell_type": "markdown", "metadata": {}}, {"source": "class Model(gluon.Block):\n def __init__(self, **kwargs):\n super(Model, self).__init__(**kwargs)\n with self.name_scope():\n self.dense0 = gluon.nn.Dense(20)\n self.dense1 = gluon.nn.Dense(20)\n self.mydense = gluon.nn.Dense(20, prefix='mydense_')\n\n def forward(self, x):\n x = mx.nd.relu(self.dense0(x))\n x = mx.nd.relu(self.dense1(x))\n return mx.nd.relu(self.mydense(x))", "cell_type": "code", "execution_count": null, "outputs": [], "metadata": {}}, {"source": "Now let's instantiate our neural net.\n\n- Note that `model0.dense0` is named as `model0_dense0_` instead of `dense0_`.\n\n- Also note that although we specified `mydense_` as prefix for `model.mydense`, its parent's prefix is automatically prepended to generate the prefix `model0_mydense_`.", "cell_type": "markdown", "metadata": {}}, {"source": "model0 = Model()\nmodel0.initialize()\nmodel0(mx.nd.zeros((1, 20)))\nprint(model0.prefix)\nprint(model0.dense0.prefix)\nprint(model0.dense1.prefix)\nprint(model0.mydense.prefix)", "cell_type": "code", "execution_count": null, "outputs": [], "metadata": {}}, {"source": " model0_\n model0_dense0_\n model0_dense1_\n model0_mydense_\n\n\nIf we instantiate `Model` again, it will be given a different name like shown before for `Dense`.\n\n- Note that `model1.dense0` is still named as `dense0_` instead of `dense2_`, following dense layers in previously created `model0`. This is because each instance of model's name scope is independent of each other.", "cell_type": "markdown", "metadata": {}}, {"source": "model1 = Model()\nprint(model1.prefix)\nprint(model1.dense0.prefix)\nprint(model1.dense1.prefix)\nprint(model1.mydense.prefix)", "cell_type": "code", "execution_count": null, "outputs": [], "metadata": {}}, {"source": " model1_\n model1_dense0_\n model1_dense1_\n model1_mydense_\n\n\n**It is recommended that you manually specify a prefix for the top level Block, i.e. `model = Model(prefix='mymodel_')`, to avoid potential confusions in naming.**\n\nThe same principle also applies to container blocks like Sequential. `name_scope` can be used inside `__init__` as well as out side of `__init__`:", "cell_type": "markdown", "metadata": {}}, {"source": "net = gluon.nn.Sequential()\nwith net.name_scope():\n net.add(gluon.nn.Dense(20))\n net.add(gluon.nn.Dense(20))\nprint(net.prefix)\nprint(net[0].prefix)\nprint(net[1].prefix)", "cell_type": "code", "execution_count": null, "outputs": [], "metadata": {}}, {"source": " sequential0_\n sequential0_dense0_\n sequential0_dense1_\n\n\n`gluon.model_zoo` also behaves similarly:", "cell_type": "markdown", "metadata": {}}, {"source": "net = gluon.nn.Sequential()\nwith net.name_scope():\n net.add(gluon.model_zoo.vision.alexnet(pretrained=True))\n net.add(gluon.model_zoo.vision.alexnet(pretrained=True))\nprint(net.prefix, net[0].prefix, net[1].prefix)", "cell_type": "code", "execution_count": null, "outputs": [], "metadata": {}}, {"source": " sequential1_ sequential1_alexnet0_ sequential1_alexnet1_\n\n\n## Saving and loading\n\nBecause model0 and model1 have different prefixes, their parameters also have different names:", "cell_type": "markdown", "metadata": {}}, {"source": "print(model0.collect_params(), '\\n')\nprint(model1.collect_params())", "cell_type": "code", "execution_count": null, "outputs": [], "metadata": {}}, {"source": " model0_ (\n Parameter model0_dense0_weight (shape=(20L, 20L), dtype=<type 'numpy.float32'>)\n Parameter model0_dense0_bias (shape=(20L,), dtype=<type 'numpy.float32'>)\n Parameter model0_dense1_weight (shape=(20L, 20L), dtype=<type 'numpy.float32'>)\n Parameter model0_dense1_bias (shape=(20L,), dtype=<type 'numpy.float32'>)\n Parameter model0_mydense_weight (shape=(20L, 20L), dtype=<type 'numpy.float32'>)\n Parameter model0_mydense_bias (shape=(20L,), dtype=<type 'numpy.float32'>)\n ) \n \n model1_ (\n Parameter model1_dense0_weight (shape=(20, 0), dtype=<type 'numpy.float32'>)\n Parameter model1_dense0_bias (shape=(20,), dtype=<type 'numpy.float32'>)\n Parameter model1_dense1_weight (shape=(20, 0), dtype=<type 'numpy.float32'>)\n Parameter model1_dense1_bias (shape=(20,), dtype=<type 'numpy.float32'>)\n Parameter model1_mydense_weight (shape=(20, 0), dtype=<type 'numpy.float32'>)\n Parameter model1_mydense_bias (shape=(20,), dtype=<type 'numpy.float32'>)\n )\n\n\nAs a result, if you try to save parameters from model0 and load it with model1, you'll get an error due to unmatching names:", "cell_type": "markdown", "metadata": {}}, {"source": "model0.collect_params().save('model.params')\ntry:\n model1.collect_params().load('model.params', mx.cpu())\nexcept Exception as e:\n print(e)", "cell_type": "code", "execution_count": null, "outputs": [], "metadata": {}}, {"source": " Parameter 'model1_dense0_weight' is missing in file 'model.params', which contains parameters: 'model0_mydense_weight', 'model0_dense1_bias', 'model0_dense1_weight', 'model0_dense0_weight', 'model0_dense0_bias', 'model0_mydense_bias'. Please make sure source and target networks have the same prefix.\n\n\nTo solve this problem, we use `save_parameters`/`load_parameters` instead of `collect_params` and `save`/`load`. `save_parameters` uses model structure, instead of parameter name, to match parameters.", "cell_type": "markdown", "metadata": {}}, {"source": "model0.save_parameters('model.params')\nmodel1.load_parameters('model.params')\nprint(mx.nd.load('model.params').keys())", "cell_type": "code", "execution_count": null, "outputs": [], "metadata": {}}, {"source": " ['dense0.bias', 'mydense.bias', 'dense1.bias', 'dense1.weight', 'dense0.weight', 'mydense.weight']\n\n\n## Replacing Blocks from networks and fine-tuning\n\nSometimes you may want to load a pretrained model, and replace certain Blocks in it for fine-tuning.\n\nFor example, the alexnet in model zoo has 1000 output dimensions, but maybe you only have 100 classes in your application.\n\nTo see how to do this, we first load a pretrained AlexNet.\n\n- In Gluon model zoo, all image classification models follow the format where the feature extraction layers are named `features` while the output layer is named `output`.\n- Note that the output layer is a dense block with 1000 dimension outputs.", "cell_type": "markdown", "metadata": {}}, {"source": "alexnet = gluon.model_zoo.vision.alexnet(pretrained=True)\nprint(alexnet.output)\nprint(alexnet.output.prefix)", "cell_type": "code", "execution_count": null, "outputs": [], "metadata": {}}, {"source": " Dense(4096 -> 1000, linear)\n alexnet0_dense2_\n\n\nTo change the output to 100 dimension, we replace it with a new block.", "cell_type": "markdown", "metadata": {}}, {"source": "with alexnet.name_scope():\n alexnet.output = gluon.nn.Dense(100)\nalexnet.output.initialize()\nprint(alexnet.output)\nprint(alexnet.output.prefix)", "cell_type": "code", "execution_count": null, "outputs": [], "metadata": {}}, {"source": "\n Dense(None -> 100, linear)\n alexnet0_dense3_\n\n\n<!-- INSERT SOURCE DOWNLOAD BUTTONS -->\n\n", "cell_type": "markdown", "metadata": {}}], "metadata": {"display_name": "", "name": "", "language": "python"}, "nbformat_minor": 2}