tree: 138f526b7ff810a9287e4ad843caf0a45f594f40 [path history] [tgz]
  1. include/
  2. src/
  3. .gitignore
  4. arm-none-eabi-gcc.cmake
  5. convert_image.py
  6. corstone300.ld
  7. Makefile
  8. README.md
  9. requirements.txt
  10. run_demo.sh
apps/microtvm/cmsisnn/README.md

Running TVM on bare metal Arm(R) Cortex(R)-M55 CPU and CMSIS-NN

This folder contains an example of how to use TVM to run a model on bare metal Cortex(R)-M55 CPU and CMSIS-NN.

Prerequisites

If the demo is run in the ci_cpu Docker container provided with TVM, then the following software will already be installed.

If the demo is not run in the ci_cpu Docker container, then you will need the following:

You will also need TVM which can either be:

  • Built from source (see Install from Source)
    • When building from source, the following need to be set in config.cmake:
      • set(USE_CMSISNN ON)
      • set(USE_MICRO ON)
      • set(USE_LLVM ON)
  • Installed from TLCPack(see TLCPack)

You will need to update your PATH environment variable to include the path to cmake 3.19.5 and the FVP. For example if you've installed these in /opt/arm , then you would do the following:

export PATH=/opt/arm/FVP_Corstone_SSE-300/models/Linux64_GCC-6.4:/opt/arm/cmake/bin:$PATH

Running the demo application

Type the following command to run the bare metal demo application (src/demo_bare_metal.c):

./run_demo.sh

If the Ethos(TM)-U platform and/or CMSIS have not been installed in /opt/arm/ethosu then the locations for these can be specified as arguments to run_demo.sh, for example:

./run_demo.sh --cmsis_path /home/tvm-user/cmsis \
--ethosu_platform_path /home/tvm-user/ethosu/core_platform

This will:

  • Download a quantized (int8) person detection model
  • Use tvmc to compile the model for Cortex(R)-M55 CPU and CMSIS-NN
  • Download an image to run the model on
  • Create a C header file inputs.c containing the image data as a C array
  • Create a C header file outputs.c containing a C array where the output of inference will be stored
  • Build the demo application
  • Run the demo application on a Fixed Virtual Platform (FVP) based on Arm(R) Corstone(TM)-300 software
  • The application will report whether a person was detected e.g. “Person detected.” or “No person detected.”

Using your own image

The create_image.py script takes a single argument on the command line which is the path of the image to be converted into an array of bytes for consumption by the model.

The demo can be modified to use an image of your choice by changing the following line in run_demo.sh

curl -sS https://raw.githubusercontent.com/tensorflow/tflite-micro/main/tensorflow/lite/micro/examples/person_detection/testdata/person.bmp -o input_image.bmp