summaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
authorIsabella Gottardi <isabella.gottardi@arm.com>2021-05-07 11:57:30 +0100
committerKshitij Sisodia <kshitij-sisodia-arm@review.mlplatform.org>2021-05-19 08:49:10 +0000
commit958133d49b121e997b63ad81f96db90fdd2a3e45 (patch)
treeb658654353c15a75f67df8c85a1cf6404fd3c8a7 /docs
parentafb0963b9d4413a398ebaa0185db88d88295e954 (diff)
downloadml-embedded-evaluation-kit-958133d49b121e997b63ad81f96db90fdd2a3e45.tar.gz
MLECO-1913: Documentation update: helper scripts and AD use case model update
Change-Id: I610b720146b520fe8633d25255b97df647b99ef5 Signed-off-by: Isabella Gottardi <isabella.gottardi@arm.com>
Diffstat (limited to 'docs')
-rw-r--r--docs/documentation.md2
-rw-r--r--docs/quick_start.md192
-rw-r--r--docs/sections/building.md8
-rw-r--r--docs/sections/testing_benchmarking.md10
4 files changed, 141 insertions, 71 deletions
diff --git a/docs/documentation.md b/docs/documentation.md
index 8ab9fa3..9ec73a3 100644
--- a/docs/documentation.md
+++ b/docs/documentation.md
@@ -185,7 +185,7 @@ from [Arm ML-Zoo](https://github.com/ARM-software/ML-zoo/).
- [Mobilenet V2](https://github.com/ARM-software/ML-zoo/blob/master/models/image_classification/mobilenet_v2_1.0_224/tflite_uint8).
- [DS-CNN](https://github.com/ARM-software/ML-zoo/blob/master/models/keyword_spotting/ds_cnn_large/tflite_clustered_int8).
- [Wav2Letter](https://github.com/ARM-software/ML-zoo/blob/master/models/speech_recognition/wav2letter/tflite_int8).
-- Anomaly Detection (coming soon).
+- [Anomaly Detection](https://github.com/ARM-software/ML-zoo/raw/7c32b097f7d94aae2cd0b98a8ed5a3ba81e66b18/models/anomaly_detection/micronet_medium/tflite_int8/ad_medium_int8.tflite).
When using Ethos-U55 NPU backend, the NN model is assumed to be optimized by Vela compiler.
However, even if not, it will fall back on the CPU and execute, if supported by TensorFlow Lite Micro.
diff --git a/docs/quick_start.md b/docs/quick_start.md
index f3565e8..abf8f50 100644
--- a/docs/quick_start.md
+++ b/docs/quick_start.md
@@ -1,8 +1,9 @@
# Quick start example ML application
-This is a quick start guide that will show you how to run the keyword spotting example application. The aim of this guide
-is to illustrate the flow of running an application on the evaluation kit rather than showing the keyword spotting
-functionality or performance. All use cases in the evaluation kit follow the steps.
+This is a quick start guide that will show you how to run the keyword spotting example application.
+The aim of this quick start guide is to enable you to run an application quickly on the Fixed Virtual Platform.
+The assumption we are making is that your Arm® Ethos™-U55 NPU is configured to use 128 Multiply-Accumulate units,
+is using a shared SRAM with the Arm® Cortex®-M55.
1. Verify you have installed [the required prerequisites](sections/building.md#Build-prerequisites).
@@ -19,75 +20,142 @@ functionality or performance. All use cases in the evaluation kit follow the ste
git submodule update --init
```
-4. Next, you would need to get a neural network model. For the purpose of this quick start guide, we'll use the
- `ds_cnn_clustered_int8` keyword spotting model from the [Arm public model zoo](https://github.com/ARM-software/ML-zoo)
- and the principle remains the same for all of the other use cases. Download the `ds_cnn_large_int8.tflite` model
- file with the curl command below:
-
- ```commandline
- curl -L https://github.com/ARM-software/ML-zoo/blob/master/models/keyword_spotting/ds_cnn_large/tflite_clustered_int8/ds_cnn_clustered_int8.tflite?raw=true --output ds_cnn_clustered_int8.tflite
- ```
-
-5. [Vela](https://review.mlplatform.org/plugins/gitiles/ml/ethos-u/ethos-u-vela) is an open-source python tool converting
+4. Next, you can use the `build_default` python script to get the default neural network models, compile them with
+ Vela and build the project.
+ [Vela](https://review.mlplatform.org/plugins/gitiles/ml/ethos-u/ethos-u-vela) is an open-source python tool converting
TensorFlow Lite for Microcontrollers neural network model into an optimized model that can run on an embedded system
containing an Ethos-U55 NPU. It is worth noting that in order to take full advantage of the capabilities of the NPU, the
neural network operators should be [supported by Vela](https://review.mlplatform.org/plugins/gitiles/ml/ethos-u/ethos-u-vela/+/HEAD/SUPPORTED_OPS.md).
- In this step, you will compile the model with Vela.
-
- For this step, you need to ensure you have [correctly installed the Vela package](https://pypi.org/project/ethos-u-vela/):
-
- ```commandline
- python3 -m venv env
- source ./env/bin/activate
- pip install --upgrade pip
- pip install --upgrade setuptools
- pip install ethos-u-vela
- ```
-
- In the command below, we specify that we are using the Arm® Ethos™-U55 NPU with a 128 Multiply-Accumulate units
- (MAC units) configured for a High End Embedded use case. The [building section](sections/building.md#Optimize-custom-model-with-Vela-compiler)
- has more detailed explanation about Vela usage.
-
- ```commandline
- vela ds_cnn_clustered_int8.tflite \
- --accelerator-config=ethos-u55-128 \
- --block-config-limit=0 \
- --config scripts/vela/vela.ini \
- --memory-mode Shared_Sram \
- --system-config Ethos_U55_High_End_Embedded
- ```
-
- An optimized model file for Ethos-U55 is generated in a folder named `output`.
-
-6. Create a `build` folder in the root level of the evaluation kit.
-
- ```commandline
- mkdir build && cd build
- ```
-
-7. Build the makefiles with `CMake` as shown in the command below. The [build process section](sections/building.md#Build-process)
- gives an in-depth explanation about the meaning of every parameter. For the time being, note that we point the Vela
- optimized model from stage 5 in the `-Dkws_MODEL_TFLITE_PATH` parameter.
-
- ```commandline
- cmake \
- -DUSE_CASE_BUILD=kws \
- -Dkws_MODEL_TFLITE_PATH=output/ds_cnn_clustered_int8_vela.tflite \
- ..
- ```
-
-8. Compile the project with a `make`. Details about this stage can be found in the [building part of the documentation](sections/building.md#Building-the-configured-project).
```commandline
- make -j4
+ python3 build_default.py
```
-9. Launch the project as explained [here](sections/deployment.md#Deployment). In this quick-start guide, we'll use the Fixed
- Virtual Platform. Point the generated `bin/ethos-u-kws.axf` file in stage 8 to the FVP that you have downloaded when
+5. Launch the project as explained [here](sections/deployment.md#Deployment). For the purpose of this quick start guide,
+ we'll use the keyword spotting application and the Fixed Virtual Platform.
+ Point the generated `bin/ethos-u-kws.axf` file in stage 4 to the FVP that you have downloaded when
installing the prerequisites.
```commandline
<path_to_FVP>/FVP_Corstone_SSE-300_Ethos-U55 -a ./bin/ethos-u-kws.axf
```
-10. A telnet window is launched through which you can interact with the application and obtain performance figures.
+6. A telnet window is launched through which you can interact with the application and obtain performance figures.
+
+> **Note:**: Execution of the build_default.py script is equivalent to running the following commands:
+
+```commandline
+mkdir resources_downloaded && cd resources_downloaded
+python3 -m venv env
+env/bin/python3 -m pip install --upgrade pip
+env/bin/python3 -m pip install --upgrade setuptools
+env/bin/python3 -m pip install ethos-u-vela==2.1.1
+cd ..
+
+curl -L https://github.com/ARM-software/ML-zoo/raw/7c32b097f7d94aae2cd0b98a8ed5a3ba81e66b18/models/anomaly_detection/micronet_medium/tflite_int8/ad_medium_int8.tflite \
+ --output resources_downloaded/ad/ad_medium_int8.tflite
+curl -L https://github.com/ARM-software/ML-zoo/raw/7c32b097f7d94aae2cd0b98a8ed5a3ba81e66b18/models/anomaly_detection/micronet_medium/tflite_int8/testing_input/input/0.npy \
+ --output ./resources_downloaded/ad/ifm0.npy
+curl -L https://github.com/ARM-software/ML-zoo/raw/7c32b097f7d94aae2cd0b98a8ed5a3ba81e66b18/models/anomaly_detection/micronet_medium/tflite_int8/testing_output/Identity/0.npy \
+ --output ./resources_downloaded/ad/ofm0.npy
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/speech_recognition/wav2letter/tflite_int8/wav2letter_int8.tflite \
+ --output ./resources_downloaded/asr/wav2letter_int8.tflite
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/speech_recognition/wav2letter/tflite_int8/testing_input/input_2_int8/0.npy \
+ --output ./resources_downloaded/asr/ifm0.npy
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/speech_recognition/wav2letter/tflite_int8/testing_output/Identity_int8/0.npy \
+ --output ./resources_downloaded/asr/ofm0.npy
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/image_classification/mobilenet_v2_1.0_224/tflite_uint8/mobilenet_v2_1.0_224_quantized_1_default_1.tflite \
+ --output ./resources_downloaded/img_class/mobilenet_v2_1.0_224_quantized_1_default_1.tflite
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/image_classification/mobilenet_v2_1.0_224/tflite_uint8/testing_input/input/0.npy \
+ --output ./resources_downloaded/img_class/ifm0.npy
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/image_classification/mobilenet_v2_1.0_224/tflite_uint8/testing_output/output/0.npy \
+ --output ./resources_downloaded/img_class/ofm0.npy
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/keyword_spotting/ds_cnn_large/tflite_clustered_int8/ds_cnn_clustered_int8.tflite \
+ --output ./resources_downloaded/kws/ds_cnn_clustered_int8.tflite
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/keyword_spotting/ds_cnn_large/tflite_clustered_int8/testing_input/input_2/0.npy \
+ --output ./resources_downloaded/kws/ifm0.npy
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/keyword_spotting/ds_cnn_large/tflite_clustered_int8/testing_output/Identity/0.npy \
+ --output ./resources_downloaded/kws/ofm0.npy
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/speech_recognition/wav2letter/tflite_int8/wav2letter_int8.tflite \
+ --output ./resources_downloaded/kws_asr/wav2letter_int8.tflite
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/speech_recognition/wav2letter/tflite_int8/testing_input/input_2_int8/0.npy \
+ --output ./resources_downloaded/kws_asr/asr/ifm0.npy
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/speech_recognition/wav2letter/tflite_int8/testing_input/input_2_int8/0.npy
+ --output ./resources_downloaded/kws_asr/asr/ifm0.npy
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/speech_recognition/wav2letter/tflite_int8/testing_output/Identity_int8/0.npy \
+ --output ./resources_downloaded/kws_asr/asr/ofm0.npy
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/keyword_spotting/ds_cnn_large/tflite_clustered_int8/ds_cnn_clustered_int8.tflite \
+ --output ./resources_downloaded/kws_asr/ds_cnn_clustered_int8.tflite
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/keyword_spotting/ds_cnn_large/tflite_clustered_int8/testing_input/input_2/0.npy \
+ --output ./resources_downloaded/kws_asr/kws/ifm0.npy
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/keyword_spotting/ds_cnn_large/tflite_clustered_int8/testing_output/Identity/0.npy \
+ --output ./resources_downloaded/kws_asr/kws/ofm0.npy
+curl -L https://github.com/ARM-software/ML-zoo/raw/68b5fbc77ed28e67b2efc915997ea4477c1d9d5b/models/keyword_spotting/dnn_small/tflite_int8/dnn_s_quantized.tflite \
+ --output ./resources_downloaded/inference_runner/dnn_s_quantized.tflite
+
+. resources_downloaded/env/bin/activate && vela resources_downloaded/kws/ds_cnn_clustered_int8.tflite \
+ --accelerator-config=ethos-u55-128 \
+ --block-config-limit=0 --config scripts/vela/default_vela.ini \
+ --memory-mode=Shared_Sram \
+ --system-config=Ethos_U55_High_End_Embedded \
+ --output-dir=resources_downloaded/kws
+mv resources_downloaded/kws/ds_cnn_clustered_int8_vela.tflite resources_downloaded/kws/ds_cnn_clustered_int8_vela_H128.tflite
+
+. resources_downloaded/env/bin/activate && vela resources_downloaded/kws_asr/wav2letter_int8.tflite \
+ --accelerator-config=ethos-u55-128 \
+ --block-config-limit=0 --config scripts/vela/default_vela.ini \
+ --memory-mode=Shared_Sram \
+ --system-config=Ethos_U55_High_End_Embedded \
+ --output-dir=resources_downloaded/kws_asr
+mv resources_downloaded/kws_asr/wav2letter_int8_vela.tflite resources_downloaded/kws_asr/wav2letter_int8_vela_H128.tflite
+
+. resources_downloaded/env/bin/activate && vela resources_downloaded/kws_asr/ds_cnn_clustered_int8.tflite -\
+ --accelerator-config=ethos-u55-128 \
+ --block-config-limit=0 --config scripts/vela/default_vela.ini \
+ --memory-mode=Shared_Sram \
+ --system-config=Ethos_U55_High_End_Embedded \
+ --output-dir=resources_downloaded/kws_asr
+mv resources_downloaded/kws_asr/ds_cnn_clustered_int8_vela.tflite resources_downloaded/kws_asr/ds_cnn_clustered_int8_vela_H128.tflite
+
+. resources_downloaded/env/bin/activate && vela resources_downloaded/inference_runner/dnn_s_quantized.tflite -\
+ --accelerator-config=ethos-u55-128 \
+ --block-config-limit=0 --config scripts/vela/default_vela.ini \
+ --memory-mode=Shared_Sram \
+ --system-config=Ethos_U55_High_End_Embedded \
+ --output-dir=resources_downloaded/inference_runner
+mv resources_downloaded/inference_runner/dnn_s_quantized_vela.tflite resources_downloaded/inference_runner/dnn_s_quantized_vela_H128.tflite
+
+. resources_downloaded/env/bin/activate && vela resources_downloaded/img_class/mobilenet_v2_1.0_224_quantized_1_default_1.tflite \
+ --accelerator-config=ethos-u55-128 \
+ --block-config-limit=0 --config scripts/vela/default_vela.ini \
+ --memory-mode=Shared_Sram \
+ --system-config=Ethos_U55_High_End_Embedded \
+ --output-dir=resources_downloaded/img_class
+mv resources_downloaded/img_class/mobilenet_v2_1.0_224_quantized_1_default_1_vela.tflite resources_downloaded/img_class/mobilenet_v2_1.0_224_quantized_1_default_1_vela_H128.tflite
+
+. resources_downloaded/env/bin/activate && vela resources_downloaded/asr/wav2letter_int8.tflite \
+ --accelerator-config=ethos-u55-128 \
+ --block-config-limit=0 --config scripts/vela/default_vela.ini \
+ --memory-mode=Shared_Sram \
+ --system-config=Ethos_U55_High_End_Embedded \
+ --output-dir=resources_downloaded/asr
+mv resources_downloaded/asr/wav2letter_int8_vela.tflite resources_downloaded/asr/wav2letter_int8_vela_H128.tflite
+
+. resources_downloaded/env/bin/activate && vela resources_downloaded/ad/ad_medium_int8.tflite \
+ --accelerator-config=ethos-u55-128 \
+ --block-config-limit=0 --config scripts/vela/default_vela.ini \
+ --memory-mode=Shared_Sram \
+ --system-config=Ethos_U55_High_End_Embedded \
+ --output-dir=resources_downloaded/ad
+mv resources_downloaded/ad/ad_medium_int8_vela.tflite resources_downloaded/ad/ad_medium_int8_vela_H128.tflite
+
+mkdir cmake-build-mps3-sse-300-gnu-release and cd cmake-build-mps3-sse-300-gnu-release
+
+cmake .. \
+ -DTARGET_PLATFORM=mps3 \
+ -DTARGET_SUBSYSTEM=sse-300 \
+ -DCMAKE_TOOLCHAIN_FILE=scripts/cmake/toolchains/bare-metal-gcc.cmake
+```
+
+> **Note:** If you want to make changes to the application (for example modifying the number of MAC units of the Ethos-U or running a custom neural network),
+> you should follow the approach defined in [documentation.md](../docs/documentation.md) instead of using the `build_default` python script.
diff --git a/docs/sections/building.md b/docs/sections/building.md
index 7bd01d1..98cb5e8 100644
--- a/docs/sections/building.md
+++ b/docs/sections/building.md
@@ -393,9 +393,11 @@ cmake ../ -DTARGET_PLATFORM=native
Results of the build will be placed under `build/bin/` folder:
```tree
- bin
- |- dev_ethosu_eval-tests
- |_ ethos-u
+bin
+├── arm_ml_embedded_evaluation_kit-<usecase1>-tests
+├── arm_ml_embedded_evaluation_kit-<usecase2>-tests
+├── ethos-u-<usecase1>
+└── ethos-u-<usecase1>
```
### Configuring the build for simple_platform
diff --git a/docs/sections/testing_benchmarking.md b/docs/sections/testing_benchmarking.md
index 7932dde..904f2c9 100644
--- a/docs/sections/testing_benchmarking.md
+++ b/docs/sections/testing_benchmarking.md
@@ -27,13 +27,13 @@ Where:
- `utils`: contains utilities sources used only within the tests.
When [configuring](./building.md#configuring-the-build-native-unit-test) and
-[building](./building.md#Building-the-configured-project) for `native` target platform results of the build will
-be placed under `build/bin/` folder, for example:
+[building](./building.md#building-the-configured-project) for `native` target platform results of the build will
+be placed under `<build folder>/bin/` folder, for example:
```tree
.
-├── dev_ethosu_eval-<usecase1>-tests
-├── dev_ethosu_eval-<usecase2>-tests
+├── arm_ml_embedded_evaluation_kit-<usecase1>-tests
+├── arm_ml_embedded_evaluation_kit-<usecase2>-tests
├── ethos-u-<usecase1>
└── ethos-u-<usecase1>
```
@@ -41,7 +41,7 @@ be placed under `build/bin/` folder, for example:
To execute unit-tests for a specific use-case in addition to the common tests:
```commandline
-dev_ethosu_eval-<use_case>-tests
+arm_ml_embedded_evaluation_kit-<use_case>-tests
```
```log