aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorNikhil Raj <nikhil.raj@arm.com>2022-02-17 12:12:06 +0000
committerColm Donelan <colm.donelan@arm.com>2022-02-17 21:22:46 +0000
commit79eed6d0d39317c950c6f631ac1bebead5592ab4 (patch)
tree084bf32b3c028c38ca437a8b3da889a5b5ef8a39
parentc466e41bcc8a35f9fa6d94664bf2a75c207f8048 (diff)
downloadarmnn-79eed6d0d39317c950c6f631ac1bebead5592ab4.tar.gz
Update TfLite build guides
* Update path changes in building TF, CL * Add option to switch off XNN package while building TFLite * Remove obsolete links Signed-off-by: Nikhil Raj <nikhil.raj@arm.com> Change-Id: Ib2c932ceb8ef2be4e01ac77ac8f880f814d452cb
-rw-r--r--delegate/BuildGuideNative.md20
-rw-r--r--delegate/DelegateQuickStartGuide.md16
2 files changed, 19 insertions, 17 deletions
diff --git a/delegate/BuildGuideNative.md b/delegate/BuildGuideNative.md
index 932c74423a..4aa1af3ee9 100644
--- a/delegate/BuildGuideNative.md
+++ b/delegate/BuildGuideNative.md
@@ -65,7 +65,7 @@ found [here](https://docs.bazel.build/versions/master/install-compile-source.htm
compile with CMake. Depending on your operating system and architecture there might be an easier way.
```bash
wget -O cmake-3.16.0.tar.gz https://cmake.org/files/v3.16/cmake-3.16.0.tar.gz
-tar -xzf cmake-3.16.0.tar.gz -C $BASEDIR/cmake-3.16.0
+tar -xzf cmake-3.16.0.tar.gz -C $BASEDIR/
# If you have an older CMake, remove installed in order to upgrade
yes | sudo apt-get purge cmake
@@ -89,8 +89,10 @@ git checkout $(../armnn/scripts/get_tensorflow.sh -p) # Minimum version required
Now the build process can be started. When calling "cmake", as below, you can specify a number of build
flags. But if you have no need to configure your tensorflow build, you can follow the exact commands below:
```bash
-cmake $BASEDIR/tensorflow
-cmake --build $BASEDIR/tflite-output # This will be your DTFLITE_LIB_ROOT directory
+mkdir build # You are already inside $BASEDIR/tensorflow at this point
+cd build
+cmake $BASEDIR/tensorflow/tensorflow/lite -DTFLITE_ENABLE_XNNPACK=OFF
+cmake --build . # This will be your DTFLITE_LIB_ROOT directory
```
## Build Flatbuffers
@@ -123,7 +125,7 @@ To build the Arm Compute Library on your platform, download the Arm Compute Libr
the version you want to use. Build it using `scons`.
```bash
-cd $HOME/armnn-devenv
+cd $BASEDIR
git clone https://review.mlplatform.org/ml/ComputeLibrary
cd ComputeLibrary/
git checkout $(../armnn/scripts/get_compute_library.sh -p) # e.g. v21.11
@@ -152,7 +154,7 @@ with the additional cmake arguments shown below
cd $BASEDIR/armnn/delegate && mkdir build && cd build
cmake .. -DCMAKE_BUILD_TYPE=release # A release build rather than a debug build.
-DTENSORFLOW_ROOT=$BASEDIR/tensorflow \ # The root directory where tensorflow can be found.
- -DTFLITE_LIB_ROOT=$BASEDIR/tflite-output \ # Directory where tensorflow libraries can be found.
+ -DTFLITE_LIB_ROOT=$BASEDIR/tensorflow/build \ # Directory where tensorflow libraries can be found.
-DFLATBUFFERS_ROOT=$BASEDIR/flatbuffers-1.12.0/install \ # Flatbuffers install directory.
-DArmnn_DIR=$BASEDIR/armnn/build \ # Directory where the Arm NN library can be found
-DARMNN_SOURCE_DIR=$BASEDIR/armnn # The top directory of the Arm NN repository.
@@ -199,7 +201,7 @@ cmake .. -DARMCOMPUTE_ROOT=$BASEDIR/ComputeLibrary \
-DBUILD_UNIT_TESTS=0 \
-DBUILD_ARMNN_TFLITE_DELEGATE=1 \
-DTENSORFLOW_ROOT=$BASEDIR/tensorflow \
- -DTFLITE_LIB_ROOT=$BASEDIR/tflite-output \
+ -DTFLITE_LIB_ROOT=$BASEDIR/tensorflow/build \
-DFLATBUFFERS_ROOT=$BASEDIR/flatbuffers-1.12.0/install
make
```
@@ -227,11 +229,11 @@ wget https://github.com/ARM-software/ML-zoo/blob/master/models/image_classificat
```
## Execute the benchmarking tool with the Arm NN delegate
+You are already at $BASEDIR/benchmarking from the previous stage.
```bash
-cd $BASEDIR/benchmarking
LD_LIBRARY_PATH=../armnn/build ./benchmark_model --graph=mobilenet_v2_1.0_224_quantized_1_default_1.tflite --external_delegate_path="../armnn/build/delegate/libarmnnDelegate.so" --external_delegate_options="backends:CpuAcc;logging-severity:info"
```
-The "external_delegate_options" here are specific to the Arm NN delegate. They are used to specify a target Arm NN backend or to enable/disable various options in Arm NN. A full description can be found in the parameters of function [tflite_plugin_create_delegate](namespacetflite.xhtml).
+The "external_delegate_options" here are specific to the Arm NN delegate. They are used to specify a target Arm NN backend or to enable/disable various options in Arm NN. A full description can be found in the parameters of function tflite_plugin_create_delegate.
# Integrate the Arm NN TfLite Delegate into your project
@@ -256,4 +258,4 @@ armnnDelegateInterpreter->ModifyGraphWithDelegate(theArmnnDelegate.get());
For further information on using TfLite Delegates please visit the [tensorflow website](https://www.tensorflow.org/lite/guide)
-For more details of the kind of options you can pass to the Arm NN delegate please check the parameters of function [tflite_plugin_create_delegate](namespacetflite.xhtml).
+For more details of the kind of options you can pass to the Arm NN delegate please check the parameters of function tflite_plugin_create_delegate.
diff --git a/delegate/DelegateQuickStartGuide.md b/delegate/DelegateQuickStartGuide.md
index b581bce62c..ce58624677 100644
--- a/delegate/DelegateQuickStartGuide.md
+++ b/delegate/DelegateQuickStartGuide.md
@@ -1,5 +1,5 @@
# TfLite Delegate Quick Start Guide
-If you have downloaded the ArmNN Github binaries or built the TfLite delegate yourself, then this tutorial will show you how you can
+If you have downloaded the Arm NN Github binaries or built the TfLite delegate yourself, then this tutorial will show you how you can
integrate it into TfLite to run models using python.
Here is an example python script showing how to do this. In this script we are making use of the
@@ -13,7 +13,7 @@ import tflite_runtime.interpreter as tflite
# (if you are using the complete tensorflow package you can find load_delegate in tf.experimental.load_delegate)
armnn_delegate = tflite.load_delegate( library="<path-to-armnn-binaries>/libarmnnDelegate.so",
options={"backends": "CpuAcc,GpuAcc,CpuRef", "logging-severity":"info"})
-# Delegates/Executes all operations supported by ArmNN to/with ArmNN
+# Delegates/Executes all operations supported by Arm NN to/with Arm NN
interpreter = tflite.Interpreter(model_path="<your-armnn-repo-dir>/delegate/python/test/test_data/mock_model.tflite",
experimental_delegates=[armnn_delegate])
interpreter.allocate_tensors()
@@ -36,14 +36,14 @@ print(output_data)
# Prepare the environment
Pre-requisites:
- * Dynamically build Arm NN Delegate library or download the ArmNN binaries
+ * Dynamically build Arm NN Delegate library or download the Arm NN binaries
* python3 (Depends on TfLite version)
* virtualenv
* numpy (Depends on TfLite version)
* tflite_runtime (>=2.5, depends on Arm NN Delegate)
If you haven't built the delegate yet then take a look at the [build guide](./BuildGuideNative.md). Otherwise,
-you can download the binaries [here](https://github.com/ARM-software/armnn/releases/tag/v21.11)
+you can download the binaries [here](https://github.com/ARM-software/armnn/releases/)
We recommend creating a virtual environment for this tutorial. For the following code to work python3 is needed. Please
also check the documentation of the TfLite version you want to use. There might be additional prerequisites for the python
@@ -88,16 +88,16 @@ python ExternalDelegatePythonTutorial.py
```
The output should look similar to this:
```bash
-Info: ArmNN v27.0.0
+Info: Arm NN v28.0.0
Info: Initialization time: 0.56 ms
-INFO: TfLiteArmnnDelegate: Created TfLite ArmNN delegate.
+INFO: TfLiteArmnnDelegate: Created TfLite Arm NN delegate.
[[ 12 123 16 12 11 14 20 16 20 12]]
Info: Shutdown time: 0.28 ms
```
-For more details of the kind of options you can pass to the Arm NN delegate please check the parameters of function [tflite_plugin_create_delegate](namespacetflite.xhtml).
+For more details of the kind of options you can pass to the Arm NN delegate please check the parameters of function tflite_plugin_create_delegate.
You can also test the functionality of the external delegate adaptor by running some unit tests:
```bash
@@ -105,4 +105,4 @@ pip install pytest
cd armnn/delegate/python/test
# You can deselect tests that require backends that your hardware doesn't support using markers e.g. -m "not GpuAccTest"
pytest --delegate-dir="<path-to-armnn-binaries>/libarmnnDelegate.so" -m "not GpuAccTest"
-``` \ No newline at end of file
+```