ArmNN
 21.11
delegate/DelegateQuickStartGuide.md
Go to the documentation of this file.
1 # TfLite Delegate Quick Start Guide
2 If you have downloaded the ArmNN Github binaries or built the TfLite delegate yourself, then this tutorial will show you how you can
3 integrate it into TfLite to run models using python.
4 
5 Here is an example python script showing how to do this. In this script we are making use of the
6 [external adaptor](https://www.tensorflow.org/lite/performance/implementing_delegate#option_2_leverage_external_delegate)
7 tool of TfLite that allows you to load delegates at runtime.
8 ```python
9 import numpy as np
10 import tflite_runtime.interpreter as tflite
11 
12 # Load TFLite model and allocate tensors.
13 # (if you are using the complete tensorflow package you can find load_delegate in tf.experimental.load_delegate)
14 armnn_delegate = tflite.load_delegate( library="<path-to-armnn-binaries>/libarmnnDelegate.so",
15  options={"backends": "CpuAcc,GpuAcc,CpuRef", "logging-severity":"info"})
16 # Delegates/Executes all operations supported by ArmNN to/with ArmNN
17 interpreter = tflite.Interpreter(model_path="<your-armnn-repo-dir>/delegate/python/test/test_data/mock_model.tflite",
18  experimental_delegates=[armnn_delegate])
19 interpreter.allocate_tensors()
20 
21 # Get input and output tensors.
22 input_details = interpreter.get_input_details()
23 output_details = interpreter.get_output_details()
24 
25 # Test model on random input data.
26 input_shape = input_details[0]['shape']
27 input_data = np.array(np.random.random_sample(input_shape), dtype=np.uint8)
28 interpreter.set_tensor(input_details[0]['index'], input_data)
29 
30 interpreter.invoke()
31 
32 # Print out result
33 output_data = interpreter.get_tensor(output_details[0]['index'])
34 print(output_data)
35 ```
36 
37 # Prepare the environment
38 Pre-requisites:
39  * Dynamically build Arm NN Delegate library or download the ArmNN binaries
40  * python3 (Depends on TfLite version)
41  * virtualenv
42  * numpy (Depends on TfLite version)
43  * tflite_runtime (>=2.5, depends on Arm NN Delegate)
44 
45 If you haven't built the delegate yet then take a look at the [build guide](./BuildGuideNative.md). Otherwise,
46 you can download the binaries [here](https://github.com/ARM-software/armnn/releases/tag/v21.11)
47 
48 We recommend creating a virtual environment for this tutorial. For the following code to work python3 is needed. Please
49 also check the documentation of the TfLite version you want to use. There might be additional prerequisites for the python
50 version. We will use Tensorflow Lite 2.5.1 for this guide.
51 ```bash
52 # Install python3 (We ended up with python3.5.3) and virtualenv
53 sudo apt-get install python3-pip
54 sudo pip3 install virtualenv
55 
56 # create a virtual environment
57 cd your/tutorial/dir
58 # creates a directory myenv at the current location
59 virtualenv -p python3 myenv
60 # activate the environment
61 source myenv/bin/activate
62 ```
63 
64 Now that the environment is active we can install additional packages we need for our example script. As you can see
65 in the python script at the start of this page, this tutorial uses the `tflite_runtime` rather than the whole tensorflow
66 package. The `tflite_runtime` is a package that wraps the TfLite Interpreter. Therefore it can only be used to run inferences of
67 TfLite models. But since Arm NN is only an inference engine itself this is a perfect match. The
68 `tflite_runtime` is also much smaller than the whole tensorflow package and better suited to run models on
69 mobile and embedded devices.
70 
71 The TfLite [website](https://www.tensorflow.org/lite/guide/python) shows you two methods to download the `tflite_runtime` package.
72 In our experience, the use of the pip command works for most systems including debian. However, if you're using an older version of Tensorflow,
73 you may need to build the pip package from source. You can find more information [here](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/tools/pip_package/README.md).
74 But in our case, with Tensorflow Lite 2.5.1, we can install through:
75 
76 ```
77 pip3 install --extra-index-url https://google-coral.github.io/py-repo/ tflite_runtime
78 ```
79 
80 Your virtual environment is now all setup. Copy the final python script into a python file e.g.
81 `ExternalDelegatePythonTutorial.py`. Modify the python script above and replace `<path-to-armnn-binaries>` and
82 `<your-armnn-repo-dir>` with the directories you have set up. If you've been using the [native build guide](./BuildGuideNative.md)
83 this will be `$BASEDIR/armnn/build` and `$BASEDIR/armnn`.
84 
85 Finally, execute the script:
86 ```bash
87 python ExternalDelegatePythonTutorial.py
88 ```
89 The output should look similar to this:
90 ```bash
91 Info: ArmNN v27.0.0
92 
93 Info: Initialization time: 0.56 ms
94 
95 INFO: TfLiteArmnnDelegate: Created TfLite ArmNN delegate.
96 [[ 12 123 16 12 11 14 20 16 20 12]]
97 Info: Shutdown time: 0.28 ms
98 ```
99 
100 For more details of the kind of options you can pass to the Arm NN delegate please check the parameters of function [tflite_plugin_create_delegate](namespacetflite.xhtml).
101 
102 You can also test the functionality of the external delegate adaptor by running some unit tests:
103 ```bash
104 pip install pytest
105 cd armnn/delegate/python/test
106 # You can deselect tests that require backends that your hardware doesn't support using markers e.g. -m "not GpuAccTest"
107 pytest --delegate-dir="<path-to-armnn-binaries>/libarmnnDelegate.so" -m "not GpuAccTest"
108 ```