ArmNN
 21.02
delegate/IntegrateDelegateIntoPython.md
Go to the documentation of this file.
1 # Integrate the TfLite delegate into TfLite using Python
2 If you have built the TfLite delegate as a separate dynamic library then this tutorial will show you how you can
3 integrate it in TfLite to run models using python.
4 
5 Here is an example python script showing how to do this. In this script we are making use of the
6 [external adaptor](https://www.tensorflow.org/lite/performance/implementing_delegate#option_2_leverage_external_delegate)
7 tool of TfLite that allows you to load delegates at runtime.
8 ```python
9 import numpy as np
10 import tflite_runtime.interpreter as tflite
11 
12 # Load TFLite model and allocate tensors.
13 # (if you are using the complete tensorflow package you can find load_delegate in tf.experimental.load_delegate)
14 armnn_delegate = tflite.load_delegate( library="<your-armnn-build-dir>/delegate/libarmnnDelegate.so",
15  options={"backends": "CpuAcc,GpuAcc,CpuRef", "logging-severity":"info"})
16 # Delegates/Executes all operations supported by ArmNN to/with ArmNN
17 interpreter = tflite.Interpreter(model_path="<your-armnn-repo-dir>/delegate/python/test/test_data/mock_model.tflite",
18  experimental_delegates=[armnn_delegate])
19 interpreter.allocate_tensors()
20 
21 # Get input and output tensors.
22 input_details = interpreter.get_input_details()
23 output_details = interpreter.get_output_details()
24 
25 # Test model on random input data.
26 input_shape = input_details[0]['shape']
27 input_data = np.array(np.random.random_sample(input_shape), dtype=np.uint8)
28 interpreter.set_tensor(input_details[0]['index'], input_data)
29 
30 interpreter.invoke()
31 
32 # Print out result
33 output_data = interpreter.get_tensor(output_details[0]['index'])
34 print(output_data)
35 ```
36 
37 # Prepare the environment
38 Pre-requisites:
39  * Dynamically build Arm NN Delegate library
40  * python3 (Depends on TfLite version)
41  * virtualenv
42  * numpy (Depends on TfLite version)
43  * tflite_runtime (>=2.0, depends on Arm NN Delegate)
44 
45 If you haven't built the delegate yet then take a look at the [build guide](./BuildGuideNative.md).
46 
47 We recommend creating a virtual environment for this tutorial. For the following code to work python3 is needed. Please
48 also check the documentation of the TfLite version you want to use. There might be additional prerequisites for the python
49 version.
50 ```bash
51 # Install python3 (We ended up with python3.5.3) and virtualenv
52 sudo apt-get install python3-pip
53 sudo pip3 install virtualenv
54 
55 # create a virtual environment
56 cd your/tutorial/dir
57 # creates a directory myenv at the current location
58 virtualenv -p python3 myenv
59 # activate the environment
60 source myenv/bin/activate
61 ```
62 
63 Now that the environment is active we can install additional packages we need for our example script. As you can see
64 in the python script at the start of this page, this tutorial uses the `tflite_runtime` rather than the whole tensorflow
65 package. The `tflite_runtime` is a package that wraps the TfLite Interpreter. Therefore it can only be used to run inferences of
66 TfLite models. But since Arm NN is only an inference engine itself this is a perfect match. The
67 `tflite_runtime` is also much smaller than the whole tensorflow package and better suited to run models on
68 mobile and embedded devices.
69 
70 At the time of writing, there are no packages of either `tensorflow` or `tflite_runtime` available on `pypi` that
71 are built for an arm architecture. That means installing them using `pip` on your development board is currently not
72 possible. The TfLite [website](https://www.tensorflow.org/lite/guide/python) points you at prebuilt `tflite_runtime`
73 packages. However, that limits you to specific TfLite and Python versions. For this reason we will build the
74 `tflite_runtime` from source.
75 
76 You will have downloaded the tensorflow repository in order to build the Arm NN delegate. In there you can find further
77 instructions on how to build the `tflite_runtime` under `tensorflow/lite/tools/pip_package/README.md`. This tutorial
78 uses bazel to build it natively but there are scripts for cross-compilation available as well.
79 ```bash
80 # Add the directory where bazel is built to your PATH so that the script can find it
81 PATH=$PATH:your/build/dir/bazel/output
82 # Run the following script to build tflite_runtime natively.
83 tensorflow/lite/tools/pip_package/build_pip_package_with_bazel.sh
84 ```
85 The execution of the script creates a `.whl` file which can be used by `pip` to install the TfLite Runtime package.
86 The build-script produces some output in which you can find the location where the `.whl` file was created. Then all that is
87 left to do is to install all necessary python packages with `pip`.
88 ```bash
89 pip install tensorflow/lite/tools/pip_package/gen/tflite_pip/python3/dist/tflite_runtime-2.3.1-py3-none-any.whl numpy
90 ```
91 
92 Your virtual environment is now all setup. Copy the final python script into a python file e.g.
93 `ExternalDelegatePythonTutorial.py`. Modify the python script above and replace `<your-armnn-build-dir>` and
94 `<your-armnn-repo-dir>` with the directories you have set up. If you've been using the [native build guide](./BuildGuideNative.md)
95 this will be `$BASEDIR/armnn/build` and `$BASEDIR/armnn`.
96 
97 Finally, execute the script:
98 ```bash
99 python ExternalDelegatePythonTutorial.py
100 ```
101 The output should look similar to this:
102 ```bash
103 Info: ArmNN v23.0.0
104 
105 Info: Initialization time: 0.56 ms
106 
107 INFO: TfLiteArmnnDelegate: Created TfLite ArmNN delegate.
108 [[ 12 123 16 12 11 14 20 16 20 12]]
109 Info: Shutdown time: 0.28 ms
110 ```
111 
112 For more details on what kind of options you can pass to the Arm NN delegate please check
113 [armnn_delegate_adaptor.cpp](src/armnn_external_delegate.cpp).
114 
115 You can also test the functionality of the external delegate adaptor by running some unit tests:
116 ```bash
117 pip install pytest
118 cd armnn/delegate/python/test
119 # You can deselect tests that require backends that your hardware doesn't support using markers e.g. -m "not GpuAccTest"
120 pytest --delegate-dir="<your-armnn-build-dir>/armnn/delegate/libarmnnDelegate.so" -m "not GpuAccTest"
121 ```