1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
|
<!-- Copyright (c) 2020 ARM Limited. -->
<!-- -->
<!-- SPDX-License-Identifier: MIT -->
<!-- -->
<!-- HTML header for doxygen 1.8.13-->
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/>
<meta http-equiv="X-UA-Compatible" content="IE=9"/>
<meta name="generator" content="Doxygen 1.8.13"/>
<meta name="robots" content="NOINDEX, NOFOLLOW" />
<meta name="viewport" content="width=device-width, initial-scale=1"/>
<title>ArmNN: delegate/BuildGuideNative.md Source File</title>
<link href="tabs.css" rel="stylesheet" type="text/css"/>
<script type="text/javascript" src="jquery.js"></script>
<script type="text/javascript" src="dynsections.js"></script>
<link href="navtree.css" rel="stylesheet" type="text/css"/>
<script type="text/javascript" src="resize.js"></script>
<script type="text/javascript" src="navtreedata.js"></script>
<script type="text/javascript" src="navtree.js"></script>
<script type="text/javascript">
$(document).ready(initResizable);
</script>
<link href="search/search.css" rel="stylesheet" type="text/css"/>
<script type="text/javascript" src="search/searchdata.js"></script>
<script type="text/javascript" src="search/search.js"></script>
<script type="text/x-mathjax-config">
MathJax.Hub.Config({
extensions: ["tex2jax.js"],
jax: ["input/TeX","output/HTML-CSS"],
});
</script><script type="text/javascript" src="http://cdn.mathjax.org/mathjax/latest/MathJax.js"></script>
<link href="doxygen.css" rel="stylesheet" type="text/css" />
<link href="stylesheet.css" rel="stylesheet" type="text/css"/>
</head>
<body>
<div id="top"><!-- do not remove this div, it is closed by doxygen! -->
<div id="titlearea">
<table cellspacing="0" cellpadding="0">
<tbody>
<tr style="height: 56px;">
<img alt="ArmNN" src="Arm_NN_horizontal_blue.png" style="max-width: 10rem; margin-top: .5rem; margin-left 10px"/>
<td style="padding-left: 0.5em;">
<div id="projectname">
 <span id="projectnumber">22.02</span>
</div>
</td>
</tr>
</tbody>
</table>
</div>
<!-- end header part -->
<!-- Generated by Doxygen 1.8.13 -->
<script type="text/javascript">
var searchBox = new SearchBox("searchBox", "search",false,'Search');
</script>
<script type="text/javascript" src="menudata.js"></script>
<script type="text/javascript" src="menu.js"></script>
<script type="text/javascript">
$(function() {
initMenu('',true,false,'search.php','Search');
$(document).ready(function() { init_search(); });
});
</script>
<div id="main-nav"></div>
</div><!-- top -->
<div id="side-nav" class="ui-resizable side-nav-resizable">
<div id="nav-tree">
<div id="nav-tree-contents">
<div id="nav-sync" class="sync"></div>
</div>
</div>
<div id="splitbar" style="-moz-user-select:none;"
class="ui-resizable-handle">
</div>
</div>
<script type="text/javascript">
$(document).ready(function(){initNavTree('_build_guide_native_8md.xhtml','');});
</script>
<div id="doc-content">
<!-- window showing the filter options -->
<div id="MSearchSelectWindow"
onmouseover="return searchBox.OnSearchSelectShow()"
onmouseout="return searchBox.OnSearchSelectHide()"
onkeydown="return searchBox.OnSearchSelectKey(event)">
</div>
<!-- iframe showing the search results (closed by default) -->
<div id="MSearchResultsWindow">
<iframe src="javascript:void(0)" frameborder="0"
name="MSearchResults" id="MSearchResults">
</iframe>
</div>
<div class="header">
<div class="headertitle">
<div class="title">delegate/BuildGuideNative.md</div> </div>
</div><!--header-->
<div class="contents">
<a href="_build_guide_native_8md.xhtml">Go to the documentation of this file.</a><div class="fragment"><div class="line"><a name="l00001"></a><span class="lineno"> 1</span> # Delegate build guide introduction</div><div class="line"><a name="l00002"></a><span class="lineno"> 2</span> </div><div class="line"><a name="l00003"></a><span class="lineno"> 3</span> The Arm NN Delegate can be found within the Arm NN repository but it is a standalone piece of software. However,</div><div class="line"><a name="l00004"></a><span class="lineno"> 4</span> it makes use of the Arm NN library. For this reason we have added two options to build the delegate. The first option</div><div class="line"><a name="l00005"></a><span class="lineno"> 5</span> allows you to build the delegate together with the Arm NN library, the second option is a standalone build </div><div class="line"><a name="l00006"></a><span class="lineno"> 6</span> of the delegate.</div><div class="line"><a name="l00007"></a><span class="lineno"> 7</span> </div><div class="line"><a name="l00008"></a><span class="lineno"> 8</span> This tutorial uses an Aarch64 machine with Ubuntu 18.04 installed that can build all components</div><div class="line"><a name="l00009"></a><span class="lineno"> 9</span> natively (no cross-compilation required). This is to keep this guide simple.</div><div class="line"><a name="l00010"></a><span class="lineno"> 10</span> </div><div class="line"><a name="l00011"></a><span class="lineno"> 11</span> **Table of content:**</div><div class="line"><a name="l00012"></a><span class="lineno"> 12</span> - [Delegate build guide introduction](#delegate-build-guide-introduction)</div><div class="line"><a name="l00013"></a><span class="lineno"> 13</span> - [Dependencies](#dependencies)</div><div class="line"><a name="l00014"></a><span class="lineno"> 14</span>  * [Download Arm NN](#download-arm-nn)</div><div class="line"><a name="l00015"></a><span class="lineno"> 15</span>  * [Build Tensorflow Lite for C++](#build-tensorflow-lite-for-c--)</div><div class="line"><a name="l00016"></a><span class="lineno"> 16</span>  * [Build Flatbuffers](#build-flatbuffers)</div><div class="line"><a name="l00017"></a><span class="lineno"> 17</span>  * [Build the Arm Compute Library](#build-the-arm-compute-library)</div><div class="line"><a name="l00018"></a><span class="lineno"> 18</span>  * [Build the Arm NN Library](#build-the-arm-nn-library)</div><div class="line"><a name="l00019"></a><span class="lineno"> 19</span> - [Build the TfLite Delegate (Stand-Alone)](#build-the-tflite-delegate--stand-alone-)</div><div class="line"><a name="l00020"></a><span class="lineno"> 20</span> - [Build the Delegate together with Arm NN](#build-the-delegate-together-with-arm-nn)</div><div class="line"><a name="l00021"></a><span class="lineno"> 21</span> - [Integrate the Arm NN TfLite Delegate into your project](#integrate-the-arm-nn-tflite-delegate-into-your-project)</div><div class="line"><a name="l00022"></a><span class="lineno"> 22</span> </div><div class="line"><a name="l00023"></a><span class="lineno"> 23</span> </div><div class="line"><a name="l00024"></a><span class="lineno"> 24</span> # Dependencies</div><div class="line"><a name="l00025"></a><span class="lineno"> 25</span> </div><div class="line"><a name="l00026"></a><span class="lineno"> 26</span> Build Dependencies:</div><div class="line"><a name="l00027"></a><span class="lineno"> 27</span>  * Tensorflow Lite: this guide uses version 2.5.0. Other versions may work.</div><div class="line"><a name="l00028"></a><span class="lineno"> 28</span>  * Flatbuffers 1.12.0</div><div class="line"><a name="l00029"></a><span class="lineno"> 29</span>  * Arm NN 21.11 or higher</div><div class="line"><a name="l00030"></a><span class="lineno"> 30</span> </div><div class="line"><a name="l00031"></a><span class="lineno"> 31</span> Required Tools:</div><div class="line"><a name="l00032"></a><span class="lineno"> 32</span>  * Git. This guide uses version 2.17.1. Other versions might work.</div><div class="line"><a name="l00033"></a><span class="lineno"> 33</span>  * pip. This guide uses version 20.3.3. Other versions might work.</div><div class="line"><a name="l00034"></a><span class="lineno"> 34</span>  * wget. This guide uses version 1.17.1. Other versions might work.</div><div class="line"><a name="l00035"></a><span class="lineno"> 35</span>  * zip. This guide uses version 3.0. Other versions might work.</div><div class="line"><a name="l00036"></a><span class="lineno"> 36</span>  * unzip. This guide uses version 6.00. Other versions might work.</div><div class="line"><a name="l00037"></a><span class="lineno"> 37</span>  * cmake 3.16.0 or higher. This guide uses version 3.16.0</div><div class="line"><a name="l00038"></a><span class="lineno"> 38</span>  * scons. This guide uses version 2.4.1. Other versions might work.</div><div class="line"><a name="l00039"></a><span class="lineno"> 39</span> </div><div class="line"><a name="l00040"></a><span class="lineno"> 40</span> Our first step is to build all the build dependencies I have mentioned above. We will have to create quite a few</div><div class="line"><a name="l00041"></a><span class="lineno"> 41</span> directories. To make navigation a bit easier define a base directory for the project. At this stage we can also</div><div class="line"><a name="l00042"></a><span class="lineno"> 42</span> install all the tools that are required during the build. This guide assumes you are using a Bash shell.</div><div class="line"><a name="l00043"></a><span class="lineno"> 43</span> ```bash</div><div class="line"><a name="l00044"></a><span class="lineno"> 44</span> export BASEDIR=~/ArmNNDelegate</div><div class="line"><a name="l00045"></a><span class="lineno"> 45</span> mkdir $BASEDIR</div><div class="line"><a name="l00046"></a><span class="lineno"> 46</span> cd $BASEDIR</div><div class="line"><a name="l00047"></a><span class="lineno"> 47</span> apt-get update && apt-get install git wget unzip zip python git cmake scons</div><div class="line"><a name="l00048"></a><span class="lineno"> 48</span> ```</div><div class="line"><a name="l00049"></a><span class="lineno"> 49</span> </div><div class="line"><a name="l00050"></a><span class="lineno"> 50</span> ## Download Arm NN</div><div class="line"><a name="l00051"></a><span class="lineno"> 51</span> </div><div class="line"><a name="l00052"></a><span class="lineno"> 52</span> First clone Arm NN using Git.</div><div class="line"><a name="l00053"></a><span class="lineno"> 53</span> </div><div class="line"><a name="l00054"></a><span class="lineno"> 54</span> ```bash</div><div class="line"><a name="l00055"></a><span class="lineno"> 55</span> cd $BASEDIR</div><div class="line"><a name="l00056"></a><span class="lineno"> 56</span> git clone "https://review.mlplatform.org/ml/armnn" </div><div class="line"><a name="l00057"></a><span class="lineno"> 57</span> cd armnn</div><div class="line"><a name="l00058"></a><span class="lineno"> 58</span> git checkout <branch_name> # e.g. branches/armnn_21_11</div><div class="line"><a name="l00059"></a><span class="lineno"> 59</span> ```</div><div class="line"><a name="l00060"></a><span class="lineno"> 60</span> </div><div class="line"><a name="l00061"></a><span class="lineno"> 61</span> ## Build Tensorflow Lite for C++</div><div class="line"><a name="l00062"></a><span class="lineno"> 62</span> Tensorflow has a few dependencies on it's own. It requires the python packages pip3, numpy,</div><div class="line"><a name="l00063"></a><span class="lineno"> 63</span> and also Bazel or CMake which are used to compile Tensorflow. A description on how to build bazel can be</div><div class="line"><a name="l00064"></a><span class="lineno"> 64</span> found [here](https://docs.bazel.build/versions/master/install-compile-source.html). But for this guide, we will</div><div class="line"><a name="l00065"></a><span class="lineno"> 65</span> compile with CMake. Depending on your operating system and architecture there might be an easier way.</div><div class="line"><a name="l00066"></a><span class="lineno"> 66</span> ```bash</div><div class="line"><a name="l00067"></a><span class="lineno"> 67</span> wget -O cmake-3.16.0.tar.gz https://cmake.org/files/v3.16/cmake-3.16.0.tar.gz</div><div class="line"><a name="l00068"></a><span class="lineno"> 68</span> tar -xzf cmake-3.16.0.tar.gz -C $BASEDIR/</div><div class="line"><a name="l00069"></a><span class="lineno"> 69</span> </div><div class="line"><a name="l00070"></a><span class="lineno"> 70</span> # If you have an older CMake, remove installed in order to upgrade</div><div class="line"><a name="l00071"></a><span class="lineno"> 71</span> yes | sudo apt-get purge cmake</div><div class="line"><a name="l00072"></a><span class="lineno"> 72</span> hash -r</div><div class="line"><a name="l00073"></a><span class="lineno"> 73</span> </div><div class="line"><a name="l00074"></a><span class="lineno"> 74</span> cd $BASEDIR/cmake-3.16.0 </div><div class="line"><a name="l00075"></a><span class="lineno"> 75</span> ./bootstrap </div><div class="line"><a name="l00076"></a><span class="lineno"> 76</span> make </div><div class="line"><a name="l00077"></a><span class="lineno"> 77</span> sudo make install </div><div class="line"><a name="l00078"></a><span class="lineno"> 78</span> ```</div><div class="line"><a name="l00079"></a><span class="lineno"> 79</span> </div><div class="line"><a name="l00080"></a><span class="lineno"> 80</span> ### Download and build Tensorflow Lite</div><div class="line"><a name="l00081"></a><span class="lineno"> 81</span> Arm NN provides a script, armnn/scripts/get_tensorflow.sh, that can be used to download the version of TensorFlow that Arm NN was tested with:</div><div class="line"><a name="l00082"></a><span class="lineno"> 82</span> ```bash</div><div class="line"><a name="l00083"></a><span class="lineno"> 83</span> cd $BASEDIR</div><div class="line"><a name="l00084"></a><span class="lineno"> 84</span> git clone https://github.com/tensorflow/tensorflow.git</div><div class="line"><a name="l00085"></a><span class="lineno"> 85</span> cd tensorflow/</div><div class="line"><a name="l00086"></a><span class="lineno"> 86</span> git checkout $(../armnn/scripts/get_tensorflow.sh -p) # Minimum version required for the delegate is v2.3.1</div><div class="line"><a name="l00087"></a><span class="lineno"> 87</span> ```</div><div class="line"><a name="l00088"></a><span class="lineno"> 88</span> </div><div class="line"><a name="l00089"></a><span class="lineno"> 89</span> Now the build process can be started. When calling "cmake", as below, you can specify a number of build</div><div class="line"><a name="l00090"></a><span class="lineno"> 90</span> flags. But if you have no need to configure your tensorflow build, you can follow the exact commands below:</div><div class="line"><a name="l00091"></a><span class="lineno"> 91</span> ```bash</div><div class="line"><a name="l00092"></a><span class="lineno"> 92</span> mkdir build # You are already inside $BASEDIR/tensorflow at this point</div><div class="line"><a name="l00093"></a><span class="lineno"> 93</span> cd build</div><div class="line"><a name="l00094"></a><span class="lineno"> 94</span> cmake $BASEDIR/tensorflow/tensorflow/lite -DTFLITE_ENABLE_XNNPACK=OFF</div><div class="line"><a name="l00095"></a><span class="lineno"> 95</span> cmake --build . # This will be your DTFLITE_LIB_ROOT directory</div><div class="line"><a name="l00096"></a><span class="lineno"> 96</span> ```</div><div class="line"><a name="l00097"></a><span class="lineno"> 97</span> </div><div class="line"><a name="l00098"></a><span class="lineno"> 98</span> ## Build Flatbuffers</div><div class="line"><a name="l00099"></a><span class="lineno"> 99</span> Flatbuffers is a memory efficient cross-platform serialization library as </div><div class="line"><a name="l00100"></a><span class="lineno"> 100</span> described [here](https://google.github.io/flatbuffers/). It is used in tflite to store models and is also a dependency </div><div class="line"><a name="l00101"></a><span class="lineno"> 101</span> of the delegate. After downloading the right version it can be built and installed using cmake.</div><div class="line"><a name="l00102"></a><span class="lineno"> 102</span> ```bash</div><div class="line"><a name="l00103"></a><span class="lineno"> 103</span> cd $BASEDIR</div><div class="line"><a name="l00104"></a><span class="lineno"> 104</span> wget -O flatbuffers-1.12.0.zip https://github.com/google/flatbuffers/archive/v1.12.0.zip</div><div class="line"><a name="l00105"></a><span class="lineno"> 105</span> unzip -d . flatbuffers-1.12.0.zip</div><div class="line"><a name="l00106"></a><span class="lineno"> 106</span> cd flatbuffers-1.12.0 </div><div class="line"><a name="l00107"></a><span class="lineno"> 107</span> mkdir install && mkdir build && cd build</div><div class="line"><a name="l00108"></a><span class="lineno"> 108</span> # I'm using a different install directory but that is not required</div><div class="line"><a name="l00109"></a><span class="lineno"> 109</span> cmake .. -DCMAKE_INSTALL_PREFIX:PATH=$BASEDIR/flatbuffers-1.12.0/install </div><div class="line"><a name="l00110"></a><span class="lineno"> 110</span> make install</div><div class="line"><a name="l00111"></a><span class="lineno"> 111</span> ```</div><div class="line"><a name="l00112"></a><span class="lineno"> 112</span> </div><div class="line"><a name="l00113"></a><span class="lineno"> 113</span> ## Build the Arm Compute Library</div><div class="line"><a name="l00114"></a><span class="lineno"> 114</span> </div><div class="line"><a name="l00115"></a><span class="lineno"> 115</span> The Arm NN library depends on the Arm Compute Library (ACL). It provides a set of functions that are optimized for </div><div class="line"><a name="l00116"></a><span class="lineno"> 116</span> both Arm CPUs and GPUs. The Arm Compute Library is used directly by Arm NN to run machine learning workloads on </div><div class="line"><a name="l00117"></a><span class="lineno"> 117</span> Arm CPUs and GPUs.</div><div class="line"><a name="l00118"></a><span class="lineno"> 118</span> </div><div class="line"><a name="l00119"></a><span class="lineno"> 119</span> It is important to have the right version of ACL and Arm NN to make it work. Arm NN and ACL are developed very closely </div><div class="line"><a name="l00120"></a><span class="lineno"> 120</span> and released together. If you would like to use the Arm NN version "21.11" you should use the same "21.11" version for </div><div class="line"><a name="l00121"></a><span class="lineno"> 121</span> ACL too. Arm NN provides a script, armnn/scripts/get_compute_library.sh, that can be used to download the exact version </div><div class="line"><a name="l00122"></a><span class="lineno"> 122</span> of Arm Compute Library that Arm NN was tested with.</div><div class="line"><a name="l00123"></a><span class="lineno"> 123</span> </div><div class="line"><a name="l00124"></a><span class="lineno"> 124</span> To build the Arm Compute Library on your platform, download the Arm Compute Library and checkout the tag that contains </div><div class="line"><a name="l00125"></a><span class="lineno"> 125</span> the version you want to use. Build it using `scons`.</div><div class="line"><a name="l00126"></a><span class="lineno"> 126</span> </div><div class="line"><a name="l00127"></a><span class="lineno"> 127</span> ```bash</div><div class="line"><a name="l00128"></a><span class="lineno"> 128</span> cd $BASEDIR</div><div class="line"><a name="l00129"></a><span class="lineno"> 129</span> git clone https://review.mlplatform.org/ml/ComputeLibrary </div><div class="line"><a name="l00130"></a><span class="lineno"> 130</span> cd ComputeLibrary/</div><div class="line"><a name="l00131"></a><span class="lineno"> 131</span> git checkout $(../armnn/scripts/get_compute_library.sh -p) # e.g. v21.11</div><div class="line"><a name="l00132"></a><span class="lineno"> 132</span> # The machine used for this guide only has a Neon CPU which is why I only have "neon=1" but if </div><div class="line"><a name="l00133"></a><span class="lineno"> 133</span> # your machine has an arm Gpu you can enable that by adding `opencl=1 embed_kernels=1 to the command below</div><div class="line"><a name="l00134"></a><span class="lineno"> 134</span> scons arch=arm64-v8a neon=1 extra_cxx_flags="-fPIC" benchmark_tests=0 validation_tests=0 </div><div class="line"><a name="l00135"></a><span class="lineno"> 135</span> ```</div><div class="line"><a name="l00136"></a><span class="lineno"> 136</span> </div><div class="line"><a name="l00137"></a><span class="lineno"> 137</span> ## Build the Arm NN Library</div><div class="line"><a name="l00138"></a><span class="lineno"> 138</span> </div><div class="line"><a name="l00139"></a><span class="lineno"> 139</span> With ACL built we can now continue to build Arm NN. Create a build directory and use `cmake` to build it.</div><div class="line"><a name="l00140"></a><span class="lineno"> 140</span> ```bash</div><div class="line"><a name="l00141"></a><span class="lineno"> 141</span> cd $BASEDIR</div><div class="line"><a name="l00142"></a><span class="lineno"> 142</span> cd armnn</div><div class="line"><a name="l00143"></a><span class="lineno"> 143</span> mkdir build && cd build</div><div class="line"><a name="l00144"></a><span class="lineno"> 144</span> # if you've got an arm Gpu add `-DARMCOMPUTECL=1` to the command below</div><div class="line"><a name="l00145"></a><span class="lineno"> 145</span> cmake .. -DARMCOMPUTE_ROOT=$BASEDIR/ComputeLibrary -DARMCOMPUTENEON=1 -DBUILD_UNIT_TESTS=0 </div><div class="line"><a name="l00146"></a><span class="lineno"> 146</span> make</div><div class="line"><a name="l00147"></a><span class="lineno"> 147</span> ```</div><div class="line"><a name="l00148"></a><span class="lineno"> 148</span> </div><div class="line"><a name="l00149"></a><span class="lineno"> 149</span> # Build the TfLite Delegate (Stand-Alone)</div><div class="line"><a name="l00150"></a><span class="lineno"> 150</span> </div><div class="line"><a name="l00151"></a><span class="lineno"> 151</span> The delegate as well as Arm NN is built using `cmake`. Create a build directory as usual and build the delegate</div><div class="line"><a name="l00152"></a><span class="lineno"> 152</span> with the additional cmake arguments shown below</div><div class="line"><a name="l00153"></a><span class="lineno"> 153</span> ```bash</div><div class="line"><a name="l00154"></a><span class="lineno"> 154</span> cd $BASEDIR/armnn/delegate && mkdir build && cd build</div><div class="line"><a name="l00155"></a><span class="lineno"> 155</span> cmake .. -DCMAKE_BUILD_TYPE=release # A release build rather than a debug build.</div><div class="line"><a name="l00156"></a><span class="lineno"> 156</span>  -DTENSORFLOW_ROOT=$BASEDIR/tensorflow \ # The root directory where tensorflow can be found.</div><div class="line"><a name="l00157"></a><span class="lineno"> 157</span>  -DTFLITE_LIB_ROOT=$BASEDIR/tensorflow/build \ # Directory where tensorflow libraries can be found.</div><div class="line"><a name="l00158"></a><span class="lineno"> 158</span>  -DFLATBUFFERS_ROOT=$BASEDIR/flatbuffers-1.12.0/install \ # Flatbuffers install directory.</div><div class="line"><a name="l00159"></a><span class="lineno"> 159</span>  -DArmnn_DIR=$BASEDIR/armnn/build \ # Directory where the Arm NN library can be found</div><div class="line"><a name="l00160"></a><span class="lineno"> 160</span>  -DARMNN_SOURCE_DIR=$BASEDIR/armnn # The top directory of the Arm NN repository. </div><div class="line"><a name="l00161"></a><span class="lineno"> 161</span>  # Required are the includes for Arm NN</div><div class="line"><a name="l00162"></a><span class="lineno"> 162</span> make</div><div class="line"><a name="l00163"></a><span class="lineno"> 163</span> ```</div><div class="line"><a name="l00164"></a><span class="lineno"> 164</span> </div><div class="line"><a name="l00165"></a><span class="lineno"> 165</span> To ensure that the build was successful you can run the unit tests for the delegate that can be found in </div><div class="line"><a name="l00166"></a><span class="lineno"> 166</span> the build directory for the delegate. [Doctest](https://github.com/onqtam/doctest) was used to create those tests. Using test filters you can</div><div class="line"><a name="l00167"></a><span class="lineno"> 167</span> filter out tests that your build is not configured for. In this case, because Arm NN was only built for Cpu </div><div class="line"><a name="l00168"></a><span class="lineno"> 168</span> acceleration (CpuAcc), we filter for all test suites that have `CpuAcc` in their name.</div><div class="line"><a name="l00169"></a><span class="lineno"> 169</span> ```bash</div><div class="line"><a name="l00170"></a><span class="lineno"> 170</span> cd $BASEDIR/armnn/delegate/build</div><div class="line"><a name="l00171"></a><span class="lineno"> 171</span> ./DelegateUnitTests --test-suite=*CpuAcc* </div><div class="line"><a name="l00172"></a><span class="lineno"> 172</span> ```</div><div class="line"><a name="l00173"></a><span class="lineno"> 173</span> If you have built for Gpu acceleration as well you might want to change your test-suite filter:</div><div class="line"><a name="l00174"></a><span class="lineno"> 174</span> ```bash</div><div class="line"><a name="l00175"></a><span class="lineno"> 175</span> ./DelegateUnitTests --test-suite=*CpuAcc*,*GpuAcc*</div><div class="line"><a name="l00176"></a><span class="lineno"> 176</span> ```</div><div class="line"><a name="l00177"></a><span class="lineno"> 177</span> </div><div class="line"><a name="l00178"></a><span class="lineno"> 178</span> # Build the Delegate together with Arm NN</div><div class="line"><a name="l00179"></a><span class="lineno"> 179</span> </div><div class="line"><a name="l00180"></a><span class="lineno"> 180</span> In the introduction it was mentioned that there is a way to integrate the delegate build into Arm NN. This is</div><div class="line"><a name="l00181"></a><span class="lineno"> 181</span> pretty straight forward. The cmake arguments that were previously used for the delegate have to be added</div><div class="line"><a name="l00182"></a><span class="lineno"> 182</span> to the Arm NN cmake arguments. Also another argument `BUILD_ARMNN_TFLITE_DELEGATE` needs to be added to </div><div class="line"><a name="l00183"></a><span class="lineno"> 183</span> instruct Arm NN to build the delegate as well. The new commands to build Arm NN are as follows:</div><div class="line"><a name="l00184"></a><span class="lineno"> 184</span> </div><div class="line"><a name="l00185"></a><span class="lineno"> 185</span> Download Arm NN if you have not already done so:</div><div class="line"><a name="l00186"></a><span class="lineno"> 186</span> ```bash</div><div class="line"><a name="l00187"></a><span class="lineno"> 187</span> cd $BASEDIR</div><div class="line"><a name="l00188"></a><span class="lineno"> 188</span> git clone "https://review.mlplatform.org/ml/armnn" </div><div class="line"><a name="l00189"></a><span class="lineno"> 189</span> cd armnn</div><div class="line"><a name="l00190"></a><span class="lineno"> 190</span> git checkout <branch_name> # e.g. branches/armnn_21_11</div><div class="line"><a name="l00191"></a><span class="lineno"> 191</span> ```</div><div class="line"><a name="l00192"></a><span class="lineno"> 192</span> Build Arm NN with the delegate included</div><div class="line"><a name="l00193"></a><span class="lineno"> 193</span> ```bash</div><div class="line"><a name="l00194"></a><span class="lineno"> 194</span> cd $BASEDIR</div><div class="line"><a name="l00195"></a><span class="lineno"> 195</span> cd armnn</div><div class="line"><a name="l00196"></a><span class="lineno"> 196</span> rm -rf build # Remove any previous cmake build.</div><div class="line"><a name="l00197"></a><span class="lineno"> 197</span> mkdir build && cd build</div><div class="line"><a name="l00198"></a><span class="lineno"> 198</span> # if you've got an arm Gpu add `-DARMCOMPUTECL=1` to the command below</div><div class="line"><a name="l00199"></a><span class="lineno"> 199</span> cmake .. -DARMCOMPUTE_ROOT=$BASEDIR/ComputeLibrary \</div><div class="line"><a name="l00200"></a><span class="lineno"> 200</span>  -DARMCOMPUTENEON=1 \</div><div class="line"><a name="l00201"></a><span class="lineno"> 201</span>  -DBUILD_UNIT_TESTS=0 \</div><div class="line"><a name="l00202"></a><span class="lineno"> 202</span>  -DBUILD_ARMNN_TFLITE_DELEGATE=1 \</div><div class="line"><a name="l00203"></a><span class="lineno"> 203</span>  -DTENSORFLOW_ROOT=$BASEDIR/tensorflow \</div><div class="line"><a name="l00204"></a><span class="lineno"> 204</span>  -DTFLITE_LIB_ROOT=$BASEDIR/tensorflow/build \</div><div class="line"><a name="l00205"></a><span class="lineno"> 205</span>  -DFLATBUFFERS_ROOT=$BASEDIR/flatbuffers-1.12.0/install</div><div class="line"><a name="l00206"></a><span class="lineno"> 206</span> make</div><div class="line"><a name="l00207"></a><span class="lineno"> 207</span> ```</div><div class="line"><a name="l00208"></a><span class="lineno"> 208</span> The delegate library can then be found in `build/armnn/delegate`.</div><div class="line"><a name="l00209"></a><span class="lineno"> 209</span> </div><div class="line"><a name="l00210"></a><span class="lineno"> 210</span> # Test the Arm NN delegate using the [TFLite Model Benchmark Tool](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/tools/benchmark)</div><div class="line"><a name="l00211"></a><span class="lineno"> 211</span> </div><div class="line"><a name="l00212"></a><span class="lineno"> 212</span> The TFLite Model Benchmark Tool has a useful command line interface to test delegates. We can use this to demonstrate the use of the Arm NN delegate and its options.</div><div class="line"><a name="l00213"></a><span class="lineno"> 213</span> </div><div class="line"><a name="l00214"></a><span class="lineno"> 214</span> Some examples of this can be viewed in this [YouTube demonstration](https://www.youtube.com/watch?v=NResQ1kbm-M&t=920s).</div><div class="line"><a name="l00215"></a><span class="lineno"> 215</span> </div><div class="line"><a name="l00216"></a><span class="lineno"> 216</span> ## Download the TFLite Model Benchmark Tool</div><div class="line"><a name="l00217"></a><span class="lineno"> 217</span> </div><div class="line"><a name="l00218"></a><span class="lineno"> 218</span> Binary builds of the benchmarking tool for various platforms are available [here](https://www.tensorflow.org/lite/performance/measurement#native_benchmark_binary). In this example I will target an aarch64 Linux environment. I will also download a sample uint8 tflite model from the [Arm ML Model Zoo](https://github.com/ARM-software/ML-zoo).</div><div class="line"><a name="l00219"></a><span class="lineno"> 219</span> </div><div class="line"><a name="l00220"></a><span class="lineno"> 220</span> ```bash</div><div class="line"><a name="l00221"></a><span class="lineno"> 221</span> mkdir $BASEDIR/benchmarking</div><div class="line"><a name="l00222"></a><span class="lineno"> 222</span> cd $BASEDIR/benchmarking</div><div class="line"><a name="l00223"></a><span class="lineno"> 223</span> # Get the benchmarking binary.</div><div class="line"><a name="l00224"></a><span class="lineno"> 224</span> wget https://storage.googleapis.com/tensorflow-nightly-public/prod/tensorflow/release/lite/tools/nightly/latest/linux_aarch64_benchmark_model -O benchmark_model</div><div class="line"><a name="l00225"></a><span class="lineno"> 225</span> # Make it executable.</div><div class="line"><a name="l00226"></a><span class="lineno"> 226</span> chmod +x benchmark_model</div><div class="line"><a name="l00227"></a><span class="lineno"> 227</span> # and a sample model from model zoo.</div><div class="line"><a name="l00228"></a><span class="lineno"> 228</span> wget https://github.com/ARM-software/ML-zoo/blob/master/models/image_classification/mobilenet_v2_1.0_224/tflite_uint8/mobilenet_v2_1.0_224_quantized_1_default_1.tflite?raw=true -O mobilenet_v2_1.0_224_quantized_1_default_1.tflite</div><div class="line"><a name="l00229"></a><span class="lineno"> 229</span> ```</div><div class="line"><a name="l00230"></a><span class="lineno"> 230</span> </div><div class="line"><a name="l00231"></a><span class="lineno"> 231</span> ## Execute the benchmarking tool with the Arm NN delegate</div><div class="line"><a name="l00232"></a><span class="lineno"> 232</span> You are already at $BASEDIR/benchmarking from the previous stage.</div><div class="line"><a name="l00233"></a><span class="lineno"> 233</span> ```bash</div><div class="line"><a name="l00234"></a><span class="lineno"> 234</span> LD_LIBRARY_PATH=../armnn/build ./benchmark_model --graph=mobilenet_v2_1.0_224_quantized_1_default_1.tflite --external_delegate_path="../armnn/build/delegate/libarmnnDelegate.so" --external_delegate_options="backends:CpuAcc;logging-severity:info"</div><div class="line"><a name="l00235"></a><span class="lineno"> 235</span> ```</div><div class="line"><a name="l00236"></a><span class="lineno"> 236</span> The "external_delegate_options" here are specific to the Arm NN delegate. They are used to specify a target Arm NN backend or to enable/disable various options in Arm NN. A full description can be found in the parameters of function tflite_plugin_create_delegate.</div><div class="line"><a name="l00237"></a><span class="lineno"> 237</span> </div><div class="line"><a name="l00238"></a><span class="lineno"> 238</span> # Integrate the Arm NN TfLite Delegate into your project</div><div class="line"><a name="l00239"></a><span class="lineno"> 239</span> </div><div class="line"><a name="l00240"></a><span class="lineno"> 240</span> The delegate can be integrated into your c++ project by creating a TfLite Interpreter and </div><div class="line"><a name="l00241"></a><span class="lineno"> 241</span> instructing it to use the Arm NN delegate for the graph execution. This should look similiar</div><div class="line"><a name="l00242"></a><span class="lineno"> 242</span> to the following code snippet.</div><div class="line"><a name="l00243"></a><span class="lineno"> 243</span> ```objectivec</div><div class="line"><a name="l00244"></a><span class="lineno"> 244</span> // Create TfLite Interpreter</div><div class="line"><a name="l00245"></a><span class="lineno"> 245</span> std::unique_ptr<Interpreter> armnnDelegateInterpreter;</div><div class="line"><a name="l00246"></a><span class="lineno"> 246</span> InterpreterBuilder(tfLiteModel, ::tflite::ops::builtin::BuiltinOpResolver())</div><div class="line"><a name="l00247"></a><span class="lineno"> 247</span>  (&armnnDelegateInterpreter)</div><div class="line"><a name="l00248"></a><span class="lineno"> 248</span> </div><div class="line"><a name="l00249"></a><span class="lineno"> 249</span> // Create the Arm NN Delegate</div><div class="line"><a name="l00250"></a><span class="lineno"> 250</span> armnnDelegate::DelegateOptions delegateOptions(backends);</div><div class="line"><a name="l00251"></a><span class="lineno"> 251</span> std::unique_ptr<TfLiteDelegate, decltype(&armnnDelegate::TfLiteArmnnDelegateDelete)></div><div class="line"><a name="l00252"></a><span class="lineno"> 252</span>  theArmnnDelegate(armnnDelegate::TfLiteArmnnDelegateCreate(delegateOptions),</div><div class="line"><a name="l00253"></a><span class="lineno"> 253</span>  armnnDelegate::TfLiteArmnnDelegateDelete);</div><div class="line"><a name="l00254"></a><span class="lineno"> 254</span> </div><div class="line"><a name="l00255"></a><span class="lineno"> 255</span> // Instruct the Interpreter to use the armnnDelegate</div><div class="line"><a name="l00256"></a><span class="lineno"> 256</span> armnnDelegateInterpreter->ModifyGraphWithDelegate(theArmnnDelegate.get());</div><div class="line"><a name="l00257"></a><span class="lineno"> 257</span> ```</div><div class="line"><a name="l00258"></a><span class="lineno"> 258</span> </div><div class="line"><a name="l00259"></a><span class="lineno"> 259</span> For further information on using TfLite Delegates please visit the [tensorflow website](https://www.tensorflow.org/lite/guide)</div><div class="line"><a name="l00260"></a><span class="lineno"> 260</span> </div><div class="line"><a name="l00261"></a><span class="lineno"> 261</span> For more details of the kind of options you can pass to the Arm NN delegate please check the parameters of function tflite_plugin_create_delegate.</div></div><!-- fragment --></div><!-- contents -->
</div><!-- doc-content -->
<!-- start footer part -->
<div id="nav-path" class="navpath"><!-- id is needed for treeview function! -->
<ul>
<li class="navelem"><a class="el" href="_build_guide_native_8md.xhtml">BuildGuideNative.md</a></li>
<li class="footer">Generated on Wed Mar 9 2022 12:00:05 for ArmNN by
<a href="http://www.doxygen.org/index.html">
<img class="footer" src="doxygen.png" alt="doxygen"/></a> 1.8.13 </li>
</ul>
</div>
</body>
</html>
|