aboutsummaryrefslogtreecommitdiff
path: root/README.md
diff options
context:
space:
mode:
authorsurmeh01 <surabhi.mehta@arm.com>2018-03-29 16:33:54 +0100
committersurmeh01 <surabhi.mehta@arm.com>2018-03-29 16:33:54 +0100
commit7666005c72227a3ea5c410ca2861c9b6620887d8 (patch)
tree084296e0ba923f7885b8efb242335a4547b2cdb0 /README.md
parent5307bc10ac488261e84ac76b2dede6039ea3fe96 (diff)
downloadandroid-nn-driver-7666005c72227a3ea5c410ca2861c9b6620887d8.tar.gz
Release 18.03
Diffstat (limited to 'README.md')
-rw-r--r--README.md28
1 files changed, 3 insertions, 25 deletions
diff --git a/README.md b/README.md
index f549d2c2..0dcccd78 100644
--- a/README.md
+++ b/README.md
@@ -23,36 +23,14 @@ PRODUCT_PACKAGES += android.hardware.neuralnetworks@1.0-service-armnn
</pre>
4. Build Android as normal, i.e. run `make` in `<ANDROID_ROOT>`
5. To confirm that the ArmNN driver has been built, check for driver service executable at
-<pre>
-<ANDROID_ROOT>/out/target/product/<product>/system/vendor/bin/hw/android.hardware.neuralnetworks@1.0-service-armnn
-</pre>
+`<ANDROID_ROOT>/out/target/product/<product>/system/vendor/bin/hw/android.hardware.neuralnetworks@1.0-service-armnn`
### Testing
1. Run the ArmNN driver service executable in the background
-<pre>
-adb shell /system/vendor/bin/hw/android.hardware.neuralnetworks@1.0-service-armnn &
-</pre>
+<pre>adb shell /system/vendor/bin/hw/android.hardware.neuralnetworks@1.0-service-armnn &</pre>
2. Run some code that exercises the Android Neural Networks API, for example Android's
`NeuralNetworksTest` unit tests (note this is an optional component that must be built).
-<pre>
-adb shell /data/nativetest/NeuralNetworksTest/NeuralNetworksTest > NeuralNetworkTest.log
-</pre>
+<pre>adb shell /data/nativetest/NeuralNetworksTest/NeuralNetworksTest > NeuralNetworkTest.log</pre>
3. To confirm that the ArmNN driver is being used to service the Android Neural Networks API requests,
check for messages in logcat with the `ArmnnDriver` tag.
-
-### Using ClTuner
-
-ClTuner is a feature of the Compute Library that finds optimum values for OpenCL tuning parameters. The recommended way of using it with ArmNN is to generate the tuning data during development of the Android image for a device, and use it in read-only mode during normal operation:
-
-1. Run the ArmNN driver service executable in tuning mode. The path to the tuning data must be writable by the service:
-<pre>
-adb shell /system/vendor/bin/hw/android.hardware.neuralnetworks@1.0-service-armnn --cl-tuned-parameters-file &lt;PATH_TO_TUNING_DATA&gt; --cl-tuned-parameters-mode UpdateTunedParameters &
-</pre>
-2. Run a representative set of Android NNAPI testing loads. In this mode of operation, each NNAPI workload will be slow the first time it is executed, as the tuning parameters are being selected. Subsequent executions will use the tuning data which has been generated.
-3. Stop the service.
-4. Deploy the tuned parameters file to a location readable by the ArmNN driver service (for example, to a location within /vendor/etc).
-5. During normal operation, pass the location of the tuning data to the driver service (this would normally be done by passing arguments via Android init in the service .rc definition):
-<pre>
-adb shell /system/vendor/bin/hw/android.hardware.neuralnetworks@1.0-service-armnn --cl-tuned-parameters-file &lt;PATH_TO_TUNING_DATA&gt; &
-</pre>