summaryrefslogtreecommitdiff
path: root/docs/use_cases/inference_runner.md
diff options
context:
space:
mode:
authorIsabella Gottardi <isabella.gottardi@arm.com>2022-01-27 16:39:37 +0000
committerKshitij Sisodia <kshitij.sisodia@arm.com>2022-02-08 16:32:28 +0000
commit3107aa2152de9be8317e62da1d0327bcad6552e2 (patch)
tree2ba12a5dd39f28ae1b646e132fbe575c6a442ee9 /docs/use_cases/inference_runner.md
parent5cdfa9b834dc5a94c70f9f2b1f5c849dc5439e85 (diff)
downloadml-embedded-evaluation-kit-3107aa2152de9be8317e62da1d0327bcad6552e2.tar.gz
MLECO-2873: Object detection usecase follow-up
Change-Id: Ic14e93a50fb7b3f3cfd9497bac1280794cc0fc15 Signed-off-by: Isabella Gottardi <isabella.gottardi@arm.com>
Diffstat (limited to 'docs/use_cases/inference_runner.md')
-rw-r--r--docs/use_cases/inference_runner.md7
1 files changed, 5 insertions, 2 deletions
diff --git a/docs/use_cases/inference_runner.md b/docs/use_cases/inference_runner.md
index 01bf4d0..2824def 100644
--- a/docs/use_cases/inference_runner.md
+++ b/docs/use_cases/inference_runner.md
@@ -59,7 +59,7 @@ following:
- `inference_runner_DYNAMIC_MEM_LOAD_ENABLED`: This can be set to ON or OFF, to allow dynamic model load capability for use with MPS3 FVPs. See section [Building with dynamic model load capability](./inference_runner.md#building-with-dynamic-model-load-capability) below for more details.
-To build **ONLY** the Inference Runner example application, add `-DUSE_CASE_BUILD=inferece_runner` to the `cmake`
+To build **ONLY** the Inference Runner example application, add `-DUSE_CASE_BUILD=inference_runner` to the `cmake`
command line, as specified in: [Building](../documentation.md#Building).
### Build process
@@ -199,7 +199,7 @@ To install the FVP:
### Starting Fast Model simulation
-Once completed the building step, the application binary `ethos-u-infernce_runner.axf` can be found in the `build/bin`
+Once completed the building step, the application binary `ethos-u-inference_runner.axf` can be found in the `build/bin`
folder.
Assuming that the install location of the FVP was set to `~/FVP_install_location`, then the simulation can be started by
@@ -287,9 +287,11 @@ cmake .. \
```
Once the configuration completes, running:
+
```commandline
make -j
```
+
will build the application that will expect the neural network model and the IFM to be loaded into
specific addresses. These addresses are defined in
[corstone-sse-300.cmake](../../scripts/cmake/subsystem-profiles/corstone-sse-300.cmake) for the MPS3
@@ -314,6 +316,7 @@ binary blob.
--data /path/to/custom-ifm.bin@0x92000000 \
--dump cpu0=/path/to/output.bin@Memory:0x93000000,1024
```
+
The above command will dump a 1KiB (1024 bytes) file with output tensors as a binary blob after it
has consumed the model and IFM data provided by the file paths specified and the inference is
executed successfully.