summaryrefslogtreecommitdiff
path: root/docs/sections/customizing.md
diff options
context:
space:
mode:
authorCisco Cervellera <cisco.cervellera@arm.com>2021-11-16 09:54:20 +0000
committerCisco Cervellera <cisco.cervellera@arm.com>2021-11-16 09:54:20 +0000
commite7a0393973a1a1c1ed05b1bf1838fe931416890a (patch)
tree8127d74bf1024ee6f45a07100216162d346ae2a2 /docs/sections/customizing.md
parentb52b585f1c9ee3f2800fa51f3031117ed1396abd (diff)
downloadml-embedded-evaluation-kit-e7a0393973a1a1c1ed05b1bf1838fe931416890a.tar.gz
MLECO-2520: Change md files to have correct file links
Change-Id: I3ec18583c321eb2815a670d56f4958e610331d6d
Diffstat (limited to 'docs/sections/customizing.md')
-rw-r--r--docs/sections/customizing.md42
1 files changed, 21 insertions, 21 deletions
diff --git a/docs/sections/customizing.md b/docs/sections/customizing.md
index 854a3ed..3bf9b26 100644
--- a/docs/sections/customizing.md
+++ b/docs/sections/customizing.md
@@ -1,21 +1,21 @@
# Implementing custom ML application
-- [Implementing custom ML application](#implementing-custom-ml-application)
- - [Software project description](#software-project-description)
- - [Hardware Abstraction Layer API](#hardware-abstraction-layer-api)
- - [Main loop function](#main-loop-function)
- - [Application context](#application-context)
- - [Profiler](#profiler)
- - [NN Model API](#nn-model-api)
- - [Adding custom ML use-case](#adding-custom-ml-use_case)
- - [Implementing main loop](#implementing-main-loop)
- - [Implementing custom NN model](#implementing-custom-nn-model)
- - [Define ModelPointer and ModelSize methods](#define-modelpointer-and-modelsize-methods)
- - [Executing inference](#executing-inference)
- - [Printing to console](#printing-to-console)
- - [Reading user input from console](#reading-user-input-from-console)
- - [Output to MPS3 LCD](#output-to-mps3-lcd)
- - [Building custom use-case](#building-custom-use_case)
+- [Implementing custom ML application](./customizing.md#implementing-custom-ml-application)
+ - [Software project description](./customizing.md#software-project-description)
+ - [Hardware Abstraction Layer API](./customizing.md#hardware-abstraction-layer-api)
+ - [Main loop function](./customizing.md#main-loop-function)
+ - [Application context](./customizing.md#application-context)
+ - [Profiler](./customizing.md#profiler)
+ - [NN Model API](./customizing.md#nn-model-api)
+ - [Adding custom ML use-case](./customizing.md#adding-custom-ml-use_case)
+ - [Implementing main loop](./customizing.md#implementing-main-loop)
+ - [Implementing custom NN model](./customizing.md#implementing-custom-nn-model)
+ - [Define ModelPointer and ModelSize methods](./customizing.md#define-modelpointer-and-modelsize-methods)
+ - [Executing inference](./customizing.md#executing-inference)
+ - [Printing to console](./customizing.md#printing-to-console)
+ - [Reading user input from console](./customizing.md#reading-user-input-from-console)
+ - [Output to MPS3 LCD](./customizing.md#output-to-mps3-lcd)
+ - [Building custom use-case](./customizing.md#building-custom-use_case)
This section describes how to implement a custom Machine Learning application running on Arm® *Corstone™-300* based FVP
or on the Arm® MPS3 FPGA prototyping board.
@@ -323,7 +323,7 @@ use_case
```
Start with creation of a sub-directory under the `use_case` directory and two additional directories `src` and `include`
-as described in the [Software project description](#software-project-description) section.
+as described in the [Software project description](./customizing.md#software-project-description) section.
## Implementing main loop
@@ -336,9 +336,9 @@ Main loop has knowledge about the platform and has access to the platform compon
Layer (HAL).
Start by creating a `MainLoop.cc` file in the `src` directory (the one created under
-[Adding custom ML use case](#adding-custom-ml-use-case)). The name used is not important.
+[Adding custom ML use case](./customizing.md#adding-custom-ml-use-case)). The name used is not important.
-Now define the `main_loop` function with the signature described in [Main loop function](#main-loop-function):
+Now define the `main_loop` function with the signature described in [Main loop function](./customizing.md#main-loop-function):
```C++
#include "hal.h"
@@ -348,7 +348,7 @@ void main_loop(hal_platform& platform) {
}
```
-The preceeding code is already a working use-case. If you compile and run it (see [Building custom usecase](#building-custom-use-case)),
+The preceeding code is already a working use-case. If you compile and run it (see [Building custom usecase](./customizing.md#building-custom-use-case)),
then the application starts and prints a message to console and exits straight away.
You can now start filling this function with logic.
@@ -358,7 +358,7 @@ You can now start filling this function with logic.
Before inference could be run with a custom NN model, TensorFlow Lite Micro framework must learn about the operators, or
layers, included in the model. You must register operators using the `MicroMutableOpResolver` API.
-The *Ethos-U* code samples project has an abstraction around TensorFlow Lite Micro API (see [NN model API](#nn-model-api)).
+The *Ethos-U* code samples project has an abstraction around TensorFlow Lite Micro API (see [NN model API](./customizing.md#nn-model-api)).
Create `HelloWorldModel.hpp` in the use-case include sub-directory, extend Model abstract class,
and then declare the required methods.