diff options
author | Sangwon Ha <sangwon.ha@arm.com> | 2024-01-02 22:46:24 +0000 |
---|---|---|
committer | Sang Won Ha <sangwon.ha@arm.com> | 2024-01-04 15:54:06 +0000 |
commit | 7fe7791468978429ab02343a8485b51b39832027 (patch) | |
tree | 6f5712c04f3aee623c548d57b490b99d8c41d0af /src/cpu/kernels/activation/generic/neon/lut.cpp | |
parent | 11ab45148a69f76e7821fa5e0670e4bacb05c776 (diff) | |
download | ComputeLibrary-7fe7791468978429ab02343a8485b51b39832027.tar.gz |
Prevent RELU from being processed thru LUT in INT8
- For quantized RELU activation, de-quantization and re-quantization
is not required since comparison against the quantization bias is
only required.
Resolves: COMPMID-6340
Change-Id: I574bd220f3d0d893b7f7c4819a883e2a131f61f4
Signed-off-by: Sangwon Ha <sangwon.ha@arm.com>
Reviewed-on: https://review.mlplatform.org/c/ml/ComputeLibrary/+/10916
Tested-by: Arm Jenkins <bsgcomp@arm.com>
Comments-Addressed: Arm Jenkins <bsgcomp@arm.com>
Reviewed-by: Jakub Sujak <jakub.sujak@arm.com>
Reviewed-by: <felixjohnny.thomasmathibalan@arm.com>
Benchmark: Arm Jenkins <bsgcomp@arm.com>
Diffstat (limited to 'src/cpu/kernels/activation/generic/neon/lut.cpp')
-rw-r--r-- | src/cpu/kernels/activation/generic/neon/lut.cpp | 7 |
1 files changed, 4 insertions, 3 deletions
diff --git a/src/cpu/kernels/activation/generic/neon/lut.cpp b/src/cpu/kernels/activation/generic/neon/lut.cpp index f289c80d4b..ddd186f9cb 100644 --- a/src/cpu/kernels/activation/generic/neon/lut.cpp +++ b/src/cpu/kernels/activation/generic/neon/lut.cpp @@ -1,5 +1,5 @@ /* - * Copyright (c) 2022-2023 Arm Limited. + * Copyright (c) 2022-2024 Arm Limited. * * SPDX-License-Identifier: MIT * @@ -34,8 +34,9 @@ namespace cpu #ifdef __aarch64__ void neon_q8_activation_lut(const ITensor *src, ITensor *dst, const ActivationLayerInfo &act_info, const Window &window) { - ARM_COMPUTE_ERROR_ON(src->info()->data_type() != DataType::QASYMM8 && - src->info()->data_type() != DataType::QASYMM8_SIGNED); + ARM_COMPUTE_ERROR_ON( // LUT does not provide any performance benefit for ReLU as it's a single max() operation + (src->info()->data_type() != DataType::QASYMM8 && src->info()->data_type() != DataType::QASYMM8_SIGNED) || + act_info.activation() == ActivationLayerInfo::ActivationFunction::RELU); const auto window_end_x = window.x().end(); Window win_collapsed = window.collapse_if_possible(window, Window::DimZ); win_collapsed.set(Window::DimX, Window::Dimension(0, 1, 1)); |