diff options
author | SiCong Li <sicong.li@arm.com> | 2020-06-08 17:30:51 +0100 |
---|---|---|
committer | SiCong Li <sicong.li@arm.com> | 2020-06-10 17:33:25 +0000 |
commit | a32e2aef81cfcba9f5ae1770ceeb4a8d26fdc1f4 (patch) | |
tree | b791e466df9dff4032be409d6765cb8a9af9319d /tests/validation/NEON/ActivationLayer.cpp | |
parent | 8aa8764982d23ed8b8c8810bbfda30542f21e034 (diff) | |
download | ComputeLibrary-a32e2aef81cfcba9f5ae1770ceeb4a8d26fdc1f4.tar.gz |
COMPMID-3523: Fix validation fails on armv8.2-a
* Fix neon sqrt activation delta(epsilon)
* Fix NEON Hard Swish validation tolerance
* Fix NEON FP16 LogSoftmaxLayer validation test typo
* Raise NEON reduction (sum) f16 tolerance
Change-Id: Ia33d69ce5f0b78be1893fb8e13d2761a8e7fceff
Signed-off-by: SiCong Li <sicong.li@arm.com>
Reviewed-on: https://review.mlplatform.org/c/ml/ComputeLibrary/+/3318
Comments-Addressed: Arm Jenkins <bsgcomp@arm.com>
Reviewed-by: Michele Di Giorgio <michele.digiorgio@arm.com>
Tested-by: Arm Jenkins <bsgcomp@arm.com>
Diffstat (limited to 'tests/validation/NEON/ActivationLayer.cpp')
-rw-r--r-- | tests/validation/NEON/ActivationLayer.cpp | 2 |
1 files changed, 2 insertions, 0 deletions
diff --git a/tests/validation/NEON/ActivationLayer.cpp b/tests/validation/NEON/ActivationLayer.cpp index 063bfaa2cd..e3a8db167c 100644 --- a/tests/validation/NEON/ActivationLayer.cpp +++ b/tests/validation/NEON/ActivationLayer.cpp @@ -60,6 +60,7 @@ RelativeTolerance<float> relative_tolerance(DataType data_type, ActivationLayerI case ActivationLayerInfo::ActivationFunction::ELU: case ActivationLayerInfo::ActivationFunction::SQRT: case ActivationLayerInfo::ActivationFunction::TANH: + case ActivationLayerInfo::ActivationFunction::HARD_SWISH: switch(data_type) { case DataType::F16: @@ -87,6 +88,7 @@ AbsoluteTolerance<float> absolute_tolerance(DataType data_type, ActivationLayerI case ActivationLayerInfo::ActivationFunction::SOFT_RELU: case ActivationLayerInfo::ActivationFunction::SQRT: case ActivationLayerInfo::ActivationFunction::TANH: + case ActivationLayerInfo::ActivationFunction::HARD_SWISH: switch(data_type) { case DataType::F16: |