aboutsummaryrefslogtreecommitdiff
path: root/tests
diff options
context:
space:
mode:
authorGunes Bayir <gunes.bayir@arm.com>2024-02-08 12:20:06 +0000
committerGunes Bayir <gunes.bayir@arm.com>2024-02-08 15:05:55 +0000
commita3e1b50588b89a2c0c67da2679728a422fc16402 (patch)
treeef0f87cf140ee68af17135a9f5eafb0dc856d11c /tests
parenta5a81aec922f0d4c30a676098ae83f3bb73a75a6 (diff)
downloadComputeLibrary-a3e1b50588b89a2c0c67da2679728a422fc16402.tar.gz
Fix the bug in GpuTanh operator in dynamic fusion
Tanh in dynamic fusion is a simple operator with no A and B coefficients, as its public interface implies. Tanh operator follows the TOSA specification. Customization of tanh calculation with a and b can be achieved via fusion as below: out = a * tanh(b *in) --> x = b * in y = tanh(x) out = a * y; Resolves: COMPMID-6873 Signed-off-by: Gunes Bayir <gunes.bayir@arm.com> Change-Id: I818765192f631ae82c2094b0fc376fb87bae4fa4 Reviewed-on: https://review.mlplatform.org/c/ml/ComputeLibrary/+/11109 Benchmark: Arm Jenkins <bsgcomp@arm.com> Tested-by: Arm Jenkins <bsgcomp@arm.com> Reviewed-by: Gian Marco Iodice <gianmarco.iodice@arm.com> Comments-Addressed: Arm Jenkins <bsgcomp@arm.com>
Diffstat (limited to 'tests')
-rw-r--r--tests/validation/fixtures/dynamic_fusion/operators/ActivationFixture.h2
1 files changed, 1 insertions, 1 deletions
diff --git a/tests/validation/fixtures/dynamic_fusion/operators/ActivationFixture.h b/tests/validation/fixtures/dynamic_fusion/operators/ActivationFixture.h
index 2f0b13329d..c9ffbccbc7 100644
--- a/tests/validation/fixtures/dynamic_fusion/operators/ActivationFixture.h
+++ b/tests/validation/fixtures/dynamic_fusion/operators/ActivationFixture.h
@@ -194,7 +194,7 @@ class DynamicFusionTanhValidationFixture
public:
void setup(TensorShape shape, bool fuse, DataType data_type)
{
- ActivationLayerInfo act_info{ActivationLayerInfo::ActivationFunction::TANH};
+ ActivationLayerInfo act_info{ActivationLayerInfo::ActivationFunction::TANH, 1.0f, 1.0f};
DynamicFusionActivationValidationFixture<TensorType, AccessorType, FunctionType, T>::setup(shape, fuse,
data_type, act_info);
}