From 926f502ca731fa49bcdf949408ce25728616e5f2 Mon Sep 17 00:00:00 2001 From: Murray Kornelsen Date: Wed, 13 Jul 2022 21:22:39 -0400 Subject: Adding GELU activation OpenCL implementation uses built in erf. NEON implementation requires new vectorized erf. Uses the following approximation: erf(x) = 1 - 1 / (1 + a1x + a2x^2 + a3x^3 + a4x^4)^4 a1 = 0.278393, a2 = 0.230389, a3 = 0.000972, a4 = 0.078108 From https://en.wikipedia.org/wiki/Error_function#Numerical_approximations Signed-off-by: Murray Kornelsen Change-Id: I2d3964b2c26a4334166b17135f9104bc6324fad2 Reviewed-on: https://review.mlplatform.org/c/ml/ComputeLibrary/+/7921 Reviewed-by: Viet-Hoa Do Reviewed-by: Pablo Marquez Tello Comments-Addressed: Arm Jenkins Comments-Addressed: Pablo Marquez Tello Tested-by: Arm Jenkins Benchmark: Arm Jenkins --- tests/validation/NEON/ActivationLayer.cpp | 1 + 1 file changed, 1 insertion(+) (limited to 'tests/validation/NEON/ActivationLayer.cpp') diff --git a/tests/validation/NEON/ActivationLayer.cpp b/tests/validation/NEON/ActivationLayer.cpp index e45b7fa5ad..a2971f28ba 100644 --- a/tests/validation/NEON/ActivationLayer.cpp +++ b/tests/validation/NEON/ActivationLayer.cpp @@ -68,6 +68,7 @@ RelativeTolerance relative_tolerance(DataType data_type, ActivationLayerI case ActivationLayerInfo::ActivationFunction::SQRT: case ActivationLayerInfo::ActivationFunction::TANH: case ActivationLayerInfo::ActivationFunction::HARD_SWISH: + case ActivationLayerInfo::ActivationFunction::GELU: switch(data_type) { case DataType::F16: -- cgit v1.2.1