aboutsummaryrefslogtreecommitdiff
path: root/ethosu/vela/tensor.py
diff options
context:
space:
mode:
authorJohan Alfven <johan.alfven@arm.com>2023-04-13 10:13:56 +0200
committerJohan Alfven <johan.alfven@arm.com>2023-04-24 12:56:44 +0200
commitc4268bf407048c7899c8501dd1223a777f8c4963 (patch)
treeeaadb0341c0989778542ccabcaf56fbddd592e8a /ethosu/vela/tensor.py
parent301ca6046884ccacd6cb4d64bd4c4869ff66b4bf (diff)
downloadethos-u-vela-c4268bf407048c7899c8501dd1223a777f8c4963.tar.gz
MLBEDSW-7501: Vela unnecessary adds reshaped weights tensors
- Weights are internally cloned and reshaped/transposed when running on the NPU. This happens already in the reader. If the op is passed through to the CPU there are code that writes backs these clones but with another round of reshape/transpose. This adds extra tensors in the optimized file compared to the original file if the original tensors are subgraph inputs. - If the op is passed trough to the CPU the clones should not be written to the file. Solved this by setting the src_tensor when making the clone. Change-Id: I9f55d542c099882882920bffe8e15b43b2ca2c8d Signed-off-by: Johan Alfven <johan.alfven@arm.com>
Diffstat (limited to 'ethosu/vela/tensor.py')
-rw-r--r--ethosu/vela/tensor.py1
1 files changed, 1 insertions, 0 deletions
diff --git a/ethosu/vela/tensor.py b/ethosu/vela/tensor.py
index 9ba6ab77..6ba331c4 100644
--- a/ethosu/vela/tensor.py
+++ b/ethosu/vela/tensor.py
@@ -506,6 +506,7 @@ class Tensor:
res.name = res.name + suffix
res.ops = []
res.consumer_list = []
+ res.src_tensor = self
return res