diff options
author | Michael McGeagh <michael.mcgeagh@arm.com> | 2020-08-06 17:31:02 +0100 |
---|---|---|
committer | Fredrik Knutsson <fredrik.knutsson.hunnebo@gmail.com> | 2020-08-12 06:29:18 +0000 |
commit | 5778ffdab61a46369c73c91f2c6289ba9833e3a3 (patch) | |
tree | 3225088facdb0fe46190170c47da304df45e4aee /ethosu/vela/operation.py | |
parent | 22f74e1c39572f084ad05cc2f208446fd2f50138 (diff) | |
download | ethos-u-vela-5778ffdab61a46369c73c91f2c6289ba9833e3a3.tar.gz |
MLBEDSW-2637 Refactor util funcs out of softmax.py
There were a number of "TensorUtil" functions defined in softmax.py
These have been moved to their respective classes for Tensor and
Operator respectively.
Two of the functions were not a simple tensor/op function. These helper
functions have been moved to tensor.py for the simple fact that they
return Tensor's
Signed-off-by: Michael McGeagh <michael.mcgeagh@arm.com>
Change-Id: I17d39c4e11f0837b7867b4a54da2e4a56383e095
Diffstat (limited to 'ethosu/vela/operation.py')
-rw-r--r-- | ethosu/vela/operation.py | 9 |
1 files changed, 9 insertions, 0 deletions
diff --git a/ethosu/vela/operation.py b/ethosu/vela/operation.py index 7134fd82..adbbff51 100644 --- a/ethosu/vela/operation.py +++ b/ethosu/vela/operation.py @@ -311,3 +311,12 @@ input and output tensors, as well as an attribute dictionary.""" self.attrs["fused_activation_function"] = "LUT" self.activation_lut = lut_tensor self.inputs.append(lut_tensor) + + def add_input_tensor(self, tens): + self.inputs.append(tens) + if self not in tens.consumer_list: + tens.consumer_list.append(self) + + def set_output_tensor(self, tens): + tens.ops = [self] + self.outputs = [tens] |