ArmNN
 22.02
ParserPrototxtFixture< TParser > Struct Template Reference

#include <ParserPrototxtFixture.hpp>

Public Member Functions

 ParserPrototxtFixture ()
 
template<std::size_t NumOutputDimensions>
void RunTest (const std::vector< float > &inputData, const std::vector< float > &expectedOutputData)
 Executes the network with the given input tensor and checks the result against the given output tensor. More...
 
template<std::size_t NumOutputDimensions>
void RunComparisonTest (const std::map< std::string, std::vector< float >> &inputData, const std::map< std::string, std::vector< uint8_t >> &expectedOutputData)
 Executes the network with the given input tensor and checks the result against the given output tensor. More...
 
template<std::size_t NumOutputDimensions, typename T = float>
void RunTest (const std::map< std::string, std::vector< float >> &inputData, const std::map< std::string, std::vector< T >> &expectedOutputData)
 Executes the network with the given input tensors and checks the results against the given output tensors. More...
 
void SetupSingleInputSingleOutput (const std::string &inputName, const std::string &outputName)
 Parses and loads the network defined by the m_Prototext string. More...
 
void SetupSingleInputSingleOutput (const armnn::TensorShape &inputTensorShape, const std::string &inputName, const std::string &outputName)
 
void SetupSingleInputSingleOutput (const armnn::TensorShape &inputTensorShape, const armnn::TensorShape &outputTensorShape, const std::string &inputName, const std::string &outputName)
 
void Setup (const std::map< std::string, armnn::TensorShape > &inputShapes, const std::vector< std::string > &requestedOutputs)
 
void Setup (const std::map< std::string, armnn::TensorShape > &inputShapes)
 
void Setup ()
 
armnn::IOptimizedNetworkPtr SetupOptimizedNetwork (const std::map< std::string, armnn::TensorShape > &inputShapes, const std::vector< std::string > &requestedOutputs)
 

Public Attributes

std::string m_Prototext
 
std::unique_ptr< TParser, void(*)(TParser *parser)> m_Parser
 
armnn::IRuntimePtr m_Runtime
 
armnn::NetworkId m_NetworkIdentifier
 
armnn::TensorShape m_SingleOutputShape
 This will store the output shape so it don't need to be passed to the single-input-single-output overload of RunTest(). More...
 
std::string m_SingleInputName
 If the single-input-single-output overload of Setup() is called, these will store the input and output name so they don't need to be passed to the single-input-single-output overload of RunTest(). More...
 
std::string m_SingleOutputName
 

Detailed Description

template<typename TParser>
struct armnnUtils::ParserPrototxtFixture< TParser >

Definition at line 24 of file ParserPrototxtFixture.hpp.

Constructor & Destructor Documentation

◆ ParserPrototxtFixture()

Member Function Documentation

◆ RunComparisonTest()

void RunComparisonTest ( const std::map< std::string, std::vector< float >> &  inputData,
const std::map< std::string, std::vector< uint8_t >> &  expectedOutputData 
)

Executes the network with the given input tensor and checks the result against the given output tensor.

Calls RunTest with output type of uint8_t for checking comparison operators.

Definition at line 194 of file ParserPrototxtFixture.hpp.

Referenced by ParserPrototxtFixture< TParser >::ParserPrototxtFixture().

197 {
198  RunTest<NumOutputDimensions, uint8_t>(inputData, expectedOutputData);
199 }

◆ RunTest() [1/2]

void RunTest ( const std::vector< float > &  inputData,
const std::vector< float > &  expectedOutputData 
)

Executes the network with the given input tensor and checks the result against the given output tensor.

This overload assumes that the network has a single input and a single output.

Definition at line 186 of file ParserPrototxtFixture.hpp.

References ParserPrototxtFixture< TParser >::m_SingleInputName, and ParserPrototxtFixture< TParser >::m_SingleOutputName.

Referenced by ParserPrototxtFixture< TParser >::ParserPrototxtFixture().

188 {
189  RunTest<NumOutputDimensions>({ { m_SingleInputName, inputData } }, { { m_SingleOutputName, expectedOutputData } });
190 }
std::string m_SingleInputName
If the single-input-single-output overload of Setup() is called, these will store the input and outpu...

◆ RunTest() [2/2]

void RunTest ( const std::map< std::string, std::vector< float >> &  inputData,
const std::map< std::string, std::vector< T >> &  expectedOutputData 
)

Executes the network with the given input tensors and checks the results against the given output tensors.

This overload supports multiple inputs and multiple outputs, identified by name.

Definition at line 203 of file ParserPrototxtFixture.hpp.

References CHECK_LOCATION, CompareTensors(), TensorShape::GetNumDimensions(), ParserPrototxtFixture< TParser >::m_NetworkIdentifier, ParserPrototxtFixture< TParser >::m_Parser, ParserPrototxtFixture< TParser >::m_Runtime, and ParserPrototxtFixture< TParser >::m_SingleOutputShape.

205 {
206  // Sets up the armnn input tensors from the given vectors.
207  armnn::InputTensors inputTensors;
208  for (auto&& it : inputData)
209  {
210  armnn::BindingPointInfo bindingInfo = m_Parser->GetNetworkInputBindingInfo(it.first);
211  bindingInfo.second.SetConstant(true);
212  inputTensors.push_back({ bindingInfo.first, armnn::ConstTensor(bindingInfo.second, it.second.data()) });
213  if (bindingInfo.second.GetNumElements() != it.second.size())
214  {
215  throw armnn::Exception(fmt::format("Input tensor {0} is expected to have {1} elements. "
216  "{2} elements supplied. {3}",
217  it.first,
218  bindingInfo.second.GetNumElements(),
219  it.second.size(),
220  CHECK_LOCATION().AsString()));
221  }
222  }
223 
224  // Allocates storage for the output tensors to be written to and sets up the armnn output tensors.
225  std::map<std::string, std::vector<T>> outputStorage;
226  armnn::OutputTensors outputTensors;
227  for (auto&& it : expectedOutputData)
228  {
229  armnn::BindingPointInfo bindingInfo = m_Parser->GetNetworkOutputBindingInfo(it.first);
230  outputStorage.emplace(it.first, std::vector<T>(bindingInfo.second.GetNumElements()));
231  outputTensors.push_back(
232  { bindingInfo.first, armnn::Tensor(bindingInfo.second, outputStorage.at(it.first).data()) });
233  }
234 
235  m_Runtime->EnqueueWorkload(m_NetworkIdentifier, inputTensors, outputTensors);
236 
237  // Compares each output tensor to the expected values.
238  for (auto&& it : expectedOutputData)
239  {
240  armnn::BindingPointInfo bindingInfo = m_Parser->GetNetworkOutputBindingInfo(it.first);
241  if (bindingInfo.second.GetNumElements() != it.second.size())
242  {
243  throw armnn::Exception(fmt::format("Output tensor {0} is expected to have {1} elements. "
244  "{2} elements supplied. {3}",
245  it.first,
246  bindingInfo.second.GetNumElements(),
247  it.second.size(),
248  CHECK_LOCATION().AsString()));
249  }
250 
251  // If the expected output shape is set, the output tensor checks will be carried out.
253  {
254 
255  if (bindingInfo.second.GetShape().GetNumDimensions() == NumOutputDimensions &&
256  bindingInfo.second.GetShape().GetNumDimensions() == m_SingleOutputShape.GetNumDimensions())
257  {
258  for (unsigned int i = 0; i < m_SingleOutputShape.GetNumDimensions(); ++i)
259  {
260  if (m_SingleOutputShape[i] != bindingInfo.second.GetShape()[i])
261  {
262  // This exception message could not be created by fmt:format because of an oddity in
263  // the operator << of TensorShape.
264  std::stringstream message;
265  message << "Output tensor " << it.first << " is expected to have "
266  << bindingInfo.second.GetShape() << "shape. "
267  << m_SingleOutputShape << " shape supplied. "
268  << CHECK_LOCATION().AsString();
269  throw armnn::Exception(message.str());
270  }
271  }
272  }
273  else
274  {
275  throw armnn::Exception(fmt::format("Output tensor {0} is expected to have {1} dimensions. "
276  "{2} dimensions supplied. {3}",
277  it.first,
278  bindingInfo.second.GetShape().GetNumDimensions(),
279  NumOutputDimensions,
280  CHECK_LOCATION().AsString()));
281  }
282  }
283 
284  auto outputExpected = it.second;
285  auto shape = bindingInfo.second.GetShape();
286  if (std::is_same<T, uint8_t>::value)
287  {
288  auto result = CompareTensors(outputExpected, outputStorage[it.first], shape, shape, true);
289  CHECK_MESSAGE(result.m_Result, result.m_Message.str());
290  }
291  else
292  {
293  auto result = CompareTensors(outputExpected, outputStorage[it.first], shape, shape);
294  CHECK_MESSAGE(result.m_Result, result.m_Message.str());
295  }
296  }
297 }
armnn::TensorShape m_SingleOutputShape
This will store the output shape so it don&#39;t need to be passed to the single-input-single-output over...
std::vector< std::pair< LayerBindingId, class ConstTensor > > InputTensors
Definition: Tensor.hpp:392
A tensor defined by a TensorInfo (shape and data type) and a mutable backing store.
Definition: Tensor.hpp:319
std::unique_ptr< TParser, void(*)(TParser *parser)> m_Parser
A tensor defined by a TensorInfo (shape and data type) and an immutable backing store.
Definition: Tensor.hpp:327
std::vector< std::pair< LayerBindingId, class Tensor > > OutputTensors
Definition: Tensor.hpp:393
#define CHECK_LOCATION()
Definition: Exceptions.hpp:209
armnn::PredicateResult CompareTensors(const std::vector< T > &actualData, const std::vector< T > &expectedData, const armnn::TensorShape &actualShape, const armnn::TensorShape &expectedShape, bool compareBoolean=false, bool isDynamic=false)
std::pair< armnn::LayerBindingId, armnn::TensorInfo > BindingPointInfo
Definition: Tensor.hpp:274
Base class for all ArmNN exceptions so that users can filter to just those.
Definition: Exceptions.hpp:46
unsigned int GetNumDimensions() const
Function that returns the tensor rank.
Definition: Tensor.cpp:174

◆ Setup() [1/3]

void Setup ( const std::map< std::string, armnn::TensorShape > &  inputShapes,
const std::vector< std::string > &  requestedOutputs 
)

Definition at line 122 of file ParserPrototxtFixture.hpp.

References CHECK_LOCATION, armnn::CpuRef, ParserPrototxtFixture< TParser >::m_NetworkIdentifier, ParserPrototxtFixture< TParser >::m_Parser, ParserPrototxtFixture< TParser >::m_Prototext, ParserPrototxtFixture< TParser >::m_Runtime, armnn::Optimize(), and armnn::Success.

124 {
125  std::string errorMessage;
126 
127  armnn::INetworkPtr network =
128  m_Parser->CreateNetworkFromString(m_Prototext.c_str(), inputShapes, requestedOutputs);
129  auto optimized = Optimize(*network, { armnn::Compute::CpuRef }, m_Runtime->GetDeviceSpec());
130  armnn::Status ret = m_Runtime->LoadNetwork(m_NetworkIdentifier, move(optimized), errorMessage);
131  if (ret != armnn::Status::Success)
132  {
133  throw armnn::Exception(fmt::format("LoadNetwork failed with error: '{0}' {1}",
134  errorMessage,
135  CHECK_LOCATION().AsString()));
136  }
137 }
CPU Execution: Reference C++ kernels.
std::unique_ptr< TParser, void(*)(TParser *parser)> m_Parser
IOptimizedNetworkPtr Optimize(const INetwork &network, const std::vector< BackendId > &backendPreferences, const IDeviceSpec &deviceSpec, const OptimizerOptions &options=OptimizerOptions(), Optional< std::vector< std::string > &> messages=EmptyOptional())
Create an optimized version of the network.
Definition: Network.cpp:1680
Status
enumeration
Definition: Types.hpp:29
#define CHECK_LOCATION()
Definition: Exceptions.hpp:209
Base class for all ArmNN exceptions so that users can filter to just those.
Definition: Exceptions.hpp:46
std::unique_ptr< INetwork, void(*)(INetwork *network)> INetworkPtr
Definition: INetwork.hpp:241

◆ Setup() [2/3]

void Setup ( const std::map< std::string, armnn::TensorShape > &  inputShapes)

Definition at line 140 of file ParserPrototxtFixture.hpp.

References CHECK_LOCATION, armnn::CpuRef, ParserPrototxtFixture< TParser >::m_NetworkIdentifier, ParserPrototxtFixture< TParser >::m_Parser, ParserPrototxtFixture< TParser >::m_Prototext, ParserPrototxtFixture< TParser >::m_Runtime, armnn::Optimize(), and armnn::Success.

141 {
142  std::string errorMessage;
143 
144  armnn::INetworkPtr network =
145  m_Parser->CreateNetworkFromString(m_Prototext.c_str(), inputShapes);
146  auto optimized = Optimize(*network, { armnn::Compute::CpuRef }, m_Runtime->GetDeviceSpec());
147  armnn::Status ret = m_Runtime->LoadNetwork(m_NetworkIdentifier, move(optimized), errorMessage);
148  if (ret != armnn::Status::Success)
149  {
150  throw armnn::Exception(fmt::format("LoadNetwork failed with error: '{0}' {1}",
151  errorMessage,
152  CHECK_LOCATION().AsString()));
153  }
154 }
CPU Execution: Reference C++ kernels.
std::unique_ptr< TParser, void(*)(TParser *parser)> m_Parser
IOptimizedNetworkPtr Optimize(const INetwork &network, const std::vector< BackendId > &backendPreferences, const IDeviceSpec &deviceSpec, const OptimizerOptions &options=OptimizerOptions(), Optional< std::vector< std::string > &> messages=EmptyOptional())
Create an optimized version of the network.
Definition: Network.cpp:1680
Status
enumeration
Definition: Types.hpp:29
#define CHECK_LOCATION()
Definition: Exceptions.hpp:209
Base class for all ArmNN exceptions so that users can filter to just those.
Definition: Exceptions.hpp:46
std::unique_ptr< INetwork, void(*)(INetwork *network)> INetworkPtr
Definition: INetwork.hpp:241

◆ Setup() [3/3]

void Setup ( )

Definition at line 157 of file ParserPrototxtFixture.hpp.

References CHECK_LOCATION, armnn::CpuRef, ParserPrototxtFixture< TParser >::m_NetworkIdentifier, ParserPrototxtFixture< TParser >::m_Parser, ParserPrototxtFixture< TParser >::m_Prototext, ParserPrototxtFixture< TParser >::m_Runtime, armnn::Optimize(), and armnn::Success.

Referenced by ParserPrototxtFixture< TParser >::ParserPrototxtFixture(), ParserPrototxtFixture< TParser >::SetupSingleInputSingleOutput(), and TEST_SUITE().

158 {
159  std::string errorMessage;
160 
161  armnn::INetworkPtr network =
162  m_Parser->CreateNetworkFromString(m_Prototext.c_str());
163  auto optimized = Optimize(*network, { armnn::Compute::CpuRef }, m_Runtime->GetDeviceSpec());
164  armnn::Status ret = m_Runtime->LoadNetwork(m_NetworkIdentifier, move(optimized), errorMessage);
165  if (ret != armnn::Status::Success)
166  {
167  throw armnn::Exception(fmt::format("LoadNetwork failed with error: '{0}' {1}",
168  errorMessage,
169  CHECK_LOCATION().AsString()));
170  }
171 }
CPU Execution: Reference C++ kernels.
std::unique_ptr< TParser, void(*)(TParser *parser)> m_Parser
IOptimizedNetworkPtr Optimize(const INetwork &network, const std::vector< BackendId > &backendPreferences, const IDeviceSpec &deviceSpec, const OptimizerOptions &options=OptimizerOptions(), Optional< std::vector< std::string > &> messages=EmptyOptional())
Create an optimized version of the network.
Definition: Network.cpp:1680
Status
enumeration
Definition: Types.hpp:29
#define CHECK_LOCATION()
Definition: Exceptions.hpp:209
Base class for all ArmNN exceptions so that users can filter to just those.
Definition: Exceptions.hpp:46
std::unique_ptr< INetwork, void(*)(INetwork *network)> INetworkPtr
Definition: INetwork.hpp:241

◆ SetupOptimizedNetwork()

armnn::IOptimizedNetworkPtr SetupOptimizedNetwork ( const std::map< std::string, armnn::TensorShape > &  inputShapes,
const std::vector< std::string > &  requestedOutputs 
)

Definition at line 174 of file ParserPrototxtFixture.hpp.

References armnn::CpuRef, ParserPrototxtFixture< TParser >::m_Parser, ParserPrototxtFixture< TParser >::m_Prototext, ParserPrototxtFixture< TParser >::m_Runtime, and armnn::Optimize().

Referenced by ParserPrototxtFixture< TParser >::ParserPrototxtFixture().

177 {
178  armnn::INetworkPtr network =
179  m_Parser->CreateNetworkFromString(m_Prototext.c_str(), inputShapes, requestedOutputs);
180  auto optimized = Optimize(*network, { armnn::Compute::CpuRef }, m_Runtime->GetDeviceSpec());
181  return optimized;
182 }
CPU Execution: Reference C++ kernels.
std::unique_ptr< TParser, void(*)(TParser *parser)> m_Parser
IOptimizedNetworkPtr Optimize(const INetwork &network, const std::vector< BackendId > &backendPreferences, const IDeviceSpec &deviceSpec, const OptimizerOptions &options=OptimizerOptions(), Optional< std::vector< std::string > &> messages=EmptyOptional())
Create an optimized version of the network.
Definition: Network.cpp:1680
std::unique_ptr< INetwork, void(*)(INetwork *network)> INetworkPtr
Definition: INetwork.hpp:241

◆ SetupSingleInputSingleOutput() [1/3]

void SetupSingleInputSingleOutput ( const std::string &  inputName,
const std::string &  outputName 
)

Parses and loads the network defined by the m_Prototext string.

Definition at line 87 of file ParserPrototxtFixture.hpp.

References ParserPrototxtFixture< TParser >::m_SingleInputName, ParserPrototxtFixture< TParser >::m_SingleOutputName, and ParserPrototxtFixture< TParser >::Setup().

Referenced by ParserPrototxtFixture< TParser >::ParserPrototxtFixture().

89 {
90  // Stores the input and output name so they don't need to be passed to the single-input-single-output RunTest().
91  m_SingleInputName = inputName;
92  m_SingleOutputName = outputName;
93  Setup({ }, { outputName });
94 }
std::string m_SingleInputName
If the single-input-single-output overload of Setup() is called, these will store the input and outpu...

◆ SetupSingleInputSingleOutput() [2/3]

void SetupSingleInputSingleOutput ( const armnn::TensorShape inputTensorShape,
const std::string &  inputName,
const std::string &  outputName 
)

Definition at line 97 of file ParserPrototxtFixture.hpp.

References ParserPrototxtFixture< TParser >::m_SingleInputName, ParserPrototxtFixture< TParser >::m_SingleOutputName, and ParserPrototxtFixture< TParser >::Setup().

100 {
101  // Stores the input and output name so they don't need to be passed to the single-input-single-output RunTest().
102  m_SingleInputName = inputName;
103  m_SingleOutputName = outputName;
104  Setup({ { inputName, inputTensorShape } }, { outputName });
105 }
std::string m_SingleInputName
If the single-input-single-output overload of Setup() is called, these will store the input and outpu...

◆ SetupSingleInputSingleOutput() [3/3]

void SetupSingleInputSingleOutput ( const armnn::TensorShape inputTensorShape,
const armnn::TensorShape outputTensorShape,
const std::string &  inputName,
const std::string &  outputName 
)

Definition at line 108 of file ParserPrototxtFixture.hpp.

References ParserPrototxtFixture< TParser >::m_SingleInputName, ParserPrototxtFixture< TParser >::m_SingleOutputName, ParserPrototxtFixture< TParser >::m_SingleOutputShape, and ParserPrototxtFixture< TParser >::Setup().

112 {
113  // Stores the input name, the output name and the output tensor shape
114  // so they don't need to be passed to the single-input-single-output RunTest().
115  m_SingleInputName = inputName;
116  m_SingleOutputName = outputName;
117  m_SingleOutputShape = outputTensorShape;
118  Setup({ { inputName, inputTensorShape } }, { outputName });
119 }
armnn::TensorShape m_SingleOutputShape
This will store the output shape so it don&#39;t need to be passed to the single-input-single-output over...
std::string m_SingleInputName
If the single-input-single-output overload of Setup() is called, these will store the input and outpu...

Member Data Documentation

◆ m_NetworkIdentifier

◆ m_Parser

std::unique_ptr<TParser, void(*)(TParser* parser)> m_Parser

◆ m_Prototext

◆ m_Runtime

◆ m_SingleInputName

std::string m_SingleInputName

If the single-input-single-output overload of Setup() is called, these will store the input and output name so they don't need to be passed to the single-input-single-output overload of RunTest().

Definition at line 77 of file ParserPrototxtFixture.hpp.

Referenced by ParserPrototxtFixture< TParser >::RunTest(), and ParserPrototxtFixture< TParser >::SetupSingleInputSingleOutput().

◆ m_SingleOutputName

◆ m_SingleOutputShape

armnn::TensorShape m_SingleOutputShape

This will store the output shape so it don't need to be passed to the single-input-single-output overload of RunTest().

Definition at line 83 of file ParserPrototxtFixture.hpp.

Referenced by ParserPrototxtFixture< TParser >::RunTest(), and ParserPrototxtFixture< TParser >::SetupSingleInputSingleOutput().


The documentation for this struct was generated from the following file: