ArmNN
 23.08
ParserPrototxtFixture< TParser > Struct Template Reference

#include <ParserPrototxtFixture.hpp>

Collaboration diagram for ParserPrototxtFixture< TParser >:
[legend]

Public Member Functions

 ParserPrototxtFixture ()
 
template<std::size_t NumOutputDimensions>
void RunTest (const std::vector< float > &inputData, const std::vector< float > &expectedOutputData)
 Executes the network with the given input tensor and checks the result against the given output tensor. More...
 
template<std::size_t NumOutputDimensions>
void RunComparisonTest (const std::map< std::string, std::vector< float >> &inputData, const std::map< std::string, std::vector< uint8_t >> &expectedOutputData)
 Executes the network with the given input tensor and checks the result against the given output tensor. More...
 
template<std::size_t NumOutputDimensions, typename T = float>
void RunTest (const std::map< std::string, std::vector< float >> &inputData, const std::map< std::string, std::vector< T >> &expectedOutputData)
 Executes the network with the given input tensors and checks the results against the given output tensors. More...
 
void SetupSingleInputSingleOutput (const std::string &inputName, const std::string &outputName)
 Parses and loads the network defined by the m_Prototext string. More...
 
void SetupSingleInputSingleOutput (const armnn::TensorShape &inputTensorShape, const std::string &inputName, const std::string &outputName)
 
void SetupSingleInputSingleOutput (const armnn::TensorShape &inputTensorShape, const armnn::TensorShape &outputTensorShape, const std::string &inputName, const std::string &outputName)
 
void Setup (const std::map< std::string, armnn::TensorShape > &inputShapes, const std::vector< std::string > &requestedOutputs)
 
void Setup (const std::map< std::string, armnn::TensorShape > &inputShapes)
 
void Setup ()
 
armnn::IOptimizedNetworkPtr SetupOptimizedNetwork (const std::map< std::string, armnn::TensorShape > &inputShapes, const std::vector< std::string > &requestedOutputs)
 

Public Attributes

std::string m_Prototext
 
std::unique_ptr< TParser, void(*)(TParser *parser)> m_Parser
 
armnn::IRuntimePtr m_Runtime
 
armnn::NetworkId m_NetworkIdentifier
 
armnn::TensorShape m_SingleOutputShape
 This will store the output shape so it don't need to be passed to the single-input-single-output overload of RunTest(). More...
 
std::string m_SingleInputName
 If the single-input-single-output overload of Setup() is called, these will store the input and output name so they don't need to be passed to the single-input-single-output overload of RunTest(). More...
 
std::string m_SingleOutputName
 

Detailed Description

template<typename TParser>
struct armnnUtils::ParserPrototxtFixture< TParser >

Definition at line 24 of file ParserPrototxtFixture.hpp.

Constructor & Destructor Documentation

◆ ParserPrototxtFixture()

Definition at line 26 of file ParserPrototxtFixture.hpp.

27  : m_Parser(TParser::Create())
30  {
31  }

Member Function Documentation

◆ RunComparisonTest()

void RunComparisonTest ( const std::map< std::string, std::vector< float >> &  inputData,
const std::map< std::string, std::vector< uint8_t >> &  expectedOutputData 
)

Executes the network with the given input tensor and checks the result against the given output tensor.

Calls RunTest with output type of uint8_t for checking comparison operators.

Definition at line 194 of file ParserPrototxtFixture.hpp.

197 {
198  RunTest<NumOutputDimensions, uint8_t>(inputData, expectedOutputData);
199 }

◆ RunTest() [1/2]

void RunTest ( const std::map< std::string, std::vector< float >> &  inputData,
const std::map< std::string, std::vector< T >> &  expectedOutputData 
)

Executes the network with the given input tensors and checks the results against the given output tensors.

This overload supports multiple inputs and multiple outputs, identified by name.

Definition at line 203 of file ParserPrototxtFixture.hpp.

205 {
206  // Sets up the armnn input tensors from the given vectors.
207  armnn::InputTensors inputTensors;
208  for (auto&& it : inputData)
209  {
210  armnn::BindingPointInfo bindingInfo = m_Parser->GetNetworkInputBindingInfo(it.first);
211  bindingInfo.second.SetConstant(true);
212  inputTensors.push_back({ bindingInfo.first, armnn::ConstTensor(bindingInfo.second, it.second.data()) });
213  if (bindingInfo.second.GetNumElements() != it.second.size())
214  {
215  throw armnn::Exception(fmt::format("Input tensor {0} is expected to have {1} elements. "
216  "{2} elements supplied. {3}",
217  it.first,
218  bindingInfo.second.GetNumElements(),
219  it.second.size(),
220  CHECK_LOCATION().AsString()));
221  }
222  }
223 
224  // Allocates storage for the output tensors to be written to and sets up the armnn output tensors.
225  std::map<std::string, std::vector<T>> outputStorage;
226  armnn::OutputTensors outputTensors;
227  for (auto&& it : expectedOutputData)
228  {
229  armnn::BindingPointInfo bindingInfo = m_Parser->GetNetworkOutputBindingInfo(it.first);
230  outputStorage.emplace(it.first, std::vector<T>(bindingInfo.second.GetNumElements()));
231  outputTensors.push_back(
232  { bindingInfo.first, armnn::Tensor(bindingInfo.second, outputStorage.at(it.first).data()) });
233  }
234 
235  m_Runtime->EnqueueWorkload(m_NetworkIdentifier, inputTensors, outputTensors);
236 
237  // Compares each output tensor to the expected values.
238  for (auto&& it : expectedOutputData)
239  {
240  armnn::BindingPointInfo bindingInfo = m_Parser->GetNetworkOutputBindingInfo(it.first);
241  if (bindingInfo.second.GetNumElements() != it.second.size())
242  {
243  throw armnn::Exception(fmt::format("Output tensor {0} is expected to have {1} elements. "
244  "{2} elements supplied. {3}",
245  it.first,
246  bindingInfo.second.GetNumElements(),
247  it.second.size(),
248  CHECK_LOCATION().AsString()));
249  }
250 
251  // If the expected output shape is set, the output tensor checks will be carried out.
253  {
254 
255  if (bindingInfo.second.GetShape().GetNumDimensions() == NumOutputDimensions &&
256  bindingInfo.second.GetShape().GetNumDimensions() == m_SingleOutputShape.GetNumDimensions())
257  {
258  for (unsigned int i = 0; i < m_SingleOutputShape.GetNumDimensions(); ++i)
259  {
260  if (m_SingleOutputShape[i] != bindingInfo.second.GetShape()[i])
261  {
262  // This exception message could not be created by fmt:format because of an oddity in
263  // the operator << of TensorShape.
264  std::stringstream message;
265  message << "Output tensor " << it.first << " is expected to have "
266  << bindingInfo.second.GetShape() << "shape. "
267  << m_SingleOutputShape << " shape supplied. "
268  << CHECK_LOCATION().AsString();
269  throw armnn::Exception(message.str());
270  }
271  }
272  }
273  else
274  {
275  throw armnn::Exception(fmt::format("Output tensor {0} is expected to have {1} dimensions. "
276  "{2} dimensions supplied. {3}",
277  it.first,
278  bindingInfo.second.GetShape().GetNumDimensions(),
279  NumOutputDimensions,
280  CHECK_LOCATION().AsString()));
281  }
282  }
283 
284  auto outputExpected = it.second;
285  auto shape = bindingInfo.second.GetShape();
286  if (std::is_same<T, uint8_t>::value)
287  {
288  auto result = CompareTensors(outputExpected, outputStorage[it.first], shape, shape, true);
289  CHECK_MESSAGE(result.m_Result, result.m_Message.str());
290  }
291  else
292  {
293  auto result = CompareTensors(outputExpected, outputStorage[it.first], shape, shape);
294  CHECK_MESSAGE(result.m_Result, result.m_Message.str());
295  }
296  }
297 }

References CHECK_LOCATION.

◆ RunTest() [2/2]

void RunTest ( const std::vector< float > &  inputData,
const std::vector< float > &  expectedOutputData 
)

Executes the network with the given input tensor and checks the result against the given output tensor.

This overload assumes that the network has a single input and a single output.

Definition at line 186 of file ParserPrototxtFixture.hpp.

188 {
189  RunTest<NumOutputDimensions>({ { m_SingleInputName, inputData } }, { { m_SingleOutputName, expectedOutputData } });
190 }

◆ Setup() [1/3]

void Setup

Definition at line 157 of file ParserPrototxtFixture.hpp.

158 {
159  std::string errorMessage;
160 
161  armnn::INetworkPtr network =
162  m_Parser->CreateNetworkFromString(m_Prototext.c_str());
163  auto optimized = Optimize(*network, { armnn::Compute::CpuRef }, m_Runtime->GetDeviceSpec());
164  armnn::Status ret = m_Runtime->LoadNetwork(m_NetworkIdentifier, move(optimized), errorMessage);
165  if (ret != armnn::Status::Success)
166  {
167  throw armnn::Exception(fmt::format("LoadNetwork failed with error: '{0}' {1}",
168  errorMessage,
169  CHECK_LOCATION().AsString()));
170  }
171 }

References CHECK_LOCATION, armnn::CpuRef, armnn::Optimize(), and armnn::Success.

◆ Setup() [2/3]

void Setup ( const std::map< std::string, armnn::TensorShape > &  inputShapes)

Definition at line 140 of file ParserPrototxtFixture.hpp.

141 {
142  std::string errorMessage;
143 
144  armnn::INetworkPtr network =
145  m_Parser->CreateNetworkFromString(m_Prototext.c_str(), inputShapes);
146  auto optimized = Optimize(*network, { armnn::Compute::CpuRef }, m_Runtime->GetDeviceSpec());
147  armnn::Status ret = m_Runtime->LoadNetwork(m_NetworkIdentifier, move(optimized), errorMessage);
148  if (ret != armnn::Status::Success)
149  {
150  throw armnn::Exception(fmt::format("LoadNetwork failed with error: '{0}' {1}",
151  errorMessage,
152  CHECK_LOCATION().AsString()));
153  }
154 }

References CHECK_LOCATION, armnn::CpuRef, armnn::Optimize(), and armnn::Success.

◆ Setup() [3/3]

void Setup ( const std::map< std::string, armnn::TensorShape > &  inputShapes,
const std::vector< std::string > &  requestedOutputs 
)

Definition at line 122 of file ParserPrototxtFixture.hpp.

124 {
125  std::string errorMessage;
126 
127  armnn::INetworkPtr network =
128  m_Parser->CreateNetworkFromString(m_Prototext.c_str(), inputShapes, requestedOutputs);
129  auto optimized = Optimize(*network, { armnn::Compute::CpuRef }, m_Runtime->GetDeviceSpec());
130  armnn::Status ret = m_Runtime->LoadNetwork(m_NetworkIdentifier, move(optimized), errorMessage);
131  if (ret != armnn::Status::Success)
132  {
133  throw armnn::Exception(fmt::format("LoadNetwork failed with error: '{0}' {1}",
134  errorMessage,
135  CHECK_LOCATION().AsString()));
136  }
137 }

References CHECK_LOCATION, armnn::CpuRef, armnn::Optimize(), and armnn::Success.

◆ SetupOptimizedNetwork()

armnn::IOptimizedNetworkPtr SetupOptimizedNetwork ( const std::map< std::string, armnn::TensorShape > &  inputShapes,
const std::vector< std::string > &  requestedOutputs 
)

Definition at line 174 of file ParserPrototxtFixture.hpp.

177 {
178  armnn::INetworkPtr network =
179  m_Parser->CreateNetworkFromString(m_Prototext.c_str(), inputShapes, requestedOutputs);
180  auto optimized = Optimize(*network, { armnn::Compute::CpuRef }, m_Runtime->GetDeviceSpec());
181  return optimized;
182 }

References armnn::CpuRef, and armnn::Optimize().

◆ SetupSingleInputSingleOutput() [1/3]

void SetupSingleInputSingleOutput ( const armnn::TensorShape inputTensorShape,
const armnn::TensorShape outputTensorShape,
const std::string &  inputName,
const std::string &  outputName 
)

Definition at line 108 of file ParserPrototxtFixture.hpp.

112 {
113  // Stores the input name, the output name and the output tensor shape
114  // so they don't need to be passed to the single-input-single-output RunTest().
115  m_SingleInputName = inputName;
116  m_SingleOutputName = outputName;
117  m_SingleOutputShape = outputTensorShape;
118  Setup({ { inputName, inputTensorShape } }, { outputName });
119 }

◆ SetupSingleInputSingleOutput() [2/3]

void SetupSingleInputSingleOutput ( const armnn::TensorShape inputTensorShape,
const std::string &  inputName,
const std::string &  outputName 
)

Definition at line 97 of file ParserPrototxtFixture.hpp.

100 {
101  // Stores the input and output name so they don't need to be passed to the single-input-single-output RunTest().
102  m_SingleInputName = inputName;
103  m_SingleOutputName = outputName;
104  Setup({ { inputName, inputTensorShape } }, { outputName });
105 }

◆ SetupSingleInputSingleOutput() [3/3]

void SetupSingleInputSingleOutput ( const std::string &  inputName,
const std::string &  outputName 
)

Parses and loads the network defined by the m_Prototext string.

Definition at line 87 of file ParserPrototxtFixture.hpp.

89 {
90  // Stores the input and output name so they don't need to be passed to the single-input-single-output RunTest().
91  m_SingleInputName = inputName;
92  m_SingleOutputName = outputName;
93  Setup({ }, { outputName });
94 }

Member Data Documentation

◆ m_NetworkIdentifier

armnn::NetworkId m_NetworkIdentifier

Definition at line 72 of file ParserPrototxtFixture.hpp.

◆ m_Parser

std::unique_ptr<TParser, void(*)(TParser* parser)> m_Parser

Definition at line 70 of file ParserPrototxtFixture.hpp.

◆ m_Prototext

std::string m_Prototext

Definition at line 69 of file ParserPrototxtFixture.hpp.

◆ m_Runtime

armnn::IRuntimePtr m_Runtime

Definition at line 71 of file ParserPrototxtFixture.hpp.

◆ m_SingleInputName

std::string m_SingleInputName

If the single-input-single-output overload of Setup() is called, these will store the input and output name so they don't need to be passed to the single-input-single-output overload of RunTest().

Definition at line 77 of file ParserPrototxtFixture.hpp.

◆ m_SingleOutputName

std::string m_SingleOutputName

Definition at line 78 of file ParserPrototxtFixture.hpp.

◆ m_SingleOutputShape

armnn::TensorShape m_SingleOutputShape

This will store the output shape so it don't need to be passed to the single-input-single-output overload of RunTest().

Definition at line 83 of file ParserPrototxtFixture.hpp.


The documentation for this struct was generated from the following file:
armnn::INetworkPtr
std::unique_ptr< INetwork, void(*)(INetwork *network)> INetworkPtr
Definition: INetwork.hpp:339
armnn::BindingPointInfo
std::pair< armnn::LayerBindingId, armnn::TensorInfo > BindingPointInfo
Definition: Tensor.hpp:274
armnn::Tensor
A tensor defined by a TensorInfo (shape and data type) and a mutable backing store.
Definition: Tensor.hpp:319
armnn::InputTensors
std::vector< std::pair< LayerBindingId, class ConstTensor > > InputTensors
Definition: Tensor.hpp:392
armnn::Compute::CpuRef
@ CpuRef
CPU Execution: Reference C++ kernels.
CHECK_LOCATION
#define CHECK_LOCATION()
Definition: Exceptions.hpp:203
armnn::OutputTensors
std::vector< std::pair< LayerBindingId, class Tensor > > OutputTensors
Definition: Tensor.hpp:393
armnn::TensorShape::GetNumDimensions
unsigned int GetNumDimensions() const
Function that returns the tensor rank.
Definition: Tensor.cpp:174
armnnUtils::ParserPrototxtFixture::m_Prototext
std::string m_Prototext
Definition: ParserPrototxtFixture.hpp:69
armnnUtils::ParserPrototxtFixture::m_SingleOutputShape
armnn::TensorShape m_SingleOutputShape
This will store the output shape so it don't need to be passed to the single-input-single-output over...
Definition: ParserPrototxtFixture.hpp:83
armnnUtils::ParserPrototxtFixture::m_SingleOutputName
std::string m_SingleOutputName
Definition: ParserPrototxtFixture.hpp:78
armnnUtils::ParserPrototxtFixture::m_SingleInputName
std::string m_SingleInputName
If the single-input-single-output overload of Setup() is called, these will store the input and outpu...
Definition: ParserPrototxtFixture.hpp:77
armnn::Status::Success
@ Success
armnn::Exception
Base class for all ArmNN exceptions so that users can filter to just those.
Definition: Exceptions.hpp:46
armnnUtils::ParserPrototxtFixture::m_Runtime
armnn::IRuntimePtr m_Runtime
Definition: ParserPrototxtFixture.hpp:71
armnnUtils::ParserPrototxtFixture::Setup
void Setup()
Definition: ParserPrototxtFixture.hpp:157
armnn::Status
Status
Definition: Types.hpp:42
armnn::IRuntime::CreationOptions
Definition: IRuntime.hpp:78
armnn::IRuntime::Create
static IRuntimePtr Create(const CreationOptions &options)
Definition: Runtime.cpp:52
armnn::ConstTensor
A tensor defined by a TensorInfo (shape and data type) and an immutable backing store.
Definition: Tensor.hpp:327
armnnUtils::ParserPrototxtFixture::m_NetworkIdentifier
armnn::NetworkId m_NetworkIdentifier
Definition: ParserPrototxtFixture.hpp:72
armnn::Optimize
IOptimizedNetworkPtr Optimize(const INetwork &network, const std::vector< BackendId > &backendPreferences, const IDeviceSpec &deviceSpec, const OptimizerOptionsOpaque &options=OptimizerOptionsOpaque(), Optional< std::vector< std::string > & > messages=EmptyOptional())
Create an optimized version of the network.
Definition: Network.cpp:2091
armnnUtils::ParserPrototxtFixture::m_Parser
std::unique_ptr< TParser, void(*)(TParser *parser)> m_Parser
Definition: ParserPrototxtFixture.hpp:70