ArmNN
 20.05
ParserPrototxtFixture< TParser > Struct Template Reference

#include <ParserPrototxtFixture.hpp>

Public Member Functions

 ParserPrototxtFixture ()
 
template<std::size_t NumOutputDimensions>
void RunTest (const std::vector< float > &inputData, const std::vector< float > &expectedOutputData)
 Executes the network with the given input tensor and checks the result against the given output tensor. More...
 
template<std::size_t NumOutputDimensions>
void RunComparisonTest (const std::map< std::string, std::vector< float >> &inputData, const std::map< std::string, std::vector< uint8_t >> &expectedOutputData)
 Executes the network with the given input tensor and checks the result against the given output tensor. More...
 
template<std::size_t NumOutputDimensions, typename T = float>
void RunTest (const std::map< std::string, std::vector< float >> &inputData, const std::map< std::string, std::vector< T >> &expectedOutputData)
 Executes the network with the given input tensors and checks the results against the given output tensors. More...
 
void SetupSingleInputSingleOutput (const std::string &inputName, const std::string &outputName)
 Parses and loads the network defined by the m_Prototext string. More...
 
void SetupSingleInputSingleOutput (const armnn::TensorShape &inputTensorShape, const std::string &inputName, const std::string &outputName)
 
void SetupSingleInputSingleOutput (const armnn::TensorShape &inputTensorShape, const armnn::TensorShape &outputTensorShape, const std::string &inputName, const std::string &outputName)
 
void Setup (const std::map< std::string, armnn::TensorShape > &inputShapes, const std::vector< std::string > &requestedOutputs)
 
void Setup ()
 
armnn::IOptimizedNetworkPtr SetupOptimizedNetwork (const std::map< std::string, armnn::TensorShape > &inputShapes, const std::vector< std::string > &requestedOutputs)
 

Public Attributes

std::string m_Prototext
 
std::unique_ptr< TParser, void(*)(TParser *parser)> m_Parser
 
armnn::IRuntimePtr m_Runtime
 
armnn::NetworkId m_NetworkIdentifier
 
armnn::TensorShape m_SingleOutputShape
 This will store the output shape so it don't need to be passed to the single-input-single-output overload of RunTest(). More...
 
std::string m_SingleInputName
 If the single-input-single-output overload of Setup() is called, these will store the input and output name so they don't need to be passed to the single-input-single-output overload of RunTest(). More...
 
std::string m_SingleOutputName
 

Detailed Description

template<typename TParser>
struct armnnUtils::ParserPrototxtFixture< TParser >

Definition at line 24 of file ParserPrototxtFixture.hpp.

Constructor & Destructor Documentation

◆ ParserPrototxtFixture()

Definition at line 26 of file ParserPrototxtFixture.hpp.

27  : m_Parser(TParser::Create())
30  {
31  }
static IRuntimePtr Create(const CreationOptions &options)
Definition: Runtime.cpp:31
std::unique_ptr< TParser, void(*)(TParser *parser)> m_Parser

Member Function Documentation

◆ RunComparisonTest()

void RunComparisonTest ( const std::map< std::string, std::vector< float >> &  inputData,
const std::map< std::string, std::vector< uint8_t >> &  expectedOutputData 
)

Executes the network with the given input tensor and checks the result against the given output tensor.

Calls RunTest with output type of uint8_t for checking comparison operators.

Definition at line 178 of file ParserPrototxtFixture.hpp.

Referenced by ParserPrototxtFixture< armnnOnnxParser::IOnnxParser >::ParserPrototxtFixture().

181 {
182  RunTest<NumOutputDimensions, uint8_t>(inputData, expectedOutputData);
183 }

◆ RunTest() [1/2]

void RunTest ( const std::vector< float > &  inputData,
const std::vector< float > &  expectedOutputData 
)

Executes the network with the given input tensor and checks the result against the given output tensor.

This overload assumes that the network has a single input and a single output.

Definition at line 170 of file ParserPrototxtFixture.hpp.

Referenced by ParserPrototxtFixture< armnnOnnxParser::IOnnxParser >::ParserPrototxtFixture().

172 {
173  RunTest<NumOutputDimensions>({ { m_SingleInputName, inputData } }, { { m_SingleOutputName, expectedOutputData } });
174 }
std::string m_SingleInputName
If the single-input-single-output overload of Setup() is called, these will store the input and outpu...

◆ RunTest() [2/2]

void RunTest ( const std::map< std::string, std::vector< float >> &  inputData,
const std::map< std::string, std::vector< T >> &  expectedOutputData 
)

Executes the network with the given input tensors and checks the results against the given output tensors.

This overload supports multiple inputs and multiple outputs, identified by name.

Definition at line 187 of file ParserPrototxtFixture.hpp.

189 {
190  // Sets up the armnn input tensors from the given vectors.
191  armnn::InputTensors inputTensors;
192  for (auto&& it : inputData)
193  {
194  armnn::BindingPointInfo bindingInfo = m_Parser->GetNetworkInputBindingInfo(it.first);
195  inputTensors.push_back({ bindingInfo.first, armnn::ConstTensor(bindingInfo.second, it.second.data()) });
196  }
197 
198  // Allocates storage for the output tensors to be written to and sets up the armnn output tensors.
199  std::map<std::string, boost::multi_array<T, NumOutputDimensions>> outputStorage;
200  armnn::OutputTensors outputTensors;
201  for (auto&& it : expectedOutputData)
202  {
203  armnn::BindingPointInfo bindingInfo = m_Parser->GetNetworkOutputBindingInfo(it.first);
204  outputStorage.emplace(it.first, MakeTensor<T, NumOutputDimensions>(bindingInfo.second));
205  outputTensors.push_back(
206  { bindingInfo.first, armnn::Tensor(bindingInfo.second, outputStorage.at(it.first).data()) });
207  }
208 
209  m_Runtime->EnqueueWorkload(m_NetworkIdentifier, inputTensors, outputTensors);
210 
211  // Compares each output tensor to the expected values.
212  for (auto&& it : expectedOutputData)
213  {
214  armnn::BindingPointInfo bindingInfo = m_Parser->GetNetworkOutputBindingInfo(it.first);
215  if (bindingInfo.second.GetNumElements() != it.second.size())
216  {
217  throw armnn::Exception(
218  boost::str(boost::format("Output tensor %1% is expected to have %2% elements. "
219  "%3% elements supplied. %4%") %
220  it.first %
221  bindingInfo.second.GetNumElements() %
222  it.second.size() %
223  CHECK_LOCATION().AsString()));
224  }
225 
226  // If the expected output shape is set, the output tensor checks will be carried out.
228  {
229 
230  if (bindingInfo.second.GetShape().GetNumDimensions() == NumOutputDimensions &&
231  bindingInfo.second.GetShape().GetNumDimensions() == m_SingleOutputShape.GetNumDimensions())
232  {
233  for (unsigned int i = 0; i < m_SingleOutputShape.GetNumDimensions(); ++i)
234  {
235  if (m_SingleOutputShape[i] != bindingInfo.second.GetShape()[i])
236  {
237  throw armnn::Exception(
238  boost::str(boost::format("Output tensor %1% is expected to have %2% shape. "
239  "%3% shape supplied. %4%") %
240  it.first %
241  bindingInfo.second.GetShape() %
243  CHECK_LOCATION().AsString()));
244  }
245  }
246  }
247  else
248  {
249  throw armnn::Exception(
250  boost::str(boost::format("Output tensor %1% is expected to have %2% dimensions. "
251  "%3% dimensions supplied. %4%") %
252  it.first %
253  bindingInfo.second.GetShape().GetNumDimensions() %
254  NumOutputDimensions %
255  CHECK_LOCATION().AsString()));
256  }
257  }
258 
259  auto outputExpected = MakeTensor<T, NumOutputDimensions>(bindingInfo.second, it.second);
260  if (std::is_same<T, uint8_t>::value)
261  {
262  BOOST_TEST(CompareTensors(outputExpected, outputStorage[it.first], true));
263  }
264  else
265  {
266  BOOST_TEST(CompareTensors(outputExpected, outputStorage[it.first]));
267  }
268  }
269 }
armnn::TensorShape m_SingleOutputShape
This will store the output shape so it don&#39;t need to be passed to the single-input-single-output over...
boost::test_tools::predicate_result CompareTensors(const boost::multi_array< T, n > &a, const boost::multi_array< T, n > &b, bool compareBoolean=false)
std::vector< std::pair< LayerBindingId, class ConstTensor > > InputTensors
Definition: Tensor.hpp:225
A tensor defined by a TensorInfo (shape and data type) and a mutable backing store.
Definition: Tensor.hpp:191
std::unique_ptr< TParser, void(*)(TParser *parser)> m_Parser
A tensor defined by a TensorInfo (shape and data type) and an immutable backing store.
Definition: Tensor.hpp:199
std::vector< std::pair< LayerBindingId, class Tensor > > OutputTensors
Definition: Tensor.hpp:226
#define CHECK_LOCATION()
Definition: Exceptions.hpp:192
std::pair< armnn::LayerBindingId, armnn::TensorInfo > BindingPointInfo
Definition: Tensor.hpp:146
Base class for all ArmNN exceptions so that users can filter to just those.
Definition: Exceptions.hpp:46
unsigned int GetNumDimensions() const
Definition: Tensor.hpp:43

◆ Setup() [1/2]

void Setup ( const std::map< std::string, armnn::TensorShape > &  inputShapes,
const std::vector< std::string > &  requestedOutputs 
)

Definition at line 121 of file ParserPrototxtFixture.hpp.

123 {
124  std::string errorMessage;
125 
126  armnn::INetworkPtr network =
127  m_Parser->CreateNetworkFromString(m_Prototext.c_str(), inputShapes, requestedOutputs);
128  auto optimized = Optimize(*network, { armnn::Compute::CpuRef }, m_Runtime->GetDeviceSpec());
129  armnn::Status ret = m_Runtime->LoadNetwork(m_NetworkIdentifier, move(optimized), errorMessage);
130  if (ret != armnn::Status::Success)
131  {
132  throw armnn::Exception(boost::str(
133  boost::format("LoadNetwork failed with error: '%1%' %2%")
134  % errorMessage
135  % CHECK_LOCATION().AsString()));
136  }
137 }
CPU Execution: Reference C++ kernels.
std::unique_ptr< TParser, void(*)(TParser *parser)> m_Parser
IOptimizedNetworkPtr Optimize(const INetwork &network, const std::vector< BackendId > &backendPreferences, const IDeviceSpec &deviceSpec, const OptimizerOptions &options=OptimizerOptions(), Optional< std::vector< std::string > &> messages=EmptyOptional())
Create an optimized version of the network.
Definition: Network.cpp:1003
Status
enumeration
Definition: Types.hpp:26
#define CHECK_LOCATION()
Definition: Exceptions.hpp:192
Base class for all ArmNN exceptions so that users can filter to just those.
Definition: Exceptions.hpp:46
std::unique_ptr< INetwork, void(*)(INetwork *network)> INetworkPtr
Definition: INetwork.hpp:101

◆ Setup() [2/2]

void Setup ( )

Definition at line 140 of file ParserPrototxtFixture.hpp.

Referenced by BOOST_FIXTURE_TEST_CASE(), ParserPrototxtFixture< armnnOnnxParser::IOnnxParser >::ParserPrototxtFixture(), and ParserPrototxtFixture< armnnOnnxParser::IOnnxParser >::SetupSingleInputSingleOutput().

141 {
142  std::string errorMessage;
143 
144  armnn::INetworkPtr network =
145  m_Parser->CreateNetworkFromString(m_Prototext.c_str());
146  auto optimized = Optimize(*network, { armnn::Compute::CpuRef }, m_Runtime->GetDeviceSpec());
147  armnn::Status ret = m_Runtime->LoadNetwork(m_NetworkIdentifier, move(optimized), errorMessage);
148  if (ret != armnn::Status::Success)
149  {
150  throw armnn::Exception(boost::str(
151  boost::format("LoadNetwork failed with error: '%1%' %2%")
152  % errorMessage
153  % CHECK_LOCATION().AsString()));
154  }
155 }
CPU Execution: Reference C++ kernels.
std::unique_ptr< TParser, void(*)(TParser *parser)> m_Parser
IOptimizedNetworkPtr Optimize(const INetwork &network, const std::vector< BackendId > &backendPreferences, const IDeviceSpec &deviceSpec, const OptimizerOptions &options=OptimizerOptions(), Optional< std::vector< std::string > &> messages=EmptyOptional())
Create an optimized version of the network.
Definition: Network.cpp:1003
Status
enumeration
Definition: Types.hpp:26
#define CHECK_LOCATION()
Definition: Exceptions.hpp:192
Base class for all ArmNN exceptions so that users can filter to just those.
Definition: Exceptions.hpp:46
std::unique_ptr< INetwork, void(*)(INetwork *network)> INetworkPtr
Definition: INetwork.hpp:101

◆ SetupOptimizedNetwork()

armnn::IOptimizedNetworkPtr SetupOptimizedNetwork ( const std::map< std::string, armnn::TensorShape > &  inputShapes,
const std::vector< std::string > &  requestedOutputs 
)

Definition at line 158 of file ParserPrototxtFixture.hpp.

Referenced by BOOST_FIXTURE_TEST_CASE(), and ParserPrototxtFixture< armnnOnnxParser::IOnnxParser >::ParserPrototxtFixture().

161 {
162  armnn::INetworkPtr network =
163  m_Parser->CreateNetworkFromString(m_Prototext.c_str(), inputShapes, requestedOutputs);
164  auto optimized = Optimize(*network, { armnn::Compute::CpuRef }, m_Runtime->GetDeviceSpec());
165  return optimized;
166 }
CPU Execution: Reference C++ kernels.
std::unique_ptr< TParser, void(*)(TParser *parser)> m_Parser
IOptimizedNetworkPtr Optimize(const INetwork &network, const std::vector< BackendId > &backendPreferences, const IDeviceSpec &deviceSpec, const OptimizerOptions &options=OptimizerOptions(), Optional< std::vector< std::string > &> messages=EmptyOptional())
Create an optimized version of the network.
Definition: Network.cpp:1003
std::unique_ptr< INetwork, void(*)(INetwork *network)> INetworkPtr
Definition: INetwork.hpp:101

◆ SetupSingleInputSingleOutput() [1/3]

void SetupSingleInputSingleOutput ( const std::string &  inputName,
const std::string &  outputName 
)

Parses and loads the network defined by the m_Prototext string.

Definition at line 86 of file ParserPrototxtFixture.hpp.

Referenced by BOOST_FIXTURE_TEST_CASE(), and ParserPrototxtFixture< armnnOnnxParser::IOnnxParser >::ParserPrototxtFixture().

88 {
89  // Stores the input and output name so they don't need to be passed to the single-input-single-output RunTest().
90  m_SingleInputName = inputName;
91  m_SingleOutputName = outputName;
92  Setup({ }, { outputName });
93 }
std::string m_SingleInputName
If the single-input-single-output overload of Setup() is called, these will store the input and outpu...

◆ SetupSingleInputSingleOutput() [2/3]

void SetupSingleInputSingleOutput ( const armnn::TensorShape inputTensorShape,
const std::string &  inputName,
const std::string &  outputName 
)

Definition at line 96 of file ParserPrototxtFixture.hpp.

99 {
100  // Stores the input and output name so they don't need to be passed to the single-input-single-output RunTest().
101  m_SingleInputName = inputName;
102  m_SingleOutputName = outputName;
103  Setup({ { inputName, inputTensorShape } }, { outputName });
104 }
std::string m_SingleInputName
If the single-input-single-output overload of Setup() is called, these will store the input and outpu...

◆ SetupSingleInputSingleOutput() [3/3]

void SetupSingleInputSingleOutput ( const armnn::TensorShape inputTensorShape,
const armnn::TensorShape outputTensorShape,
const std::string &  inputName,
const std::string &  outputName 
)

Definition at line 107 of file ParserPrototxtFixture.hpp.

111 {
112  // Stores the input name, the output name and the output tensor shape
113  // so they don't need to be passed to the single-input-single-output RunTest().
114  m_SingleInputName = inputName;
115  m_SingleOutputName = outputName;
116  m_SingleOutputShape = outputTensorShape;
117  Setup({ { inputName, inputTensorShape } }, { outputName });
118 }
armnn::TensorShape m_SingleOutputShape
This will store the output shape so it don&#39;t need to be passed to the single-input-single-output over...
std::string m_SingleInputName
If the single-input-single-output overload of Setup() is called, these will store the input and outpu...

Member Data Documentation

◆ m_NetworkIdentifier

◆ m_Parser

◆ m_Prototext

◆ m_Runtime

◆ m_SingleInputName

std::string m_SingleInputName

If the single-input-single-output overload of Setup() is called, these will store the input and output name so they don't need to be passed to the single-input-single-output overload of RunTest().

Definition at line 76 of file ParserPrototxtFixture.hpp.

Referenced by ParserPrototxtFixture< armnnOnnxParser::IOnnxParser >::RunTest(), and ParserPrototxtFixture< armnnOnnxParser::IOnnxParser >::SetupSingleInputSingleOutput().

◆ m_SingleOutputName

◆ m_SingleOutputShape

armnn::TensorShape m_SingleOutputShape

This will store the output shape so it don't need to be passed to the single-input-single-output overload of RunTest().

Definition at line 82 of file ParserPrototxtFixture.hpp.

Referenced by ParserPrototxtFixture< armnnOnnxParser::IOnnxParser >::RunTest(), and ParserPrototxtFixture< armnnOnnxParser::IOnnxParser >::SetupSingleInputSingleOutput().


The documentation for this struct was generated from the following file: