ArmNN
 20.02
ParserFlatbuffersFixture Struct Reference

#include <ParserFlatbuffersFixture.hpp>

Inheritance diagram for ParserFlatbuffersFixture:
PositiveActivationFixture

Public Member Functions

 ParserFlatbuffersFixture ()
 
void Setup ()
 
void SetupSingleInputSingleOutput (const std::string &inputName, const std::string &outputName)
 
bool ReadStringToBinary ()
 
template<std::size_t NumOutputDimensions, armnn::DataType ArmnnType>
void RunTest (size_t subgraphId, const std::vector< armnn::ResolveType< ArmnnType >> &inputData, const std::vector< armnn::ResolveType< ArmnnType >> &expectedOutputData)
 Executes the network with the given input tensor and checks the result against the given output tensor. More...
 
template<std::size_t NumOutputDimensions, armnn::DataType ArmnnType>
void RunTest (size_t subgraphId, const std::map< std::string, std::vector< armnn::ResolveType< ArmnnType >>> &inputData, const std::map< std::string, std::vector< armnn::ResolveType< ArmnnType >>> &expectedOutputData)
 Executes the network with the given input tensors and checks the results against the given output tensors. More...
 
template<std::size_t NumOutputDimensions, armnn::DataType ArmnnType1, armnn::DataType ArmnnType2>
void RunTest (size_t subgraphId, const std::map< std::string, std::vector< armnn::ResolveType< ArmnnType1 >>> &inputData, const std::map< std::string, std::vector< armnn::ResolveType< ArmnnType2 >>> &expectedOutputData)
 Multiple Inputs, Multiple Outputs w/ Variable Datatypes and different dimension sizes. More...
 
template<armnn::DataType ArmnnType1, armnn::DataType ArmnnType2>
void RunTest (std::size_t subgraphId, const std::map< std::string, std::vector< armnn::ResolveType< ArmnnType1 >>> &inputData, const std::map< std::string, std::vector< armnn::ResolveType< ArmnnType2 >>> &expectedOutputData)
 Multiple Inputs, Multiple Outputs w/ Variable Datatypes and different dimension sizes. More...
 
void CheckTensors (const TensorRawPtr &tensors, size_t shapeSize, const std::vector< int32_t > &shape, tflite::TensorType tensorType, uint32_t buffer, const std::string &name, const std::vector< float > &min, const std::vector< float > &max, const std::vector< float > &scale, const std::vector< int64_t > &zeroPoint)
 

Static Public Member Functions

static std::string GenerateDetectionPostProcessJsonString (const armnn::DetectionPostProcessDescriptor &descriptor)
 

Public Attributes

std::vector< uint8_t > m_GraphBinary
 
std::string m_JsonString
 
ITfLiteParserPtr m_Parser
 
armnn::IRuntimePtr m_Runtime
 
armnn::NetworkId m_NetworkIdentifier
 
std::string m_SingleInputName
 If the single-input-single-output overload of Setup() is called, these will store the input and output name so they don't need to be passed to the single-input-single-output overload of RunTest(). More...
 
std::string m_SingleOutputName
 

Detailed Description

Definition at line 37 of file ParserFlatbuffersFixture.hpp.

Constructor & Destructor Documentation

◆ ParserFlatbuffersFixture()

Definition at line 39 of file ParserFlatbuffersFixture.hpp.

References m_Parser, and options.

39  :
40  m_Parser(nullptr, &ITfLiteParser::Destroy),
43  {
44  ITfLiteParser::TfLiteParserOptions options;
45  options.m_StandInLayerForUnsupported = true;
46 
47  m_Parser.reset(ITfLiteParser::CreateRaw(armnn::Optional<ITfLiteParser::TfLiteParserOptions>(options)));
48  }
static IRuntimePtr Create(const CreationOptions &options)
Definition: Runtime.cpp:32
armnn::Runtime::CreationOptions::ExternalProfilingOptions options

Member Function Documentation

◆ CheckTensors()

void CheckTensors ( const TensorRawPtr tensors,
size_t  shapeSize,
const std::vector< int32_t > &  shape,
tflite::TensorType  tensorType,
uint32_t  buffer,
const std::string &  name,
const std::vector< float > &  min,
const std::vector< float > &  max,
const std::vector< float > &  scale,
const std::vector< int64_t > &  zeroPoint 
)
inline

Definition at line 193 of file ParserFlatbuffersFixture.hpp.

References BOOST_CHECK().

197  {
198  BOOST_CHECK(tensors);
199  BOOST_CHECK_EQUAL(shapeSize, tensors->shape.size());
200  BOOST_CHECK_EQUAL_COLLECTIONS(shape.begin(), shape.end(), tensors->shape.begin(), tensors->shape.end());
201  BOOST_CHECK_EQUAL(tensorType, tensors->type);
202  BOOST_CHECK_EQUAL(buffer, tensors->buffer);
203  BOOST_CHECK_EQUAL(name, tensors->name);
204  BOOST_CHECK(tensors->quantization);
205  BOOST_CHECK_EQUAL_COLLECTIONS(min.begin(), min.end(), tensors->quantization.get()->min.begin(),
206  tensors->quantization.get()->min.end());
207  BOOST_CHECK_EQUAL_COLLECTIONS(max.begin(), max.end(), tensors->quantization.get()->max.begin(),
208  tensors->quantization.get()->max.end());
209  BOOST_CHECK_EQUAL_COLLECTIONS(scale.begin(), scale.end(), tensors->quantization.get()->scale.begin(),
210  tensors->quantization.get()->scale.end());
211  BOOST_CHECK_EQUAL_COLLECTIONS(zeroPoint.begin(), zeroPoint.end(),
212  tensors->quantization.get()->zero_point.begin(),
213  tensors->quantization.get()->zero_point.end());
214  }
BOOST_CHECK(profilingService.GetCurrentState()==ProfilingState::WaitingForAck)

◆ GenerateDetectionPostProcessJsonString()

static std::string GenerateDetectionPostProcessJsonString ( const armnn::DetectionPostProcessDescriptor descriptor)
inlinestatic

Definition at line 166 of file ParserFlatbuffersFixture.hpp.

References DetectionPostProcessDescriptor::m_DetectionsPerClass, DetectionPostProcessDescriptor::m_MaxClassesPerDetection, DetectionPostProcessDescriptor::m_MaxDetections, DetectionPostProcessDescriptor::m_NmsIouThreshold, DetectionPostProcessDescriptor::m_NmsScoreThreshold, DetectionPostProcessDescriptor::m_NumClasses, DetectionPostProcessDescriptor::m_ScaleH, DetectionPostProcessDescriptor::m_ScaleW, DetectionPostProcessDescriptor::m_ScaleX, DetectionPostProcessDescriptor::m_ScaleY, and DetectionPostProcessDescriptor::m_UseRegularNms.

168  {
169  flexbuffers::Builder detectPostProcess;
170  detectPostProcess.Map([&]() {
171  detectPostProcess.Bool("use_regular_nms", descriptor.m_UseRegularNms);
172  detectPostProcess.Int("max_detections", descriptor.m_MaxDetections);
173  detectPostProcess.Int("max_classes_per_detection", descriptor.m_MaxClassesPerDetection);
174  detectPostProcess.Int("detections_per_class", descriptor.m_DetectionsPerClass);
175  detectPostProcess.Int("num_classes", descriptor.m_NumClasses);
176  detectPostProcess.Float("nms_score_threshold", descriptor.m_NmsScoreThreshold);
177  detectPostProcess.Float("nms_iou_threshold", descriptor.m_NmsIouThreshold);
178  detectPostProcess.Float("h_scale", descriptor.m_ScaleH);
179  detectPostProcess.Float("w_scale", descriptor.m_ScaleW);
180  detectPostProcess.Float("x_scale", descriptor.m_ScaleX);
181  detectPostProcess.Float("y_scale", descriptor.m_ScaleY);
182  });
183  detectPostProcess.Finish();
184 
185  // Create JSON string
186  std::stringstream strStream;
187  std::vector<uint8_t> buffer = detectPostProcess.GetBuffer();
188  std::copy(buffer.begin(), buffer.end(),std::ostream_iterator<int>(strStream,","));
189 
190  return strStream.str();
191  }
float m_ScaleW
Center size encoding scale weight.
float m_ScaleX
Center size encoding scale x.
uint32_t m_DetectionsPerClass
Detections per classes, used in Regular NMS.
uint32_t m_MaxClassesPerDetection
Maximum numbers of classes per detection, used in Fast NMS.
uint32_t m_MaxDetections
Maximum numbers of detections.
float m_NmsIouThreshold
Intersection over union threshold.
uint32_t m_NumClasses
Number of classes.
bool m_UseRegularNms
Use Regular NMS.
float m_ScaleH
Center size encoding scale height.
float m_ScaleY
Center size encoding scale y.
float m_NmsScoreThreshold
NMS score threshold.

◆ ReadStringToBinary()

bool ReadStringToBinary ( )
inline

Definition at line 102 of file ParserFlatbuffersFixture.hpp.

References g_TfLiteSchemaText, g_TfLiteSchemaText_len, and RunTest().

Referenced by Setup().

103  {
105 
106  // parse schema first, so we can use it to parse the data after
107  flatbuffers::Parser parser;
108 
109  bool ok = parser.Parse(schemafile.c_str());
110  BOOST_ASSERT_MSG(ok, "Failed to parse schema file");
111 
112  ok &= parser.Parse(m_JsonString.c_str());
113  BOOST_ASSERT_MSG(ok, "Failed to parse json input");
114 
115  if (!ok)
116  {
117  return false;
118  }
119 
120  {
121  const uint8_t * bufferPtr = parser.builder_.GetBufferPointer();
122  size_t size = static_cast<size_t>(parser.builder_.GetSize());
123  m_GraphBinary.assign(bufferPtr, bufferPtr+size);
124  }
125  return ok;
126  }
std::vector< uint8_t > m_GraphBinary
unsigned char g_TfLiteSchemaText[]
unsigned int g_TfLiteSchemaText_len

◆ RunTest() [1/4]

void RunTest ( size_t  subgraphId,
const std::vector< armnn::ResolveType< armnnType >> &  inputData,
const std::vector< armnn::ResolveType< armnnType >> &  expectedOutputData 
)

Executes the network with the given input tensor and checks the result against the given output tensor.

Single Input, Single Output Executes the network with the given input tensor and checks the result against the given output tensor.

This assumes the network has a single input and a single output.

This overload assumes the network has a single input and a single output.

Definition at line 222 of file ParserFlatbuffersFixture.hpp.

References m_SingleInputName, and m_SingleOutputName.

Referenced by ReadStringToBinary().

225 {
226  RunTest<NumOutputDimensions, armnnType>(subgraphId,
227  { { m_SingleInputName, inputData } },
228  { { m_SingleOutputName, expectedOutputData } });
229 }
std::string m_SingleInputName
If the single-input-single-output overload of Setup() is called, these will store the input and outpu...

◆ RunTest() [2/4]

void RunTest ( size_t  subgraphId,
const std::map< std::string, std::vector< armnn::ResolveType< armnnType >>> &  inputData,
const std::map< std::string, std::vector< armnn::ResolveType< armnnType >>> &  expectedOutputData 
)

Executes the network with the given input tensors and checks the results against the given output tensors.

Multiple Inputs, Multiple Outputs Executes the network with the given input tensors and checks the results against the given output tensors.

This overload supports multiple inputs and multiple outputs, identified by name.

Definition at line 236 of file ParserFlatbuffersFixture.hpp.

239 {
240  RunTest<NumOutputDimensions, armnnType, armnnType>(subgraphId, inputData, expectedOutputData);
241 }

◆ RunTest() [3/4]

void RunTest ( size_t  subgraphId,
const std::map< std::string, std::vector< armnn::ResolveType< armnnType1 >>> &  inputData,
const std::map< std::string, std::vector< armnn::ResolveType< armnnType2 >>> &  expectedOutputData 
)

Multiple Inputs, Multiple Outputs w/ Variable Datatypes and different dimension sizes.

Multiple Inputs, Multiple Outputs w/ Variable Datatypes Executes the network with the given input tensors and checks the results against the given output tensors.

Executes the network with the given input tensors and checks the results against the given output tensors. This overload supports multiple inputs and multiple outputs, identified by name along with the allowance for the input datatype to be different to the output

This overload supports multiple inputs and multiple outputs, identified by name along with the allowance for the input datatype to be different to the output

Definition at line 250 of file ParserFlatbuffersFixture.hpp.

References CompareTensors(), TensorInfo::GetNumDimensions(), m_NetworkIdentifier, m_Parser, m_Runtime, and armnn::VerifyTensorInfoDataType().

253 {
254  using DataType2 = armnn::ResolveType<armnnType2>;
255 
256  // Setup the armnn input tensors from the given vectors.
257  armnn::InputTensors inputTensors;
258  for (auto&& it : inputData)
259  {
260  armnn::BindingPointInfo bindingInfo = m_Parser->GetNetworkInputBindingInfo(subgraphId, it.first);
261  armnn::VerifyTensorInfoDataType(bindingInfo.second, armnnType1);
262  inputTensors.push_back({ bindingInfo.first, armnn::ConstTensor(bindingInfo.second, it.second.data()) });
263  }
264 
265  // Allocate storage for the output tensors to be written to and setup the armnn output tensors.
266  std::map<std::string, boost::multi_array<DataType2, NumOutputDimensions>> outputStorage;
267  armnn::OutputTensors outputTensors;
268  for (auto&& it : expectedOutputData)
269  {
270  armnn::LayerBindingId outputBindingId = m_Parser->GetNetworkOutputBindingInfo(subgraphId, it.first).first;
271  armnn::TensorInfo outputTensorInfo = m_Runtime->GetOutputTensorInfo(m_NetworkIdentifier, outputBindingId);
272 
273  // Check that output tensors have correct number of dimensions (NumOutputDimensions specified in test)
274  auto outputNumDimensions = outputTensorInfo.GetNumDimensions();
275  BOOST_CHECK_MESSAGE((outputNumDimensions == NumOutputDimensions),
276  boost::str(boost::format("Number of dimensions expected %1%, but got %2% for output layer %3%")
277  % NumOutputDimensions
278  % outputNumDimensions
279  % it.first));
280 
281  armnn::VerifyTensorInfoDataType(outputTensorInfo, armnnType2);
282  outputStorage.emplace(it.first, MakeTensor<DataType2, NumOutputDimensions>(outputTensorInfo));
283  outputTensors.push_back(
284  { outputBindingId, armnn::Tensor(outputTensorInfo, outputStorage.at(it.first).data()) });
285  }
286 
287  m_Runtime->EnqueueWorkload(m_NetworkIdentifier, inputTensors, outputTensors);
288 
289  // Compare each output tensor to the expected values
290  for (auto&& it : expectedOutputData)
291  {
292  armnn::BindingPointInfo bindingInfo = m_Parser->GetNetworkOutputBindingInfo(subgraphId, it.first);
293  auto outputExpected = MakeTensor<DataType2, NumOutputDimensions>(bindingInfo.second, it.second);
294  BOOST_TEST(CompareTensors(outputExpected, outputStorage[it.first]));
295  }
296 }
boost::test_tools::predicate_result CompareTensors(const boost::multi_array< T, n > &a, const boost::multi_array< T, n > &b, bool compareBoolean=false)
typename ResolveTypeImpl< DT >::Type ResolveType
Definition: ResolveType.hpp:73
std::vector< std::pair< LayerBindingId, class ConstTensor > > InputTensors
Definition: Tensor.hpp:225
int LayerBindingId
Type of identifiers for bindable layers (inputs, outputs).
Definition: Types.hpp:171
A tensor defined by a TensorInfo (shape and data type) and a mutable backing store.
Definition: Tensor.hpp:191
A tensor defined by a TensorInfo (shape and data type) and an immutable backing store.
Definition: Tensor.hpp:199
std::vector< std::pair< LayerBindingId, class Tensor > > OutputTensors
Definition: Tensor.hpp:226
std::pair< armnn::LayerBindingId, armnn::TensorInfo > BindingPointInfo
Definition: Tensor.hpp:146
void VerifyTensorInfoDataType(const armnn::TensorInfo &info, armnn::DataType dataType)
Definition: TypesUtils.hpp:296
unsigned int GetNumDimensions() const
Definition: Tensor.hpp:92

◆ RunTest() [4/4]

void RunTest ( std::size_t  subgraphId,
const std::map< std::string, std::vector< armnn::ResolveType< armnnType1 >>> &  inputData,
const std::map< std::string, std::vector< armnn::ResolveType< armnnType2 >>> &  expectedOutputData 
)

Multiple Inputs, Multiple Outputs w/ Variable Datatypes and different dimension sizes.

Executes the network with the given input tensors and checks the results against the given output tensors. This overload supports multiple inputs and multiple outputs, identified by name along with the allowance for the input datatype to be different to the output

Executes the network with the given input tensors and checks the results against the given output tensors. This overload supports multiple inputs and multiple outputs, identified by name along with the allowance for the input datatype to be different to the output.

Definition at line 304 of file ParserFlatbuffersFixture.hpp.

References m_NetworkIdentifier, m_Parser, m_Runtime, and armnn::VerifyTensorInfoDataType().

307 {
308  using DataType2 = armnn::ResolveType<armnnType2>;
309 
310  // Setup the armnn input tensors from the given vectors.
311  armnn::InputTensors inputTensors;
312  for (auto&& it : inputData)
313  {
314  armnn::BindingPointInfo bindingInfo = m_Parser->GetNetworkInputBindingInfo(subgraphId, it.first);
315  armnn::VerifyTensorInfoDataType(bindingInfo.second, armnnType1);
316 
317  inputTensors.push_back({ bindingInfo.first, armnn::ConstTensor(bindingInfo.second, it.second.data()) });
318  }
319 
320  armnn::OutputTensors outputTensors;
321  outputTensors.reserve(expectedOutputData.size());
322  std::map<std::string, std::vector<DataType2>> outputStorage;
323  for (auto&& it : expectedOutputData)
324  {
325  armnn::BindingPointInfo bindingInfo = m_Parser->GetNetworkOutputBindingInfo(subgraphId, it.first);
326  armnn::VerifyTensorInfoDataType(bindingInfo.second, armnnType2);
327 
328  std::vector<DataType2> out(it.second.size());
329  outputStorage.emplace(it.first, out);
330  outputTensors.push_back({ bindingInfo.first,
331  armnn::Tensor(bindingInfo.second,
332  outputStorage.at(it.first).data()) });
333  }
334 
335  m_Runtime->EnqueueWorkload(m_NetworkIdentifier, inputTensors, outputTensors);
336 
337  // Checks the results.
338  for (auto&& it : expectedOutputData)
339  {
340  std::vector<armnn::ResolveType<armnnType2>> out = outputStorage.at(it.first);
341  {
342  for (unsigned int i = 0; i < out.size(); ++i)
343  {
344  BOOST_TEST(it.second[i] == out[i], boost::test_tools::tolerance(0.000001f));
345  }
346  }
347  }
348 }
typename ResolveTypeImpl< DT >::Type ResolveType
Definition: ResolveType.hpp:73
std::vector< std::pair< LayerBindingId, class ConstTensor > > InputTensors
Definition: Tensor.hpp:225
A tensor defined by a TensorInfo (shape and data type) and a mutable backing store.
Definition: Tensor.hpp:191
A tensor defined by a TensorInfo (shape and data type) and an immutable backing store.
Definition: Tensor.hpp:199
std::vector< std::pair< LayerBindingId, class Tensor > > OutputTensors
Definition: Tensor.hpp:226
std::pair< armnn::LayerBindingId, armnn::TensorInfo > BindingPointInfo
Definition: Tensor.hpp:146
void VerifyTensorInfoDataType(const armnn::TensorInfo &info, armnn::DataType dataType)
Definition: TypesUtils.hpp:296

◆ Setup()

void Setup ( )
inline

Definition at line 61 of file ParserFlatbuffersFixture.hpp.

References armnn::CpuRef, armnn::Optimize(), ReadStringToBinary(), and armnn::Success.

Referenced by BOOST_FIXTURE_TEST_CASE(), and SetupSingleInputSingleOutput().

62  {
63  bool ok = ReadStringToBinary();
64  if (!ok) {
65  throw armnn::Exception("LoadNetwork failed while reading binary input");
66  }
67 
68  armnn::INetworkPtr network =
69  m_Parser->CreateNetworkFromBinary(m_GraphBinary);
70 
71  if (!network) {
72  throw armnn::Exception("The parser failed to create an ArmNN network");
73  }
74 
75  auto optimized = Optimize(*network, { armnn::Compute::CpuRef },
76  m_Runtime->GetDeviceSpec());
77  std::string errorMessage;
78 
79  armnn::Status ret = m_Runtime->LoadNetwork(m_NetworkIdentifier, move(optimized), errorMessage);
80 
81  if (ret != armnn::Status::Success)
82  {
83  throw armnn::Exception(
84  boost::str(
85  boost::format("The runtime failed to load the network. "
86  "Error was: %1%. in %2% [%3%:%4%]") %
87  errorMessage %
88  __func__ %
89  __FILE__ %
90  __LINE__));
91  }
92  }
CPU Execution: Reference C++ kernels.
IOptimizedNetworkPtr Optimize(const INetwork &network, const std::vector< BackendId > &backendPreferences, const IDeviceSpec &deviceSpec, const OptimizerOptions &options=OptimizerOptions(), Optional< std::vector< std::string > &> messages=EmptyOptional())
Create an optimized version of the network.
Definition: Network.cpp:890
Status
enumeration
Definition: Types.hpp:26
std::vector< uint8_t > m_GraphBinary
Base class for all ArmNN exceptions so that users can filter to just those.
Definition: Exceptions.hpp:46
std::unique_ptr< INetwork, void(*)(INetwork *network)> INetworkPtr
Definition: INetwork.hpp:101

◆ SetupSingleInputSingleOutput()

void SetupSingleInputSingleOutput ( const std::string &  inputName,
const std::string &  outputName 
)
inline

Definition at line 94 of file ParserFlatbuffersFixture.hpp.

References Setup().

95  {
96  // Store the input and output name so they don't need to be passed to the single-input-single-output RunTest().
97  m_SingleInputName = inputName;
98  m_SingleOutputName = outputName;
99  Setup();
100  }
std::string m_SingleInputName
If the single-input-single-output overload of Setup() is called, these will store the input and outpu...

Member Data Documentation

◆ m_GraphBinary

std::vector<uint8_t> m_GraphBinary

Definition at line 50 of file ParserFlatbuffersFixture.hpp.

◆ m_JsonString

std::string m_JsonString

Definition at line 51 of file ParserFlatbuffersFixture.hpp.

◆ m_NetworkIdentifier

armnn::NetworkId m_NetworkIdentifier

Definition at line 54 of file ParserFlatbuffersFixture.hpp.

Referenced by RunTest().

◆ m_Parser

ITfLiteParserPtr m_Parser

Definition at line 52 of file ParserFlatbuffersFixture.hpp.

Referenced by ParserFlatbuffersFixture(), and RunTest().

◆ m_Runtime

armnn::IRuntimePtr m_Runtime

Definition at line 53 of file ParserFlatbuffersFixture.hpp.

Referenced by RunTest().

◆ m_SingleInputName

std::string m_SingleInputName

If the single-input-single-output overload of Setup() is called, these will store the input and output name so they don't need to be passed to the single-input-single-output overload of RunTest().

Definition at line 58 of file ParserFlatbuffersFixture.hpp.

Referenced by RunTest().

◆ m_SingleOutputName

std::string m_SingleOutputName

Definition at line 59 of file ParserFlatbuffersFixture.hpp.

Referenced by RunTest().


The documentation for this struct was generated from the following file: