ArmNN  NotReleased
ParserFlatbuffersSerializeFixture Struct Reference

#include <ParserFlatbuffersSerializeFixture.hpp>

Inheritance diagram for ParserFlatbuffersSerializeFixture:
PositiveActivationFixture

Public Member Functions

 ParserFlatbuffersSerializeFixture ()
 
void Setup ()
 
void SetupSingleInputSingleOutput (const std::string &inputName, const std::string &outputName)
 
bool ReadStringToBinary ()
 
template<std::size_t NumOutputDimensions, armnn::DataType ArmnnType, typename DataType = armnn::ResolveType<ArmnnType>>
void RunTest (unsigned int layersId, const std::vector< DataType > &inputData, const std::vector< DataType > &expectedOutputData)
 
template<std::size_t NumOutputDimensions, armnn::DataType ArmnnInputType, armnn::DataType ArmnnOutputType, typename InputDataType = armnn::ResolveType<ArmnnInputType>, typename OutputDataType = armnn::ResolveType<ArmnnOutputType>>
void RunTest (unsigned int layersId, const std::vector< InputDataType > &inputData, const std::vector< OutputDataType > &expectedOutputData)
 
template<std::size_t NumOutputDimensions, armnn::DataType ArmnnType, typename DataType = armnn::ResolveType<ArmnnType>>
void RunTest (unsigned int layersId, const std::map< std::string, std::vector< DataType >> &inputData, const std::map< std::string, std::vector< DataType >> &expectedOutputData)
 
template<std::size_t NumOutputDimensions, armnn::DataType ArmnnInputType, armnn::DataType ArmnnOutputType, typename InputDataType = armnn::ResolveType<ArmnnInputType>, typename OutputDataType = armnn::ResolveType<ArmnnOutputType>>
void RunTest (unsigned int layersId, const std::map< std::string, std::vector< InputDataType >> &inputData, const std::map< std::string, std::vector< OutputDataType >> &expectedOutputData)
 
void CheckTensors (const TensorRawPtr &tensors, size_t shapeSize, const std::vector< int32_t > &shape, armnnSerializer::TensorInfo tensorType, const std::string &name, const float scale, const int64_t zeroPoint)
 

Public Attributes

std::vector< uint8_t > m_GraphBinary
 
std::string m_JsonString
 
std::unique_ptr< IDeserializer, void(*)(IDeserializer *parser)> m_Parser
 
armnn::IRuntimePtr m_Runtime
 
armnn::NetworkId m_NetworkIdentifier
 
std::string m_SingleInputName
 
std::string m_SingleOutputName
 

Detailed Description

Definition at line 28 of file ParserFlatbuffersSerializeFixture.hpp.

Constructor & Destructor Documentation

◆ ParserFlatbuffersSerializeFixture()

Definition at line 30 of file ParserFlatbuffersSerializeFixture.hpp.

30  :
31  m_Parser(IDeserializer::Create()),
34  {
35  }
std::unique_ptr< IDeserializer, void(*)(IDeserializer *parser)> m_Parser
static IRuntimePtr Create(const CreationOptions &options)
Definition: Runtime.cpp:32

Member Function Documentation

◆ CheckTensors()

void CheckTensors ( const TensorRawPtr tensors,
size_t  shapeSize,
const std::vector< int32_t > &  shape,
armnnSerializer::TensorInfo  tensorType,
const std::string &  name,
const float  scale,
const int64_t  zeroPoint 
)
inline

Definition at line 154 of file ParserFlatbuffersSerializeFixture.hpp.

157  {
158  boost::ignore_unused(name);
159  BOOST_CHECK_EQUAL(shapeSize, tensors->dimensions()->size());
160  BOOST_CHECK_EQUAL_COLLECTIONS(shape.begin(), shape.end(),
161  tensors->dimensions()->begin(), tensors->dimensions()->end());
162  BOOST_CHECK_EQUAL(tensorType.dataType(), tensors->dataType());
163  BOOST_CHECK_EQUAL(scale, tensors->quantizationScale());
164  BOOST_CHECK_EQUAL(zeroPoint, tensors->quantizationOffset());
165  }

◆ ReadStringToBinary()

bool ReadStringToBinary ( )
inline

Definition at line 92 of file ParserFlatbuffersSerializeFixture.hpp.

References deserialize_schema_end, deserialize_schema_start, and RunTest().

Referenced by Setup().

93  {
94  std::string schemafile(&deserialize_schema_start, &deserialize_schema_end);
95 
96  // parse schema first, so we can use it to parse the data after
97  flatbuffers::Parser parser;
98 
99  bool ok = parser.Parse(schemafile.c_str());
100  BOOST_ASSERT_MSG(ok, "Failed to parse schema file");
101 
102  ok &= parser.Parse(m_JsonString.c_str());
103  BOOST_ASSERT_MSG(ok, "Failed to parse json input");
104 
105  if (!ok)
106  {
107  return false;
108  }
109 
110  {
111  const uint8_t* bufferPtr = parser.builder_.GetBufferPointer();
112  size_t size = static_cast<size_t>(parser.builder_.GetSize());
113  m_GraphBinary.assign(bufferPtr, bufferPtr+size);
114  }
115  return ok;
116  }
const char deserialize_schema_end
const char deserialize_schema_start

◆ RunTest() [1/4]

void RunTest ( unsigned int  layersId,
const std::vector< DataType > &  inputData,
const std::vector< DataType > &  expectedOutputData 
)

Executes the network with the given input tensor and checks the result against the given output tensor. This overload assumes the network has a single input and a single output.

Definition at line 169 of file ParserFlatbuffersSerializeFixture.hpp.

Referenced by ReadStringToBinary().

172 {
173  RunTest<NumOutputDimensions, ArmnnType, ArmnnType, DataType, DataType>(layersId, inputData, expectedOutputData);
174 }

◆ RunTest() [2/4]

void RunTest ( unsigned int  layersId,
const std::vector< InputDataType > &  inputData,
const std::vector< OutputDataType > &  expectedOutputData 
)

Definition at line 181 of file ParserFlatbuffersSerializeFixture.hpp.

References m_SingleInputName, and m_SingleOutputName.

184 {
185  RunTest<NumOutputDimensions, ArmnnInputType, ArmnnOutputType>(layersId,
186  { { m_SingleInputName, inputData } },
187  { { m_SingleOutputName, expectedOutputData } });
188 }

◆ RunTest() [3/4]

void RunTest ( unsigned int  layersId,
const std::map< std::string, std::vector< DataType >> &  inputData,
const std::map< std::string, std::vector< DataType >> &  expectedOutputData 
)

Executes the network with the given input tensors and checks the results against the given output tensors. This overload supports multiple inputs and multiple outputs, identified by name.

Definition at line 191 of file ParserFlatbuffersSerializeFixture.hpp.

194 {
195  RunTest<NumOutputDimensions, ArmnnType, ArmnnType, DataType, DataType>(layersId, inputData, expectedOutputData);
196 }

◆ RunTest() [4/4]

void RunTest ( unsigned int  layersId,
const std::map< std::string, std::vector< InputDataType >> &  inputData,
const std::map< std::string, std::vector< OutputDataType >> &  expectedOutputData 
)

Definition at line 203 of file ParserFlatbuffersSerializeFixture.hpp.

References CompareTensors(), m_NetworkIdentifier, m_Parser, m_Runtime, and armnn::VerifyTensorInfoDataType().

207 {
208  auto ConvertBindingInfo = [](const armnnDeserializer::BindingPointInfo& bindingInfo)
209  {
210  return std::make_pair(bindingInfo.m_BindingId, bindingInfo.m_TensorInfo);
211  };
212 
213  // Setup the armnn input tensors from the given vectors.
214  armnn::InputTensors inputTensors;
215  for (auto&& it : inputData)
216  {
217  armnn::BindingPointInfo bindingInfo = ConvertBindingInfo(
218  m_Parser->GetNetworkInputBindingInfo(layersId, it.first));
219  armnn::VerifyTensorInfoDataType(bindingInfo.second, ArmnnInputType);
220  inputTensors.push_back({ bindingInfo.first, armnn::ConstTensor(bindingInfo.second, it.second.data()) });
221  }
222 
223  // Allocate storage for the output tensors to be written to and setup the armnn output tensors.
224  std::map<std::string, boost::multi_array<OutputDataType, NumOutputDimensions>> outputStorage;
225  armnn::OutputTensors outputTensors;
226  for (auto&& it : expectedOutputData)
227  {
228  armnn::BindingPointInfo bindingInfo = ConvertBindingInfo(
229  m_Parser->GetNetworkOutputBindingInfo(layersId, it.first));
230  armnn::VerifyTensorInfoDataType(bindingInfo.second, ArmnnOutputType);
231  outputStorage.emplace(it.first, MakeTensor<OutputDataType, NumOutputDimensions>(bindingInfo.second));
232  outputTensors.push_back(
233  { bindingInfo.first, armnn::Tensor(bindingInfo.second, outputStorage.at(it.first).data()) });
234  }
235 
236  m_Runtime->EnqueueWorkload(m_NetworkIdentifier, inputTensors, outputTensors);
237 
238  // Compare each output tensor to the expected values
239  for (auto&& it : expectedOutputData)
240  {
241  armnn::BindingPointInfo bindingInfo = ConvertBindingInfo(
242  m_Parser->GetNetworkOutputBindingInfo(layersId, it.first));
243  auto outputExpected = MakeTensor<OutputDataType, NumOutputDimensions>(bindingInfo.second, it.second);
244  BOOST_TEST(CompareTensors(outputExpected, outputStorage[it.first]));
245  }
246 }
A tensor defined by a TensorInfo (shape and data type) and an immutable backing store.
Definition: Tensor.hpp:199
std::vector< std::pair< LayerBindingId, class Tensor > > OutputTensors
Definition: Tensor.hpp:226
void VerifyTensorInfoDataType(const armnn::TensorInfo &info, armnn::DataType dataType)
Definition: TypesUtils.hpp:292
std::pair< armnn::LayerBindingId, armnn::TensorInfo > BindingPointInfo
Definition: Tensor.hpp:146
A tensor defined by a TensorInfo (shape and data type) and a mutable backing store.
Definition: Tensor.hpp:191
std::vector< std::pair< LayerBindingId, class ConstTensor > > InputTensors
Definition: Tensor.hpp:225
std::unique_ptr< IDeserializer, void(*)(IDeserializer *parser)> m_Parser
boost::test_tools::predicate_result CompareTensors(const boost::multi_array< T, n > &a, const boost::multi_array< T, n > &b, bool compareBoolean=false)

◆ Setup()

void Setup ( )
inline

Definition at line 48 of file ParserFlatbuffersSerializeFixture.hpp.

References armnn::CpuRef, armnn::Optimize(), ReadStringToBinary(), and armnn::Success.

Referenced by SetupSingleInputSingleOutput().

49  {
50  bool ok = ReadStringToBinary();
51  if (!ok)
52  {
53  throw armnn::Exception("LoadNetwork failed while reading binary input");
54  }
55 
56  armnn::INetworkPtr network =
57  m_Parser->CreateNetworkFromBinary(m_GraphBinary);
58 
59  if (!network)
60  {
61  throw armnn::Exception("The parser failed to create an ArmNN network");
62  }
63 
64  auto optimized = Optimize(*network, {armnn::Compute::CpuRef},
65  m_Runtime->GetDeviceSpec());
66 
67  std::string errorMessage;
68  armnn::Status ret = m_Runtime->LoadNetwork(m_NetworkIdentifier, move(optimized), errorMessage);
69 
70  if (ret != armnn::Status::Success)
71  {
72  throw armnn::Exception(
73  boost::str(
74  boost::format("The runtime failed to load the network. "
75  "Error was: %1%. in %2% [%3%:%4%]") %
76  errorMessage %
77  __func__ %
78  __FILE__ %
79  __LINE__));
80  }
81 
82  }
Status
Definition: Types.hpp:26
IOptimizedNetworkPtr Optimize(const INetwork &network, const std::vector< BackendId > &backendPreferences, const IDeviceSpec &deviceSpec, const OptimizerOptions &options=OptimizerOptions(), Optional< std::vector< std::string > &> messages=EmptyOptional())
Definition: Network.cpp:807
std::unique_ptr< IDeserializer, void(*)(IDeserializer *parser)> m_Parser
Base class for all ArmNN exceptions so that users can filter to just those.
Definition: Exceptions.hpp:46
std::unique_ptr< INetwork, void(*)(INetwork *network)> INetworkPtr
Definition: INetwork.hpp:85
CPU Execution: Reference C++ kernels.

◆ SetupSingleInputSingleOutput()

void SetupSingleInputSingleOutput ( const std::string &  inputName,
const std::string &  outputName 
)
inline

Definition at line 84 of file ParserFlatbuffersSerializeFixture.hpp.

References Setup().

85  {
86  // Store the input and output name so they don't need to be passed to the single-input-single-output RunTest().
87  m_SingleInputName = inputName;
88  m_SingleOutputName = outputName;
89  Setup();
90  }

Member Data Documentation

◆ m_GraphBinary

std::vector<uint8_t> m_GraphBinary

Definition at line 37 of file ParserFlatbuffersSerializeFixture.hpp.

◆ m_JsonString

std::string m_JsonString

Definition at line 38 of file ParserFlatbuffersSerializeFixture.hpp.

◆ m_NetworkIdentifier

armnn::NetworkId m_NetworkIdentifier

Definition at line 41 of file ParserFlatbuffersSerializeFixture.hpp.

Referenced by RunTest().

◆ m_Parser

std::unique_ptr<IDeserializer, void (*)(IDeserializer* parser)> m_Parser

Definition at line 39 of file ParserFlatbuffersSerializeFixture.hpp.

Referenced by RunTest().

◆ m_Runtime

armnn::IRuntimePtr m_Runtime

Definition at line 40 of file ParserFlatbuffersSerializeFixture.hpp.

Referenced by RunTest().

◆ m_SingleInputName

std::string m_SingleInputName

If the single-input-single-output overload of Setup() is called, these will store the input and output name so they don't need to be passed to the single-input-single-output overload of RunTest().

Definition at line 45 of file ParserFlatbuffersSerializeFixture.hpp.

Referenced by RunTest().

◆ m_SingleOutputName

std::string m_SingleOutputName

Definition at line 46 of file ParserFlatbuffersSerializeFixture.hpp.

Referenced by RunTest().


The documentation for this struct was generated from the following file: