ArmNN
 20.08
Params Struct Reference

#include <InferenceModel.hpp>

Public Member Functions

 Params ()
 

Public Attributes

std::string m_ModelPath
 
std::vector< std::string > m_InputBindings
 
std::vector< armnn::TensorShapem_InputShapes
 
std::vector< std::string > m_OutputBindings
 
std::vector< armnn::BackendIdm_ComputeDevices
 
std::string m_DynamicBackendsPath
 
size_t m_SubgraphId
 
bool m_IsModelBinary
 
bool m_VisualizePostOptimizationModel
 
bool m_EnableFp16TurboMode
 
bool m_EnableBf16TurboMode
 
bool m_PrintIntermediateLayers
 
bool m_ParseUnsupported
 
bool m_InferOutputShape
 

Detailed Description

Definition at line 83 of file InferenceModel.hpp.

Constructor & Destructor Documentation

◆ Params()

Params ( )
inline

Definition at line 100 of file InferenceModel.hpp.

References Params::m_EnableBf16TurboMode, Params::m_EnableFp16TurboMode, Params::m_InferOutputShape, Params::m_IsModelBinary, Params::m_ParseUnsupported, Params::m_PrintIntermediateLayers, Params::m_SubgraphId, and Params::m_VisualizePostOptimizationModel.

Member Data Documentation

◆ m_ComputeDevices

std::vector<armnn::BackendId> m_ComputeDevices

◆ m_DynamicBackendsPath

◆ m_EnableBf16TurboMode

bool m_EnableBf16TurboMode

◆ m_EnableFp16TurboMode

bool m_EnableFp16TurboMode

◆ m_InferOutputShape

bool m_InferOutputShape

◆ m_InputBindings

◆ m_InputShapes

std::vector<armnn::TensorShape> m_InputShapes

Definition at line 87 of file InferenceModel.hpp.

Referenced by CreateNetworkImpl< IParser >::Create(), and MainImpl().

◆ m_IsModelBinary

bool m_IsModelBinary

◆ m_ModelPath

◆ m_OutputBindings

◆ m_ParseUnsupported

bool m_ParseUnsupported

◆ m_PrintIntermediateLayers

bool m_PrintIntermediateLayers

◆ m_SubgraphId

size_t m_SubgraphId

◆ m_VisualizePostOptimizationModel

bool m_VisualizePostOptimizationModel

The documentation for this struct was generated from the following file: