ArmNN
NotReleased
|
#include <InferenceModel.hpp>
Public Member Functions | |
std::vector< armnn::BackendId > | GetComputeDevicesAsBackendIds () |
Public Attributes | |
std::string | m_ModelDir |
std::vector< std::string > | m_ComputeDevices |
std::string | m_DynamicBackendsPath |
bool | m_VisualizePostOptimizationModel |
bool | m_EnableFp16TurboMode |
std::string | m_Labels |
Definition at line 324 of file InferenceModel.hpp.
|
inline |
Definition at line 333 of file InferenceModel.hpp.
std::vector<std::string> m_ComputeDevices |
Definition at line 327 of file InferenceModel.hpp.
Referenced by InferenceModel< IParser, TDataType >::AddCommandLineOptions().
std::string m_DynamicBackendsPath |
Definition at line 328 of file InferenceModel.hpp.
Referenced by InferenceModel< IParser, TDataType >::AddCommandLineOptions().
bool m_EnableFp16TurboMode |
Definition at line 330 of file InferenceModel.hpp.
Referenced by InferenceModel< IParser, TDataType >::AddCommandLineOptions().
std::string m_Labels |
Definition at line 331 of file InferenceModel.hpp.
Referenced by InferenceModel< IParser, TDataType >::AddCommandLineOptions().
std::string m_ModelDir |
Definition at line 326 of file InferenceModel.hpp.
Referenced by InferenceModel< IParser, TDataType >::AddCommandLineOptions().
bool m_VisualizePostOptimizationModel |
Definition at line 329 of file InferenceModel.hpp.
Referenced by InferenceModel< IParser, TDataType >::AddCommandLineOptions().