ArmNN
 23.05
IRuntime Class Reference

#include <IRuntime.hpp>

Classes

struct  CreationOptions
 

Public Member Functions

Status LoadNetwork (NetworkId &networkIdOut, IOptimizedNetworkPtr network)
 Loads a complete network into the IRuntime. More...
 
Status LoadNetwork (NetworkId &networkIdOut, IOptimizedNetworkPtr network, std::string &errorMessage)
 Load a complete network into the IRuntime. More...
 
Status LoadNetwork (NetworkId &networkIdOut, IOptimizedNetworkPtr network, std::string &errorMessage, const INetworkProperties &networkProperties)
 
TensorInfo GetInputTensorInfo (NetworkId networkId, LayerBindingId layerId) const
 
TensorInfo GetOutputTensorInfo (NetworkId networkId, LayerBindingId layerId) const
 
std::vector< ImportedInputIdImportInputs (NetworkId networkId, const InputTensors &inputTensors, MemorySource forceImportMemorySource=MemorySource::Undefined)
 ImportInputs separates the importing and mapping of InputTensors from network execution. More...
 
std::vector< ImportedOutputIdImportOutputs (NetworkId networkId, const OutputTensors &outputTensors, MemorySource forceImportMemorySource=MemorySource::Undefined)
 ImportOutputs separates the importing and mapping of OutputTensors from network execution. More...
 
void ClearImportedInputs (NetworkId networkId, const std::vector< ImportedInputId > inputIds)
 Un-import and delete the imported InputTensor/s This function is not thread safe and must not be used while other threads are calling Execute(). More...
 
void ClearImportedOutputs (NetworkId networkId, const std::vector< ImportedOutputId > outputIds)
 Un-import and delete the imported OutputTensor/s This function is not thread safe and must not be used while other threads are calling Execute(). More...
 
Status EnqueueWorkload (NetworkId networkId, const InputTensors &inputTensors, const OutputTensors &outputTensors, std::vector< ImportedInputId > preImportedInputIds={}, std::vector< ImportedOutputId > preImportedOutputIds={})
 Evaluates a network using input in inputTensors and outputs filled into outputTensors. More...
 
Status Execute (IWorkingMemHandle &workingMemHandle, const InputTensors &inputTensors, const OutputTensors &outputTensors, std::vector< ImportedInputId > preImportedInputs={}, std::vector< ImportedOutputId > preImportedOutputs={})
 This is an experimental function. More...
 
Status UnloadNetwork (NetworkId networkId)
 Unloads a network from the IRuntime. More...
 
const IDeviceSpecGetDeviceSpec () const
 
std::unique_ptr< IWorkingMemHandleCreateWorkingMemHandle (NetworkId networkId)
 Create a new unique WorkingMemHandle object. More...
 
const std::shared_ptr< IProfilerGetProfiler (NetworkId networkId) const
 Gets the profiler corresponding to the given network id. More...
 
void RegisterDebugCallback (NetworkId networkId, const DebugCallbackFunction &func)
 Registers a callback function to debug layers performing custom computations on intermediate tensors. More...
 

Static Public Member Functions

static IRuntimeCreateRaw (const CreationOptions &options)
 
static IRuntimePtr Create (const CreationOptions &options)
 
static void Destroy (IRuntime *runtime)
 

Protected Member Functions

 IRuntime ()
 
 IRuntime (const IRuntime::CreationOptions &options)
 
 ~IRuntime ()
 

Protected Attributes

std::unique_ptr< RuntimeImplpRuntimeImpl
 

Detailed Description

Definition at line 82 of file IRuntime.hpp.

Constructor & Destructor Documentation

◆ IRuntime() [1/2]

IRuntime ( )
protected

Definition at line 41 of file Runtime.cpp.

Referenced by IRuntime::CreateRaw().

◆ IRuntime() [2/2]

IRuntime ( const IRuntime::CreationOptions options)
protected

Definition at line 43 of file Runtime.cpp.

43 : pRuntimeImpl(new RuntimeImpl(options)) {}

◆ ~IRuntime()

~IRuntime ( )
protecteddefault

Member Function Documentation

◆ ClearImportedInputs()

void ClearImportedInputs ( NetworkId  networkId,
const std::vector< ImportedInputId inputIds 
)

Un-import and delete the imported InputTensor/s This function is not thread safe and must not be used while other threads are calling Execute().

Only compatible with AsyncEnabled networks

Definition at line 104 of file Runtime.cpp.

105 {
106  return pRuntimeImpl->ClearImportedInputs(networkId, inputIds);
107 }

References IRuntime::pRuntimeImpl.

◆ ClearImportedOutputs()

void ClearImportedOutputs ( NetworkId  networkId,
const std::vector< ImportedOutputId outputIds 
)

Un-import and delete the imported OutputTensor/s This function is not thread safe and must not be used while other threads are calling Execute().

Only compatible with AsyncEnabled networks

Definition at line 108 of file Runtime.cpp.

109 {
110  return pRuntimeImpl->ClearImportedOutputs(networkId, outputIds);
111 }

References IRuntime::pRuntimeImpl.

◆ Create()

IRuntimePtr Create ( const CreationOptions options)
static

◆ CreateRaw()

IRuntime * CreateRaw ( const CreationOptions options)
static

Definition at line 47 of file Runtime.cpp.

48 {
49  return new IRuntime(options);
50 }

References IRuntime::IRuntime().

Referenced by IRuntime::Create().

◆ CreateWorkingMemHandle()

std::unique_ptr< IWorkingMemHandle > CreateWorkingMemHandle ( NetworkId  networkId)

Create a new unique WorkingMemHandle object.

Create multiple handles if you wish to have overlapped Execution by calling this function from different threads.

Definition at line 146 of file Runtime.cpp.

147 {
148  return pRuntimeImpl->CreateWorkingMemHandle(networkId);
149 }

References IRuntime::pRuntimeImpl.

◆ Destroy()

void Destroy ( IRuntime runtime)
static

Definition at line 57 of file Runtime.cpp.

58 {
59  delete runtime;
60 }

Referenced by IRuntime::Create().

◆ EnqueueWorkload()

Status EnqueueWorkload ( NetworkId  networkId,
const InputTensors inputTensors,
const OutputTensors outputTensors,
std::vector< ImportedInputId preImportedInputIds = {},
std::vector< ImportedOutputId preImportedOutputIds = {} 
)

Evaluates a network using input in inputTensors and outputs filled into outputTensors.

Definition at line 113 of file Runtime.cpp.

118 {
119  return pRuntimeImpl->EnqueueWorkload(networkId, inputTensors, outputTensors,
120  preImportedInputIds, preImportedOutputIds);
121 }

References IRuntime::pRuntimeImpl.

Referenced by ArmnnPreparedModel::ExecuteGraph().

◆ Execute()

Status Execute ( IWorkingMemHandle workingMemHandle,
const InputTensors inputTensors,
const OutputTensors outputTensors,
std::vector< ImportedInputId preImportedInputs = {},
std::vector< ImportedOutputId preImportedOutputs = {} 
)

This is an experimental function.

Evaluates a network using input in inputTensors and outputs filled into outputTensors. This function performs a thread safe execution of the network. Returns once execution is complete. Will block until this and any other thread using the same workingMem object completes.

Definition at line 123 of file Runtime.cpp.

128 {
129  return pRuntimeImpl->Execute(workingMemHandle,
130  inputTensors,
131  outputTensors,
132  preImportedInputs,
133  preImportedOutputs);
134 }

References IRuntime::pRuntimeImpl.

◆ GetDeviceSpec()

const IDeviceSpec & GetDeviceSpec ( ) const

Definition at line 141 of file Runtime.cpp.

142 {
143  return pRuntimeImpl->GetDeviceSpec();
144 }

References IRuntime::pRuntimeImpl.

◆ GetInputTensorInfo()

armnn::TensorInfo GetInputTensorInfo ( NetworkId  networkId,
LayerBindingId  layerId 
) const

Definition at line 82 of file Runtime.cpp.

83 {
84  return pRuntimeImpl->GetInputTensorInfo(networkId, layerId);
85 }

References IRuntime::pRuntimeImpl.

Referenced by ArmnnPreparedModel::ExecuteWithDummyInputs().

◆ GetOutputTensorInfo()

armnn::TensorInfo GetOutputTensorInfo ( NetworkId  networkId,
LayerBindingId  layerId 
) const

Definition at line 87 of file Runtime.cpp.

88 {
89  return pRuntimeImpl->GetOutputTensorInfo(networkId, layerId);
90 }

References IRuntime::pRuntimeImpl.

Referenced by ArmnnPreparedModel::ExecuteWithDummyInputs().

◆ GetProfiler()

const std::shared_ptr< IProfiler > GetProfiler ( NetworkId  networkId) const

Gets the profiler corresponding to the given network id.

Parameters
networkIdThe id of the network for which to get the profile.
Returns
A pointer to the requested profiler, or nullptr if not found.

Definition at line 151 of file Runtime.cpp.

152 {
153  return pRuntimeImpl->GetProfiler(networkId);
154 }

References IRuntime::pRuntimeImpl.

Referenced by ArmnnPreparedModel::~ArmnnPreparedModel().

◆ ImportInputs()

std::vector< ImportedInputId > ImportInputs ( NetworkId  networkId,
const InputTensors inputTensors,
MemorySource  forceImportMemorySource = MemorySource::Undefined 
)

ImportInputs separates the importing and mapping of InputTensors from network execution.

Allowing for a set of InputTensors to be imported and mapped once, but used in execution many times. This function is not thread safe and must not be used while other threads are calling Execute(). No exceptions are thrown for failed imports. It is the caller's responsibility to check whether tensors have been successfully imported by comparing returned ids with those passed in the InputTensors. Whether a tensor can be imported or not is backend specific.

Definition at line 92 of file Runtime.cpp.

94 {
95  return pRuntimeImpl->ImportInputs(networkId, inputTensors, forceImportMemorySource);
96 }

References IRuntime::pRuntimeImpl.

Referenced by ArmnnPreparedModel::ExecuteGraph().

◆ ImportOutputs()

std::vector< ImportedOutputId > ImportOutputs ( NetworkId  networkId,
const OutputTensors outputTensors,
MemorySource  forceImportMemorySource = MemorySource::Undefined 
)

ImportOutputs separates the importing and mapping of OutputTensors from network execution.

Allowing for a set of OutputTensors to be imported and mapped once, but used in execution many times. This function is not thread safe and must not be used while other threads are calling Execute(). No exceptions are thrown for failed imports. It is the caller's responsibility to check whether tensors have been successfully imported by comparing returned ids with those passed in the OutputTensors. Whether a tensor can be imported or not is backend specific.

Definition at line 98 of file Runtime.cpp.

100 {
101  return pRuntimeImpl->ImportOutputs(networkId, outputTensors, forceImportMemorySource);
102 }

References IRuntime::pRuntimeImpl.

Referenced by ArmnnPreparedModel::ExecuteGraph().

◆ LoadNetwork() [1/3]

Status LoadNetwork ( NetworkId networkIdOut,
IOptimizedNetworkPtr  network 
)

Loads a complete network into the IRuntime.

Parameters
[out]networkIdOut- Unique identifier for the network is returned in this reference.
[in]network- Complete network to load into the IRuntime. The runtime takes ownership of the network once passed in.
Returns
armnn::Status

Definition at line 62 of file Runtime.cpp.

63 {
64  return pRuntimeImpl->LoadNetwork(networkIdOut, std::move(network));
65 }

References IRuntime::pRuntimeImpl.

◆ LoadNetwork() [2/3]

Status LoadNetwork ( NetworkId networkIdOut,
IOptimizedNetworkPtr  network,
std::string &  errorMessage 
)

Load a complete network into the IRuntime.

Parameters
[out]networkIdOutUnique identifier for the network is returned in this reference.
[in]networkComplete network to load into the IRuntime.
[out]errorMessageError message if there were any errors. The runtime takes ownership of the network once passed in.
Returns
armnn::Status

Definition at line 67 of file Runtime.cpp.

70 {
71  return pRuntimeImpl->LoadNetwork(networkIdOut, std::move(network), errorMessage);
72 }

References IRuntime::pRuntimeImpl.

◆ LoadNetwork() [3/3]

Status LoadNetwork ( NetworkId networkIdOut,
IOptimizedNetworkPtr  network,
std::string &  errorMessage,
const INetworkProperties networkProperties 
)

Definition at line 74 of file Runtime.cpp.

78 {
79  return pRuntimeImpl->LoadNetwork(networkIdOut, std::move(network), errorMessage, networkProperties);
80 }

References IRuntime::pRuntimeImpl.

◆ RegisterDebugCallback()

void RegisterDebugCallback ( NetworkId  networkId,
const DebugCallbackFunction func 
)

Registers a callback function to debug layers performing custom computations on intermediate tensors.

Parameters
networkIdThe id of the network to register the callback.
funccallback function to pass to the debug layer.

Definition at line 156 of file Runtime.cpp.

157 {
158  return pRuntimeImpl->RegisterDebugCallback(networkId, func);
159 }

References IRuntime::pRuntimeImpl.

◆ UnloadNetwork()

Status UnloadNetwork ( NetworkId  networkId)

Unloads a network from the IRuntime.

At the moment this only removes the network from the m_Impl->m_Network. This might need more work in the future to be AndroidNN compliant.

Parameters
[in]networkId- Unique identifier for the network to be unloaded. Generated in LoadNetwork().
Returns
armnn::Status

Definition at line 136 of file Runtime.cpp.

137 {
138  return pRuntimeImpl->UnloadNetwork(networkId);
139 }

References IRuntime::pRuntimeImpl.

Referenced by ArmnnPreparedModel::~ArmnnPreparedModel().

Member Data Documentation

◆ pRuntimeImpl


The documentation for this class was generated from the following files:
armnn::IRuntime::CreateRaw
static IRuntime * CreateRaw(const CreationOptions &options)
Definition: Runtime.cpp:47
armnn::IRuntime::CreationOptions
Definition: IRuntime.hpp:85
armnn::RuntimeImpl
Definition: Runtime.hpp:30
armnn::IRuntime::IRuntime
IRuntime()
Definition: Runtime.cpp:41
armnn::IRuntime::pRuntimeImpl
std::unique_ptr< RuntimeImpl > pRuntimeImpl
Definition: IRuntime.hpp:302
armnn::IRuntimePtr
std::unique_ptr< IRuntime, void(*)(IRuntime *runtime)> IRuntimePtr
Definition: IRuntime.hpp:41
armnn::IRuntime::Destroy
static void Destroy(IRuntime *runtime)
Definition: Runtime.cpp:57