ArmNN
 21.05
WorkingMemHandle Class Referencefinal

#include <WorkingMemHandle.hpp>

Inheritance diagram for WorkingMemHandle:
IWorkingMemHandle

Public Member Functions

 WorkingMemHandle (NetworkId networkId, std::vector< WorkingMemDescriptor > workingMemDescriptors, std::unordered_map< LayerGuid, WorkingMemDescriptor > workingMemDescriptorMap, std::vector< std::shared_ptr< IMemoryManager >> memoryManagers, std::unordered_map< LayerGuid, std::vector< std::unique_ptr< ITensorHandle > > > ownedTensorHandles)
 
 ~WorkingMemHandle ()
 
NetworkId GetNetworkId () override
 Returns the NetworkId of the Network that this IWorkingMemHandle works with. More...
 
profiling::ProfilingGuid GetInferenceId () override
 Returns the InferenceId of the Inference that this IWorkingMemHandle works with. More...
 
void Allocate () override
 Allocate the backing memory required for execution. More...
 
void Free () override
 Free the backing memory required for execution. The mutex must be locked. More...
 
bool IsAllocated () override
 IsAllocated returns true if the backing memory is currently allocated. The mutex must be locked. More...
 
std::mutex & GetMutex () override
 Get a mutex which can be used for synchronizing access to the WorkingMemHandle object. More...
 
WorkingMemDescriptorGetWorkingMemDescriptor (LayerGuid id) override
 Get the WorkingMemDescriptor for a Layer. The mutex must be locked. More...
 
WorkingMemDescriptorGetWorkingMemDescriptorAt (unsigned int id) override
 Get the WorkingMemDescriptor at an index. More...
 
- Public Member Functions inherited from IWorkingMemHandle
virtual ~IWorkingMemHandle ()
 

Detailed Description

Definition at line 23 of file WorkingMemHandle.hpp.

Constructor & Destructor Documentation

◆ WorkingMemHandle()

WorkingMemHandle ( NetworkId  networkId,
std::vector< WorkingMemDescriptor workingMemDescriptors,
std::unordered_map< LayerGuid, WorkingMemDescriptor workingMemDescriptorMap,
std::vector< std::shared_ptr< IMemoryManager >>  memoryManagers,
std::unordered_map< LayerGuid, std::vector< std::unique_ptr< ITensorHandle > > >  ownedTensorHandles 
)

Definition at line 17 of file WorkingMemHandle.cpp.

22  :
23  m_NetworkId(networkId),
24  m_WorkingMemDescriptors(workingMemDescriptors),
25  m_WorkingMemDescriptorMap(workingMemDescriptorMap),
26  m_MemoryManagers(memoryManagers),
27  m_OwnedTensorHandles(std::move(ownedTensorHandles)),
28  m_IsAllocated(false),
29  m_Mutex(),
31 {
32 }
static ProfilingDynamicGuid GetNextGuid()

◆ ~WorkingMemHandle()

~WorkingMemHandle ( )
inline

Definition at line 33 of file WorkingMemHandle.hpp.

References WorkingMemHandle::Free().

34  { Free(); }
void Free() override
Free the backing memory required for execution. The mutex must be locked.

Member Function Documentation

◆ Allocate()

void Allocate ( )
overridevirtual

Allocate the backing memory required for execution.

If this is not called, then allocation will be deferred to execution time. The mutex must be locked.

Implements IWorkingMemHandle.

Definition at line 34 of file WorkingMemHandle.cpp.

Referenced by LoadedNetwork::Execute(), and WorkingMemHandle::GetInferenceId().

35 {
36  if (m_IsAllocated)
37  {
38  return;
39  }
40  m_IsAllocated = true;
41 
42  for (auto& mgr : m_MemoryManagers)
43  {
44  mgr->Acquire();
45  }
46 }

◆ Free()

void Free ( )
overridevirtual

Free the backing memory required for execution. The mutex must be locked.

Implements IWorkingMemHandle.

Definition at line 48 of file WorkingMemHandle.cpp.

Referenced by WorkingMemHandle::GetInferenceId(), and WorkingMemHandle::~WorkingMemHandle().

49 {
50  if (!m_IsAllocated)
51  {
52  return;
53  }
54  m_IsAllocated = false;
55 
56  for (auto& mgr : m_MemoryManagers)
57  {
58  mgr->Release();
59  }
60 }

◆ GetInferenceId()

profiling::ProfilingGuid GetInferenceId ( )
inlineoverridevirtual

Returns the InferenceId of the Inference that this IWorkingMemHandle works with.

Implements IWorkingMemHandle.

Definition at line 41 of file WorkingMemHandle.hpp.

References WorkingMemHandle::Allocate(), and WorkingMemHandle::Free().

42  {
43  return m_InferenceId;
44  }

◆ GetMutex()

std::mutex& GetMutex ( )
inlineoverridevirtual

Get a mutex which can be used for synchronizing access to the WorkingMemHandle object.

Implements IWorkingMemHandle.

Definition at line 60 of file WorkingMemHandle.hpp.

Referenced by LoadedNetwork::Execute().

61  {
62  return m_Mutex;
63  }

◆ GetNetworkId()

NetworkId GetNetworkId ( )
inlineoverridevirtual

Returns the NetworkId of the Network that this IWorkingMemHandle works with.

Implements IWorkingMemHandle.

Definition at line 36 of file WorkingMemHandle.hpp.

37  {
38  return m_NetworkId;
39  }

◆ GetWorkingMemDescriptor()

WorkingMemDescriptor& GetWorkingMemDescriptor ( LayerGuid  id)
inlineoverridevirtual

Get the WorkingMemDescriptor for a Layer. The mutex must be locked.

Implements IWorkingMemHandle.

Definition at line 66 of file WorkingMemHandle.hpp.

References ARMNN_ASSERT.

Referenced by LoadedNetwork::Schedule().

67  {
68  auto result = m_WorkingMemDescriptorMap.find(id);
69  ARMNN_ASSERT(result != m_WorkingMemDescriptorMap.end());
70  return result->second;
71  }
#define ARMNN_ASSERT(COND)
Definition: Assert.hpp:14

◆ GetWorkingMemDescriptorAt()

WorkingMemDescriptor& GetWorkingMemDescriptorAt ( unsigned int  id)
inlineoverridevirtual

Get the WorkingMemDescriptor at an index.

The WorkingMemDescriptors are stored in the same order as the Workloads in a topologically sorted graph. The mutex must be locked.

Implements IWorkingMemHandle.

Definition at line 75 of file WorkingMemHandle.hpp.

Referenced by LoadedNetwork::Execute().

76  {
77  return m_WorkingMemDescriptors[id];
78  }

◆ IsAllocated()

bool IsAllocated ( )
inlineoverridevirtual

IsAllocated returns true if the backing memory is currently allocated. The mutex must be locked.

Implements IWorkingMemHandle.

Definition at line 54 of file WorkingMemHandle.hpp.

Referenced by LoadedNetwork::Execute().

55  {
56  return m_IsAllocated;
57  }

The documentation for this class was generated from the following files: