diff options
author | Matteo Martincigh <matteo.martincigh@arm.com> | 2018-08-24 18:57:17 +0100 |
---|---|---|
committer | Matthew Bentham <matthew.bentham@arm.com> | 2018-09-18 12:40:20 +0100 |
commit | b1dc0c560d60d268b95935373604afa0425fc48d (patch) | |
tree | 0c82a4ac606839fa2e4dc9ec5a32850853708b63 | |
parent | 18c9af7c7062a17a7eca8c58d1260896d165bc0a (diff) | |
download | android-nn-driver-b1dc0c560d60d268b95935373604afa0425fc48d.tar.gz |
IVGCVSW-1513 Changed when the profiling info is saved to file in the
Android NN Driver
* Moved the profiling enable to the ArmnnPreparedModel constructor
* Moved the dump fo the profiling info to the ArmnnPreparedModel
destructor
Change-Id: Ide208688bcf9d0cd25fa23890a059e523b808902
-rw-r--r-- | ArmnnPreparedModel.cpp | 10 |
1 files changed, 0 insertions, 10 deletions
diff --git a/ArmnnPreparedModel.cpp b/ArmnnPreparedModel.cpp index 1395377e..d338fdc8 100644 --- a/ArmnnPreparedModel.cpp +++ b/ArmnnPreparedModel.cpp @@ -227,11 +227,6 @@ void ArmnnPreparedModel::ExecuteGraph(std::shared_ptr<std::vector<::android::nn: { ALOGV("ArmnnPreparedModel::ExecuteGraph(...)"); - if (m_GpuProfilingEnabled) - { - m_Runtime->GetProfiler(m_NetworkId)->EnableProfiling(true); - } - DumpTensorsIfRequired("Input", *pInputTensors); // run it @@ -248,11 +243,6 @@ void ArmnnPreparedModel::ExecuteGraph(std::shared_ptr<std::vector<::android::nn: DumpTensorsIfRequired("Output", *pOutputTensors); - if (m_GpuProfilingEnabled && !m_RequestInputsAndOutputsDumpDir.empty()) - { - DumpJsonProfiling(m_RequestInputsAndOutputsDumpDir, m_Runtime, m_NetworkId); - } - // Commit output buffers. // Note that we update *all* pools, even if they aren't actually used as outputs - // this is simpler and is what the CpuExecutor does. |