ONE - On-device Neural Engine
Loading...
Searching...
No Matches
onert::backend::acl_cl::Optimizer Class Reference

#include <Optimizer.h>

Public Member Functions

 Optimizer (BackendContext *context)
 
void optimize ()
 

Detailed Description

Definition at line 30 of file Optimizer.h.

Constructor & Destructor Documentation

◆ Optimizer()

onert::backend::acl_cl::Optimizer::Optimizer ( BackendContext context)

Definition at line 33 of file Optimizer.cc.

34 : _context{context},
35 _tensor_builder{std::dynamic_pointer_cast<TensorBuilder>(context->tensor_builder)}
36{
37 assert(context);
38}

Member Function Documentation

◆ optimize()

void onert::backend::acl_cl::Optimizer::optimize ( )

Definition at line 40 of file Optimizer.cc.

41{
42 // Concat elimination (build subtensor info)
43 {
44 acl_common::AclSubTensorAnalyzer sa{*_context->graph()};
45 sa.setUsePadding();
46 _context->graph()->operations().iterate(
47 [&](const ir::OperationIndex &, const ir::IOperation &op) { op.accept(sa); });
48
49 _tensor_builder->parent_map(sa.releaseParentMap());
50 }
51}
const ir::Graph * graph() const
const Operations & operations() const override
Definition Graph.h:114
void iterate(const std::function< void(const Index &, const Object &)> &fn) const
Iterate over the container with given function.
::onert::util::Index< uint32_t, OperationIndexTag > OperationIndex
Definition Index.h:32

References onert::ir::IOperation::accept(), onert::backend::BackendContext::graph(), onert::util::ObjectManager< Index, Object >::iterate(), and onert::ir::Graph::operations().


The documentation for this class was generated from the following files: