ONE - On-device Neural Engine
|
Create a higher-rank TensorShape following NumPy broadcasting semantics. More...
#include <CircleShapeInferenceHelper.h>
Public Member Functions | |
TensorShapeExpander (const loco::TensorShape &shape) | |
loco::TensorShape | to (uint32_t output_rank) |
Create a higher-rank TensorShape following NumPy broadcasting semantics.
HOW TO USE:
auto expanded_tensor_shape = expand(tensor_shape).to(N);
Definition at line 62 of file CircleShapeInferenceHelper.h.
|
inline |
Definition at line 65 of file CircleShapeInferenceHelper.h.
|
inline |
Definition at line 71 of file CircleShapeInferenceHelper.h.
References output_shape, and loco::TensorShape::rank().