Comments (3)
@onnx/sig-operators
from onnx.
Good question. Looking at the onnx shape inference implementation, it computes the complete products (not skipping the forwarded dimensions). Hence, it will not be able to infer an output dimension and will flag this as an error here.
However, the more general interpretation could be useful in some situations, I guess ... it may be worth investigating whether backend implementations support it.
Do you see any examples/models where this will be useful? If so, it may be worth updating the spec to allow it.
from onnx.
It seems to be useful for dealing with batch dimensions that might be zero. For example, reshaping from [n,a,b] to [0,-1] where n is the batch dimension.
We (Nvidia TensorRT group) ran into the issue with fasterrcnn_resnet50_fpn.onnx (I think it's derived from here)> and accidentally fed it random data. I'm guessing there's some kind of internal batch dimension there, with a data-dependent length.
On the other hand, the "forwarding 0" is dangerous with networks that contain empty tensors, so there's much to be said for just discouraging "forwarding 0", even if it helps the use of wildcard -1. In retrospect, "forwarding -2" would have been a much better design, but Caffe chose 0.
from onnx.
Related Issues (20)
- onnx.shape_inference.infer_shapes creates a blank graph when passed a >2GB model HOT 1
- Test using bfloat16 and model input/output HOT 1
- Missing type support in parser for various types (float16, bfloat16, ...)
- INT4 TensorProto byte size is 5x larger than expected with negative values
- The PixelUnshuffle op Cannot be converted to SpaceToDepth HOT 2
- OperatorSetProto.onnx: the instance of immutable operator set domain ai.onnx and ai.onnx.ml HOT 1
- Please specify the behavior of min > max in Clip HOT 1
- Any operation to convert "Tile" to "Slice" HOT 1
- Expected output of QLinearMatMul: using astype or np.clip?
- How to use onnx.utils.extract_model to extract more than 2GB child onnx model ? HOT 2
- terminated by signal SIGSEGV (Address boundary error) HOT 6
- Shape inference crash on Conv
- s390x test failures - test_make_tensor_raw HOT 1
- Shape inference check fails with external data.
- Shape Inference crash on Gemm
- onnx.checker crashes on STFT
- onnx.checker crashes on LayerNormalization
- Importing `onnx==1.16.1` causes a segmentation fault on MacOS 11 (Big Sur)
- The model is converted to onnx format using dynamic batch precision collapse HOT 8
- Compatibility with numpy>=2.0
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from onnx.