zhangge6 / onnx-modifier Goto Github PK
View Code? Open in Web Editor NEWA tool to modify ONNX models in a visualization fashion, based on Netron and Flask.
License: MIT License
A tool to modify ONNX models in a visualization fashion, based on Netron and Flask.
License: MIT License
Hello @ZhangGe6 ,
I appreciate your amazing and helpful work.
I am having an issue when opening the model in windows app. I am getting the following error when trying to open a basic mobilenetv2-7.onnx:
The script 'http://... /static/onnx.js' failed to load.
Thank you in advance for your help !
可以加节点, 但是生成的模型是无效的, 填入的节点的值似乎没有起到作用. 导出的模型用netron看新增的节点是没有值的(正常的节点能看到值), 用onnxruntime使用导出的模型推理会报错
no issue anymore. Thanks
执行
python3.8 app.py
得到输出:
* Serving Flask app 'app' (lazy loading)
* Environment: production
WARNING: This is a development server. Do not use it in a production deployment.
Use a production WSGI server instead.
* Debug mode: off
* Running on http://127.0.0.1:5000 (Press CTRL+C to quit)
127.0.0.1 - - [09/May/2022 16:22:44] "GET / HTTP/1.1" 200 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/view-grapher.css HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/view-sidebar.css HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/sweetalert.css HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/sweetalert.min.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/dagre.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/base.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/text.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/json.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/xml.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/python.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/protobuf.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/flatbuffers.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/zip.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/gzip.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/tar.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/view-grapher.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/view-sidebar.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/view.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/index.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:44] "GET /static/logo.svg HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:45] "GET /favicon.ico HTTP/1.1" 404 -
127.0.0.1 - - [09/May/2022 16:22:50] "GET /static/onnx.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:50] "GET /static/onnx-proto.js HTTP/1.1" 304 -
127.0.0.1 - - [09/May/2022 16:22:51] "GET /static/onnx-metadata.json HTTP/1.1" 304 -
[2022-05-09 16:22:51,596] ERROR in app: Exception on /open_model [POST]
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 2077, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1525, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1523, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1509, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
File "app.py", line 15, in open_model
onnx_modifier = onnxModifier.from_name_stream(onnx_file.filename, onnx_file.stream)
File "/Users/dengxuezheng/BeKe/tf/onnx/onnx-modifier/onnx_modifier.py", line 26, in from_name_stream
model_proto = onnx.load_model(stream, onnx.ModelProto)
File "/Users/dengxuezheng/Library/Python/3.8/lib/python/site-packages/onnx/__init__.py", line 124, in load_model
model_filepath = _get_file_path(f)
File "/Users/dengxuezheng/Library/Python/3.8/lib/python/site-packages/onnx/__init__.py", line 54, in _get_file_path
return os.path.abspath(f.name)
File "/usr/local/Cellar/[email protected]/3.8.13_1/Frameworks/Python.framework/Versions/3.8/lib/python3.8/posixpath.py", line 374, in abspath
path = os.fspath(path)
TypeError: expected str, bytes or os.PathLike object, not int
127.0.0.1 - - [09/May/2022 16:22:51] "POST /open_model HTTP/1.1" 500 -
[2022-05-09 16:24:02,033] ERROR in app: Exception on /download [POST]
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 2077, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1525, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1523, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1509, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
File "app.py", line 26, in modify_and_download_model
onnx_modifier.reload() # allow for downloading for multiple times
NameError: name 'onnx_modifier' is not defined
127.0.0.1 - - [09/May/2022 16:24:02] "POST /download HTTP/1.1" 500 -
执行python3.8 app.py 出错,大神帮忙看看
I Add a Reshape, but can not set a initializer for shape?
otherwise, if edit orgin reshape shape, after edited the output shape not change?
I create a Cast node. to attribute is undefined initially.
I change to float32, float, Float, Float32, FLOAT, ... etc.
No matter what I put, it show NaN later
I change to 1 and it shows float32 later
I thought I make it. but when I load new model with netron, it say unknown type "1"
Really don't know how to add Cast properly
就是if判断的模型
作者你好,我的情况是这样的,我的模型是在灰度图上训练的,所以输入的通道数只有1,现在我要把输入的通道数变成3,这样才能享受TNN的加速,但我又不想重新制作数据集和训练,所以我想到的是:第一步,修改输入的通道数从1变成3,即[1,1,192,192]变成[1,3,192,192];第二步,在输入后面插入一个slice算子,只选取三个通道中的第一个通道。这样我就不用重新训练模型了。
现在我跳过第一步,先尝试第二步插入slice 算子,但遇到了以下问题:
我已经修改了slice的输入和输出
也修改了第一个卷积的输入
可slice算子还是无法成功地与第一个卷积连接。
我使用的onnx模型是:model.zip
我在使用reshape的时候,发现无法定义shape的大小,请问这点是什么处理的呢?谢谢
原始onnx模型存在多余输出节点,原始模型输出如下图1所示,希望将输出名称为Yaw,220, 221之外的所有节点都删除。故将其余输出节点在gemm部分删除,删除成功后输出节点如下图2所示,但点击download后报错如图3所示。
具体报错log
127.0.0.1 - - [29/Jan/2023 16:31:52] "POST /open_model HTTP/1.1" 200 -
[2023-01-29 16:38:03,331] ERROR in app: Exception on /download [POST]
Traceback (most recent call last):
File "/Users/cxt/miniconda3/envs/pytorch1.9/lib/python3.6/site-packages/flask/app.py", line 2292, in wsgi_app
response = self.full_dispatch_request()
File "/Users/cxt/miniconda3/envs/pytorch1.9/lib/python3.6/site-packages/flask/app.py", line 1815, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/Users/cxt/miniconda3/envs/pytorch1.9/lib/python3.6/site-packages/flask/app.py", line 1718, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/Users/cxt/miniconda3/envs/pytorch1.9/lib/python3.6/site-packages/flask/_compat.py", line 35, in reraise
raise value
File "/Users/cxt/miniconda3/envs/pytorch1.9/lib/python3.6/site-packages/flask/app.py", line 1813, in full_dispatch_request
rv = self.dispatch_request()
File "/Users/cxt/miniconda3/envs/pytorch1.9/lib/python3.6/site-packages/flask/app.py", line 1799, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "app.py", line 27, in modify_and_download_model
onnx_modifier.modify(modify_info)
File "/Users/cxt/Reposities/onnx-modifier/onnx_modifier.py", line 302, in modify
self.post_process()
File "/Users/cxt/Reposities/onnx-modifier/onnx_modifier.py", line 278, in post_process
shape_inference()
File "/Users/cxt/Reposities/onnx-modifier/onnx_modifier.py", line 266, in shape_inference
inferred_shape_info = onnx.shape_inference.infer_shapes(self.model_proto)
AttributeError: module 'onnx' has no attribute 'shape_inference'
Is it possible to add new "out_put layer", for instance, output after a convolution layer.
Sometimes, we might want to check feature maps from the convolution layer as for image recognition.
I'd like to get such output. I found such way for Pytorch and Keras/Tensorflow.
I'd like to do the same thing using onnx model in onnxRT by cutting(?) and modifing the onnx model.
I can not find tutorial of it. If there exists such exsamples, please teach me.
比如我现在有一个ADD节点,现在有两个输入,我想变成三个输入,我改如何增加呢
I think it maybe a good choice to run app.py inside a docker.
增加的节点上填入的值与导出的值不一致, 是类型转换导致的精度截断?
Any other examples to add nodes, like add a multi-dimension constant?
like input is [1,3,480,640] and the constant [1,3,1,1] (100,100,100)
and we want to broadcast add them ?
Thank you!
2,保存后打开修剪过的模型,可以看到inputs变多了,如下图:
3,我是通过如下代码对模型的inputs进行了删除:
import onnx
inputpath = "./descriptor.onnx"
# 1,修改模型
model = onnx.load(inputpath)
inputs = model.graph.input
inputs_num = 1 # 原始网络的输入会排在列表的最前面,所以调用pop函数删除冗余输入
for i in range(len(inputs)-inputs_num):
inputs.pop()
print(inputs)
# 2,检查结构并保存模型(覆盖保存)
onnx.checker.check_model(model)
onnx.save(model, inputpath)
谢谢
for example:
a -> b
b -> c
I modified the output of node a to the input of node c, but it will not reconnect between a and c.
Hi @ZhangGe6
First of all thanks for this wonderful tool.
Actually i modified my graph using this tool and when i am viewing in netron it is proper but the node which i added is giving no output means i can't see what shape it output or what type it outputs it just showing the output name as text. can you please help me to fix this?
Thanks in Advance:)
您好,
想问下可以支持选择一个node从的 输入或输出tensor 并把他标记成模型的输出吗?
设备:Macbook Air M1
运行报错:
如题,想在某个节点增加输出但是点击后没有用,没有显示输出节点
@ZhangGe6 This tool is exactly what I was looking for but unfortunately, after I make my changes on the development server I get the error that on line 27 of app.py, that onnx_modifier is not defined.
Traceback (most recent call last):
File "/home/travis/p38_venv/lib/python3.8/site-packages/flask/app.py", line 2091, in __call__
return self.wsgi_app(environ, start_response)
File "/home/travis/p38_venv/lib/python3.8/site-packages/flask/app.py", line 2076, in wsgi_app
response = self.handle_exception(e)
File "/home/travis/p38_venv/lib/python3.8/site-packages/flask/app.py", line 2073, in wsgi_app
response = self.full_dispatch_request()
File "/home/travis/p38_venv/lib/python3.8/site-packages/flask/app.py", line 1519, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/home/travis/p38_venv/lib/python3.8/site-packages/flask/app.py", line 1517, in full_dispatch_request
rv = self.dispatch_request()
File "/home/travis/p38_venv/lib/python3.8/site-packages/flask/app.py", line 1503, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
File "/home/travis/memryx_dev/onnx-modifier/app.py", line 27, in modify_and_download_model
onnx_modifier.reload() # allow downloading for multiple times
NameError: name 'onnx_modifier' is not defined
Hi,
Thank you for creating this tool. It's absolutely great!
Adding an output as shown here: https://github.com/ZhangGe6/onnx-modifier#add-new-model-outputs doesn't work.
Load in squeezenet1.0-12.onnx
Click on any node
Click on "Add output"
Nothing happens
Kind regards,
Arne
Hi, I met a bug-like problem here.
Suppose I open a quantized onnx model in onnx-modifier and download it directly. In that case, it will add an explicit tensor type to some ops, such as QuantizeLinear, even with nothing modified.
The figure below shows the explicit type added by onnx-modifier automatically and there is no type information in the original model at the same tensor.
We hope that onnx-modifier will keep everything that isn't modified the same as before. Thanks!
作者你好,我想在输出节点前增加一个argmax节点,该节点位于Add和heatmap之间,最后还要把模型输出名从heatmap重命名成indexmap,如下图所示:
但是我在操作过程中遇到问题,就是Add节点的输出张量名我无法将其从indexmap修改成heatmap(改成其他名字也不可以),好像Add的输出张量名跟模型的输出节点名绑定死了,输出节点名是什么,Add的输出张量名就是什么,见下面视频的后面部分
https://user-images.githubusercontent.com/43233772/220515357-088b3043-770a-4009-86f3-ff40b977f8e7.mp4
模型:
model.zip
I am using the Windows 64-bit version.
Steps: open mnist-1.onnx file. Click download. Then open modified-mnist-1.onnx This is what I see:
screenshot
It would be nice to be able to change the output shape and data type from the UI.
The same goes for the input and intermediate tensors :)
I got an onnx with QuantizeLienar and DequantizeLinear nodes in it, and the scale of these nodes are scalar.
When I try to modify the scale initializer to a scalar value like below, the scale initializer becomes 0 in the downloaded modified onnx model.
So I wonder if onnx-modifier support modification on scalar initializer. Thanks!
添加conv这样的节点,weight和bias这样的参数是不支持赋值的,是否可以考虑添加一下Initializer节点以支持修改weight和bias这样的参数?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.