Giter VIP home page Giter VIP logo

Comments (12)

Kirayue avatar Kirayue commented on August 28, 2024

There is another problem. Because of lacking pad operator, I download the squeezenet model from the model zoo. Then loadModel succeeded, but the output of the model is undefined.
Here is code snippet.

require('onnxjs-node')

async function main() {
    const session = new onnx.InferenceSession({backendHint: 'cpu'})
    await session.loadModel('./squeezenet1.1.onnx')
    const tensorX = new onnx.Tensor(new Float32Array(1 * 3 * 224 * 224), 'float32', [1, 3, 224, 224])
    const outputMap = await session.run(tensorX)
    console.log(outputMap.values().next())
}

main()

and the console result is {value: undefined, done: true}

from onnxjs.

fs-eire avatar fs-eire commented on August 28, 2024

Thanks for the feedback. It looks like currently operator Pad is only supported in webgl and onnxruntime (node.js) only. will support Pad in CPU backend in future.

from onnxjs.

Kirayue avatar Kirayue commented on August 28, 2024

Hi, @fs-eire
Sorry for my unclear descriptions. Here onnxruntime is the module from the python package. I tested two models from python onnxruntime first and then from node.js (onnxjs-node). I expected it to work, but I encountered many weird results.
According to README.md, running in node needs an extra package(onnxjs-node), and there is no need to modify code running in onnxjs. But after requiring onnxjs-node in code, I specify the backendHint to 'webgl', 'wasm' and the errors that first comment mentioned still remain. And only if I use 'cpu', there is no backend error but got the second comment results (This cannot load my own model as you said because of Pad, I used squeezenet in the second comment).
If I do not specify the backendHint, I expected it uses onnxruntime. But the obj after new onnx.InferenceSession() is OnnxRuntimeInferenceSession {binding: InferenceSession {} }, it is an empty obj.

Sorry to bother you, I really appreciate your reply. Thank you

from onnxjs.

fs-eire avatar fs-eire commented on August 28, 2024

@Kirayue
There are 4 available value for the option backendHint : webgl, wasm, cpu and onnxruntime.

ONNX.js will resolve the backend according to backendHint, as the following behavior:

  • if backendHint is specified, use that backend anyway.
  • else: (backendHint is not specified)
    • if the environment is Node.js
      • if module onnxjs-node is loaded:
        • use backend onnxruntime.
      • else (module onnxjs-node is not loaded):
        • try resolve backend wasm, cpu
    • else (the environment is browser or electron/chromium)
      • try resolve backend webgl, wasm, cpu

The resolve will succeed when the first available backend is initialized successfully.
webgl cannot work in Node.js
onnxruntime cannot work in browsers
wasm and cpu works both in browsers and Node.js
wasm cannot use multi-thread (web-worker based) in Node.js

If I do not specify the backendHint, I expected it uses onnxruntime. But the obj after new onnx.InferenceSession() is OnnxRuntimeInferenceSession {binding: InferenceSession {} }, it is an empty obj.

This looks the session is created successfully. It is not an empty object -- although you cannot see its members. It's a Node.js native binding so it works if you call .loadModel or .run.

from onnxjs.

Kirayue avatar Kirayue commented on August 28, 2024

@fs-eire, thank you for your explanations.

If I want to use wasm, should I install or config something? Because the error still remains even I used it in Node.js. I made a copy of onnx-worker.js to the dir of index.js.

If I do not specify the backendHint, I expected it uses onnxruntime. But the obj after new onnx.InferenceSession() is OnnxRuntimeInferenceSession {binding: InferenceSession {} }, it is an empty obj.
This looks the session is created successfully. It is not an empty object -- although you cannot see its members. It's a Node.js native binding so it works if you call .loadModel or .run.

You are right, but I called session.run(tensorX), where tensorX = new onnx.Tensor(new Float32Array(3 * 224 * 224).fill(1), 'float32', [1, 3, 224, 224]). (this backendHint is 'onnxruntime')
the error message is tensor.dims must be an array, but I console.log(tensorX.dims), its [1, 3, 224, 224]. I do not know what is going on. If I used cpu as backendHint, the problem becomes second comment.

from onnxjs.

fs-eire avatar fs-eire commented on August 28, 2024

you should use session.run([tensorX]), since session.run() accepts a tensor array as its first input

https://github.com/Microsoft/onnxjs/blob/c1c8f90492dea911000bffda39691786b6b6ac7d/lib/api/inference-session.ts#L35-L44

from onnxjs.

Kirayue avatar Kirayue commented on August 28, 2024

Ok, it works. thank you for your help.
So the rest if that if I want to test my own model, I have two choices, onnxruntime and webgl because of Pad operator, the former I ran I got Error: unsupported data type (7), but I tested it using python onnxruntime.

My purpose is to deploy my model using webgl, I just want to make sure why onnxruntime did not work.
I will try webgl later, thank you very much 😄

from onnxjs.

fs-eire avatar fs-eire commented on August 28, 2024

the error message Error: unsupported data type (7) indicates that the model is outputting one or more INT64 tensor(s).

JavaScript's number is not compatible with int64, and there are no such thing like Int64Array. Although there are libraries like Long it is still not that easy to support int64.

Inside onnxjs, all int64/uint64 intermedia tensors will be treated as int32/uint32, assuming no overflow happened. However, onnxjs-node does not allow int64/uint64 as model outputs, thus the error occurred.

possible solutions(model): avoid use of int64/uint64, use int32/uint32 instead
possible solutions(onnxjs-node): update onnxjs-node to force read int64-as-int32 for model output(s).

from onnxjs.

Kirayue avatar Kirayue commented on August 28, 2024

Got it, thank you @fs-eire

I will modify the model and rerun again.

from onnxjs.

Kirayue avatar Kirayue commented on August 28, 2024

Hi, @fs-eire
Sorry to bother again, I want to try the models in the browser. Squeezenet from the model zoo worked fine, but my own model caused the error unrecognized operator 'Shape'. I use 'webgl' backend, there is no Pad error. I also checked the operators written in operators.md, it should not occur Shape operator error in webgl backend.
image

from onnxjs.

gnsmrky avatar gnsmrky commented on August 28, 2024

Shape isn't listed in the ONNX.js op list. If the tensor shape you are trying to do is static, you can always set it as constant variable in your network's torch.nn.Module classes.

from onnxjs.

Kirayue avatar Kirayue commented on August 28, 2024

@gnsmrky
oh, you are right, I misread to Reshape. I referred to #84, and solved the problem. Thank you. 😄

from onnxjs.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.