Giter VIP home page Giter VIP logo

Comments (8)

khmyznikov avatar khmyznikov commented on June 10, 2024 1

@guotuofeng @jambayk

"conversion": {
            "type": "OnnxConversion",
            "config": {
                "target_opset": 17,
                "save_as_external_data": true
            }
},

This flag helped.

from olive.

jambayk avatar jambayk commented on June 10, 2024

could you try with --tempdir . added to the second command? Perhaps some of the data got corrupted in the default temporary directory somewhere along the process. Also use a clean cache so that it doesn't reuse the cached onnx model.

from olive.

khmyznikov avatar khmyznikov commented on June 10, 2024

could you try with --tempdir . added to the second command? Perhaps some of the data got corrupted in the default temporary directory somewhere along the process. Also use a clean cache so that it doesn't reuse the cached onnx model.

.cache was emply. No, sadly --tempdir doesn't help...

python -m olive.workflows.run --config .\whisper_cpu_int8.json --tempdir .                                                                                             
[2024-03-19 17:25:01,399] [DEBUG] [accelerator.py:155:create_accelerators] Initial execution providers: ['CPUExecutionProvider']
[2024-03-19 17:25:01,400] [DEBUG] [accelerator.py:168:create_accelerators] Initial accelerators: ['cpu']
[2024-03-19 17:25:01,400] [DEBUG] [accelerator.py:193:create_accelerators] Supported execution providers for device cpu: ['CPUExecutionProvider']
[2024-03-19 17:25:01,400] [INFO] [accelerator.py:208:create_accelerators] Running workflow on accelerator specs: cpu-cpu
[2024-03-19 17:25:01,401] [INFO] [engine.py:116:initialize] Using cache directory: cache
[2024-03-19 17:25:01,403] [INFO] [engine.py:272:run] Running Olive on accelerator: cpu-cpu
[2024-03-19 17:25:01,404] [DEBUG] [engine.py:1071:create_system] create native OliveSystem SystemType.Local
[2024-03-19 17:25:01,404] [DEBUG] [engine.py:1071:create_system] create native OliveSystem SystemType.Local
[2024-03-19 17:25:01,519] [DEBUG] [engine.py:706:_cache_model] Cached model 144b9b2eb19bac2e3483da6ee6aba08b to cache\models\144b9b2eb19bac2e3483da6ee6aba08b.json
[2024-03-19 17:25:01,520] [DEBUG] [engine.py:344:run_accelerator] Running Olive in no-search mode ...
[2024-03-19 17:25:01,520] [DEBUG] [engine.py:428:run_no_search] Running ['conversion', 'transformers_optimization', 'onnx_dynamic_quantization', 'insert_beam_search', 'prepost'] with no search ...
[2024-03-19 17:25:01,520] [INFO] [engine.py:862:_run_pass] Running pass conversion:OnnxConversion
[2024-03-19 17:25:01,523] [DEBUG] [resource_path.py:156:create_resource_path] Resource path code/user_script.py is inferred to be of type file.
[2024-03-19 17:25:01,527] [DEBUG] [resource_path.py:156:create_resource_path] Resource path code is inferred to be of type folder.
[2024-03-19 17:25:01,530] [DEBUG] [resource_path.py:156:create_resource_path] Resource path code is inferred to be of type folder.
[2024-03-19 17:25:01,532] [DEBUG] [resource_path.py:156:create_resource_path] Resource path code/user_script.py is inferred to be of type file.
[2024-03-19 17:25:02,578] [DEBUG] [resource_path.py:156:create_resource_path] Resource path C:\Users\gkhmyznikov\Develop\GitHub\Olive\examples\whisper\code is inferred to be of type folder.
[2024-03-19 17:25:02,581] [DEBUG] [resource_path.py:156:create_resource_path] Resource path C:\Users\gkhmyznikov\Develop\GitHub\Olive\examples\whisper\code\user_script.py is inferred to be of type file.
[2024-03-19 17:25:02,746] [INFO] [hf_config.py:68:load_hf_model] Loading Huggingface model from openai/whisper-medium
[2024-03-19 17:25:14,797] [DEBUG] [resource_path.py:156:create_resource_path] Resource path C:\Users\gkhmyznikov\Develop\GitHub\Olive\examples\whisper\code is inferred to be of type folder.
[2024-03-19 17:25:14,800] [DEBUG] [resource_path.py:156:create_resource_path] Resource path C:\Users\gkhmyznikov\Develop\GitHub\Olive\examples\whisper\code\user_script.py is inferred to be of type file.
[2024-03-19 17:25:14,942] [DEBUG] [resource_path.py:156:create_resource_path] Resource path C:\Users\gkhmyznikov\Develop\GitHub\Olive\examples\whisper\code is inferred to be of type folder.
[2024-03-19 17:25:14,945] [DEBUG] [resource_path.py:156:create_resource_path] Resource path C:\Users\gkhmyznikov\Develop\GitHub\Olive\examples\whisper\code\user_script.py is inferred to be of type file.
[2024-03-19 17:25:14,954] [DEBUG] [pytorch.py:253:get_user_io_config] Calling get_encdec_io_config to get io_config
[2024-03-19 17:25:30,115] [DEBUG] [dummy_inputs.py:30:get_dummy_inputs] Using dummy_inputs_func to get dummy inputs
[2024-03-19 17:25:30,259] [DEBUG] [pytorch.py:253:get_user_io_config] Calling get_encdec_io_config to get io_config
[2024-03-19 17:25:46,136] [DEBUG] [conversion.py:175:_export_pytorch_model] Converting model on device cpu with dtype None.
C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\transformers\models\whisper\modeling_whisper.py:415: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if attn_weights.size() != (bsz * self.num_heads, tgt_len, src_len):
C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\transformers\models\whisper\modeling_whisper.py:454: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if attn_output.size() != (bsz * self.num_heads, tgt_len, self.head_dim):
C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\transformers\modeling_attn_mask_utils.py:66: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if input_shape[-1] > 1 or self.sliding_window is not None:
C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\transformers\modeling_attn_mask_utils.py:137: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if past_key_values_length > 0:
C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\transformers\models\whisper\modeling_whisper.py:422: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if attention_mask.size() != (bsz, 1, tgt_len, src_len):
[2024-03-19 17:27:18,392] [DEBUG] [resource_path.py:156:create_resource_path] Resource path C:\Users\gkhmyznikov\Develop\GitHub\Olive\examples\whisper\code is inferred to be of type folder.
[2024-03-19 17:27:18,394] [DEBUG] [resource_path.py:156:create_resource_path] Resource path C:\Users\gkhmyznikov\Develop\GitHub\Olive\examples\whisper\code\user_script.py is inferred to be of type file.
[2024-03-19 17:27:18,636] [DEBUG] [resource_path.py:156:create_resource_path] Resource path C:\Users\gkhmyznikov\Develop\GitHub\Olive\examples\whisper\code is inferred to be of type folder.
[2024-03-19 17:27:18,638] [DEBUG] [resource_path.py:156:create_resource_path] Resource path C:\Users\gkhmyznikov\Develop\GitHub\Olive\examples\whisper\code\user_script.py is inferred to be of type file.
[2024-03-19 17:27:18,643] [DEBUG] [pytorch.py:253:get_user_io_config] Calling get_dec_io_config to get io_config
[2024-03-19 17:27:18,802] [DEBUG] [dummy_inputs.py:30:get_dummy_inputs] Using dummy_inputs_func to get dummy inputs
[2024-03-19 17:27:21,028] [DEBUG] [pytorch.py:253:get_user_io_config] Calling get_dec_io_config to get io_config
[2024-03-19 17:27:21,181] [DEBUG] [conversion.py:175:_export_pytorch_model] Converting model on device cpu with dtype None.
C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\transformers\models\whisper\modeling_whisper.py:377: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  and past_key_value[0].shape[2] == key_value_states.shape[1]
[2024-03-19 17:28:08,114] [INFO] [engine.py:952:_run_pass] Pass conversion:OnnxConversion finished in 186.585407 seconds
[2024-03-19 17:28:08,123] [DEBUG] [engine.py:706:_cache_model] Cached model 0_OnnxConversion-144b9b2eb19bac2e3483da6ee6aba08b-dc5fbbbe422d406cc8fcef71d99251a4 to cache\models\0_OnnxConversion-144b9b2eb19bac2e3483da6ee6aba08b-dc5fbbbe422d406cc8fcef71d99251a4.json
[2024-03-19 17:28:08,126] [DEBUG] [engine.py:789:_cache_run] Cached run for 144b9b2eb19bac2e3483da6ee6aba08b->0_OnnxConversion-144b9b2eb19bac2e3483da6ee6aba08b-dc5fbbbe422d406cc8fcef71d99251a4 into cache\runs\OnnxConversion-144b9b2eb19bac2e3483da6ee6aba08b-dc5fbbbe422d406cc8fcef71d99251a4.json
[2024-03-19 17:28:08,133] [INFO] [engine.py:862:_run_pass] Running pass transformers_optimization:OrtTransformersOptimization
[2024-03-19 17:28:08,138] [DEBUG] [resource_path.py:156:create_resource_path] Resource path C:\Users\gkhmyznikov\Develop\GitHub\Olive\examples\whisper\cache\models\0_OnnxConversion-144b9b2eb19bac2e3483da6ee6aba08b-dc5fbbbe422d406cc8fcef71d99251a4\output_model\encoder_decoder_init\model.onnx is inferred to be of type file.
[2024-03-19 17:28:08,143] [DEBUG] [resource_path.py:156:create_resource_path] Resource path C:\Users\gkhmyznikov\Develop\GitHub\Olive\examples\whisper\cache\models\0_OnnxConversion-144b9b2eb19bac2e3483da6ee6aba08b-dc5fbbbe422d406cc8fcef71d99251a4\output_model\decoder\model.onnx is inferred to be of type file.
[2024-03-19 17:28:17,624] [ERROR] [engine.py:947:_run_pass] Pass run failed.
Traceback (most recent call last):
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\olive\engine\engine.py", line 935, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, data_root, output_model_path, pass_search_point)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\olive\systems\local.py", line 34, in run_pass
    output_model = the_pass.run(model, data_root, output_model_path, point)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\olive\passes\olive_pass.py", line 195, in run
    output_model_component = self._run_for_config(
                             ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\olive\passes\onnx\transformer_optimization.py", line 278, in _run_for_config
    optimizer = transformers_optimizer.optimize_model(input=model.model_path, **run_config)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\onnxruntime\transformers\optimizer.py", line 352, in optimize_model
    temp_model_path = optimize_by_onnxruntime(
                      ^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\onnxruntime\transformers\optimizer.py", line 175, in optimize_by_onnxruntime
    onnxruntime.InferenceSession(onnx_model_path, sess_options, providers=providers, **kwargs)
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 472, in _create_inference_session      
    sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxruntime.capi.onnxruntime_pybind11_state.InvalidProtobuf: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from C:\Users\gkhmyznikov\Develop\GitHub\Olive\examples\whisper\cache\models\0_OnnxConversion-144b9b2eb19bac2e3483da6ee6aba08b-dc5fbbbe422d406cc8fcef71d99251a4\output_model\encoder_decoder_init\model.onnx failed:Protobuf parsing failed.
[2024-03-19 17:28:17,749] [WARNING] [engine.py:366:run_accelerator] Failed to run Olive on cpu-cpu.
Traceback (most recent call last):
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\olive\engine\engine.py", line 345, in run_accelerator
    output_footprint = self.run_no_search(
                       ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\olive\engine\engine.py", line 429, in run_no_search
    should_prune, signal, model_ids = self._run_passes(
                                      ^^^^^^^^^^^^^^^^^
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\olive\engine\engine.py", line 824, in _run_passes
    model_config, model_id = self._run_pass(
                             ^^^^^^^^^^^^^^^
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\olive\engine\engine.py", line 935, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, data_root, output_model_path, pass_search_point)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\olive\systems\local.py", line 34, in run_pass
    output_model = the_pass.run(model, data_root, output_model_path, point)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\olive\passes\olive_pass.py", line 195, in run
    output_model_component = self._run_for_config(
                             ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\olive\passes\onnx\transformer_optimization.py", line 278, in _run_for_config
    optimizer = transformers_optimizer.optimize_model(input=model.model_path, **run_config)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\onnxruntime\transformers\optimizer.py", line 352, in optimize_model
    temp_model_path = optimize_by_onnxruntime(
                      ^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\onnxruntime\transformers\optimizer.py", line 175, in optimize_by_onnxruntime
    onnxruntime.InferenceSession(onnx_model_path, sess_options, providers=providers, **kwargs)
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "C:\Users\gkhmyznikov\Develop\Python\whisper\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 472, in _create_inference_session      
    sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxruntime.capi.onnxruntime_pybind11_state.InvalidProtobuf: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from C:\Users\gkhmyznikov\Develop\GitHub\Olive\examples\whisper\cache\models\0_OnnxConversion-144b9b2eb19bac2e3483da6ee6aba08b-dc5fbbbe422d406cc8fcef71d99251a4\output_model\encoder_decoder_init\model.onnx failed:Protobuf parsing failed.
[2024-03-19 17:28:18,012] [INFO] [engine.py:289:run] Run history for cpu-cpu:
[2024-03-19 17:28:18,144] [INFO] [engine.py:565:dump_run_history] run history:
+------------------------------------------------------------------------------------+----------------------------------+----------------+----------------+-----------+  
| model_id                                                                           | parent_model_id                  | from_pass      |   duration_sec | metrics   |  
+====================================================================================+==================================+================+================+===========+  
| 144b9b2eb19bac2e3483da6ee6aba08b                                                   |                                  |                |                |           |  
+------------------------------------------------------------------------------------+----------------------------------+----------------+----------------+-----------+  
| 0_OnnxConversion-144b9b2eb19bac2e3483da6ee6aba08b-dc5fbbbe422d406cc8fcef71d99251a4 | 144b9b2eb19bac2e3483da6ee6aba08b | OnnxConversion |        186.585 |           |  
+------------------------------------------------------------------------------------+----------------------------------+----------------+----------------+-----------+  
[2024-03-19 17:28:18,149] [INFO] [engine.py:295:run] Package top ranked 0 models as artifacts
[2024-03-19 17:28:18,150] [WARNING] [packaging_generator.py:41:generate_output_artifacts] No model is selected. Skip packaging output artifacts.

from olive.

guotuofeng avatar guotuofeng commented on June 10, 2024

It seems the error message like #926.

What's your model size? does https://microsoft.github.io/Olive/api/passes.html#cmdoption-arg-save_as_external_data help when it is set to True?

from olive.

guotuofeng avatar guotuofeng commented on June 10, 2024

BTW, would you please manually load the model using https://netron.app/ to check the onnx model file itself?

from olive.

khmyznikov avatar khmyznikov commented on June 10, 2024

Also, looks like the problem is only with python 3.11+

from olive.

guotuofeng avatar guotuofeng commented on June 10, 2024

Also, looks like the problem is only with python 3.11+

Glad to know the background. thanks.

from olive.

jambayk avatar jambayk commented on June 10, 2024

PR #1069 linked above should fix this issue even without making the change above.

Please reopen the issue if you still experience the problem.

from olive.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.