Giter VIP home page Giter VIP logo

steering-committee's Introduction

ONNX Steering Committee

The role and responsibilities of the Steering Committee are defined in the ONNX governance.

The Steering Commmittee meets regularly and calls are available for any community member to join. You can find the latest schedule at https://onnx.ai/calendar

Agendas for upcoming meetings and notes from completed meetings are published here.

Elections happen annually.

Term Members
Current
June 1st, 2023 - May 31, 2024
Prasanth Pulavarthi (Microsoft)
Alexandre Eichenberger (IBM)
Mayank Kaushik (Nvidia)
Andreas Fehlner (TRUMPF Laser GmbH)
Saurabh Tangri (Intel)
Februar 15, 2023 - May 31, 2023 Prasanth Pulavarthi (Microsoft)
Alexandre Eichenberger (IBM)
Mayank Kaushik (Nvidia)
Andreas Fehlner (TRUMPF Laser GmbH)
Saurabh Tangri (Intel)
June 1, 2022 - Februar 15, 2023 Prasanth Pulavarthi (Microsoft)
Alexandre Eichenberger (IBM)
Rajeev Nalawadi (Intel)
Mayank Kaushik (Nvidia)
Andreas Fehlner (TRUMPF Laser GmbH)
June 1, 2021 - May 31, 2022 Prasanth Pulavarthi (Microsoft)
Alexandre Eichenberger (IBM)
Rajeev Nalawadi (Intel)
Mayank Kaushik (Nvidia)
Wenming Ye (Amazon)
June 1, 2020 - May 31, 2021 Prasanth Pulavarthi (Microsoft)
Harry Kim (Intel)
Jim Spohrer (IBM)
Joohoon Lee (Nvidia)
Sheng Zha (Amazon)
June 1, 2019 - May 31, 2020 Prasanth Pulavarthi (Microsoft)
Joe Spisak (Facebook)
Vin Sharma (Amazon)
Harry Kim (Intel)
Dilip Sequeira (Nvidia)

steering-committee's People

Contributors

alexandreeichenberger avatar andife avatar awshlabs avatar harryskim avatar joohoon avatar jspisak avatar mk-nvidia avatar prasanthpul avatar rajeevnalawadi avatar rgesteve avatar saurabhtangri avatar szha avatar wenming avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

steering-committee's Issues

Inform and start working with LFAI (Jacqueline Serafin) on logistics

  • LFAI will have its own process and checklist which is quite extensive
  • Leverage LFAI for support available to ONNX community
    • Registration Web Site
    • Marketing/Advertising effort for the event
    • Zoom Meeting - confirm if you need certain Zoom features, be aware of any limits
    • Weekly status update email from LFAI on progress of registration

Volunteer to be a maintainer/approver

Hi Steering Committee,

I'm daquexian, an active contributor to ONNX project, the member of infra and operator SIG (thanks for the invitation of @linkerzhang ), the maintainer of onnx optimizer, the author of the popular onnx simplifier, and I have also given two presentations in onnx community meetup.

ONNX is so widely-used that there are many community contributors submitting PRs, however, some PRs cannot get reviews and be merged in a sensible time due to the lack of reviewers, for example, onnx/onnx#4285 which is only a 1-line change. In the meeting note, I noticed that the Steering Committee wants to encourage more people to be maintainers/approvers for managing PRs, and I'm also looking forward to having a chance to help ONNX be better. So I volunteer to be a maintainer/approver and help review PRs about infra and operator. I would appreciate it greatly if I can have the chance.

Thanks!

How to join ONNX?

Hi,
I am co-founder of DeGirum corp. We are a startup working on building HW accelerators for ML inference. We use ONNX for our development. We wanted to know how our company can join ONNX.

ONNX Serving Tool Proposal - add grpc server for onnx-mlir compiled model

ONNX Serving

Serving Tool Proposal

ONNX Serving is a project written with C++ to serve onnx-mlir compiled models with GRPC and other protocols. Benefiting from C++ implementation, ONNX Serving has very low latency overhead and high throughput. ONNX Servring provides dynamic batch aggregation and workers pool to fully utilize AI accelerators on the machine.

Currently there is no existing high performance open source sering solution for onnx-mlir compiled model, we want to contribute an open-source project to ONNX community which can help user to deploy their onnx-mlir in production environment.

Proposal

Contriubte ONNX Serving to https://github.com/onnx/onnx-serving

Welcome community contributions to enhance onnx-serving with broader hardware and platform support.

I will appreciate if we can get a chance to present our project to the ONNX SIGs.

Questions:

Rules for all repos and Requirements for new, contributed repos

| Rules for all repos

  1. Must be owned and managed by one of the ONNX SIGs (Architecture & Infra)

  2. Must be actively maintained (Qin Yue Chen, Fei Fei Li)

  3. Must adopt the ONNX Code of Conduct (check)

  4. Must adopt the standard ONNX license(s) (already Apache-2.0 License)

  5. Must adopt the ONNX CLA bot (check)

  6. Must adopt all ONNX automation (like LGTM) (check)

  7. Must have CI or other automation in place for repos containing code to ensure quality (already implemented CI and utest, need to implement more test cases and add coverage scan tool)

  8. All OWNERS must be members of standing as defined by ability to vote in Steering Committee elections. (check)

Requirements for new, contributed repos

We are happy to accept contributions as repos under the ONNX organization of new projects that meet the following requirements:

  1. Project is closely related to ONNX (onnx-mlir)

  2. Adds value to the ONNX ecosystem (serving onnx-mlir compiled model)

  3. Determined to need a new repo rather than a folder in an existing repo (yes)

  4. All contributors must have signed the ONNX CLA (will)

  5. Licenses of dependencies must be acceptable (Apache 2.0 License as onnx-mlir used)

  6. Committment to maintain the repo (Qin Yue Chen, Fei Fei Li)

  7. Approval of the SIG that will own the repo

  8. Approval of the Steering Committee

Create a repo for separating ONNX optimizer out of ONNX repo

ONNX optimizer has not been actively maintained and updated by ONNX community, however, it's still being used by some partners. To have ONNX repo focus on ONNX spec itself and also not break any potential dependency on ONNX optimizer, we'd create a repo (optimizer) under ONNX org to develop and maintain it. @daquexian @fumihwh will help to make the move initially.

10/14 Community Meeting Agenda

Time (PST) Duration (minutes) Item Status
7:00am 25m Event Kickoff
7:00 5m Opening and Welcome
7:05 20m ONNX Community & LF AI Update ONNX Steering Committee
7:25am 100m Partner's and User's Presentations All confirmed
7:25 10 Extract the maximum benefits of ONNX to shorten your development cycle time and reduce guesswork. Patrick St-Amant, Zetane. Confirmed
7:35 10 ONNX at OneFlow. Jianhao Zhang, OneFlow. Confirmed
7:45 10 Efficient inference of transformers models: Collaboration highlights between Hugging Face & ONNX Runtime. Morgan Funtowicz, Huggingface Confirmed
7:55 10 Flows and Tools to map ONNX Neural Networks on Micro-controllers. Danilo Pau, ST Micro. Confirmed
8:05 10 Neural Automation: Fusion of Automation and Data Science. Fabian Bause, Beckhoff Automation. Confirmed
8:15 10 ONNX Runtime updates: mobile, quantization, training, and more. Faith Xu, Microsoft. Confirmed
8:25 10 Apache TVM and ONNX, what can ONNX do for DL Compilers (and vice versa)? Tianqi Chen, OctoML. Confirmed
8:35 10 ONNX Support in the MLIR Compiler: Approach and Status. Alexandre Eichenberger, IBM Research Confirmed
8:45 10 Compiling Traditional ML Pipelines into Tensor Computations for Unified Machine Learning Prediction Serving. Matteo Interlandi, Microsoft. Confirmed
8:55 10 Q/DQ is all you need. Neta Zmora, NVIDIA Confirmed
9:05am 10m Break
9:15am 45m SIG's and WG's Updates and Discussions All confirmed
9:15 10 Architecture/Infrastructure SIG Update. Ashwini Khade, Microsoft Confirmed
9:25 10 Operators SIG Update. Michał Karzyński, Intel and Emad Barsoum, Microsoft Confirmed
9:35 10 Converters SIG Update. Chin Huang, IBM and Guenther Schmuelling, Microsoft Confirmed
9:45 10 Model Zoo/Tutorials SIG Update. Wenbing Li, Microsoft and Vinitra Swamy, Microsoft Confirmed
9:55 5 Q&A / Open Discussions
10am total 180m End

Collect all presentations

  • Review and recommend revision/improvements as needed
  • Multiple reminders to send in their presentation decks will be required

VFX Virtual Production tool for Unreal Engine

Hi Steering Committee!
I'm making a new VFX Virtual Production/games tool for Unreal Engine, Dream

I text this in a hurry due I just saw the following twitt:
https://twitter.com/onnxai/status/1600603167489929217?s=20&t=tEJh6GZP6TjWEhhNt6EjgQ

I was wondering what I can do with ONNX in unreal after training a model with the videogame "self iterating":

image

Dream is a plug-in made to improve the workflows and quality of the users independently of their expertise.
A better user experience improves Unreal Engine as an all-in-one tool,
from the prototype until the final art with photorealistic generations never seen before.

https://www.youtube.com/watch?v=kEiDWhktLao

It is in my roadmap currently experimenting with ONNX,
feel free to contact me :)

Alberto Cerdá Estarelles
image

Discord: AlbertoTrunk#7305
LinkedIn: https://www.linkedin.com/in/albertocerdaestarelles/

Update ONNX Repositories for License Change

Now that we adopt Apache License v2.0 in favor of MIT license, we need to update the licenses for these repositories in the onnx org:

  • sklearn-onnx
  • onnx-mlir
  • steering-committee
  • tensorflow-onnx
  • backend-scoreboard
  • optimizer
  • onnx-tensorrt
  • onnx (onnx/onnx#3159)
  • models
  • onnx-tensorflow
  • keras-onnx
  • onnxmltools
  • sigs
  • onnx-docker
  • working-groups
  • onnx.github.io
  • wheel-builder
  • tutorials
  • onnx-r
  • onnx-coreml

Neural compressor Proposal - to add port the repo under Intel to ONNX organization

ONNX Model Compressor

Quantization Tool Proposal

Intel Neural Compressor(INC) is a tool for generating optimized ONNX models and supports techniques like Post training quantization (PTQ), Quantization Aware Training (QAT). The tool can also be used for distillation and Pruning to generate Sparse quantized ONNX models. It has broad model coverage (300+ models) representing key domains like vision, NLP and recommendation systems. Since its release INC has seen high popularity among the ONNX community. It has also been integrated into Huggingface Optimum pipeline. INC is also the tool used to produce int8 quantized models in ONNX model Zoo.

While ONNX ecosystem is seeing high adoption in industry there hasn’t been significant community contribution towards ONNX model compression tooling. Hence Intel wants to contribute an open-source project to ONNX community which can help accelerate deployment of sparse and quantized ONNX models.

Proposal

Migrate Intel Neural Compressor to https://github.com/onnx/neural-compressor

Maintain a vendor neutral branding(Neural Compressor) and welcome community contributions to enhance Neural Compressor with broader HW support.

Questions:

  1. (Question proposed by Tangri, Saurabh from Intel) How would Neural Compressor scale to non-intel Hardware?

  2. (Question proposed by Tangri, Saurabh from Intel) Why not remove support non-ONNX models in Neural Compressor?

Answer: (by Tangri, Saurabh from Intel) I feel interoperability has been a strength of ONNX standard since its inception, and a quantization tool that supports other frameworks should be seen as an expression of that same openness. Yes we can remove/move non-onnx perf data on the landing page, so we don’t appear to be promoting non-onnx frameworks.

Follow-up question: How is model pruning and distillation related to ONNX?

  1. Regarding requirements in Rules for all repos and Requirements for new, contributed repos: Who will be actively maintaining the repo?

  2. “There are some questions raised about the tool, particularly around expansion to non-Intel hardware”. How to expand the tool to non-intel hardware?

  3. (Gary from Micrisoft) Some of the Intel code is redundant with some of what we have in microsoft/onnxruntime and microsoft/onnxconvertercommon. I think it would be better to collaborate on one set of tools. How will the tool being used by converters and onnxruntime?

Rules for all repos and Requirements for new, contributed repos

Rules for all repos

  1. Must be owned and managed by one of the ONNX SIGs (ArchInfra SIG)

  2. Must be actively maintained (Who will be actively maintaining the repo?)

  3. Must adopt the ONNX Code of Conduct (check)

  4. Must adopt the standard ONNX license(s) (already Apache-2.0 License)

  5. Must adopt the ONNX CLA bot (check)

  6. Must adopt all ONNX automation (like LGTM) (check)

  7. Must have CI or other automation in place for repos containing code to ensure quality (needs CI pipelines with good code coverage)

  8. All OWNERS must be members of standing as defined by ability to vote in Steering Committee elections. (check)

Requirements for new, contributed repos

We are happy to accept contributions as repos under the ONNX organization of new projects that meet the following requirements:

  1. Project is closely related to ONNX ((Question proposed by Tangri, Saurabh from Intel) Why not remove support non-ONNX models in Neural Compressor?)

  2. Adds value to the ONNX ecosystem (check)

  3. Determined to need a new repo rather than a folder in an existing repo (Is it possible to move into Onnx Optimizer?)

  4. All contributors must have signed the ONNX CLA (check)

  5. Licenses of dependencies must be acceptable (check)

  6. Committment to maintain the repo (Who will be actively maintaining the repo?)

  7. Approval of the SIG that will own the repo

  8. Approval of the Steering Committee

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.