Topic: distillation Goto Github
Some thing interesting about distillation
Some thing interesting about distillation
distillation,Pytorch implementation of various Knowledge Distillation (KD) methods.
User: aberhu
distillation,[ECCV2022] Factorizing Knowledge in Neural Networks
User: adamdad
distillation,A PyTorch-based knowledge distillation toolkit for natural language processing
User: airaria
Home Page: http://textbrewer.hfl-rc.com
distillation,Pytorch implementation of CVPR2021 paper: SuperMix: Supervising the Mixing Data Augmentation
User: alldbi
distillation,irresponsible innovation. Try now at https://chat.dev/
Organization: anarchy-ai
Home Page: https://anarchy.ai/
distillation,This repository contains the implementation of three adversarial example attack methods FGSM, IFGSM, MI-FGSM and one Distillation as defense against all attacks using MNIST dataset.
User: as791
distillation,The Biorefinery Simulation and Techno-Economic Analysis Modules; Life Cycle Assessment; Chemical Process Simulation Under Uncertainty
Organization: biosteamdevelopmentgroup
distillation,高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
Organization: cluebenchmark
Home Page: https://arxiv.org/abs/2003.01355
distillation,Awesome Knowledge Distillation
User: dkozlov
distillation,(CVPR 2022) A minimalist, mapless, end-to-end self-driving stack for joint perception, prediction, planning and control.
User: dotchen
Home Page: https://dotchen.github.io/LAV/
distillation,(ICCV 2021, Oral) RL and distillation in CARLA using a factorized world model
User: dotchen
Home Page: https://dotchen.github.io/world_on_rails/
distillation,Distillation of BERT model with catalyst framework
User: elephantmipt
distillation,Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
User: flhonker
distillation,Filter Grafting for Deep Neural Networks(CVPR 2020)
User: fxmeng
Home Page: https://arxiv.org/abs/2001.05868
distillation,基于 Tensorflow,仿 Scikit-Learn 设计的深度学习自然语言处理框架。支持 40 余种模型类,涵盖语言模型、文本分类、NER、MRC、知识蒸馏等各个领域
User: geyingli
distillation,A brain-inspired version of generative replay for continual learning with deep neural networks (e.g., class-incremental learning on CIFAR-100; PyTorch code).
User: gmvandeven
distillation,PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.
User: gmvandeven
distillation,Official implementation of ⚡ Flash Diffusion ⚡: Accelerating Any Conditional Diffusion Model for Few Steps Image Generation
Organization: gojasper
Home Page: https://gojasper.github.io/flash-diffusion-project/
distillation,Papers and Book to look at when starting AGI 📚
User: gyunggyung
distillation,Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
User: hoytta0
distillation,A list of papers, docs, codes about efficient AIGC. This repo is aimed to provide the info for efficient AIGC research, including language and vision, we are continuously improving the project. Welcome to PR the works (papers, repositories) that are missed by the repo.
User: htqin
distillation,🤗 Optimum Intel: Accelerate inference with Intel optimization tools
Organization: huggingface
Home Page: https://huggingface.co/docs/optimum/main/en/intel/index
distillation,Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Organization: intellabs
distillation,🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
User: julesbelveze
Home Page: https://julesbelveze.github.io/bert-squeeze/
distillation,Pytorch implementation of ACCV18 paper "Revisiting Distillation and Incremental Classifier Learning."
User: khurramjaved96
distillation,Insightface Keras implementation
User: leondgarse
distillation,Distillation of KoBERT from SKTBrain (Lightweight KoBERT)
User: monologg
distillation,A Compressed Stable Diffusion for Efficient Text-to-Image Generation [ECCV'24]
Organization: nota-netspresso
distillation,Zero-label image classification via OpenCLIP knowledge distillation
Organization: nvidia-ai-iot
distillation,PaddleSlim is an open-source library for deep model compression and architecture search.
Organization: paddlepaddle
Home Page: https://paddleslim.readthedocs.io/zh_CN/latest/
distillation,DistillBEV: Boosting Multi-Camera 3D Object Detection with Cross-Modal Knowledge Distillation (ICCV 2023)
User: qcraftai
distillation,BERT distillation(基于BERT的蒸馏实验 )
User: qiangsiwei
distillation,Segmind Distilled diffusion
Organization: segmind
Home Page: https://discord.gg/p2MdJqZXnb
distillation,Yolov5 distillation training | Yolov5知识蒸馏训练,支持训练自己的数据
User: sharpiless
distillation,[ICLR 2022] Code for Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation (GLNN)
Organization: snap-research
distillation,[ECCV 2022] R2L: Distilling Neural Radiance Field to Neural Light Field for Efficient Novel View Synthesis
Organization: snap-research
Home Page: https://snap-research.github.io/R2L/
distillation,mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!
User: syencil
distillation,Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"
User: szq0214
distillation,MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.
User: szq0214
distillation,推荐/广告/搜索领域工业界经典以及最前沿论文集合。A collection of industry classics and cutting-edge papers in the field of recommendation/advertising/search.
User: tangxyw
Home Page: https://tangxyw.github.io/
distillation,A Python library for adversarial machine learning focusing on benchmarking adversarial robustness.
Organization: thu-ml
Home Page: https://thu-ml-ares.rtfd.io
distillation,code for our CVPR 2022 paper "DINE: Domain Adaptation from Single and Multiple Black-box Predictors"
User: tim-learn
distillation,The official repo for [NeurIPS'22] "ViTPose: Simple Vision Transformer Baselines for Human Pose Estimation" and [TPAMI'23] "ViTPose++: Vision Transformer for Generic Body Pose Estimation"
Organization: vitae-transformer
distillation,this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
User: xiongma
distillation,PyTorch Implementation on Paper [CVPR2021]Distilling Audio-Visual Knowledge by Compositional Contrastive Learning
User: yanbeic
distillation,[ICLR 2024] VQGraph: Rethinking Graph Representation Space for Bridging GNNs and MLPs
User: yangling0818
Home Page: https://openreview.net/forum?id=h6Tz85BqRI
distillation,Adaptive, interpretable wavelets across domains (NeurIPS 2021)
Organization: yu-group
Home Page: https://arxiv.org/abs/2107.09145
distillation,Code for DisCo: Remedy Self-supervised Learning on Lightweight Models with Distilled Contrastive Learning
User: yuting-gao
distillation,Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
User: zhen-dong
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.