Topic: pre-training Goto Github
Some thing interesting about pre-training
Some thing interesting about pre-training
pre-training,Code for KDD'20 "Generative Pre-Training of Graph Neural Networks"
User: acbull
pre-training,OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
User: akanyaani
pre-training,[NeurIPS 2022] Zero-Shot Video Question Answering via Frozen Bidirectional Language Models
User: antoyang
Home Page: https://arxiv.org/abs/2206.08155
pre-training,[NeurIPS 2023 D&B] VidChapters-7M: Video Chapters at Scale
User: antoyang
Home Page: http://arxiv.org/abs/2309.13952
pre-training,A collection of Audio and Speech pre-trained models.
User: balavenkatesh3322
pre-training,Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
User: brightmart
pre-training,Papers about pretraining and self-supervised learning on Graph Neural Networks (GNN).
User: chandlerbang
pre-training,Research code for ECCV 2020 paper "UNITER: UNiversal Image-TExt Representation Learning"
User: chenrocks
Home Page: https://arxiv.org/abs/1909.11740
pre-training,Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Organization: dbiir
Home Page: https://github.com/dbiir/UER-py/wiki
pre-training,GearNet and Geometric Pretraining Methods for Protein Structure Representation Learning, ICLR'2023 (https://arxiv.org/abs/2203.06125)
Organization: deepgraphlearning
pre-training,Awesome resources for in-context learning and prompt engineering: Mastery of the LLMs such as ChatGPT, GPT-3, and FlanT5, with up-to-date and cutting-edge updates.
User: egoalpha
Home Page: https://github.com/EgoAlpha/prompt-in-context-learning
pre-training,Generative AI for Math: MathPile
Organization: gair-nlp
Home Page: https://gair-nlp.github.io/MathPile/
pre-training,Code for our SIGKDD'22 paper Pre-training-Enhanced Spatial-Temporal Graph Neural Network For Multivariate Time Series Forecasting.
Organization: gestaltcogteam
pre-training,Conceptual 12M is a dataset containing (image-URL, caption) pairs collected for vision-and-language pre-training.
Organization: google-research-datasets
pre-training,Autoregressive Predictive Coding: An unsupervised autoregressive model for speech representation learning
User: iamyuanchung
pre-training,Code for ICLR 2020 paper "VL-BERT: Pre-training of Generic Visual-Linguistic Representations".
User: jackroos
pre-training,Reproducible scaling laws for contrastive language-image learning (https://arxiv.org/abs/2212.07143)
Organization: laion-ai
pre-training,The repository of ET-BERT, a network traffic classification model on encrypted traffic. The work has been accepted as The Web Conference (WWW) 2022 accepted paper.
User: linwhitehat
pre-training,Code for TKDE paper "Self-supervised learning on graphs: Contrastive, generative, or predictive"
User: lirongwu
pre-training,A simple and working implementation of Electra, the fastest way to pretrain language models from scratch, in Pytorch
User: lucidrains
pre-training,An implementation of masked language modeling for Pytorch, made as concise and simple as possible
User: lucidrains
pre-training,[Survey] Masked Modeling for Self-supervised Representation Learning on Vision and Beyond (https://arxiv.org/abs/2401.00897)
User: lupin1998
Home Page: https://openmixup.readthedocs.io/en/latest/awesome_selfsup/MIM.html
pre-training,💐Kaleido-BERT: Vision-Language Pre-training on Fashion Domain
User: mczhuge
pre-training,[ICML 2020] DrRepair: Learning to Repair Programs from Error Messages
User: michiyasunaga
Home Page: https://arxiv.org/abs/2005.10636
pre-training,Oscar and VinVL
Organization: microsoft
pre-training,Multi-modality pre-training
Organization: microsoft
pre-training,A one-stop data processing system to make data higher-quality, juicier, and more digestible for (multimodal) LLMs! 🍎 🍋 🌽 ➡️ ➡️🍸 🍹 🍷为大模型提供更高质量、更丰富、更易”消化“的数据!
Organization: modelscope
pre-training,Use PEFT or Full-parameter to finetune 350+ LLMs or 90+ MLLMs. (Qwen2.5, GLM4v, Internlm2.5, Yi, Llama3.1, Llava-Video, Internvl2, MiniCPM-V-2.6, Deepseek, Baichuan2, Gemma2, Phi3-Vision, ...)
Organization: modelscope
Home Page: https://swift.readthedocs.io/zh-cn/latest/Instruction/index.html
pre-training, Large Language Model-enhanced Recommender System Papers
User: nancheng58
pre-training,[CVPR 2024 Highlight] Visual Point Cloud Forecasting
Organization: opendrivelab
Home Page: https://arxiv.org/abs/2312.17655
pre-training,[ICLR 2024] Sheared LLaMA: Accelerating Language Model Pre-training via Structured Pruning
Organization: princeton-nlp
Home Page: https://arxiv.org/abs/2310.06694
pre-training,A professional list on Large (Language) Models and Foundation Models (LLM, LM, FM) for Time Series, Spatiotemporal, and Event Data.
User: qingsongedu
pre-training,The official GitHub page for the survey paper "A Survey of Large Language Models".
Organization: rucaibox
Home Page: https://arxiv.org/abs/2303.18223
pre-training,[ICML2024] Unified Training of Universal Time Series Forecasting Transformers
Organization: salesforceairesearch
pre-training,Probing the representations of Vision Transformers.
User: sayakpaul
Home Page: https://keras.io/examples/vision/probing_vits/
pre-training,[NeurIPS 2020] "Graph Contrastive Learning with Augmentations" by Yuning You, Tianlong Chen, Yongduo Sui, Ting Chen, Zhangyang Wang, Yang Shen
User: shen-lab
pre-training,[CVPR2023] All in One: Exploring Unified Video-Language Pre-training
Organization: showlab
Home Page: https://arxiv.org/abs/2203.07303
pre-training,Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Organization: tencent
Home Page: https://github.com/Tencent/TencentPretrain/wiki
pre-training,GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training @ KDD 2020
Organization: thudm
pre-training,The official repo for [JSTARS'24] "MTP: Advancing Remote Sensing Foundation Model via Multi-Task Pretraining"
Organization: vitae-transformer
pre-training,The official repo for [NeurIPS'23] "SAMRS: Scaling-up Remote Sensing Segmentation Dataset with Segment Anything Model"
Organization: vitae-transformer
pre-training,[MIR-2023-Survey] A continuously updated paper list for multi-modal pre-trained big models
User: wangxiao5791509
pre-training,Code and Data for EMNLP2020 Paper "KGPT: Knowledge-Grounded Pre-Training for Data-to-Text Generation"
User: wenhuchen
pre-training,Paper List of Pre-trained Foundation Recommender Models
Organization: westlake-repl
pre-training,A curated list of papers on pre-training for graph neural networks (Pre-train4GNN).
User: yuanchenbei
pre-training,Awesome list for research on CLIP (Contrastive Language-Image Pre-Training).
User: yzhuoning
pre-training,A comprehensive survey of forging vision foundation models for autonomous driving, including challenges, methodologies, and opportunities.
User: zhanghm1995
pre-training,Bamboo: 4 times larger than ImageNet; 2 time larger than Object365; Built by active learning.
User: zhangyuanhan-ai
pre-training,Collection of training data management explorations for large language models
User: zigew
Home Page: https://arxiv.org/abs/2312.01700
pre-training,An Open-sourced Knowledgable Large Language Model Framework.
Organization: zjunlp
Home Page: http://knowlm.zjukg.cn/
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.