Topic: roberta Goto Github
Some thing interesting about roberta
Some thing interesting about roberta
roberta,One-Stop Solution to encode sentence to fixed length vectors from various embedding techniques
User: amansrivastava17
roberta,PyTorch implementation of "data2vec: A General Framework for Self-supervised Learning in Speech, Vision and Language" from Meta AI
User: arxyzan
roberta,Integrating the Best of TF into PyTorch, for Machine Learning, Natural Language Processing, and Text Generation. This is part of the CASL project: http://casl-project.ai/
Organization: asyml
Home Page: https://asyml.io
roberta,A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
User: brightmart
Home Page: https://arxiv.org/pdf/1909.11942.pdf
roberta,transform multi-label classification as sentence pair task, with more training data and information
User: brightmart
roberta,RoBERTa中文预训练模型: RoBERTa for Chinese
User: brightmart
roberta,中文预训练XLNet模型: Pre-Trained Chinese XLNet_Large
User: brightmart
roberta,BOND: BERT-Assisted Open-Domain Name Entity Recognition with Distant Supervision
User: cliang1453
roberta,PromptCLUE, 全中文任务支持零样本学习模型
Organization: clue-ai
Home Page: https://www.clueai.cn
roberta,中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Organization: cluebenchmark
Home Page: http://www.CLUEbenchmarks.com
roberta,Large-scale Pre-training Corpus for Chinese 100G 中文预训练语料
Organization: cluebenchmark
Home Page: https://arxiv.org/abs/2003.01355
roberta,CLUENER2020 中文细粒度命名实体识别 Fine Grained Named Entity Recognition
Organization: cluebenchmark
Home Page: https://arxiv.org/abs/2001.04351
roberta,高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
Organization: cluebenchmark
Home Page: https://arxiv.org/abs/2003.01355
roberta,Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Organization: dbiir
Home Page: https://github.com/dbiir/UER-py/wiki
roberta,This repository contains the dataset and the PyTorch implementations of the models from the paper Recognizing Emotion Cause in Conversations.
Organization: declare-lab
roberta,:house_with_garden: Fast & easy transfer learning for NLP. Harvesting language models for the industry. Focus on Question Answering.
Organization: deepset-ai
Home Page: https://farm.deepset.ai
roberta,MinT: Minimal Transformer Library and Tutorials
User: dpressel
roberta,Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
User: ericfillion
Home Page: http://happytransformer.com
roberta,🤖 A PyTorch library of curated Transformer models and their composable components
Organization: explosion
roberta,news-please - an integrated web crawler and information extractor for news that just works
User: fhamborg
roberta,Official implementation of the papers "GECToR – Grammatical Error Correction: Tag, Not Rewrite" (BEA-20) and "Text Simplification by Tagging" (BEA-21)
Organization: grammarly
roberta,Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
User: guillaume-be
Home Page: https://docs.rs/crate/rust-bert
roberta,A PyTorch implementation of a BiLSTM\BERT\Roberta(+CRF) model for Named Entity Recognition.
User: hemingkx
roberta,A PyTorch implementation of a BiLSTM \ BERT \ Roberta (+ BiLSTM + CRF) model for Chinese Word Segmentation (中文分词) .
User: hemingkx
roberta,Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive summarization datasets to the extractive task.
User: hhousen
Home Page: https://transformersum.rtfd.io
roberta,MiniRBT (中文小型预训练模型系列)
Organization: iflytek
roberta,A Dutch RoBERTa-based language model
User: ipieter
Home Page: https://pieter.ai/robbert/
roberta,KoCLIP: Korean port of OpenAI CLIP, in Flax
User: jaketae
Home Page: https://tinyurl.com/koclip-app
roberta,BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
User: jessevig
roberta,한국어 사전학습 모델을 활용한 문장 임베딩
User: jhgan00
roberta,📖 Korean NLU Benchmark
Organization: klue-benchmark
Home Page: https://klue-benchmark.com
roberta,Simple State-of-the-Art BERT-Based Sentence Classification with Keras / TensorFlow 2. Built with HuggingFace's Transformers.
Organization: labteral
roberta,Pytorch-Named-Entity-Recognition-with-transformers
User: liuyukid
roberta,Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
User: lonepatient
roberta,Tensorflow and Keras implementation of the state of the art researches in Dialog System NLU
User: mahmoudwahdan
roberta,The implementation of DeBERTa
Organization: microsoft
roberta,Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
Organization: microsoft
Home Page: https://arxiv.org/abs/2106.09685
roberta,BERT classification model for processing texts longer than 512 tokens. Text is first divided into smaller chunks and after feeding them to BERT, intermediate results are pooled. The implementation allows fine-tuning.
Organization: mim-solutions
roberta,The programming environment »Open Roberta Lab« by Fraunhofer IAIS enables children and adolescents to program robots. A variety of different programming blocks are provided to program motors and sensors of the robot. Open Roberta Lab uses an approach of graphical programming so that beginners can seamlessly start coding. As a cloud-based application, the platform can be used without prior installation of specific software but runs in any popular browser, independent of operating system and device.
Organization: openroberta
roberta,Code for producing Japanese pretrained models provided by rinna Co., Ltd.
Organization: rinnakk
Home Page: https://huggingface.co/rinna
roberta,Implementation of paper Does syntax matter? A strong baseline for Aspect-based Sentiment Analysis with RoBERTa.
User: rogerdjq
roberta,Build and train state-of-the-art natural language processing models using BERT
User: sudharsan13296
Home Page: https://www.amazon.com/gp/product/B08LLDF377/ref=dbs_a_def_rwt_bibl_vppi_i5
roberta,Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Organization: tencent
Home Page: https://github.com/Tencent/TencentPretrain/wiki
roberta,a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
Organization: tencent
roberta,BERTweet: A pre-trained language model for English Tweets (EMNLP-2020)
Organization: vinairesearch
roberta,PhoBERT: Pre-trained language models for Vietnamese (EMNLP-2020 Findings)
Organization: vinairesearch
roberta,Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
User: ymcui
Home Page: https://ieeexplore.ieee.org/document/9599397
roberta,[NAACL 2021] This is the code for our paper `Fine-Tuning Pre-trained Language Model with Weak Supervision: A Contrastive-Regularized Self-Training Approach'.
User: yueyu1030
Home Page: https://arxiv.org/pdf/2010.07835.pdf
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.