prajjwal1 / generalize_lm_nli Goto Github PK
View Code? Open in Web Editor NEWCode for the paper EMNLP 2021 workshop paper "Generalization in NLI: Ways (Not) To Go Beyond Simple Heuristics"
License: GNU General Public License v3.0
Code for the paper EMNLP 2021 workshop paper "Generalization in NLI: Ways (Not) To Go Beyond Simple Heuristics"
License: GNU General Public License v3.0
Issue
run_glue.py fails at Line 233 "if trainer.is_world_master():"
Solution
Replace "is_world_master" with "is_world_process_zero"
Reference: dmis-lab/biobert-pytorch#7
Self-explanatory!
Google's release of the 24 smaller BERT models at https://github.com/google-research/bert do not have the .meta file. For example, the bert-base model (https://storage.googleapis.com/bert_models/2020_02_20/uncased_L-12_H-768_A-12.zip) provided in the list does not the .meta file, while the original release of bert-base by Google at https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-12_H-768_A-12.zip contains the .meta file. Is the .meta file needed if we load the TF checkpoints in Pytorch using huggingface?
Hi @prajjwal1,
Thank you very much for this great repo and work.
I would be interested in working with your bert-tiny model.
For this work, it would be of importance to know what data exactly this model was obtained from. I could not find a concrete pointer.
Would you be able to guide me to a specification of this information?
Thank you a lot in advance and kind regards!
Hi,
quick question- do all the bert-(mini / tiny / medium / small) models have the same tokenizer?
Hi
I have tried to use the bert-small-mnli model hosted on HF, and seems I have a couple of problems:
mxlen
provided, so no padding happens AFAICT. Is that correct?input_ids
, token_type_ids
and attention_mask
? If I just use input_ids
, the results seem to match with the HF API (there's still a small difference, but I am tokenizing a bit differently), but otherwise, there's a huge difference.A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.