bigcode-project / transformers Goto Github PK
View Code? Open in Web Editor NEWLicense: Apache License 2.0
License: Apache License 2.0
transformers
version: 4.27.0.dev0examples
folder (such as GLUE/SQuAD, ...)from transformers import AutoModelForCausalLM model = AutoModelForCausalLM.from_pretrained("bigcode/santacoder-fast-inference")
Model should load without issue, but instead we get a bunch of errors of the form
RuntimeError: Error(s) in loading state_dict for GPTBigCodeLMHeadModel:
size mismatch for transformer.h.0.attn.c_attn.weight: copying a param with shape torch.Size([2048, 2304]) from checkpoint, the shape in current model is torch.Size([2304, 2048]).
size mismatch for transformer.h.0.mlp.c_fc.weight: copying a param with shape torch.Size([2048, 8192]) from checkpoint, the shape in current model is torch.Size([8192, 2048]).
size mismatch for transformer.h.0.mlp.c_proj.weight: copying a param with shape torch.Size([8192, 2048]) from checkpoint, the shape in current model is torch.Size([2048, 8192]).```
transformers
version: 4.27.0.dev0While testing bigcode/santacoder-fast-inference
model on openai_human_eval
dataset. I am getting the following warning. Is there something to be concerned about?
anaconda3/envs/NLPWorkSpace/lib/python3.9/site-packages/transformers-4.27.0.dev0-py3.9.egg/transformers/models/gpt_bigcode/modeling_gpt_bigcode.py:259: UserWarning: FALLBACK path has been
taken inside: runCudaFusionGroup. This is an indication that codegen Failed for some reason.
To debug try disable codegen fallback path via setting the env variable `export PYTORCH_NVFUSER_DISABLE=fallback`
(Triggered internally at /opt/conda/conda-bld/pytorch_1670525551200/work/torch/csrc/jit/codegen/cuda/manager.cpp:331.)
attn_weights = upcast_masked_softmax(attn_weights, attention_mask, mask_value, unscale, softmax_dtype)
examples
folder (such as GLUE/SQuAD, ...)Running inference on OpenAI's HumanEval dataset leads to this warning. Specifically when I use temperature = 0.2
and top_p = 0.2
in model.generate
method
No Warning
I have a Megatron LM checkpoint trained using ALIBI. Since ALIBI doesn't add positional embeddings, I don't have it in my checkpoints as well.
During conversion of my checkpoint to HF transformers checkpoint, using src/transformers/models/megatron_gpt_bigcode/checkpoint_reshaping_and_interoperability.py , I get the below error.
AttributeError: 'dict' object has not attribute 'to'
This is because, I believe, the function get_element_from_dict_by_path is not consistent with it's return type.
It returns positional embeddings(tensors) when I have the positional embedding.
It returns empty dictionary when I don't have it. (in my case)
The issue arises later when we try to convert data type of the output from the above function in line 412.
Can we add support for checkpoints trained using ALIBI ?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.