Giter VIP home page Giter VIP logo

transformers's People

Contributors

aaugustin avatar alaradirik avatar amyeroberts avatar arthurzucker avatar dependabot[bot] avatar erenup avatar gante avatar jplu avatar julien-c avatar lewtun avatar lukovnikov avatar lysandrejik avatar mfuntowicz avatar mrm8488 avatar narsil avatar nielsrogge avatar patil-suraj avatar patrickvonplaten avatar rlouf avatar rocketknight1 avatar sanchit-gandhi avatar sgugger avatar sshleifer avatar stas00 avatar stefan-it avatar stevhliu avatar thomwolf avatar victorsanh avatar ydshieh avatar younesbelkada avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

transformers's Issues

Transformers can no longer load santacoder-fast-inference model

System Info

  • transformers version: 4.27.0.dev0
  • Platform: Linux-5.15.0-1026-aws-x86_64-with-glibc2.29
  • Python version: 3.8.10
  • Huggingface_hub version: 0.12.0
  • PyTorch version (GPU?): 1.13.1+cu117 (True)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed

Who can help?

@jlamypoirier

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

from transformers import AutoModelForCausalLM model = AutoModelForCausalLM.from_pretrained("bigcode/santacoder-fast-inference")

Expected behavior

Model should load without issue, but instead we get a bunch of errors of the form

RuntimeError: Error(s) in loading state_dict for GPTBigCodeLMHeadModel:
        size mismatch for transformer.h.0.attn.c_attn.weight: copying a param with shape torch.Size([2048, 2304]) from checkpoint, the shape in current model is torch.Size([2304, 2048]).
        size mismatch for transformer.h.0.mlp.c_fc.weight: copying a param with shape torch.Size([2048, 8192]) from checkpoint, the shape in current model is torch.Size([8192, 2048]).
        size mismatch for transformer.h.0.mlp.c_proj.weight: copying a param with shape torch.Size([8192, 2048]) from checkpoint, the shape in current model is torch.Size([2048, 8192]).```

Running Santcoder-fast-inference throws UserWarning: FALLBACK path has been taken inside

System Info

  • transformers version: 4.27.0.dev0
  • Platform: Linux-4.18.0 x86_64-with-glibc2.28
  • Python version: 3.9.0
  • Huggingface_hub version: 0.11.1
  • PyTorch version (GPU?): 1.13.1 (True)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: Yes
  • Using distributed or parallel set-up in script?: No

Who can help?

While testing bigcode/santacoder-fast-inference model on openai_human_eval dataset. I am getting the following warning. Is there something to be concerned about?

anaconda3/envs/NLPWorkSpace/lib/python3.9/site-packages/transformers-4.27.0.dev0-py3.9.egg/transformers/models/gpt_bigcode/modeling_gpt_bigcode.py:259: UserWarning: FALLBACK path has been
taken inside: runCudaFusionGroup. This is an indication that codegen Failed for some reason.                                                                               
To debug try disable codegen fallback path via setting the env variable `export PYTORCH_NVFUSER_DISABLE=fallback`                                                                                                  
 (Triggered internally at /opt/conda/conda-bld/pytorch_1670525551200/work/torch/csrc/jit/codegen/cuda/manager.cpp:331.)                                                                                            
  attn_weights = upcast_masked_softmax(attn_weights, attention_mask, mask_value, unscale, softmax_dtype)

@mayank31398 @joel

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

Running inference on OpenAI's HumanEval dataset leads to this warning. Specifically when I use temperature = 0.2 and top_p = 0.2 in model.generate method

Expected behavior

No Warning

Conversion of MegatronLM checkpoint to HF transformer checkpoint fails. (ALIBI used during training)

I have a Megatron LM checkpoint trained using ALIBI. Since ALIBI doesn't add positional embeddings, I don't have it in my checkpoints as well.

During conversion of my checkpoint to HF transformers checkpoint, using src/transformers/models/megatron_gpt_bigcode/checkpoint_reshaping_and_interoperability.py , I get the below error.

AttributeError: 'dict' object has not attribute 'to'

This is because, I believe, the function get_element_from_dict_by_path is not consistent with it's return type.
image

It returns positional embeddings(tensors) when I have the positional embedding.
It returns empty dictionary when I don't have it. (in my case)

The issue arises later when we try to convert data type of the output from the above function in line 412.

image

Can we add support for checkpoints trained using ALIBI ?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.