Comments (6)
Hum I will see if I can let people import any kind of TF model in PyTorch, that's a bit risky so it has to be done properly.
In the meantime you can add global_step
in the list line 53 of convert_tf_checkpoint_to_pytorch.py
from transformers.
Maybe some additional information could help me help you?
from transformers.
Initialize PyTorch weight ['cls', 'seq_relationship', 'output_weights']
Skipping cls/seq_relationship/output_weights/adam_m
Skipping cls/seq_relationship/output_weights/adam_v
Traceback (most recent call last):
File "/home/tiandan.cxj/python/model_serving_python/lib/python3.5/runpy.py", line 193, in _run_module_as_main
"main", mod_spec)
File "/home/tiandan.cxj/python/model_serving_python/lib/python3.5/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/home/tiandan.cxj/platform/pytorch_BERT/pytorch-pretrained-BERT/pytorch_pretrained_bert/main.py", line 19, in
convert_tf_checkpoint_to_pytorch(TF_CHECKPOINT, TF_CONFIG, PYTORCH_DUMP_OUTPUT)
File "/home/tiandan.cxj/platform/pytorch_BERT/pytorch-pretrained-BERT/pytorch_pretrained_bert/convert_tf_checkpoint_to_pytorch.py", line 69, in convert_tf_checkpoint_to_pytorch
pointer = getattr(pointer, l[0])
File "/home/tiandan.cxj/python/model_serving_python/lib/python3.5/site-packages/torch/nn/modules/module.py", line 518, in getattr
type(self).name, name))
AttributeError: 'BertForPreTraining' object has no attribute 'global_step'
from transformers.
@thomwolf sir, i am also same issue. it doen't resolve. how i am convert my finetuned pretrained model to pytorch?
export BERT_BASE_DIR=/home/dell/backup/NWP/bert-base-uncased/bert_tensorflow_e100
pytorch_pretrained_bert convert_tf_checkpoint_to_pytorch \
$BERT_BASE_DIR/model.ckpt-100 \
$BERT_BASE_DIR/bert_config.json \
$BERT_BASE_DIR/pytorch_model.bin
Traceback (most recent call last):
File "/home/dell/Downloads/Downloads/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/dell/Downloads/Downloads/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/home/dell/backup/bert_env/lib/python3.6/site-packages/pytorch_pretrained_bert/__main__.py", line 19, in <module>
convert_tf_checkpoint_to_pytorch(TF_CHECKPOINT, TF_CONFIG, PYTORCH_DUMP_OUTPUT)
File "/home/dell/backup/bert_env/lib/python3.6/site-packages/pytorch_pretrained_bert/convert_tf_checkpoint_to_pytorch.py", line 69, in convert_tf_checkpoint_to_pytorch
pointer = getattr(pointer, l[0])
File "/home/dell/backup/bert_env/lib/python3.6/site-packages/torch/nn/modules/module.py", line 535, in __getattr__
type(self).__name__, name))
AttributeError: 'BertForPreTraining' object has no attribute 'global_step'
sir how to resolve this issue?
thanks.
from transformers.
thanks @thomwolf sir. it was resolved.
from transformers.
I added the global_step to the skipping list in the modelling.py . Still facing the error. Am I missing something?
from transformers.
Related Issues (20)
- Llama3 with LlamaForSequenceClassification - Shape mismatch error HOT 2
- Cannot replicate results from object detection task guide HOT 9
- Idefics2 fine-tuning: Error when unscale_gradients called on FP16 gradients during training with transformers and accelerate HOT 3
- Wav2Vec2CTCTokenizer adds random unknown tokens to encoded input HOT 1
- MLFlowCallback MLFLOW_RUN_ID not used HOT 1
- Correct check for SDPA in Vision Language Models HOT 1
- KeyError Issue Reason HOT 1
- ValueError Reason HOT 7
- Issue with `output_router_logits` Parameter Not Being Passed Correctly in `SwitchTransformersForConditionalGeneration`
- Make fx traced model with the use of `past_key_values` pickable again?
- Community contribution: enable dynamic resolution input for more vision models. HOT 5
- mistralai/Mixtral-8x7B-v0.1 bfloat16 much slower than FP32 on Intel EMR CPU HOT 3
- ReadTimeOutError with from_pretrained for some model checkpoints only HOT 1
- The Phi-3 tokenizer is not inverting the chat template correctly. HOT 3
- i cannot find the code that transformers trainer model_wrapped by deepspeed , i can find the theory about model_wrapped was wraped by DDP(Deepspeed(transformer model )) ,but i only find the code transformers model wrapped by ddp, where is the deepspeed wrapped ? thanks ^-^ HOT 4
- Whisper Translation on low resource languages
- Cache in different devices when use split model with dispatch_model() function and model.generate()
- ChatGLM3-6b测试模型时报错AttributeError: can't set attribute HOT 1
- i cannot find the code that transformers trainer model_wrapped by deepspeed , i can find the theory about model_wrapped was wraped by DDP(Deepspeed(transformer model )) ,but i only find the code transformers model wrapped by ddp, where is the deepspeed wrapped ? thanks ^-^ HOT 1
- AutoModal how to enable TP for extremly large models? HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from transformers.