Comments (17)
There's no standard way to integrate adapters into custom models. If our implementations for bert-based models don't work for you, please have a look at our implementation to get an idea how it might be adapted to your needs. Here are the important parts of the implementation:
- adapter_modeling.py implements the single adapter modules
- adapter_bert.py integrates the adapter modules into the BERT architecture. Stacking and Fusion of adapters are implemented here. All the integration into the Huggingface model is done via Python mixins. Note how every module of BERT in
modeling_bert.py
has a corresponding mixin here. When supporting a custom model, you probably want to take this file and make changes to fit your model. - adapter_model_mixin.py implements useful methods for saving and loading adapters. The classes here are explained in https://docs.adapterhub.ml/extending.html
- in the actual model class (e.g.
modeling_bert.py
), we only implement the mixins. No other changes are done here compared to Huggingface.
Those are all the main classes where our changes are implemented.
Unfortunately, we can't give detailed support for implementing custom models. If your model is part of the Huggingface repo, you can open a feature request so we can potentially support it officially. Otherwise, we can try to help out if you have any further specific questions to our implementation.
from adapters.
The pip adapter-transformers
1.0.1 is built on transformers
2.11.0 link
However, the master branch of adapter-transformers
is already updated to transformers
3.5.1
You can run pip install git+https://github.com/Adapter-Hub/adapter-transformers.git
to get that version.
from adapters.
You can't have transformers
and adapter-transformers
in your env simultaneously.
Your code is trying to call AutoModelWithHeads
which is available in adapter-transformers
, but not in transformers
.
Try creating a new env, and your bug should be fixed.
from adapters.
from adapters.
from adapters.
from adapters.
from adapters.
from adapters.
then my first reply is what you need to do.
again, you cannot have transformers
and adapter-transformers
in your env simultaneously.
adapter-transformers
is built on transformers
3.5.1
from adapters.
from adapters.
you can find the models which are covered in our documentation.
we are continuously working on supporting more models.
from adapters.
from adapters.
from adapters.
I just ran the notebook you linked and didn't get the issue you mentioned. As @JoPfeiff pointed out, if you run
pip install git+https://github.com/Adapter-Hub/adapter-transformers.git
you get the latest version based on transformers 3.5.1. It will show 1.0.1 in the list of packages because that's our version number, not Huggingface's version number. Basically, it is transformers 3.5.1 although our version number is different.
You can merge the latest version of the master branch from Huggingface if you need an even newer version.
For the adapter implementations, please refer to this section in the documentation: https://docs.adapterhub.ml/extending.html or the relevant files in code: https://github.com/Adapter-Hub/adapter-transformers/blob/master/src/transformers/adapter_modeling.py, https://github.com/Adapter-Hub/adapter-transformers/blob/master/src/transformers/adapter_bert.py.
Hope this helps!
from adapters.
from adapters.
Hi @calpt and @JoPfeiff thanks for all info. I tried to read the pointers, I still have difficulty understanding the code base and essential part of adapter layers, I was wondering if any of you have time for a short chat of 15 minutes, I would be greatly thankful for your help. I need to understand what is the minimal amount of code needed to add adapters, to understand how to implement them for my model. thanks
from adapters.
As the discussion has shifted to #92, closing this thread in favor of #92 to keep discussion in one place.
from adapters.
Related Issues (20)
- Logits are changing in old adapters-transformer models if used by the new library HOT 1
- Transformers 4.35.0 Snyk Vulnerability HOT 1
- Error when trying to do torch.save HOT 2
- multi GPU setup causes error HOT 1
- Seed for Adapter Initialization?
- export onnx HOT 1
- How to add an adapter to a quantized model without peft? HOT 1
- load adapters from safetensor files (enhancement) HOT 2
- Cannot automatically convert prediction head of model class RobertaAdapterModel to flex head. HOT 2
- Adding adapters and fusion layer to UNIPELT
- Error when trying to set up adapter training for CodeLlama HOT 1
- T5 AdapterDrop Prefix-Tuning Bug HOT 2
- No difference in performance or speed between different adapter configs
- Total Parameter added to original model HOT 1
- UNIPELT original paper use alpha = 2 for Lora while the UNIPELT implementation in adapter has a alpha = 8 HOT 1
- regression head? HOT 4
- Custom Heads not working with adapters
- UNIPELT, UNIPELT (AP) and UNIPELT (APL)
- QuestionAnsweringTrainer for adapter? HOT 1
- "`.to` is not supported for `4-bit` or `8-bit` bitsandbytes models" when i use load_best_model_at_end=True in QLoRa HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from adapters.