Comments (2)
from transformers.
Your feature request for enhancing the Trainer class in the Transformers library to support handling multiple datasets representing different domains and calculating domain-specific losses is indeed valuable for projects involving sequence level distillation across various domains. Here are some potential contributions that could address your requirements:
-
Multiple Dataset Handling Support:
- Modify the Trainer class to accept multiple datasets representing different domains directly as input. This enhancement would streamline the integration of diverse data sources within a single training loop.
-
Domain-Specific Loss Calculation Integration:
- Implement a mechanism within the Trainer class to define and compute losses separately for each domain's dataset during the training loop. This functionality would allow for domain-specific loss calculations and aggregation into a global training objective.
-
Flexible Loss Aggregation Mechanism:
- Introduce a flexible mechanism for aggregating domain-specific losses into a global training objective. This feature would enable users to define custom aggregation strategies based on the specific requirements of their projects.
-
Unified Training Interface:
- Enhance the Trainer class to provide a unified training interface that simplifies the implementation of domain-specific training strategies. This improvement would abstract away the complexity of subclassing and method overriding for users seeking to work with multiple datasets and domain-specific losses.
-
Documentation and Examples:
- Update the documentation of the Trainer class to include clear explanations and examples demonstrating how to leverage the new features for handling multiple datasets and calculating domain-specific losses. Providing detailed guidance would facilitate the adoption of these functionalities by the community.
By incorporating these features and enhancements into the Trainer class of the Transformers library, users working on projects involving sequence level distillation across diverse domains would benefit from a more streamlined and efficient training process. Additionally, these improvements would contribute to the versatility and usability of the library for a broad range of applications requiring multi-domain data integration and domain-specific training strategies.
from transformers.
Related Issues (20)
- RuntimeError: expected mat1 and mat2 to have the same dtype, but got: float != c10::BFloat16 HOT 6
- Add post_process_depth_estimation to image processors HOT 1
- [BUG] Offline loading of non-safe tensors fails HOT 3
- `center_crop` outputs wrong sized array if provided with odd-numbered dimensions smaller than requested crop size HOT 1
- LLama3-70b LoRa results in OOM with torchrun but succeeds with python3 command HOT 2
- Sink Cache Attention Scores are strange. CausalMask seems not working. HOT 2
- Libraries import missing, unable to load image for inference and not able to load pipeline with the trained model HOT 4
- CLIPTokenizerFast cause memory leak HOT 1
- VisEncoderDecoderModel generate text incomplete when predict image with long text label HOT 1
- Trained tokenizer has broken encoding for cyrillic HOT 3
- Running out of memory while finetuning and inferencing VideoMAE due to which script is being killed. HOT 5
- Trainer memory leak for evaluation with `compute_metrics`
- Llama Model throwing "RuntimeError: expected scalar type BFloat16 but found Float" when using torch.compile and AMP together HOT 6
- [LLaMA3] 'add_bos_token=True, add_eos_token=True' seems not taking effect HOT 4
- google/siglip-so400m-patch14-384 inference output mismatch with pipeline output HOT 4
- Why using empty tensor to initialize? HOT 3
- Allow `ConversationalPipeline` to receive string input HOT 3
- Weird behaviour running AWQ code on RTX 4000 Ada that worked on Tesla T4 HOT 5
- AttributeError: 'BertModel' object has no attribute 'attn_implementation' HOT 7
- Training GPT2 with run_clm.py exceeds the described memory amount . HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from transformers.