Comments (6)
Example
My sentence: Мой кот ... (My cat ...)
3 word is ест (eat)
List of possible words: ест (eat), поглощает (absorb), глотает (swallow), кушают (eats) etc.
I need to determine the probabilities of each word from the given list in the context of the phrase and make the most correct sentence.
Output: Мой кот ест (My cat eat).
from happy-transformer.
Hi @svk-man , I am taking a look at this
from happy-transformer.
Hi @adamfillion , I hope it will help you:
https://github.com/vlomme/Russian-gpt-2
https://github.com/vlarine/transformers-ru
from happy-transformer.
@svk-man A pull request is open that will handle multi-lingual support.
For now, if you use model = HappyBERT(''bert-base-multilingual-cased") then you should be able to do masked word prediction with the language of your choice (question answering may not work as intended).
from happy-transformer.
Added in #195. Now you can use any model available on Hugging Face's model distribution network for text classification, word prediction, next sentence prediction, question answering and token classification.
New release coming soon.
from happy-transformer.
It's now available in version 2.1.0 that's published to PyPi.
from happy-transformer.
Related Issues (20)
- I created a train.csv and I get an error about HOT 5
- wanted to use GLUE score or perplexity in the model evaluation. HOT 2
- can we use a list of sentences inside generate_text, it seems to accept only string? HOT 1
- startoftext, pad, and endoftext tokens HOT 3
- instruction fine tuning (for text generation) HOT 2
- Fine-tuning TextToText (T5) model on multiple tasks HOT 1
- How to get multiple labels as output and not just one using classify_text ? HOT 2
- Can i use t5 model to fine tune on text-to-code generation? HOT 1
- [Feature suggestion] Parameter-Efficient Fine-Tuning
- ValueError: Unrecognized configuration class <class 'transformers.models.t5.configuration_t5.T5Config'> HOT 1
- How do I save the model the huggingface? HOT 1
- Get the loss per epoch instead of only 500 and 1000 epochs HOT 4
- Suggestion: it may better have trust_remote_code parameter in HappyTransformer class HOT 1
- TypeError: TextEncodeInput must be Union[TextInputSequence, Tuple[InputSequence, InputSequence]] HOT 1
- Unable to fine-tune Question answer model
- Are other languages compatible HOT 2
- Tries to use GPU (MPS) on Mac M1 and fails b/c model seems not loaded to device HOT 3
- Large Dataset Fills up RAM (Enable "streaming" parameter to load_dataset)
- Does this support FP 16? and what is the VRAM requirement for training llama
- Memory exhausting
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from happy-transformer.