zeithaum / finetune-gptneo Goto Github PK
View Code? Open in Web Editor NEWThis project forked from ognexus/finetune-gptneo
Fine Tune GPT-NEO (2.7B and 1.3B Parameters) on a single GPU with Huggingface Transformers using DeepSpeed.
License: MIT License