Giter VIP home page Giter VIP logo

reina's Introduction

REINA

Implementation of the following paper:

Training Data is More Valuable than You Think: A Simple and Effective Method by Retrieving from Training Data (https://arxiv.org/abs/2203.08773)

Shuohang Wang (shuowa at microsoft.com), Yichong Xu, Yuwei Fang, Yang Liu, Siqi Sun, Ruochen Xu, Chenguang Zhu, Michael Zeng

Accept to ACL2022 main conference

Usage 1

After cloning the repo, run the following code with docker to reproduce REINA on XSum dataset. REINA is interaged into the model trainig code. Please set model name to google/pegasus-large or facebook/bart-large or facebook/bart-base, etc. By default, the job is run on 8 GPUs. Please tuning "--gradient_accumulation_steps" if use less GPUs. More --reina_workers is prefered to speed up REINA process. 40 workers will task around 15 minutes.

docker run --gpus all -it --rm --shm-size 10g -w /home/reina/src -v ${PWD}/REINA:/home/reina shuohang/pytorch:reina /bin/bash -c "export HF_DATASETS_CACHE=/home/reina/data; export TRANSFORMERS_CACHE=/home/reina/cache; python -m torch.distributed.launch --nproc_per_node=8 run_summarization.py --report_to none  --save_strategy epoch --model_name_or_path google/pegasus-large --dataset_name xsum  --do_train   --do_eval --do_predict  --per_device_train_batch_size=2 --gradient_accumulation_steps 2 --per_device_eval_batch_size=4 --predict_with_generate --output_dir /home/reina/output --overwrite_output_dir --text_column document --summary_column summary  --num_train_epochs 3 --logging_strategy epoch --evaluation_strategy epoch --load_best_model_at_end --max_target_length 64 --val_max_target_length 64 --learning_rate 0.00005 --reina --reina_workers 40"

Usage 2

In this section, the REINA and model training are splitted in two steps. The first step will save REINA data into files and then run seq2seq model for summarization.

docker run --gpus all -it --rm --shm-size 10g -w /home/reina/src -v ${PWD}/REINA:/home/reina shuohang/pytorch:reina /bin/bash -c "export HF_DATASETS_CACHE=/home/reina/data; python reina.py --dataname xsum --reina_workers 10 --key_column document --value_column summary"
docker run --gpus all -it --rm --shm-size 10g -w /home/reina/src -v ${PWD}/REINA:/home/reina shuohang/pytorch:reina /bin/bash -c "export HF_DATASETS_CACHE=/home/reina/data; export TRANSFORMERS_CACHE=/home/reina/cache; python -m torch.distributed.launch --nproc_per_node=8 run_summarization.py --report_to none  --save_strategy epoch --model_name_or_path google/pegasus-large  --do_train   --do_eval --do_predict  --per_device_train_batch_size=2 --gradient_accumulation_steps 2 --per_device_eval_batch_size=4 --predict_with_generate --output_dir /home/reina/output --overwrite_output_dir --text_column document --summary_column summary  --num_train_epochs 3 --logging_strategy epoch --evaluation_strategy epoch --load_best_model_at_end --max_target_length 64 --val_max_target_length 64 --learning_rate 0.00005  --train_file /home/reina/data/reina/xsum/train.json --validation_file /home/reina/data/reina/xsum/validation.json --test_file /home/reina/data/reina/xsum/test.json"

Related project

REINA is integrated into the project of Human Parity on CommonsenseQA

https://github.com/microsoft/KEAR

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.

reina's People

Contributors

microsoft-github-operations[bot] avatar microsoftopensource avatar shuohangwang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

reina's Issues

Docker cannot run project

Hello, thank you very much for your work, but when I download docker image according to the instructions, some files will be missing,

python: can't open file 'reina. py': [Errno 2] No such file or directory

Run Inference Without Docker?

Hi!

So I've already used the docker command (under Usage 1) to train and evaluate my model with REINA, and I have all the outputs from that. For further use, is it possible to do

export HF_DATASETS_CACHE=/home/reina/data; export TRANSFORMERS_CACHE=/home/reina/cache
python run_summarization.py ...

on my local run_summarization.py (ie, without using the docker command)?

Also, is Lucene needed for inference? My guess is that it isn't, because it seems like when you have the REINA dataset in cache, all reina() does is load that dataset, and then HuggingFace's Trainer takes care of the rest, but want to confirm.

Any try on Dialogue generation task ?

Hi, thanks for the great work. I am wondering have you ever conducted some preliminary experiments on the dialogue generation with the proposed method ? If so, with not so promising results, what do you think is the main obstacle when applying REINA to dialogue generation ?
Looking forward to your reply.

Out Of Context Information

Hi, I was studying the source code a bit more closely and had a question regarding the method: because we're adding top K closest summaries (from the training data) to an inference document and running it through a Seq2Seq model (that has already been trained on REINA data), is there a chance for the model to provide a summary that includes information outside of the context given in the original document?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.