Giter VIP home page Giter VIP logo

depth_growing_nmt's Introduction

Depth Growing for Neural Machine Translation

This repository is the code for short paper of ACL 2019: Depth Growing for Neural Machine Translation.

The project is based on the fairseq (version 0.5.0). (Please get familar with the fairseq project first)

@inproceedings{wu2019depth,
  title={Depth Growing for Neural Machine Translation},
  author={Wu, Lijun and Wang, Yiren and Xia, Yingce and Tian, Fei and Gao, Fei and Qin, Tao and Lai, Jianhuang and Liu, Tie-Yan},
  booktitle={ACL 2019},
  year={2019}
}

Requirements and Installation

pip install -r ./deepNMT/requirements.txt
python ./deepNMT/setup.py build develop

Data

Please refer to WMT14_En_De for data processing.

Training

The detaied training procedure is:

  • Train shallow model with six layers
train_fairseq_en2de.sh
  • Train first several steps of the deep model with eight layers. For example, train only 10 steps.
train_fairseq_en2de_deep.sh
  • Prepare the deep model. Initialize the deep model with the parameters from the shallow model in last step.
build_initial_ckpt_for_deep.sh
  • Reload the initialized deep model and train deep model with eight layers.
train_fairseq_en2de_deep.sh

Inference

The detailed inference procedure is:

bash infer_deepNMT.sh 0 <shallow_model_ckpt_path>  <deep_model_ckpt_path>

depth_growing_nmt's People

Contributors

apeterswu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

Forkers

ludovicly

depth_growing_nmt's Issues

Wondering about the freeze operation

Hi,
I'm really interested in your work and fascinated by the strong improvement in your paper. But when reading your code, I have the following question:

If I understand the scripts correctly, you first train the shallow model, then replace the parameter in the deep model. Finally, you directly train the deep model. However, I didn't see any code about the operation about freeze operation as you described in your paper.

"in Stage 1, the bottom module (i.e., enc1 and dec1) is trained and subsequently holds constant; in Stage 2, only the top module (i.e., enc2 and dec2) is optimized."

Am I missing some details in your code?

Looking forward to your reply. Thanks

train first several steps

I'm really interested in your work. But I don't understand why we need to 'train first several steps of the deep model with eight layers. For example, train only 10 steps.'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.