prateekg / machine-trasnlation---attention-mechanism Goto Github PK
View Code? Open in Web Editor NEWThe repository focuses on implementing the machine translation using translated pairs of Dutch and English. The architecture uses the BahdanauDecoder to implement the attention mechanism and generate the translated sequence using the hidden encoder states. For a variance in the architecture, bi-directional encoder hidden states can be used along with different attention mechanism like LoungDecoder.