carldowns / biblosa-pytorch Goto Github PK
View Code? Open in Web Editor NEWThis project forked from galsang/biblosa-pytorch
Re-implementation of Bi-Directional Block Self-Attention for Fast and Memory-Efficient Sequence Modeling (T. Shen et al., ICLR 2018) on Pytorch.