神经机器翻译(Neural Machine Translation)系列教程 - (一)神经机器翻译-开源项目

xiaoxiao2021-02-28  109

(一)基于 C/C++  神经机器翻译

1.1 EUREKA-MangoNMT

https://github.com/jiajunzhangnlp/EUREKA-MangoNMT

中科院-张家俊

1.2 Marian

https://github.com/marian-nmt/marian

1.3 Zoph_RNN

https://github.com/isi-nlp/Zoph_RNN

支持多个GPU并行计算

1.4 Mantidae

https://github.com/duyvuleo/Mantidae

1.5 N3LP

https://github.com/hassyGo/N3LP

(二)基于 Theano  神经机器翻译

2.1 DCNMT

https://github.com/swordyork/dcnmt

2.2 dl4mt-tutorial

https://github.com/nyu-dl/dl4mt-tutorial

2.3 dl4mt-c2c

https://github.com/nyu-dl/dl4mt-c2c

2.4 HNMT

https://github.com/robertostling/hnmt

2.5 Nematus

https://github.com/rsennrich/nematus

2.6 neuralmt

https://github.com/zomux/neuralmt

2.7 NMT

https://github.com/tuzhaopeng/NMT

2.8 nmtpy

https://github.com/lium-lst/nmtpy

2.9 RNNsearch

https://github.com/XMUNLP/RNNsearch

2.10 SGNMT

https://github.com/ucam-smt/sgnmt

2.11 THUMT

https://github.com/thumt/THUMT

清华大学nlp实现

2.12 GroundHog

https://github.com/lisa-groundhog/GroundHog

Bengio研究组实现

2.13 NMT-Coverage

https://github.com/tuzhaopeng/NMT-Coverage 华为诺亚方舟实验室李航团队,实现了基于覆盖率的神经机器翻译模型

2.13 NMT-Coverage

https://github.com/tuzhaopeng/NMT-Coverage

华为诺亚方舟实验室李航团队,实现了基于覆盖率的神经机器翻译模型

2.14 blocks

https://github.com/mila-udem/blocks

GroundHog 升级版

(三)基于 TensorFlow神经机器翻译

3.1 byteNet-tensorflow

https://github.com/paarthneekhara/byteNet-tensorflow

3.2 bytenet_translation

https://github.com/Kyubyong/bytenet_translation

3.3 Neural Monkey

https://github.com/ufal/neuralmonkey

3.4 seq2seq

https://github.com/eske/seq2seq

3.5 Tensor2Tensor

https://github.com/tensorflow/tensor2tensor

3.6 tf-seq2seq

https://github.com/google/seq2seq

google提出的seq2seq模型

(四)基于 Keras神经机器翻译

4.1 Keras seq2seq

https://github.com/farizrahman4u/seq2seq

4.2 NMT-Keras

https://github.com/lvapeab/nmt-keras

(五)基于 Chainer神经机器翻译

5.1 chainn

https://github.com/philip30/chainn

5.2 KyotoNMT

https://github.com/fabiencro/knmt

5.3 attention_is_all_you_need

https://github.com/soskek/attention_is_all_you_need

实现了Google完全基于attention的模型

(六)基于 Caffe2 神经机器翻译

6.1 seq2seq

https://github.com/caffe2/caffe2/blob/master/caffe2/python/examples/seq2seq.py

(七)基于 Torch神经机器翻译

7.1 nmt-android

https://github.com/harvardnlp/nmt-android

安卓手机NMT

7.2 seq2seq-attn

https://github.com/harvardnlp/seq2seq-attn

7.3 OpenNMT

https://github.com/OpenNMT/OpenNMT

哈佛大学nlp组实现

7.4 fairseq

https://github.com/facebookresearch/fairseq

Facebook : cnn+attention 实现

(八)基于 PyTorch神经机器翻译

8.1 OpenNMT-py

https://github.com/OpenNMT/OpenNMT-py

(九)基于 DyNet 神经机器翻译

9.1 dynmt-py

https://github.com/roeeaharoni/dynmt-py

9.2 lamtram

https://github.com/neubig/lamtram

9.3 mantis

https://github.com/trevorcohn/mantis

9.4 NMTKit

https://github.com/odashi/nmtkit

9.5 xnmt

https://github.com/neulab/xnmt

(十)基于 MXNet 神经机器翻译

10.1 MXNMT

https://github.com/magic282/MXNMT

10.2 sockeye

https://github.com/awslabs/sockeye

(十一)基于 matlab 神经机器翻译

11.1 nmt.matlab

https://github.com/lmthang/nmt.matlab

11.2 nmt_stanford_nlp

http://nlp.stanford.edu/projects/nmt/

斯坦福大学nlp组实现

转载请注明原文地址: https://www.6miu.com/read-34791.html

最新回复(0)