NMT+RNNG: A hybrid decoder for NMT which combines the decoder of an attentin-based neural machine translation model with Recurrent Nerual Network Grammars
We have presented a novel syntax-aware NMT model in the target side, called NMT+RNNG ("Learning to Parse and Translate Improves Neural Machine Translation" [1]. The decoder of the NMT+RNNG combines a usual conditional language model and Recurrent Neural Network Grammas (RNNGs) [2], which enables the proposed model to learn to parse and translate.
C++ implementation.
NMTRNNG.xpp
: our proposed model (NMT+RNNG).AttentionBiEncDec.xpp
: Baseline ANMT model/data/
: Tanaka corpus (JP-EN) [4]
- Eigen, a template libary for linear algebra (http://eigen.tuxfamily.org/index.php?title=Main_Page)
- N3LP, C++ libaray for neural network-based NLP (https://github.com/hassyGo/N3LP) (!) Note that some implementations are not available yet when running these codes efficiently
- Optional: SyntaxNet, a syntactic parser (https://github.com/tensorflow/models/tree/master/syntaxnet)
- Modify the paths of
EIGEN_LOCATION
andSHARE_LOCATION
inMakefile
. $ bash setup.sh
$ ./nmtrnng
(Then, it starts training theNMTRNNG
model.)- Modify
main.cpp
if you would like to try another model.
-
[1] Akiko Eriguchi, Yoshimasa Tsuruoka, and Kyunghyun Cho. 2017. "Learning to Parse and Translate Improves Neural Machine Translatioin". In Proceeding of the 55th Annual Meeting of the Association for Computational Linguistics (ACL 2017). To appear.
-
[2] Chris Dyer, Adhiguna Kuncoro, Miguel Ballesteros, and Noah A. Smith. 2016. "Recurrent Neural Network Grammars". In Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies.
-
[3] Tanaka Corpus
Thank you for your having interest in our work. If there are any issues, feel free to contact me.
- eriguchi [.at.] logos.t.u-tokyo.ac.jp