Negative lexically constrained decoding for paraphrase generation

概要

Paraphrase generation can be regarded as monolingual translation. Unlike bilingual machine translation, paraphrase generation rewrites only a limited portion of an input sentence. Hence, previous methods based on machine translation often perform conservatively to fail to make necessary rewrites. To solve this problem, we propose a neural model for paraphrase generation that first identifies words in the source sentence that should be paraphrased. Then, these words are paraphrased by the negative lexically constrained decoding that avoids outputting these words as they are. Experiments on text simplification and formality transfer show that our model improves the quality of paraphrasing by making necessary rewrites to an input sentence.

論文種別
発表文献
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019)
梶原智之
梶原智之
招へい助教

自然言語処理。特に、テキスト平易化、言い換え、意味的文間類似度、品質推定。