In this paper, we investigate the effect of incorporating phonetic features for English-to-Chinese transliteration under the multi-task learning (MTL) setting-where we define a phonetic auxiliary task aimed to improve the generalization performance of the main transliteration task. While many have applied various NMT techniques to enhance machine transliteration models, few focus on the linguistic features particular to the relevant languages. Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language ProcessingĪssociation for Computational LinguisticsĪpproaching named entities transliteration as a Neural Machine Translation (NMT) problem is common practice. Our results show that the multi-task model achieves similar performance as the previous state of the art with a model of a much smaller size.",Įnglish-to-Chinese Transliteration with Phonetic Auxiliary Task In addition to our system, we also release a new English-to-Chinese dataset and propose a novel evaluation metric which considers multiple possible transliterations given a source name. Cite (Informal): English-to-Chinese Transliteration with Phonetic Auxiliary Task (He & Cohen, AACL 2020) Copy Citation: BibTeX Markdown MODS XML Endnote More options… PDF: = "where we define a phonetic auxiliary task aimed to improve the generalization performance of the main transliteration task. Association for Computational Linguistics. ![]() In Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing, pages 378–388, Suzhou, China. English-to-Chinese Transliteration with Phonetic Auxiliary Task. Anthology ID: 2020.aacl-main.40 Volume: Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing Month: December Year: 2020 Address: Suzhou, China Venue: AACL SIG: Publisher: Association for Computational Linguistics Note: Pages: 378–388 Language: URL: DOI: Bibkey: he-cohen-2020-english Cite (ACL): Yuan He and Shay B. Our results show that the multi-task model achieves similar performance as the previous state of the art with a model of a much smaller size. Abstract Approaching named entities transliteration as a Neural Machine Translation (NMT) problem is common practice.
0 Comments
Leave a Reply. |