This book constitutes the refereed proceedings of the 18th China Conference on Machine Translation, CCMT 2022, held in Lhasa, China, during August 6-10, 2022. The 16 full papers were included in this book were carefully reviewed and selected from 73 submissions.
This book constitutes the refereed proceedings of the 18th China Conference on Machine Translation, CCMT 2022, held in Lhasa, China, during August 6-10, 2022. The 16 full papers were included in this book were carefully reviewed and selected from 73 submissions.
Produktdetails
Produktdetails
Communications in Computer and Information Science 1671
PEACook: Post-Editing Advancement Cookbook.- Hot-start Transfer Learning combined with Approximate Distillation for Mongolian- Chinese Neural Machine Translation.- Review-based Curriculum Learning for Neural Machine Translation.- Multi-Strategy Enhanced Neural Machine Translation for Chinese Minority Language.- Target-side Language Model for Reference-free Machine Translation Evaluation.- Life Is Short, Train It Less: Neural Machine Tibetan-Chinese Translation Based on mRASP and Dataset Enhancement.- Improving the Robustness of Low-Resource Neural Machine Translation with Adversarial Examples.- Dynamic Mask Curriculum Learning for Non-Autoregressive Neural Machine Translation.- Dynamic Fusion Nearest Neighbor Machine Translation via Dempster-Shafer Theory.- A Multi-tasking and Multi-stage Chinese Minority Pre-Trained Language Model.- An improved Multi-task Approach to Pre-trained Model Based MT Quality Estimation.- Optimizing Deep Transformers for Chinese-Thai Low-Resource Translation.- HW-TSC Submission for CCMT 2022 Translation Quality Estimation Task.- Effective Data Augmentation Methods for CCMT 2022.- NJUNLP's Submission for CCMT 2022 Quality Estimation Task.- ISTIC's Thai-to-Chinese Neural Machine Translation System for CCMT' 2022.
PEACook: Post-Editing Advancement Cookbook.- Hot-start Transfer Learning combined with Approximate Distillation for Mongolian- Chinese Neural Machine Translation.- Review-based Curriculum Learning for Neural Machine Translation.- Multi-Strategy Enhanced Neural Machine Translation for Chinese Minority Language.- Target-side Language Model for Reference-free Machine Translation Evaluation.- Life Is Short, Train It Less: Neural Machine Tibetan-Chinese Translation Based on mRASP and Dataset Enhancement.- Improving the Robustness of Low-Resource Neural Machine Translation with Adversarial Examples.- Dynamic Mask Curriculum Learning for Non-Autoregressive Neural Machine Translation.- Dynamic Fusion Nearest Neighbor Machine Translation via Dempster-Shafer Theory.- A Multi-tasking and Multi-stage Chinese Minority Pre-Trained Language Model.- An improved Multi-task Approach to Pre-trained Model Based MT Quality Estimation.- Optimizing Deep Transformers for Chinese-Thai Low-Resource Translation.- HW-TSC Submission for CCMT 2022 Translation Quality Estimation Task.- Effective Data Augmentation Methods for CCMT 2022.- NJUNLP's Submission for CCMT 2022 Quality Estimation Task.- ISTIC's Thai-to-Chinese Neural Machine Translation System for CCMT' 2022.
Es gelten unsere Allgemeinen Geschäftsbedingungen: www.buecher.de/agb
Impressum
www.buecher.de ist ein Internetauftritt der buecher.de internetstores GmbH
Geschäftsführung: Monica Sawhney | Roland Kölbl | Günter Hilger
Sitz der Gesellschaft: Batheyer Straße 115 - 117, 58099 Hagen
Postanschrift: Bürgermeister-Wegele-Str. 12, 86167 Augsburg
Amtsgericht Hagen HRB 13257
Steuernummer: 321/5800/1497
USt-IdNr: DE450055826