PEACook: Post-Editing Advancement Cookbook.- Hot-start Transfer Learning combined with Approximate Distillation for Mongolian- Chinese Neural Machine Translation.- Review-based Curriculum Learning for Neural Machine Translation.- Multi-Strategy Enhanced Neural Machine Translation for Chinese Minority Language.- Target-side Language Model for Reference-free Machine Translation Evaluation.- Life Is Short, Train It Less: Neural Machine Tibetan-Chinese Translation Based on mRASP and Dataset Enhancement.- Improving the Robustness of Low-Resource Neural Machine Translation with Adversarial Examples.- Dynamic Mask Curriculum Learning for Non-Autoregressive Neural Machine Translation.- Dynamic Fusion Nearest Neighbor Machine Translation via Dempster–Shafer Theory.- A Multi-tasking and Multi-stage Chinese Minority Pre-Trained Language Model.- An improved Multi-task Approach to Pre-trained Model Based MT Quality Estimation.- Optimizing Deep Transformers for Chinese-Thai Low-Resource Translation.- HW-TSC Submission for CCMT 2022 Translation Quality Estimation Task.- Effective Data Augmentation Methods for CCMT 2022.- NJUNLP’s Submission for CCMT 2022 Quality Estimation Task.- ISTIC’s Thai-to-Chinese Neural Machine Translation System for CCMT’ 2022.