The EvaHan2023: Overview of the First International Ancient Chinese Translation Bakeoff

Dongbo Wang, Litao Lin, Zhixiao Zhao, Wenhao Ye, Kai Meng, Wenlong Sun, Lianzhen Zhao, Xue Zhao, Si Shen, Wei Zhang and Bin Li

The Ups and Downs of Training RoBERTa-based models on Smaller Datasets for Translation Tasks from Classical Chinese into Modern Standard Mandarin and Modern English

Stuart Michael McManus, Roslin Liu, Yuji Li, Leo Tam, Stephanie Qiu and Letian Yu

Pre-trained Model In Ancient-Chinese-to-Modern-Chinese Machine Translation

Jiahui Wang, Xuqin Zhang, Jiahuan Li and Shujian Huang

Some Trials on Ancient Modern Chinese Translation

Li Lin and Xinyu Hu

Istic Neural Machine Translation System for EvaHan 2023

Ningyuan Deng, Shuao Guo and Yanqing He

BIT-ACT: An Ancient Chinese Translation System Using Data Augmentation

Li Zeng, Yanzhi Tian, Yingyu Shan and Yuhang Guo

Technical Report on Ancient Chinese MachineTranslation Based on mRASP Model

Wenjing Liu

AnchiLm: An Effective Classical-to-Modern Chinese Translation Model Leveraging bpe-drop and SikuRoBERTa

Jiahui Zhu and Sizhou Chen

Translating Ancient Chinese to Modern Chinese at Scale: A Large Language Model-based Approach

Jiahuan Cao, Dezhi Peng, Yongxin Shi, Zongyuan Jiang and Lianwen Jin