Tuesday, September 5, 2023
Live Stream: Join stream (Zoom Link)
WS02: Ancient Language Translation Workshop (ALT 2023)
Venue: Salon VII, Level 1, Studio City Macau
Time Zone: UTC+8 (Beijing Time)
14:00 - 14:10: Opening Remarks
14:10 - 15:00: Invited Talks
14:10-14:30 | Prof. Zhiwei Feng, Xinjiang University (China) |
14:30-15:00 | Prof. Jingsong Yu, Peking University (China) |
15:00 - 15:30: Oral Reports
15:00-15:15 |
EvaCun: The first shared task on Cuneiform Machine Translation Shai Gordin |
15:15-15:30 |
The EvaHan2023: Overview of the First International Ancient Chinese Translation Bakeoff Dongbo Wang, Litao Lin, Zhixiao Zhao, Wenhao Ye, Kai Meng, Wenlong Sun, Lianzhen Zhao, Xue Zhao, Si Shen, Wei Zhang and Bin Li |
15:30 - 16:00: Coffee Break
16:00-16:15 |
The Ups and Downs of Training RoBERTa-based models on Smaller Datasets for Translation Tasks from Classical Chinese into Modern Standard Mandarin and Modern English Stuart Michael McManus, Roslin Liu, Yuji Li, Leo Tam, Stephanie Qiu and Letian Yu |
16:15-16:30 |
Pre-trained Model In Ancient-Chinese-to-Modern-Chinese Machine Translation Jiahui Wang, Xuqin Zhang, Jiahuan Li and Shujian Huang |
16:30-16:45 |
Some Trials on Ancient Modern Chinese Translation Li Lin and Xinyu Hu |
16:45-17:00 |
Istic Neural Machine Translation System for EvaHan 2023 Ningyuan Deng, Shuao Guo and Yanqing He |
17:00-17:15 |
BIT-ACT: An Ancient Chinese Translation System Using Data Augmentation Li Zeng, Yanzhi Tian, Yingyu Shan and Yuhang Guo |
17:15-17:30 |
Technical Report on Ancient Chinese MachineTranslation Based on mRASP Model Wenjing Liu |
17:30-17:45 |
AnchiLm: An Effective Classical-to-Modern Chinese Translation Model Leveraging bpe-drop and SikuRoBERTa Jiahui Zhu and Sizhou Chen |
17:45-18:00 |
Translating Ancient Chinese to Modern Chinese at Scale: A Large Language Model-based Approach Jiahuan Cao, Dezhi Peng, Yongxin Shi, Zongyuan Jiang and Lianwen Jin |
18:00 - 18:10: Closing Remarks