Hongqiu Wu

PhD Candidate of Computer Science

Supervisor: Hai Zhao


Email:
wuhongqiu@sjtu.edu.cn

Address

Department of Computer Science,
800 Dongchuan Road,
Shanghai Jiao Tong University,
200240, China

  Research Interest


AI4Game, Error Correction

  Task


  • Instruction-Driven Game Engine (IDGE) for Poker. [DEMO]

  • We released ReLM (Rephrasing Language Model), a new state-of-the-art model for Chinese Spelling Correction. Unlike previous tagging models, ReLM is a pure language model, which achieved the new state-of-the-art results on ECSpell, LEMON, MCSC. [ReLM]

  • I released LEMON, a novel Chinese Spelling Correction benchmark with ByteDance. [LEMON]

  Publication


* refers to euqal contribution

2024

  • Hongqiu Wu, Yan Wang, Xingyuan Liu, Hai Zhao, Min Zhang. Instruction-Driven Game Engines for Large Language Models. [PDF]

  • Yifei Yang*, Hongqiu Wu*, Hai Zhao. Attack Named Entity Recognition by Entity Boundary Interference. COLING 2024. [PDF]

  • Khai Jiet Liong, Hongqiu Wu, Hai Zhao. Unveiling Vulnerability of Self-Attention. COLING 2024. [PDF]

  • Linfeng Liu*, Hongqiu Wu*, Hai Zhao. Chinese Spelling Correction as Rephrasing Language Model. AAAI 2024. [PDF]

2023

  • Hongqiu Wu, Linfeng Liu, Hai Zhao, Min Zhang. Empower Nested Boolean Logic via Self-Supervised Curriculum Learning. EMNLP 2023. [PDF]

  • Hongqiu Wu, Shaohua Zhang, Yuchen Zhang, Hai Zhao. Rethinking Masked Language Modeling for Chinese Spelling Correction. ACL 2023. [PDF]

  • Hongqiu Wu, Yongxiang Liu, Hanwen Shi, Hai Zhao, Min Zhang. Toward Adversarial Training on Contextualized Language Representation. ICLR 2023. [PDF]

  • Hongqiu Wu, Ruixue Ding, Hai Zhao, Pengjun Xie, Fei Huang, Min Zhang. Adversarial Self-Attention for Language Understanding. AAAI 2023. [PDF]

2022

  • Hongqiu Wu, Ruixue Ding, Hai Zhao, Boli Chen, Pengjun Xie, Fei Huang, Min Zhang. Forging Multiple Training Objectives for Pre-trained Language Models via Meta-Learning. Findings of EMNLP 2022. [PDF]

  • Yiyang Li, Hongqiu Wu, Hai Zhao. Semantic-Preserving Adversarial Code Comprehension. COLING 2022. [PDF]

2021

  • Hongqiu Wu, Hai Zhao, Min Zhang. Not All Attention Is All You Need. Preprint. [PDF]

  • Hongqiu Wu, Hai Zhao, Min Zhang. Code Summarization with Structure-induced Transformer. Findings of ACL 2021. [PDF]


  Education


-- Doctor of Philosophy, Computer Science, Shanghai Jiao Tong University, Sep. 2020 - Present

-- Bachelor of Engineering, Information Engineering, Shanghai Jiao Tong University, Sep. 2016 - Jun. 2020



  Honors


-- Huawei Scholarship

-- Intel Scholarship

-- Third place in Obei Cup (top 4 in China)

-- Outstanding Student Scholarship

-- The SJTU Cup Champions for Football


  Internship


-- Research Intern, miHoYo LumiNLP

-- Research Intern, ByteDance AI Lab

-- Research Intern, Alibaba DAMO Academy

-- Research Intern, Xiaomi AI Lab

-- Algorithm Engineer Intern, Jing Dong Digits

-- Algorithm Engineer Intern, Ping An Insurance


  Links


· Shanghai Jiao Tong University
· Center for Brain-Like Computing and Machine Intelligence
· Department of Computer Science and Engineering