I am a fourth-year PhD student at Fudan University, advised by Prof. Xipeng Qiu.
My research interest focuses on Large Language Models & Retrieval, especially Retrieval-augmented LLM Generation, LLM-augmented Retrieval and LLM’s Verifiable Generation.

I'm expected to receive my PhD degree in June 2025 and will seek industry opportunities (US / CN) in 2024.

Education

  • Fudan University
    Ph.D. in Computer Science, 2020 - 2025 (expected)
    Advisor: Prof. Xipeng Qiu
  • Xidian University
    B.E. in Computer Science, 2016 - 2020
    GPA: 3.9/4.00, Ranking: 3/400

Experience

  • Microsoft Research Asia
    Advisor: Dr. Yeyun Gong
    June 2021 - June 2022

Publications

* denotes co-first authors

Xiaonan Li *, Changtai Zhu *, Linyang Li, Zhangyue Yin, Tianxiang Sun, Xipeng Qiu
LLatrieval: LLM-Verified Retrieval for Verifiable Generation
preprint arXiv 2023

Xiaonan Li, Xipeng Qiu
MoT: Memory-of-Thought Enables ChatGPT to Self-Improve
EMNLP 2023

Xiaonan Li, Xipeng Qiu
Finding Support Examples for In-Context Learning
EMNLP 2023 Findings

Xiaonan Li *, Kai Lv *, Hang Yan, Tianyang Lin, Wei Zhu, Yuan Ni, Guotong Xie, Xiaoling Wang, Xipeng Qiu
Unified Demonstration Retriever for In-Context Learning
ACL 2023

Xiaonan Li, Yeyun Gong, Yelong Shen, Xipeng Qiu, Hang Zhang, Bolun Yao, Weizhen Qi, Daxin Jiang, Weizhu Chen, Nan Duan
CodeRetriever: A Large Scale Contrastive Pre-Training Method for Code Search
EMNLP 2022

Xiaonan Li *, Daya Guo *, Yeyun Gong, Yun Lin, Yelong Shen, Xipeng Qiu, Daxin Jiang, Weizhu Chen, Nan Duan
Soft-Labeled Contrastive Pre-training for Function-level Code Representation
EMNLP 2022 Findings

Xiaonan Li, Yunfan Shao, Tianxiang Sun, Hang Yan, Xipeng Qiu , Xuanjing Huang
Accelerating BERT Inference for Sequence Labeling via Early-Exit
ACL 2021

Xiaonan Li, Hang Yan, Xipeng Qiu, Xuanjing Huang
FLAT: Chinese NER Using Flat-Lattice Transformer
ACL 2020

Awards

National Scholarship in 2020-2021 (Graduate).
National Scholarship in 2016-2017, 2017-2018 and 2018-2019 (Undergrad).