About me
I am a Ph.D. Candidate (from 2023.09) in the Chinese Information Processing Laboratory at the Institute of Software, Chinese Academy of Sciences, under the supervision of Professor Xianpei Han.
My research interests include:
- LLM Hallucination
- Retrieval-Augmented Generation
- Knowledge Graph
Publications
2025
DeepRAG: Thinking to Retrieval Step by Step for Large Language Models
Xinyan Guan, Jiali Zeng, Fandong Meng, Chunlei Xin, Yaojie Lu, Hongyu Lin, Xianpei Han, Le Sun, Jie Zhou
arXiv preprint
[Paper]PPTAgent: Generating and Evaluating Presentations Beyond Text-to-Slides
Hao Zheng, Xinyan Guan*, Hao Kong, Jia Zheng, Hongyu Lin, Yaojie Lu, Ben He, Xianpei Han, Le Sun
arXiv preprint arXiv:2501.03936
[Paper] [Code]
2024
Search, Verify and Feedback: Towards Next Generation Post-training Paradigm of Foundation Models via Verifier Engineering
Xinyan Guan, Yanjiang Liu, Xinyu Lu, Boxi Cao, Ben He, Xianpei Han, Le Sun, Jie Lou, Bowen Yu, Yaojie Lu, Hongyu Lin
arXiv preprint arXiv:2411.11504
[Paper] [Github]Mitigating Large Language Model Hallucinations via Autonomous Knowledge Graph-based Retrofitting
Xinyan Guan, Yanjiang Liu, Hongyu Lin, Yaojie Lu, Ben He, Xianpei Han, Le Sun
AAAI Conference on Artificial Intelligence (AAAI 2024)
[Paper]REInstruct: Building Instruction Data from Unlabeled Corpus
Shu Chen, Xinyan Guan, Yaojie Lu, Hongyu Lin, Xianpei Han, Le Sun
Annual Meeting of the Association for Computational Linguistics (ACL 2024)
[Paper]On-Policy Fine-grained Knowledge Feedback for Hallucination Mitigation
Xueru Wen, Xinyu Lu, Xinyan Guan, Yaojie Lu, Hongyu Lin, Ben He, Xianpei Han, Le Sun
arXiv preprint arXiv:2406.12221
[Paper]
2022
- Improving Temporal Generalization of Pre-trained Language Models with Lexical Semantic Change
Zhaochen Su, Zecheng Tang, Xinyan Guan, Juntao Li, Lijun Wu, Min Zhang
Conference on Empirical Methods in Natural Language Processing (EMNLP 2022)
[Paper]
Internship Experience
- Research Intern at Tencent, WeChat AI. (Aug. 2024 – Present)
- Conducting research on Retrieval-Augmented Generation (RAG) for improving language model effectiveness and efficiency.
- Research Intern at Xiaohongshu Inc. (Nov. 2023 – Jul. 2024)
- Researched supervised fine-tuning data quality and diversity to enhance model performance.
- Developed and optimized the Megatron alignment algorithm for large language models.
Frequent Collaborators
Recently, I closely collaborate with the following researchers and sincerely appreciate their comprehensive guidance and insightful discussions, which have greatly contributed to my research!
- Yanjiang Liu (Chinese Information Processing Laboratory, ISCAS)
- Chunlei Xin (Chinese Information Processing Laboratory, ISCAS)
- Hongyu Lin (Chinese Information Processing Laboratory, ISCAS)