关于我
About Me
我的主要研究方向为 高效大模型训练和推理。我于 2022 和 2025 年分别在山东大学获得了计算机科学学士和硕士学位(人工智能实验班)。我的本科和硕士导师是 任昭春教授。我即将成为哈尔滨工业大学(深圳)的博士生,导师为 俞俊教授。
My main research focus is on efficient training and inference of large language models (LLMs). I obtained my Bachelor's and Master's degrees in Computer Science from Shandong University in 2022 and 2025, respectively (Artificial Intelligence Experimental Class). My advisor for my Bachelor's and Master's degrees was Zhaochun Ren. I will soon be joining HITSZ as a Ph.D. student, advised by Jun Yu. During my undergraduate studies, I was awarded the ACM/ICPC Regional Contest Silver Medal, demonstrating strong coding abilities.
最新动态
News
[2025年5月] 我们关于大模型高效知识蒸馏的新论文已发布在 arXiv!我们提出了 Low-Rank Clone (💖LRC💖),一种创新的 SLM 高效预训练方法。 LRC 仅需约 10B-20B tokens 即可达到甚至超越需要数万亿 (Trillions) tokens 训练的SOTA模型。
[May 2025] Our new paper on efficient knowledge distillation for LLMs is now available on arXiv! We propose Low-Rank Clone (💖LRC💖), an innovative and efficient pretraining method for SLMs. LRC can achieve, or even surpass, the performance of SOTA models trained on trillions of tokens with only about 10B-20B tokens.
研究成果
Publications
A Token is Worth over 1,000 Tokens: Efficient Knowledge Distillation through Low-Rank Clone
OmniKV: Dynamic Context Selection for Efficient Long-Context LLMs
MEFT: Memory-Efficient Fine-Tuning through Sparse Adapter
Multi-Defendant Legal Judgment Prediction via Hierarchical Reasoning
实习经历
Internship Experience
- 百度 (2025年3月 - 至今),研究实习生
- 蚂蚁集团 (2024年5月 - 2024年10月),研究实习生
- Baidu (Mar. 2025 - Present), Research Intern
- Ant Group (May 2024 - Oct. 2024), Research Intern
奖项与荣誉
Awards & Honors
- 国家奖学金,一等学业奖学金等
- ACM/ICPC 亚洲区域赛银牌 (2枚)
- 大学生软件创新大赛 (OPPO 杯) 国家一等奖
- National Scholarship, First-Class Academic Scholarship, etc.
- ACM/ICPC Asia Regional Contest, Silver Medal (x2)
- National First Prize, University Student Software Innovation Competition (OPPO Cup)