Hi! I am a first-year PhD student at Language Technologies Institute (LTI), Carnegie Mellon University (CMU), advised by Prof. Chenyan Xiong. My primary research interests are exploring novel ways to efficiently train and apply large language models in sample-limited and computation-limited scenarios. I am currently working on data valuation, hopefully derived from model preference, for a more efficient pre-training procedure.

Previously, I graduated from Tsinghua University in 2023 with a major in Computer Science and Technology. I was honored to be a member of THUNLP, advised by Prof. Zhiyuan Liu, working closely with Dr. Tianyu Gao and Dr. Zhengyan Zhang in prompting and few-shot learning. I was a research intern at UWNLP, advised by Prof. Sheng Wang, and an intern at Baidu NLP group.

When I am not doing research, I like to work out, play guitar, and watch movies.

Updates: