Hi! I am a PhD student at UC San Diego, advised by Prof. Xiaolong Wang.
My research interest mainly lies in deep learning and language models. Recently, I work on Test-Time Training, which lets each test instance define its own learning problem to facilitate its own generalization. This learning problem can be self-supervised, making better generalization possible at test-time without access to ground-truth labels.
From my study of Test-Time Training, I have become particularly interested in developing
I am very grateful to have worked with and learned from many great mentors along the way. I spent an unforgettable summer as a visiting student at Prof. Tatsunori Hashimoto's group at Stanford, working with Dr. Yu Sun and other mentors and friends. In my undergrad years, I worked with Prof. Jingjing Li at UESTC.
‡ indicates equal contribution.
Learning to (Learn at Test Time): RNNs with Expressive Hidden States
Yu Sun‡, Xinhao Li‡, Karan Dalal‡, Jiarui Xu, Arjun Vikram, Genghan Zhang, Yann Dubois, Xinlei Chen, Xiaolong Wang, Sanmi Koyejo, Tatsunori Hashimoto, Carlos Guestrin
arXiv 2024
Learning to (Learn at Test Time)
Yu Sun‡, Xinhao Li‡, Karan Dalal, Chloe Hsu, Sanmi Koyejo, Carlos Guestrin, Xiaolong Wang, Tatsunori Hashimoto, Xinlei Chen
arXiv 2023
Interpretable Open-Set Domain Adaptation via Angular Margin Separation
Xinhao Li, Jingjing Li, Zhekai Du, Lei Zhu, Wen Li
ECCV 2022
Imbalanced Source-Free Domain Adaptation
Xinhao Li, Jingjing Li, Lei Zhu, Guoqing Wang, Zi Huang
ACM MM 2021
Full Resume in PDF.