*: indicating alphabetic ordering or equal contribution.
I got my Computer Science Ph.D. from
Duke University in 2022. I am fortunate to have Prof. Rong Ge as my advisor.
Broadly, I am interested in deep learning theory and non-convex optimization.
I did my undergraduate study at Shanghai Jiao Tong University (SJTU).
In summer 2022, I did a machine learning engineer internship at Meta.
In summer 2021, I did a research internship at Facebook AI, with Dr. Yuandong Tian as my mentor. In fall 2019, I visited IAS for the Special Year on Optimization, Statistics, and Theoretical Machine Learning. In Spring 2017, I did an intern in the Division of Mathematical Sciences, Nanyang Technological University, advised by Prof. Xiaohui Bei.
At SJTU, I was conducting research on algorithmic game theory under the supervision of Prof. Fan Wu.
Here is my CV (last update: Sep. 2021)
Email: xwang AT cs DOT duke DOT edu
Office: LSRC D125
- Xingyu Zhu, Zixuan Wang, Xiang Wang, Mo Zhou, Rong Ge.
Understanding Edge-of-Stability Training Dynamics with a Minimalist Example. ICLR 2023 [arXiv]
- Muthu Chidambaram, Xiang Wang, Chenwei Wu, Rong Ge.
Provably Learning Diverse Features in Multi-View Data with Midpoint Mixup. [arXiv]
- Xiang Wang, Annie N. Wang, Mo Zhou, Rong Ge.
Plateau in Monotonic Linear Interpolation - A "Biased" View of Loss Landscape for Deep Networks. ICLR 2023 [arXiv]
- Xiang Wang, Xinlei Chen, Simon S. Du, Yuandong Tian.
Towards Demystifying Representation Learning with Non-contrastive Self-supervision. [arXiv]
- Muthu Chidambaram, Xiang Wang, Yuzheng Hu, Chenwei Wu, Rong Ge
Towards Understanding the Data Dependency of Mixup-style Training. ICLR 2022 (Spotlight) [arXiv]
- Rong Ge*, Yunwei Ren*, Xiang Wang*, Mo Zhou*.
Understanding Deflation Process in Over-parametrized Tensor Decomposition. NeurIPS 2021 [arXiv]
- Xiang Wang, Shuai Yuan, Chenwei Wu, Rong Ge.
Guarantees for Tuning the Step Size using a Learning-to-Learn Approach. ICML 2021 [arXiv]
- Xiang Wang*, Chenwei Wu*, Jason Lee, Tengyu Ma, Rong Ge.
Beyond Lazy Training for Over-parameterized Tensor Decomposition. NeurIPS 2020 [arXiv]
- Rohith Kuditipudi, Xiang Wang, Holden Lee, Yi Zhang, Zhiyuan Li, Wei Hu, Sanjeev Arora, Rong Ge.
Explaining Landscape Connectivity of Low-cost Solutions for Multilayer Nets. NeurIPS 2019 [arXiv][blog]
- Rong Ge*, Zhize Li*, Weiyao Wang*, Xiang Wang*.
Stabilized SVRG: Simple Variance Reduction for Nonconvex Optimization. COLT 2019 [arXiv][poster][slides]
- Rong Ge*, Rohith Kuditipudi*, Zhize Li*, Xiang Wang*.
Learning Two-layer Neural Networks with Symmetric Inputs. ICLR 2019 [arXiv][poster]
- Chaoli Zhang, Xiang Wang, Fan Wu and Xiaohui Bei.
Efficient Auctions with Identity-Dependent Negative Externalities. AAMAS 2018 (Short Paper)
- Xiang Wang, Zhenzhe Zheng, Fan Wu, Xiaoju Dong, Shaojie Tang and Guihai Chen.
Strategy-proof Data Auctions with Negative Externalities. AAMAS 2016 (Short Paper)
- Stabilized SVRG: Simple Variance Reduction for Nonconvex Optimization.
COLT 2019, Jun 2019.
- CS630: Randomized Algorithms, Spring 2018. TA
- CS330: Algorithm Design, Fall 2018. TA
- Outstanding Ph.D. Research Initiation Project Award, Duke University (Sep. 2019)
- Outstanding Graduate Award, Shanghai Jiao Tong University (Jun. 2017)
- Elite Collegiate Award, China Computer Federation (Oct. 2016)
- Journal refereeing: Journal of Machine Learning Research
- Conference refereeing: STOC, ICML, NeurIPS, COLT