Xiang WANG

I got my Computer Science Ph.D. from Duke University in 2022. I am fortunate to have Prof. Rong Ge as my advisor. Broadly, I am interested in deep learning theory and non-convex optimization. I did my undergraduate study at Shanghai Jiao Tong University (SJTU).

In summer 2022, I did a machine learning engineer internship at Meta. In summer 2021, I did a research internship at Facebook AI, with Dr. Yuandong Tian as my mentor. In fall 2019, I visited IAS for the Special Year on Optimization, Statistics, and Theoretical Machine Learning. In Spring 2017, I did an intern in the Division of Mathematical Sciences, Nanyang Technological University, advised by Prof. Xiaohui Bei. At SJTU, I was conducting research on algorithmic game theory under the supervision of Prof. Fan Wu.

Here is my CV (last update: Sep. 2021)

Email: xwang AT cs DOT duke DOT edu
Office: LSRC D125
*: indicating alphabetic ordering or equal contribution.

Publications

  1. Xingyu Zhu, Zixuan Wang, Xiang Wang, Mo Zhou, Rong Ge.
    Understanding Edge-of-Stability Training Dynamics with a Minimalist Example. ICLR 2023 [arXiv]

  2. Muthu Chidambaram, Xiang Wang, Chenwei Wu, Rong Ge.
    Provably Learning Diverse Features in Multi-View Data with Midpoint Mixup. [arXiv]

  3. Xiang Wang, Annie N. Wang, Mo Zhou, Rong Ge.
    Plateau in Monotonic Linear Interpolation - A "Biased" View of Loss Landscape for Deep Networks. ICLR 2023 [arXiv]

  4. Xiang Wang, Xinlei Chen, Simon S. Du, Yuandong Tian.
    Towards Demystifying Representation Learning with Non-contrastive Self-supervision. [arXiv]

  5. Muthu Chidambaram, Xiang Wang, Yuzheng Hu, Chenwei Wu, Rong Ge
    Towards Understanding the Data Dependency of Mixup-style Training. ICLR 2022 (Spotlight) [arXiv]

  6. Rong Ge*, Yunwei Ren*, Xiang Wang*, Mo Zhou*.
    Understanding Deflation Process in Over-parametrized Tensor Decomposition. NeurIPS 2021 [arXiv]

  7. Xiang Wang, Shuai Yuan, Chenwei Wu, Rong Ge.
    Guarantees for Tuning the Step Size using a Learning-to-Learn Approach. ICML 2021 [arXiv]

  8. Xiang Wang*, Chenwei Wu*, Jason Lee, Tengyu Ma, Rong Ge.
    Beyond Lazy Training for Over-parameterized Tensor Decomposition. NeurIPS 2020 [arXiv]

  9. Rohith Kuditipudi, Xiang Wang, Holden Lee, Yi Zhang, Zhiyuan Li, Wei Hu, Sanjeev Arora, Rong Ge.
    Explaining Landscape Connectivity of Low-cost Solutions for Multilayer Nets. NeurIPS 2019 [arXiv][blog]

  10. Rong Ge*, Zhize Li*, Weiyao Wang*, Xiang Wang*.
    Stabilized SVRG: Simple Variance Reduction for Nonconvex Optimization. COLT 2019 [arXiv][poster][slides]

  11. Rong Ge*, Rohith Kuditipudi*, Zhize Li*, Xiang Wang*.
    Learning Two-layer Neural Networks with Symmetric Inputs. ICLR 2019 [arXiv][poster]

  12. Chaoli Zhang, Xiang Wang, Fan Wu and Xiaohui Bei.
    Efficient Auctions with Identity-Dependent Negative Externalities. AAMAS 2018 (Short Paper)

  13. Xiang Wang, Zhenzhe Zheng, Fan Wu, Xiaoju Dong, Shaojie Tang and Guihai Chen.
    Strategy-proof Data Auctions with Negative Externalities. AAMAS 2016 (Short Paper)

Talks

Teaching

Awards

Service