Chudi Zhong

I am Chudi Zhong. I am a Ph.D. student in computer science at Duke University, advised by Prof. Cynthia Rudin. I am also working closely with Prof. Margo Seltzer on my research projects. I recieved my B.S. in Statistics and Information Science from UNC-Chapel Hill and M.S. in Statistical Science from Duke.

My primary research interest lies in the intersection of machine learning, optimization, and human-model interaction. My goal is to develop machine learning systems that are easier to troubleshoot, interact with, and gain knowledge from. Specifically, I work on optimizing interpretable models and exploring their Rashomon sets.

Selected Projects

Sparse Decision Tree Optimization

We developed an optimal sparse decision tree algorithm called Generalized and Scalable Optimal Sparse Decision Trees (GOSDT). It combines the branch-and-bound and dynamic programming methodology with custom bit-vector libraries and computation reuse. This algorithm is different from greedy splitting and pruning, such as used in CART and C4.5, and it is able to produce trees that can surprisingly often be comparable with black-box models in terms of prediction accuracy.

To further improve the scalability of the optimal classification tree algorithms, we proposed a set of techniques that decreases the time of producing optimal sparse trees by two to three orders of magnitude. This allows us to produce high-performing interpretable models on much larger and more challenging datasets than ever before.

Paper Code

Sparse Generalized Additive Models

We presented a fast and scalable algorithm to train generalized additive models by directly solving an ℓ0 regularized logistic regression problem. We proposed a set of techniques for a coordinate-descent-and-local-swap-based algorithm. These techniques allow our algorithm can handle thousands of observations and features in minutes and can achieve accuracy and AUC comparable to black-box models on real datasets.

Paper Code

Rashomon Set of Interpretable Models

A set of well-performing models is known as the Rashomon set. This term came from the Rashomon Effect coined by Leo Breiman, which describes the fact that many equally good models explain the same data well. In this project, we aim to propose a new framework for trustworthy machine learning through constructing a set of near-optimal interpretable models and providing users with unprecendented flexibility to visualize, explore, select, and modify not only one but many well-performing models.

We have developed algorithms to efficiently and accurately construct the Rashomon set of sparse decision trees (TreeFARMS) and sparse generalized additive models and tools to facilitate human-model interactions using the Rashomon set.

Paper Code Visualization Tool

Publications

( indicates co-first authors, equal contribution)

Exploring and Interacting with the Set of Good Sparse Generalized Additive Models.

Advances in Neural Information Processing Systems (NeurIPS), 2023.

Chudi Zhong, Zhi Chen, Margo Seltzer, Cynthia Rudin

OKRidge: Scalable Optimal k-Sparse Ridge Regression for Learning Dynamical Systems..

Advances in Neural Information Processing Systems (NeurIPS), 2023. (Spotlight)

Jiachang Liu, Sam Rosen, Chudi Zhong, Cynthia Rudin

Exploring the Whole Rashomon Set of Sparse Decision Trees.

Advances in Neural Information Processing Systems (NeurIPS), 2022. (Oral)

Rui Xin, Chudi Zhong, Zhi Chen, Takuya Takagi, Margo Seltzer, Cynthia Rudin

  • – Finalist for the INFORMS 2022 Data Mining Best Student Paper Competition Award
FasterRisk: Fast and Accurate Interpretable Risk Scores.

Advances in Neural Information Processing Systems (NeurIPS), 2022.

Jiachang Liu, Chudi Zhong, Boxuan Li, Margo Seltzer, Cynthia Rudin

TimberTrek: Exploring and Curating Trustworthy Decision Trees with Interactive Visualization.

IEEE Visualization and Visual Analytics (VIS), 2022.

Zijie Wang, Chudi Zhong, Rui Xin, Takuya Takagi, Zhi Chen, Duen Horng Chau, Cynthia Rudin, Margo Seltzer

Fast Sparse Decision Tree Optimization via Reference Ensembles.

AAAI Conference on Artificial Intelligence, 2022.

Hayden McTavish, Chudi Zhong, Reto Achermann, Ilias Karimalis, Jacques Chen, Cynthia Rudin, Margo Seltzer

Fast Sparse Classification for Generalized Linear and Additive Models.

Proceedings of Machine Learning Research (AISTATS), 2022.

Jiachang Liu, Chudi Zhong, Margo Seltzer, Cynthia Rudin

Interpretable Machine Learning: Fundamental Principles and 10 Grand Challenges.

Statistics Surveys, 2022.

Cynthia Rudin, Chaofan Chen, Zhi Chen, Haiyang Huang, Lesia Semenova, Chudi Zhong

Generalized and Scalable Optimal Sparse Decision Trees.

International Conference on Machine Learning (ICML), 2020.

Jimmy Lin, Chudi Zhong, Diane Hu, Cynthia Rudin, Margo Seltzer