New Challenges in Machine Learning - Robustness and Nonconvexity
Organizers: Ilias Diakonikolas (diakonik@usc.edu), Rong Ge (rongge@cs.duke.edu), Ankur Moitra (moitra@mit.edu)
Machine learning has gone through a major transformation in the last decade. Traditional methods based on convex optimization have been replaced by highly non-convex approaches including deep learning. In the worst-case, the underlying optimization problems are NP-hard. Therefore to understand their success, we need new tools to characterize properties of natural inputs, and design algorithms that work provably in beyond-worst-case settings. In particular, robustness and nonconvexity are two of the major challenges.
Robustness: When we design provable learning algorithms, usually their performance is very brittle to the model assumptions. Can we design provably robust algorithms? How can we find outliers in high dimensions? And are there interesting theoretical questions awaiting in more modern issues in robustness, such as adversarial machine learning and generative adversarial nets?
Nonconvexity: When and why can we solve nonconvex optimization problems in high dimensions? Recent works show that it is possible to escape saddle points and get to a local minimum, and for several problems all local minima are as good as global minimum. How can we extend these results to a richer set of problems? Especially, what is deficient about the models that have been studied so far in explaining what is actually going on in deep nets?
Schedule
The workshop is on Friday June 23rd. The morning session is 9:00 - 12:00; the afternoon session is 1:00 - 4:00.
Time | Speaker | Title | Slides |
9:00 - 9:30 | Ilias Diakonikolas | Efficiency versus Robustness in High-Dimensional Statistics | Slides |
9:40 - 10:10 | Po-Ling Loh | Robust high-dimensional linear regression: A statistical perspective | Slides |
10:20 - 10: 50 | Chris De Sa | Analyzing Stochastic Gradient Descent for Some Non-Convex Problems | Slides |
11:00 - 12:00 | Poster Session | ||
12:00 - 1:00 | Lunch Break | ||
1:00 - 1:30 | Sanjeev Arora | How to get into ML: A case study. | |
1:40 - 2:10 | Adam Klivans | Identifying Barriers for Learning Neural Networks | |
2:10 - 2:50 | Break | ||
2:50 - 3:20 | John Wright | Nonconvex Recovery of Low-Complexity Models | |
3:30 - 4:00 | Aaron Sidford | Accelerated Methods for Non-Convex Optimization |
Call for Posters
We invite posters to be presented during the workshop. We solicit posters in all areas of machine learning, including unpublished work and recently published work outside FOCS/STOC. To submit a poster, please email stoc2017ml@gmail.com with a link/attachment to your paper or a 2-page abstract.