Many problems in representation learning can be posed mathematically as a low-rank tensor decomposition (including my work on syntax-aware compositional word embeddings). One important and general class is the
Tucker Decomposition, which factorizes the tensor into a lower-dimensional core tensor as well as low-dimensional factor matrices. While iterative methods exist for computing the Tucker decomposition, the natural least-squares optimization formulation for this problem is non-convex and therefore local search methods are not well-understood. To rectify this, I characterize the optimization landscape for Tucker decomposition and give an efficient and provably correct local search algorithm to find a global minimum (paper currently in submission).