Site menu:

Optimization@NIPS
We welcome you to participate in the 6th NIPS Workshop on Optimization for Machine Learning, to be held at:
Lake Tahoe, Nevada, USA, Dec 9th, 2013
Invited Talks
- Peter Richtarik (Edinburgh)
Parallel Coordinate Descent Methods for Big Data Optimization - John Wright (Columbia)
- Adrian Lewis (Cornell)
IDENTIFIABILITY, NONCONVEXITY, AND SPARSE OPTIMIZATION ALGORITHMS
The notion of "identifiability" underpins the active-set philosophy in optimization, and often manifests itself in variational formulations seeking low-dimensional structure from high-dimensional data. Beyond the realm of convexity, identifiability remains a fundamental property, occurring generically in semi-algebraic optimization. I illustrate its relevance for two simple and popular nonconvex algorithms: alternating projections and a proximal algorithm for composite optimization.