ELEN 6886
Sparse Representation and High-Dimensional Geometry

 Fall 2015 Mondays 1:10-3:40 PM 627 Mudd Instructor: John Wright Email: johnwright@ee.columbia.edu   Office hours (6886 only): Wednesday 11:45 AM - 1 PM, 716 CEPSR Teaching Assistant: Qing Qu Email: qq2105@columbia.edu Office hours: Thursdays 3-5 PM, 707 CEPSR

For course related email, please start the subject line with "6886: ". This will help me, and get you a faster answer. Thanks!

Prerequisites: Linear algebra and probability at the undergraduate level.

There are no other formal prerequisites for this course. We will touch on ideas from signal processing, optimization, and statistics, in a self-contained manner. Background in any of these areas may help students to appreciate certain aspects of the course material. If you're curious about whether you would benefit from the course, contact the instructor.

Text: Instructor's notes will be distributed for each lecture, via this website.

Course organization, grading, and a rough syllabus: syllabus.pdf

Lectures and Course Materials

 Date Topic Readings Supplementary readings, code and examples Mon. September 14, 2015 What is it all about?    Introduction to Sparse Modeling    Motivating Applications    Sparse Recovery    $$\ell^0$$-minimization Lecture notes 1 (on CourseWorks) Introductory material: Donoho, Elad and Bruckstein - From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images, SIAM Review 2009 Davenport, Duarte, Eldar and Kityniok - Introduction to Compressed Sensing, 2011 Wright, Ma, Saipro, Mairal, Huang, Yan - Sparse Representation for Computer Vision and Pattern Recognition, Signal Processing Magazine 2010 Hardness results for sparse recovery: Natarajan - Sparse Approximate Solutions to Linear Systems, SIAM Journal on Computing 1995 (You can access this through the Columbia Library -- log in using your uni and password). Amaldi and Kann - On the Approximability of Minimizing Nonzero Variables or Unsatisfied Relations in Linear Systems Theoretical Computer Science, 1997 The material on Kruskal rank, spark and uniqueness of sparse solutions comes from  Donoho and Elad - Optimally Sparse Representation in General (nonorthogonal) Dictionaries via L1 Minimization, PNAS 2003 See also, Gorodnitsky and Rao - Sparse Signal Reconstruction from Limited Data using FOCUSS - A Reweighted Minimum Norm Algorithm, IEEE TSP 1997 Lecture notes! Are available on CourseWorks. Log in using your uni, go to the ELEN 6886 tab, and go to "Files and Resources". You can also play with the demos of Face Recognition,  MRI and $$\ell^0$$ minimization that we saw in class. Review: There is a compendium of facts about linear algebra on CourseWorks. Mon. September 21, 2015 Finding sparse solutions efficiently    Convexity    $$\ell^1$$-minimization Lecture notes 1: Kruskal rank and uniqueness Lecture notes 2 (on CourseWorks) A few applications we briefly alluded to: Faces: Wright, Yang, Ganesh, Sastry, Ma - Robust Face Recognition via Sparse Representation, PAMI 2009 Superresolution: Candes and Fernandez-Granda - Towards a mathematical theory of super-resolution, CPAM 2013 Motion segmentation Elhamifar and Vidal - Sparse Subspace Clustering, Theory, Algorithm and Applications, PAMI 2013 MRI: Lustig, Donoho, Santos, Pauly - Compressed Sensing MRI, SPM 2008 Logan's phenomenon: Columbia library has a copy of Logan's thesis; you can also read more in Donoho and Stark, Uncertainty Principles and Signal Recovery, 1989 Music and lyrics: Huang, Chen Smaragdis, Hasegawa-Johnson, Singing-Voice Spearation from Monaural Recordings Using Robust Principal Component Analysis, ICASSP 2012 Mon. September 28, 2015 Correct recovery under incoherence Lecture notes 2-3 The material on coherence and L1 recovery comes from  Donoho and Elad - Optimally Sparse Representation in General (nonorthogonal) Dictionaries via L1 Minimization, PNAS 2003 Gribonval and Nielsen - Sparse Representations in Unions of Bases, IEEE IT 2003 The proof we saw in class comes from Fuchs - On Sparse Representations in Arbitrary Redundant Bases, IEEE IT 2004 Mon. October 5, 2015 Applications: MRI and Faces Analysis: The Restricted Isometry Property Lecture notes 3 The code for the two demos is on the Courseworks. The proof ideas we saw in class can be found in Lecture notes 3; see also Candes, Romberg, Tao - Robust Uncertainty Principles: Exact Signal Reconstruction from Highly Incomplete Frequency Information 2004 Candes and Tao - Decoding by Linear Programming, IEEE IT 2005 Mon. October 12, 2015 The Johnson-Lindenstrauss lemma RIP of random matrices Noisy sparse recovery Lecture notes 4 The Johnson-Lindenstrauss lemma is from Johnson and Lindenstrauss - Extensions of Lipschitz mappings into Hilbert Space, Contemporary Mathematics, 1984 See also Dasgupta and Gupta - An Elementary Proof of a Theorem of Johnson and Lindenstrauss, RSA 2003 The JL property is useful in many situations, for example in finding approximate nearest neighbors: Ailon and Chazelle - Approximate Nearest Neighbors and the Fast Johnson-Lindenstrauss Transform, STOC 2006. The stability result we quoted for Lasso comes from Bickel, Ritov and Tsybakov, Simultaneous analysis of Lasso and Dantzig Seleector, AOS 2009 See the lecture notes for a streamlined proof.