ELEN 6886
Sparse Representation and High-Dimensional Geometry

Fall 2015

1:10-3:40 PM
627 Mudd

Instructor: John Wright
Email: johnwright@ee.columbia.edu  
Office hours (6886 only): Wednesday 11:45 AM - 1 PM, 716 CEPSR

Teaching Assistant: Qing Qu
Email: qq2105@columbia.edu
Office hours: Thursdays 3-5 PM, 707 CEPSR

For course related email, please start the subject line with "6886: ". This will help me, and get you a faster answer. Thanks!

Prerequisites: Linear algebra and probability at the undergraduate level.

There are no other formal prerequisites for this course. We will touch on ideas from signal processing, optimization, and statistics, in a self-contained manner. Background in any of these areas may help students to appreciate certain aspects of the course material. If you're curious about whether you would benefit from the course, contact the instructor.

Text: Instructor's notes will be distributed for each lecture, via this website.

Course organization, grading, and a rough syllabus: syllabus.pdf

Lectures and Course Materials


Supplementary readings, code and examples
Mon. September 14, 2015
What is it all about?
   Introduction to Sparse Modeling
   Motivating Applications
   Sparse Recovery

Lecture notes 1

Introductory material:
Hardness results for sparse recovery:
The material on Kruskal rank, spark and uniqueness of sparse solutions comes from 
Lecture notes! Are available on CourseWorks. Log in using your uni, go to the ELEN 6886 tab, and go to "Files and Resources". You can also play with the demos of Face Recognition,  MRI and \(\ell^0\) minimization that we saw in class.

Review: There is a compendium of facts about linear algebra on CourseWorks. 

Mon. September 21, 2015
Finding sparse solutions efficiently

Lecture notes 1: Kruskal rank and uniqueness

Lecture notes 2 (on CourseWorks)

A few applications we briefly alluded to:

Mon. September 28, 2015
Correct recovery under incoherence
Lecture notes 2-3
The material on coherence and L1 recovery comes from 
The proof we saw in class comes from
Mon. October 5, 2015
Applications: MRI and Faces
Analysis: The Restricted Isometry Property
Lecture notes 3
The code for the two demos is on the Courseworks.

The proof ideas we saw in class can be found in Lecture notes 3; see also

Mon. October 12, 2015
The Johnson-Lindenstrauss lemma
RIP of random matrices
Noisy sparse recovery
Lecture notes 4
The Johnson-Lindenstrauss lemma is from
The JL property is useful in many situations, for example in finding approximate nearest neighbors:
The stability result we quoted for Lasso comes from
See the lecture notes for a streamlined proof.