**Fall 2016: EE 520: Special
Topics: Topics in ****Statistical Machine Learning**

Instructor: Prof Namrata Vaswani email: namrata@iastate.edu

Web link: http://home.engineering.iastate.edu/~namrata/MachineLearning_class/

Time : Mon-Wed 11-12:20

Location: Coover 2222 (until further notice)

This will be a special topics / seminars course in which we will discuss some recent works on statistical machine learning algorithms, performance guarantees and applications. One part of the class will be taught by me, the second one will involve term paper presentations by the students. Some topics that will be covered include

- Review of background needed

o Probability

o Linear Algebra

o Convex optimization

- Advanced topics on Probability and Linear Algebra (used in papers that we will discuss)

- Recent work on non convex methods (e.g. Alternating Minimization or Gradient Descent) for various problems: low-rank matrix completion, phase retrieval, etc.

- Low-rank matrix completion, robust PCA and sparse recovery and guarantees for the proposed methods

- Modern applications

This course will be of interest to graduate students from Electrical and Comp Engineering, Mathematics, Statistics, Computer Science, Industrial Engineering and others. Topics can be added based on the students’ research interests.

**Class time and Location:**Mon-Wed 11-12:20**Instructor:**Prof. Namrata Vaswani**Office Hours:**Tues - Wed 2-3 or by appointment**Email:**namrata AT iastate.edu,**Phone:**515-294-4012**Office:**3121 Coover Hall**Grading:**- 10% class participation
- 40% scribe notes for
one paper that I will present
- 50% term paper (read
and present on a paper/topic for one lecture, submit slides and a short
report)
**Prerequisites/Corequisites:****EE 523 (or at least EE 322 level knowledge), Math 510 (linear algebra), EE 571 (convex optimization) or at least one of these. I strongly recommend taking Math 510 (also offered in Fall) and going over EE 322 notes (given below)****Disability accommodation:**If you have a documented disability and anticipate needing accommodations in this course, please make arrangements to meet with me soon. You will need to provide documentation of your disability to Disability Resources (DR) office, located on the main floor of the Student Services Building, Room 1076, 515-294-7220.**Papers that can be presented by students: TBD**

**List
of Papers **

**Introduction slides:**Intro**Background material: probability, linear algebra, and optimization****Probability:**Quick recap of EE322 (undergraduate probability for EE), law of large numbers, high probability tail bounds for random matrix eigenvalues- EE
322 notes
- EE
322 problem sets (many of these are harder than what I use for my
EE 322 offerings)
- Notes
**Linear Algebra:**parts of Chapters 0, 1, 2, 4, 5 of Matrix Analysis, Horn and Johnson**Optimization:**subset of slides of Vandenberghe and Boyd:**Papers****Advanced Topics in Probability and Linear Algebra:**- Vershynin`s review article: Introduction
to the non-asymptotic analysis of random matrices
- Tropp`s paper: User-friendly Tail Bounds for
Sums of Random Matrices (some results)
- Phase retrieval via
Alternating Minimization
(Jain, Netrapalli, Sanghavi)
- Solving Random
Quadratic Systems of Equations is Nearly as Easy as Solving Linear
Equations (Chen and Candes)
- Low-rank Matrix
Completion using Alternating Minimization (Jain, Netrapalli,
Sanghavi)
- Tentatively: R. Vershynin, ESTIMATION
IN HIGH DIMENSIONS: A GEOMETRIC PERSPECTIVE
- Extra resource: A
useful book on High
Dimensional Probability by Vershynin

**Term
papers and Scribing – New 9/14/2016**

Term paper topics: PCA etc.

- Review of PCA literature

- Incremental (or online or streaming) PCA

- Memory Limited, Streaming PCA, NIPS 2013

- Online Principal Components Analysis

- Non-convex Robust PCA, NIPS 2014

- Work on partial SVD (top k singular vectors)

- Musco and Musco NIPS 2015, Randomized Block Krylov Methods for Stronger and Faster Approximate Singular Value Decomposition

- Back-references of this work

Scribe topics

1. Add to probability and linear algebra notes 2. Find linear algebra tricks in all the papers we present 3. Find probability tricks in all the papers we present 4.

Phase retrieval via Alternating Minimization (Jain, Netrapalli, Sanghavi) Solving Random Quadratic Systems of Equations is Nearly as Easy as Solving Linear Equations (Chen and Candes) Low-rank Matrix Completion using Alternating Minimization (Jain, Netrapalli, Sanghavi)

Tentatively: R. Vershynin, ESTIMATION IN HIGH DIMENSIONS: A GEOMETRIC PERSPECTIVE

**A
Partial List of Papers for students to pick from**

**Compressed Sensing:**- Decoding
by linear programming (Candes and Tao)
- The
Restricted Isometry Property and Its
Implications for Compressed Sensing (Candes)
**Matrix Completion and Robust PCA**- Exact
Matrix Completion via Convex Optimization (Recht
and Candes)
- Robust
Principal Component Analysis?
(Candes, Li, Wright, Ma)
**Alternating Minimization (AltMin) solutions for Non-Convex Problems (with appropriate initialization)**- Low-rank Matrix
Completion using Alternating Minimization (Jain, Netrapalli,
Sanghavi)
- Phase retrieval via
Alternating Minimization
(Jain, Netrapalli, Sanghavi)
**Gradient Descent type solutions for Non-Convex Problems (with appropriate initialization)**- Phase retrieval via Wirtinger Flow: Theory and Algorithms (Soltankhatabi
and Candes)
- Solving Random
Quadratic Systems of Equations is Nearly as Easy as Solving Linear
Equations (Chen and Candes)
- Others
**Collaborative Ranking: Rankings and individualized rankings` estimation**- Individualized Rank
Aggregation using Nuclear Norm Regularization
- Others – TBD
- Older EE520 on Matrix
Completion and Robust PCA: here,
- Even older EE520 on
Compressive Sensing: here