**Spring 2023**

**EE 425X: Machine Learning: A Signal
Processing Perspective**

- Updates/Reminders
- Prerequisites: MATH 207, STAT/EE 322 or
equivalent
**Location, Time:****Pearson 3125, 8:50-9:40am M-W-F**- Instructor: Prof Namrata Vaswani
- Office
Hours: TBD
- Office: 3121 Coover Hall
- Email: namrata
AT iastate DOT edu Phone: 515-294-4012
**Grading policy****Syllabus:**- Background material:
- probability,
- calculus,
- linear algebra
- Supervised
Learning
- Linear
Regression
- Logistic
Regression
- Generative
Algorithms (Gaussian & discrete-valued case; Naive Bayes assumption)
- Support
Vector Machines (to be added)
- Decision
trees
- Unsupervised
Learning
- PCA
- Clustering
- Learning
Theory / Bias-Variance Tradeoff
- Introduction
to Deep Learning / Neural Networks

·
**Access Statement for Students
with Documented Disabilities**:

o Iowa
State University is committed to assuring that all educational activities are
free from discrimination and harassment based on disability status. Students
requesting accommodations for a documented disability are required to meet with
staff in Student Accessibility Services (SAS) to establish eligibility and
learn about related processes. Eligible students will be provided with a
Notification Letter for each course and reasonable accommodations will be
arranged after timely delivery of the Notification Letter to the instructor.
Students are encouraged to deliver Notification Letters as early in the
semester as possible. SAS, a unit in the Dean of Students Office, is located in
room 1076 Student Services Building or online at www.sas.dso.iastate.edu.
Contact SAS by email at accessibility@iastate.edu or by phone at 515-294-7220
for additional information. Since this is a small class, I am happy to also
provide other options. Please do not hesitate to discuss your needs with me.

·
**Homeworks**

o Most homeworks
will be programming assignments. For most, I will have you work with both simulated
data (helps you make sure you have coded things correctly and your idea works)
and a real dataset. This will be a way
for you to check if your code is correct. You are not allowed to copy code or
solutions from the internet or from anyone in the class. I will have all
policies before first home work is due. You have to copy and sign the Honor
Code Statement given below

o Will be
posted in Canvas

·
Project Details

o See Canvas

**Course Material**- Review
- Basic probability review (EE 322 recap)
- Practice
problems: http://athenasc.com/prob-supp.html
(look at problems for Chapter 1, 2, 3).
- EE
322 course
- Too much detail for this course:
Probability Recap 3
- Linear
algebra review
- Detailed
Notes:
**One file with all my notes**- Some of
these are based on the Stanford cs229 class
- Course
material from http://cs229.stanford.edu/syllabus.html
- Summary
Notes
- Supervised Learning - Linear & Logistic
Regression; Generative Algorithms (Gaussian & discrete-valued case;
Naive Bayes assumption
- Support
Vector Machines (to be added)
- Unsupervised Learning - PCA and Clustering)
- Learning Theory / Bias-Variance Tradeoff
- Introduction
to Deep Learning / Neural Networks
- My
graduate Course on “Special Topics in Statistical Machine Learning
- Topics
from Estimation/Detection
Theory course (EE 527):
- These
use Signal Processing Notation: x is the unknown quantity, y is what we
observe; goal is to find an estimate of x, \hat{x}
- Least
Squares estimation
- Bayesian
Estimation
- MMSE,
linear MMSE estimation and Kalman filtering
- Joint
Gaussian random variables, MMSE and linear MMSE Estimation: based
on Vincent Poor book
- Kalman
filtering details and proofs: based on Poor's book
- Kalman Filter algorithm summary (an old handout): based on Poor's book
- See Spring 2008 midterm 2
solutions (in WebCT) for
- second formula for Kalman
gain
- KF with control input
- Extended KF
- KF for case when system noise
and observation noise at the same time are correlated
- Hidden
Markov Models (HMM)
- HMM
notes: mostly based on Rabiner's
tutorial paper
(new notes)
- Rabiner's
tutorial paper