**EE
527: Detection and Estimation Theory (Spring 2018)**

**Updates/Reminders****Prerequisites:**EE 224, EE 322, Basic calculus & linear algebra. Suggested co-requisite: EE 523**Location, Time:****Howe 1226, Tues-Thurs 2:10-3:30****Instructor:**Prof Namrata Vaswani**Office Hours: Wed 11-12, Thurs 11-12****Office:**3121 Coover Hall**Email:**namrata AT iastate DOT edu**Phone:**515-294-4012**Grading policy****Homeworks****: 10%****Midterm Exam: 30%****Final Exam: 40%****Project / term paper: 20%****Exam Dates and Project Details and Deadlines****Exam dates****Midterm: March 8****Project / Term Paper details:**- Pick a topic related to the course (can also be related to your
research but cannot be research that you have already done)
- Either pick a paper(s), implement the algorithm(s) for an
application and discuss pros and cons / what else can be done
- Or pick a theoretical paper, present problem, solution approach,
guarantee, and proof (or most of the proof – carefully select a paper
with proof that is not too long)
- Submission requirements: report write-up and presentation (at most
an hour, can be shorter)
**Syllabus:**- Background material:
recap of probability, calculus, linear algebra
- Estimation Theory
- Minimum variance
unbiased estimation, best linear unbiased estimation
- Cramer-Rao lower bound
(CRLB)
- Maximum Likelihood
estimation (MLE): exact and approximate methods (EM, alternating max,
etc)
- Bayesian inference
& Least Squares Estimation (from Kailath
et al's Linear Estimation book)
- Basic ideas, adaptive
techniques, Recursive LS, etc
- Kalman filtering
(sequential Bayes)
- Finite state Hidden
Markov Models: forward-backward algorithm, Viterbi
(ML state estimation), parameter estimation (f-b + EM)
- Graphical Models
**Applications**: image processing, speech, communications (to be discussed with each topic)- Sparse Recovery and
Compressive Sensing introduction
- Monte Carlo methods:
importance sampling, MCMC, particle filtering, applications in numerical
integration (MMSE estimation or error probability computation) and in
numerical optimization (e.g. annealing)
- Detection Theory
- Likelihood Ratio
testing, Bayes detectors,
- Minimax detectors,
- Multiple hypothesis
tests
- Neyman-Pearson detectors
(matched filter, estimator-correlator etc),
- Wald sequential test,
- Generalized likelihood
ratio tests (GLRTs), Wald and Rao scoring tests,
- Applications
- The syllabus is similar to Prof. Dogandzic's EE527 but I will
cover least squares estimation, Kalman
filtering and Monte Carlo methods in more detail and will discuss some
image/video processing applications also. Note that LSE, KF are also
covered in EE524, but different perspectives are always useful
**Books:****Textbook: S.M. Kay's Fundamentals of Statistical Signal Processing: Estimation Theory (Vol 1), Detection Theory (Vol 2)****References**- Kailath, Sayed
and Hassibi,
*Linear Estimation* - V. Poor,
*An Introduction to Signal Detection and Estimation* - H.Van Trees,
*Detection, Estimation, and Modulation Theory* - J.S. Liu,
*Monte Carlo Strategies in Scientific Computing.*Springer-Verlag, 2001. - B.D. Ripley,
*Stochastic Simulation.*Wiley, 1987.

**Disability accommodation:**If you have a documented disability and anticipate needing accommodations in**Homeworks****Homework 1: due Mon Jan 29**- New added:
- Discuss and explain rank and spark
- Prove the interlacing theorem for the matrix (A+zz^T)
where z is a vector.
- New added: Prove all the if and only if statements for
joint-Gaussian random variables from this
document
- Chapter 3 of Supplementary
Problems for Bertsekas’s Probability Text:
Problems 5, 6, 8,
9, 10, 14, 18, 19, 20, 21
**Correction:****Suppose X1 is N(0,1), X2 is 1 w.p. 1/2 and -1 w.p. 1/2, and X3 = X1. X2. Compute the pdf of X3 and compute the joint pdf of X1 and X3**- Practice problems: use this link to do selected practice problems
from Chapters 1 and 2: EE 322 Fall 2007 homework sets (do not need to be submit)
**Homework 2: due Mon Feb 12**- Problems 2.1, 2.4,
2.7, 2.9, 2.10 of Kay-I. Bonus: 2.8
**Homework 3: due Tues Feb 23**- Problems 5.2, 5.3,
5.4, 5.5, 5.7, 5.13, 5.16
- Compute
the MVUE for a N(\mu,\sigma^2) distribution
using N i.i.d. observations. Also compute the
covariance matrix of the MVUE estimator
**Homework 4: due Thurs March 3**- Problems 3.1, 3.3, 3.9, 3.11
- Do the following sets of problems: practice set (will
be graded for completion, ignore deadlines written on it)
**Homework 5: Due March Tuesday March 9****Problems 4.2, 4.5, 4.6, 4.10, 4.13, 4.14****Problems: 6.1, 6.2, 6.5, 6.7, 6.9, 6.16****Extra credit: 6.8, 6.14, 6.10****Homework 6: Due Thursday March 31**- Problems 7.6, 7.7, 7.14, 7.18, 7.19
**,**Problems 8.24, 8.26, 8.27 (skip the Newton Raphson part) - Practice
problems (I will suggest doing at least two): 8.4, 8.12, 8.28, 8.29
**Homework 7: Due Thursday April 7**

**Course Handouts****Introduction slides****Linear Algebra and Probability Review and New Material**- Basic probability
review (EE 322 recap)
- Practice problems: http://athenasc.com/prob-supp.html
(look at problems for Chapter 1, 2, 3).
- EE
322 course
**Probability Review - 2**(Prof. ALD's notes)- New Probability Background Needed
- Linear Algebra Review and New Material
**Classical Estimation****Recap notes for all topics: fill in the gaps in the notes below****min (classical) MSE estimation, MVUE, Sufficient Statistics, MLE**- H1: Minimum Variance Unbiased
Estimation (Prof. ALD's notes), Chapter 2, 3, 5 of Kay-I
**H2: Cramer Rao Bound and Efficient Estimators**(Prof. ALD's notes)- H3: Linear Models, Best Linear
Unbiased Estimator and ML Estimation (Prof. ALD's notes)
**EM algorithm**- Dempster
paper,
- See
scanned notes in WebCT
- Prof. ALD's handout
**Least Squares estimation****LS estimation summary**- Also
see Chapter 8 of Kay-I
**Sparse Recovery / Compressive sensing****Introduction****Old slides: Introduction to Compressive Sensing**- Compressive
Sensing class (EE 520)
- CS papers' archive
**Bayesian estimation****MMSE and linear MMSE estimation and Kalman filtering**- Joint
Gaussian random variables, MMSE and linear MMSE Estimation: based
on Poor’s book
- Kalman filtering details and proofs: based on
Poor's book
**Kalman Filter algorithm summary (an old handout)**: based on Poor's book- See
Spring 2008 midterm 2 solutions (in WebCT)
for
- second
formula for Kalman gain
- KF
with control input
- Extended
KF
- KF
for case when system noise and observation noise at the same time are
correlated
**Some extra things****Bayesian Inference using Prof ALD's notes**: some parts of this are useful**Graphical models****Graphical models**(Prof. ALD's notes) approach for handling conditional dependencies in Bayesian estimation (Prof ALD's handout)**Hidden Markov Models (HMM)**- HMM
notes: mostly based on Rabiner's tutorial
paper
**(new notes)** - Rabiner's
tutorial paper
**Detection Theory:**- H5, H5b ,
**H5c**(Prof ALD's notes) - Generalized
LRT: asymptotic distribution
**Monte Carlo****Simple MC and Importance Sampling (IS)**- Simple MC, Importance Sampling (IS),
Biased IS & Bayesian IS
**(Prof ALD's handout)** **Geweke's paper (1989)**- Concept,
Conditions for getting Consistent Estimates
- Bias-variance
tradeoff, Finding good importance densities (to reduce estimator
variance), Examples
- Resampling
(sampling-importance resampling), Rao-Blackwellization
- Techniques
for generating random samples from a PDF
- Accept-Reject
**Markov Chain Monte Carlc (MCMC)****Markov chain concepts and Markov Chain Monte Carlo (Prof ALD's handout)**- Markov
chain concepts (for proofs, full details, take EE 523)
- General
conditions for MCMC to work (slide 24)
- Metropolis-Hastings
and proof of why it works
**Particle filtering**- Particle filtering
**Doucet et al's paper (2000)**- HMM
model and other algorithms
- Importance
sampling to approximate a PDF (sum of weighted Diracs)
- Sequential
Importance Sampling (SIS)
- Resampling
concept
- Particle
filtering algorithm: SIS + Resample
**Probability and Linear Algebra Recap****Undergrad probability review (EE 322)**- Chapter
1 slides
**Single Random Variable: Discrete & Continuous****Multiple Random Variables: Discrete**, Multiple Random Variables: Continuous**Linear algebra review:**- http://www.maths.mq.edu.au/~wchen/lnlafolder/lnla.html
(Chapters 5-10, Chap 11,12 may also be useful)