**EE 527: Detection and Estimation Theory
(Spring 2014)**

**Updates/Reminders****Week 1 and 2: Recap / new background needed of probability and linear algebra****Week 3: finish linear algebra discussion, start h1.pdf (MVUE estimation, sufficient statistic, Factorization Theorem) and also complete sufficient statistic and RB-LS theorem****Homeworks****5 and 6 posted, hw 5 due on March 13****Prerequisites:**EE 224, EE 322, Basic calculus & linear alegbra. Suggested class to also take: EE 523**Location, Time:**Marston 204, Tues-Thurs 2:10-3:30pm**Instructor:**Prof Namrata Vaswani**Office Hours: Wednesday and Thursday 10-11am****Office:**3121 Coover Hall**Email:**namrata AT iastate DOT edu**Phone:**515-294-4012**Grading policy****Homeworks****: 10%****Two midterms and one final exam : 20%, 20%, 30%****One project / term paper: 20%****Exam Dates and Project Details and Deadlines****Exam dates****Midterm 2: April 1 tentative**- Midterm
1: Thursday in the week of Feb 10, Midterm 2: Thursday after Spring
break
**Project details:****TBD****Syllabus:**- Background material:
recap of probability, calculus, linear algebra
- Estimation Theory
- Minimum variance
unbiased estimation, best linear unbiased estimation
- Cramer-Rao lower bound
(CRLB)
- Maximum Likelihood
estimation (MLE): exact and approximate methods (EM, alternating max,
etc)
- Bayesian inference
& Least Squares Estimation (from Kailath
et al's Linear Estimation book)
- Basic ideas, adaptive
techniques, Recursive LS, etc
- Kalman filtering
(sequential Bayes)
- Finite state Hidden
Markov Models: forward-backward algorithm, Viterbi
(ML state estimation), parameter estimation (f-b + EM)
- Graphical Models
**Applications**: image processing, speech, communications (to be discussed with each topic)- Sparse Recovery and
Compressive Sensing introduction
- Monte Carlo methods:
importance sampling, MCMC, particle filtering, applications in numerical
integration (MMSE estimation or error probability computation) and in
numerical optimization (e.g. annealing)
- Detection Theory
- Likelihood Ratio
testing, Bayes detectors,
- Minimax detectors,
- Multiple hypothesis
tests
- Neyman-Pearson detectors
(matched filter, estimator-correlator etc),
- Wald sequential test,
- Generalized likelihood
ratio tests (GLRTs), Wald and Rao scoring tests,
- Applications
- Syllabus is similar to Prof. Dogandzic's EE527 but I will
cover least squares estimation, Kalman
filtering and Monte Carlo methods in more detail and will discuss some
image/video processing applications also. Note that LSE, KF are also
covered in EE524, but different perspectives are always useful
**Books:****Textbook: S.M. Kay's Fundamentals of Statistical Signal Processing: Estimation Theory (Vol 1), Detection Theory (Vol 2)****References**- Kailath, Sayed
and Hassibi,
*Linear Estimation* - V. Poor,
*An Introduction to Signal Detection and Estimation* - H.Van Trees,
*Detection, Estimation, and Modulation Theory* - J.S. Liu,
*Monte Carlo Strategies in Scientific Computing.*Springer-Verlag, 2001. - B.D. Ripley,
*Stochastic Simulation.*Wiley, 1987.

**Disability accomodation:**If you have a documented disability and anticipate needing accommodations in**Homeworks****Homework 7: Due April 29, Tuesday.****Kalman filter problems****Some of these are the same as exam questions, but it is good if you do them again (if you had a mistake in the exam).****Homework 6b: Due Tuesday after Spring break**- Recursive least squares (LS)
- Derive the recursive LS estimator
- Implement in Matlab (for large n, m)
and compare with the batch one to convince yourself that it is faster.
Submit a short write up on what you noticed and why.
**Homework 6: Due Tuesday after Spring break**- Problems 7.6, 7.7, 7.14, 7.18, 7.19
**,**Problems 8.24, 8.26, 8.27 (skip the Newton Raphson part) - Practice
problems (I will suggest doing at least two): 8.4, 8.12, 8.28, 8.29
**Homework 5: Due March 13 Thursday****Problems 4.2, 4.5, 4.6, 4.10, 4.13, 4.14****Problems: 6.1, 6.2, 6.5, 6.7, 6.9, 6.16****Extra credit: 6.8, 6.14, 6.10****Problem 4.6:****What is the missed detection probability assuming P_hat is Gaussian distributed with computed mean and variance and you use a detection threshold of E[P_hat]/2****Homework 4: due Friday Feb 21 by 5pm**- Problems 3.1, 3.3, 3.9, 3.11
- Do the following sets of problems: practice set (will
be graded for completion, ignore deadlines written on it)
**Homework 3: due Feb 18 Tues**- Problems 5.2, 5.3,
5.4, 5.5, 5.7, 5.13, 5.16
- Compute the MVUE for a
N(\mu,\sigma^2) distribution using N i.i.d. observations. Also compute the covariance
matrix of the MVUE estimator.
- Homework 2: due Thurs Feb 6
- Problems 2.1, 2.4, 2.7, 2.9, 2.10 of Kay-I. Bonus: 2.8
- Homework 1: due Thurs, January 23.
- Chapter 3 of Supplementary
Problems for Bertsekas’s Probability Text:
Problems 5, 6, 8,
9, 10, 14, 18, 19, 20, 21
**Correction:****Suppose X1 is N(0,1), X2 is 1 w.p. 1/2 and -1 w.p. 1/2, and X3 = X1. X2. Compute the pdf of X3 and compute the joint pdf of X1 and X3**- Practice problems: use this link to do selected practice problems
from Chapters 1 and 2: EE 322 Fall 2007 homework sets (do not need to be submit)
**Handouts****Introduction slides****Review**- Basic probability review (EE 322 recap)
- Practice problems: http://athenasc.com/prob-supp.html
(look at problems for Chapter 1, 2, 3).
- EE
322 course
**Probability Review**(Prof. ALD's notes)- Probability Recap 3
- Linear algebra review
**Classical Estimation****Recap notes for all topics: fill in the gaps in the notes below****min (classical) MSE estimation, MVUE, Sufficient Statistics, MLE**- H1: Minimum Variance Unbiased
Estimation (Prof. ALD's notes), Chapter 2, 3, 5 of Kay-I
**H2: Cramer Rao Bound and Efficient Estimators**(Prof. ALD's notes)- H3: Linear Models, Best Linear
Unbiased Estimator and ML Estimation (Prof. ALD's notes)
**EM algorithm**- Dempster
paper,
- See scanned notes in
WebCT
- Prof. ALD's handout
**Least Squares estimation****LS estimation summary**- Also see Chapter 8
of Kay-I
**Sparse Recovery / Compressive sensing****Introduction****Old slides: Introduction to Compressive Sensing**- Compressive
Sensing class (EE 520)
- CS papers' archive
**Bayesian estimation****MMSE and linear MMSE estimation and Kalman filtering**- Joint
Gaussian random variables, MMSE and linear MMSE Estimation: based
on Poor’s book
- Kalman filtering details and proofs: based on
Poor's book
**Kalman Filter algorithm summary (an old handout)**: based on Poor's book- See Spring 2008
midterm 2 solutions (in WebCT) for
- second formula for Kalman gain
- KF with control
input
- Extended KF
- KF for case when
system noise and observation noise at the same time are correlated
**Some extra things****Bayesian Inference using Prof ALD's notes**: some parts of this are useful**Graphical models****Graphical models**(Prof. ALD's notes) approach for handling conditional dependencies in Bayesian estimation (Prof ALD's handout)**Hidden Markov Models (HMM)**- HMM
notes: mostly based on Rabiner's tutorial
paper
**(new notes)** - Rabiner's
tutorial paper
**Detection Theory:**- H5, H5b ,
**H5c**(Prof ALD's notes) - Generalized
LRT: asymptotic distribution
**Monte Carlo****Simple MC and Importance Sampling (IS)**- Simple MC, Importance Sampling (IS),
Biased IS & Bayesian IS
**(Prof ALD's handout)** **Geweke's paper (1989)**- Concept, Conditions
for getting Consistent Estimates
- Bias-variance
tradeoff, Finding good importance densities (to reduce estimator
variance), Examples
- Resampling (sampling-importance
resampling), Rao-Blackwellization
- Techniques for
generating random samples from a PDF
- Accept-Reject
**Markov Chain Monte Carlc (MCMC)****Markov chain concepts and Markov Chain Monte Carlo (Prof ALD's handout)**- Markov chain
concepts (for proofs, full details, take EE 523)
- General conditions
for MCMC to work (slide 24)
- Metropolis-Hastings
and proof of why it works
**Particle filtering**- Particle filtering
**Doucet et al's paper (2000)**- HMM model and other
algorithms
- Importance sampling
to approximate a PDF (sum of weighted Diracs)
- Sequential Importance
Sampling (SIS)
- Resampling concept
- Particle filtering
algorithm: SIS + Resample
**Probability and Linear Algebra Recap****Undergrad probability review (EE 322)**- Chapter
1 slides
**Single Random Variable: Discrete & Continuous****Multiple Random Variables: Discrete**, Multiple Random Variables: Continuous**Linear algebra review:**- http://www.maths.mq.edu.au/~wchen/lnlafolder/lnla.html
(Chapters 5-10, Chap 11,12 may also be useful)