ECE 513
ECE 513 - Vector Space Signal Processing
Spring 2022
Title | Rubric | Section | CRN | Type | Hours | Times | Days | Location | Instructor |
---|---|---|---|---|---|---|---|---|---|
Vector Space Signal Processing | ECE513 | D | 44849 | OLC | 4 | 0930 - 1050 | M W | Zhizhen Jane Zhao |
See full schedule from Course Explorer
Official Description
Mathematical tools in a vector space framework, including: finite and infinite dimensional vector spaces, Hilbert spaces, orthogonal projections, subspace techniques, least-squares methods, matrix decomposition, conditioning and regularizations, bases and frames, the Hilbert space of random variables, random processes, iterative methods; applications in signal processing, including inverse problems, filter design, sampling, interpolation, sensor array processing, and signal and spectral estimation. Course Information: Prerequisite: ECE 310, ECE 313, and MATH 415.
Subject Area
- Signal Processing
Course Director
Description
Fundamentals of linear least squares estimation of discrete-time signals and their spectra: minimum-norm least squares and total least squares solutions; singular value decomposition; Wiener and Kalman filtering; autoregressive spectral analysis; and the maximum entropy method.
Topics
- Matrix inversion: orthogonal projections; left and right inverses; minimum-norm least squares solutions; Moore-Penrose pseudoinverse; reularization; singular value decomposition; Eckart and Young theorem; total least squares; principal components analysis
- Projections in Hilbert space: Hilbert space; projection theorem; normal equations, approximation and Fourier series; pseudoinverse operators, application to extrapolation of bandlimited sequences
- Hilbert space of random variables: spectral representation of discrete-time stochastic processes; spectral factorization; linear minimum-variance estimation; discrete-time Wiener filter; innovations representation; Wold decomposition; Gauss Markov theorem; sequential least squares; discrete-time Kalman filter
- Power spectrum estimation: system identification; Prony's linear prediction method; Fourier and other nonparametric methods of spectrum estimation; resolution limits and model based methods; autoregressive models and the maximum entropy method; Levinson's algorithm; lattice filters; harmonic retrieval by Pisarenko's method; direction finding with passive multi-sensor arrays
Detailed Description and Outline
Topics:
- Matrix inversion: orthogonal projections; left and right inverses; minimum-norm least squares solutions; Moore-Penrose pseudoinverse; reularization; singular value decomposition; Eckart and Young theorem; total least squares; principal components analysis
- Projections in Hilbert space: Hilbert space; projection theorem; normal equations, approximation and Fourier series; pseudoinverse operators, application to extrapolation of bandlimited sequences
- Hilbert space of random variables: spectral representation of discrete-time stochastic processes; spectral factorization; linear minimum-variance estimation; discrete-time Wiener filter; innovations representation; Wold decomposition; Gauss Markov theorem; sequential least squares; discrete-time Kalman filter
- Power spectrum estimation: system identification; Prony's linear prediction method; Fourier and other nonparametric methods of spectrum estimation; resolution limits and model based methods; autoregressive models and the maximum entropy method; Levinson's algorithm; lattice filters; harmonic retrieval by Pisarenko's method; direction finding with passive multi-sensor arrays
Texts
Class notes.
Recommended:
B. Porat, Digital Processing of Random Signals, Prentice-Hall, 1994.
Recommended:
B. Porat, Digital Processing of Random Signals, Prentice-Hall, 1994.
Last updated
2/13/2013