ECE 515
ECE 515 - Control System Theory & Design
Spring 2024
Title | Rubric | Section | CRN | Type | Hours | Times | Days | Location | Instructor |
---|---|---|---|---|---|---|---|---|---|
Control System Theory & Design | ECE515 | N | 33983 | DIS | 4 | 1100 - 1220 | T R | 2200 Sidney Lu Mech Engr Bldg | Ivan Thomas Abraham |
Control System Theory & Design | ECE515 | ONL | 40663 | OD | 4 | 1100 - 1220 | M W | Ivan Thomas Abraham | |
Control System Theory & Design | ME540 | N | 52780 | DIS | 4 | 1100 - 1220 | T R | 2200 Sidney Lu Mech Engr Bldg | Ivan Thomas Abraham |
Control System Theory & Design | ME540 | ONL | 73455 | OD | 4 | 1100 - 1220 | M W | Ivan Thomas Abraham |
See full schedule from Course Explorer
Official Description
Feedback control systems emphasizing state space techniques. Basic principles, modeling, analysis, stability, structural properties, optimization, and design to meet specifications. Course Information: Same as ME 540. Prerequisite: ECE 486.
Subject Area
- Control Systems
Course Director
Description
Fundamental course on feedback control systems. Basic principles, modeling, optimization and design to meet specifications.
Topics
- System modeling and analysis: system design as a control problem - constraints, goals and performance specifications, input-output and state space models; linearization; review of linear algebra; fundamentals of state-space analysis of linear systems
- System structural properties: stability; introduction to Lyapunov methods; controllability, observability; canonical forms and minimal realizations. Modeling uncertainties; system sensitivity and robustness measures.
- Feedback system design: basic properties of feedback; stabilization and eigenvalue placement by state and output feedback; disturbance rejection; observers for estimating states, and observer feedback systems
- Optimum feedback control: dynamic programming and the Hamilton-Jacobi-Bellman equation; synthesis of optimum state regulator systems; numerical methods
- Introduction to the minimum principle: calculus of variations and necessary conditions for optimal trajectories; minimum principle for bounded controls; time-optimal control of linear systems; numerical methods
Detailed Description and Outline
Topics:
- System modeling and analysis: system design as a control problem - constraints, goals and performance specifications, input-output and state space models; linearization; review of linear algebra; fundamentals of state-space analysis of linear systems
- System structural properties: stability; introduction to Lyapunov methods; controllability, observability; canonical forms and minimal realizations. Modeling uncertainties; system sensitivity and robustness measures.
- Feedback system design: basic properties of feedback; stabilization and eigenvalue placement by state and output feedback; disturbance rejection; observers for estimating states, and observer feedback systems
- Optimum feedback control: dynamic programming and the Hamilton-Jacobi-Bellman equation; synthesis of optimum state regulator systems; numerical methods
- Introduction to the minimum principle: calculus of variations and necessary conditions for optimal trajectories; minimum principle for bounded controls; time-optimal control of linear systems; numerical methods
Texts
Notes
Last updated
2/13/2013