Course Outline

I. Modeling and Analysis of Control Systems

1. Introduction and classification

2. State space models in both discrete and continuous time

3. Linear and nonlinear systems

4. Linearization

5. Transfer function description of linear systems. Relationship with state space models.

6. Minimal realizations. Controllable and observable forms

7. Vector spaces and linear transformations

8. Review of linear algebra; the Cayley-Hamilton theorem

9. State transition matrix and solutions of linear state equations

II. Structural Properties of Control Systems

1. Stability (Lyapunov, Input-Output)

2. Stability tests for linear systems; stability subspaces

3. Stability tests for nonlinear systems

4. Controllability; controllable subspaces

5. Observability; unobservable subspaces

III. Feedback Controller Design

1. Role of feedback in controller design

2. Stabilization and eigenvalue placement by state and output feedback

3. Full-order and reduced-order observers

4. Tracking, disturbance rejection, decoupling

5. Sensitivity analysis. Role of feedback in sensitivity reduction. Robustness

IV. Optimal Feedback Control

1. Linear-quadratic (LQ) optimal control problem and design of optimum regulators

2. The matrix Riccati differential equation and some of its properties

3. The infinite-horizon case: Time-invariant optimal controllers and the algebraic Riccati equation

4. Dynamic programming for both discrete-time and continuous-time systems; the Hamiton-Jacobi-bellman (HJB) equation; relationship between open-loop and closed-loop controllers