Optimal control theory is the science of maximizing the returns from and minimizing the costs of the operation of physical, social, and economic processes. Geared toward upper-level undergraduates, this text int...

Buy Now From Amazon

Optimal control theory is the science of maximizing the returns from and minimizing the costs of the operation of physical, social, and economic processes. Geared toward upper-level undergraduates, this text introduces three aspects of optimal control theory: dynamic programming, Pontryagin's minimum principle, and numerical techniques for trajectory optimization.
Chapters 1 and 2 focus on describing systems and evaluating their performances. Chapter 3 deals with dynamic programming. The calculus of variations and Pontryagin's minimum principle are the subjects of chapters 4 and 5, and chapter 6 examines iterative numerical techniques for finding optimal controls and trajectories. Numerous problems, intended to introduce additional topics as well as to illustrate basic concepts, appear throughout the text.



Similar Products

Optimal Control and Estimation (Dover Books on Mathematics)Control System Design: An Introduction to State-Space Methods (Dover Books on Electrical Engineering)Nonlinear SystemsCalculus of Variations and Optimal Control Theory: A Concise IntroductionProbabilistic Robotics (Intelligent Robotics and Autonomous Agents series)Introduction to Stochastic Control Theory (Dover Books on Electrical Engineering)Schaum’s Outline of Feedback and Control Systems, 2nd Edition (Schaum's Outlines)Reinforcement Learning: An Introduction (Adaptive Computation and Machine Learning series)