Includes bibliographical references (p. -413) and index.
General Introduction.- Part I: Unconstraint Problems: Basic Methods-- Line-Searches-- Newtonian Methods-- Conjugate Gradient-- Special Methods.- Part II: Nonsmooth Optimization: Some Theory of Nonsmooth Optimization-- Some Methods in Nonsmooth Optimization-- Bundle Methods. The Quest of Decent-- Decomposition and Duality.- Part III: Newton's Methods in Constrained Optimization: Background-- Local Methods for Problems with Equality Constraints-- Local Methods for Problems with Equality and Inequality Constraints-- Exact Penalization-- Globalization by Line-Search-- Quasi-Newton Versions.- Part IV: Interior-Point Algorithms for Linear and Quadratic Optimization: Linearly Constrained Optimization and Simplex Algorithm-- Linear Monotone Complementary and Associated Vector Fields-- Predictor-Corrector Algorithms-- Non-Feasible Algorithms-- Self-Duality-- One-Step Methods-- Complexity of Linear Optimization Problems with Integer Data-- Karmarkar's Algorithm.- References.- Index.
(source: Nielsen Book Data)
Just as in its first edition, this book starts with illustrations of the ubiquitous character of optimization, and describes numerical algorithms in a tutorial way. It covers fundamental algorithms as well as more specialized and advanced topics for unconstrained and constrained problems. Most of the algorithms are explained in a detailed manner, allowing straightforward implementation. Theoretical aspects of the approaches chosen are also addressed with care, often using minimal assumptions. This new edition contains computational exercises in the form of case studies which help understanding optimization methods beyond their theoretical, description, when coming to actual implementation. Besides, the nonsmooth optimization part has been substantially reorganized and expanded. (source: Nielsen Book Data)