xxix, 656 pages : illustrations (some color) ; 26 cm
  • Basic Concepts of Linear Optimization The Company Dovetail Definition of an LO-Model Alternatives of the Standard LO-Model Solving LO-Models Using a Computer Package Linearizing Nonlinear Functions Examples of Linear Optimization Models Building and Implementing Mathematical Models Exercises LINEAR OPTIMIZATION THEORY: BASIC TECHNIQUES Geometry and Algebra of Feasible Regions The Geometry of Feasible Regions Algebra of Feasible Regions-- Feasible Basic Solutions Exercises Dantzig's Simplex Algorithm From Vertex to Vertex to an Optimal Solution LO-Model Reformulation The Simplex Algorithm Simplex Tableaus Discussion of the Simplex Algorithm Initialization Uniqueness and Multiple Optimal Solutions Models with Equality Constraints The Revised Simplex Algorithm Exercises Duality, Feasibility, and Optimality The Companies Dovetail and Salmonnose Duality and Optimality Complementary Slackness Relations Infeasibility and Unboundedness-- Farkas' Lemma Primal and Dual Feasible Basic Solutions Duality and the Simplex Algorithm The Dual Simplex Algorithm Exercises Sensitivity Analysis Sensitivity of Model Parameters Perturbing Objective Coefficients Perturbing Right Hand Side Values (Nondegenerate Case) Piecewise Linearity of Perturbation Functions Perturbation of the Technology Matrix Sensitivity Analysis for the Degenerate Case Shadow Prices and Redundancy of Equality Constraints Exercises Large-Scale Linear Optimization The Interior Path Formulation of the Interior Path Algorithm Convergence to the Interior Path-- Maintaining Feasibility Termination and Initialization Exercises Integer Linear Optimization Introduction The Branch-and-Bound Algorithm Linearizing Logical Forms with Binary Variables Gomory's Cutting-Plane Algorithm Exercises Linear Network Models LO-Models with Integer Solutions-- Total Unimodularity ILO-Models with Totally Unimodular Matrices The Network Simplex Algorithm Exercises Computational Complexity Introduction to Computational Complexity Computational Aspects of Dantzig's Simplex Algorithm The Interior Path Algorithm Has Polynomial Running Time Computational Aspects of the Branch-and-Bound Algorithm Exercises LINEAR OPTIMIZATION PRACTICE: ADVANCED TECHNIQUES Designing a Reservoir for Irrigation The Parameters and the Input Data Maximizing the Irrigation Area Changing the Input Parameters of the Model GMPL Model Code Exercises Classifying Documents by Language Machine Learning Classifying Documents Using Separating Hyperplanes LO-Model for Finding Separating Hyperplane Validation of a Classifier Robustness of Separating Hyperplanes-- Separation Width Models that Maximize the Separation Width GMPL Model Code Exercises Production Planning-- A Single Product Case Model Description Regular Working Hours Overtime Allowing Overtime and Idle Time Sensitivity Analysis GMPL Model Code Exercises Production of Coffee Machines Problem Setting An LO-Model that Minimizes Backlogs Old and Recent Backlogs Full Week Productions Sensitivity Analysis GMPL Model Code Exercises Conflicting Objectives: Producing Versus Importing Problem Description and Input Data Modeling Two Conflicting Objectives-- Pareto Optimal Point Goal Optimization for Conflicting Objective Soft and Hard Constraints Sensitivity Analysis Alternative Solution Techniques A Comparison of the Solutions GMPL Model Code Exercises Coalition Formation and Profit Distribution The Farmers Cooperation Problem Game Theory-- Linear Production Games How to Distribute the Total Profit Among the Farmers? Profit Distribution for Arbitrary Numbers of Farmers Sensitivity Analysis Exercises Minimizing Trimloss When Cutting Cardboard Formulating the Problem Gilmore-Gomory's Solution Algorithm Calculating an Optimal Solution Exercises Off-Shore Helicopter Routing Problem Description Vehicle Routing Problems Problem Formulation ILO Formulation Column Generation Dual Values as Price Indicators for Crew Exchanges A Round-Off Procedure for Determining an Integer Solution Computational Experiments Sensitivity Analysis Exercises The Catering Service Problem Formulation of the Problem The Transshipment Problem Formulation Applying the Network Simplex Algorithm Sensitivity Analysis GMPL Model Code Exercises Appendix A Mathematical Proofs Appendix B Linear Algebra Appendix C Graph Theory Appendix D Convexity Appendix E Nonlinear Optimization Appendix F Writing LO-Models in GNU MathProg (GMPL).
  • (source: Nielsen Book Data)9781498710169 20160619
Presenting a strong and clear relationship between theory and practice, Linear and Integer Optimization: Theory and Practice is divided into two main parts. The first covers the theory of linear and integer optimization, including both basic and advanced topics. Dantzig's simplex algorithm, duality, sensitivity analysis, integer optimization models, and network models are introduced. More advanced topics also are presented including interior point algorithms, the branch-and-bound algorithm, cutting planes, complexity, standard combinatorial optimization models, the assignment problem, minimum cost flow, and the maximum flow/minimum cut theorem. The second part applies theory through real-world case studies. The authors discuss advanced techniques such as column generation, multiobjective optimization, dynamic optimization, machine learning (support vector machines), combinatorial optimization, approximation algorithms, and game theory. Besides the fresh new layout and completely redesigned figures, this new edition incorporates modern examples and applications of linear optimization. The book now includes computer code in the form of models in the GNU Mathematical Programming Language (GMPL). The models and corresponding data files are available for download and can be readily solved using the provided online solver. This new edition also contains appendices covering mathematical proofs, linear algebra, graph theory, convexity, and nonlinear optimization. All chapters contain extensive examples and exercises. This textbook is ideal for courses for advanced undergraduate and graduate students in various fields including mathematics, computer science, industrial engineering, operations research, and management science.
(source: Nielsen Book Data)9781498710169 20160619
Engineering Library (Terman)
ix, 494 p. : ill ; 26 cm.
The interplay between optimization and machine learning is one of the most important developments in modern computational science. Optimization formulations and methods are proving to be vital in designing algorithms to extract essential knowledge from huge volumes of data. Machine learning, however, is not simply a consumer of optimization technology but a rapidly evolving field that is itself generating new optimization ideas. This book captures the state of the art of the interaction between optimization and machine learning in a way that is accessible to researchers in both fields.Optimization approaches have enjoyed prominence in machine learning because of their wide applicability and attractive theoretical properties. The increasing complexity, size, and variety of today's machine learning models call for the reassessment of existing assumptions. This book starts the process of reassessment. It describes the resurgence in novel contexts of established frameworks such as first-order methods, stochastic approximations, convex relaxations, interior-point methods, and proximal methods. It also devotes attention to newer themes such as regularized optimization, robust optimization, gradient and subgradient methods, splitting techniques, and second-order methods. Many of these techniques draw inspiration from other fields, including operations research, theoretical computer science, and subfields of optimization. The book will enrich the ongoing cross-fertilization between the machine learning community and these other fields, and within the broader optimization community.
(source: Nielsen Book Data)9780262016469 20160608
Engineering Library (Terman)
xii, 360 p. : ill. ; 24 cm.
This book bridges optimal control theory and economics, discussing ordinary differential equations, optimal control, game theory, and mechanism design in one volume. Technically rigorous and largely self-contained, it provides an introduction to the use of optimal control theory for deterministic continuous-time systems in economics. The theory of ordinary differential equations (ODEs) is the backbone of the theory developed in the book, and chapter 2 offers a detailed review of basic concepts in the theory of ODEs, including the solution of systems of linear ODEs, state-space analysis, potential functions, and stability analysis. Following this, the book covers the main results of optimal control theory, in particular necessary and sufficient optimality conditions; game theory, with an emphasis on differential games; and the application of control-theoretic concepts to the design of economic mechanisms. Appendixes provide a mathematical review and full solutions to all end-of-chapter problems. The material is presented at three levels: single-person decision making; games, in which a group of decision makers interact strategically; and mechanism design, which is concerned with a designer's creation of an environment in which players interact to maximize the designer's objective. The book focuses on applications; the problems are an integral part of the text. It is intended for use as a textbook or reference for graduate students, teachers, and researchers interested in applications of control theory beyond its classical use in economic growth. The book will also appeal to readers interested in a modeling approach to certain practical problems involving dynamic continuous-time models.
(source: Nielsen Book Data)9780262015738 20160606
Engineering Library (Terman)
xiv, 277 p. : ill. ; 25 cm.
  • PREFACE -- ACKNOWLEDGEMENTS -- 1. Introduction -- 2. Optimal shape design -- 3. Partial differential equations for fluids -- 4. Some numerical methods for fluids -- 5. Sensitivity evaluation and automatic differentiation -- 6. Parameterization and implementation issues -- 7. Local and global optimization -- 8. Incomplete sensitivities -- 9. Consistent approximations and approximate gradients -- 10. Numerical results on shape optimization -- 11. Control of unsteady flows -- 12. From airplane design to microfluidic -- 13. Toplogical optimization for fluids -- 14. Conclusion and perspectives -- INDEX.
  • (source: Nielsen Book Data)9780199546909 20160603
The fields of computational fluid dynamics (CFD) and optimal shape design (OSD) have received considerable attention in the recent past, and are of practical importance for many engineering applications. This new edition of Applied Shape Optimization for Fluids deals with shape optimization problems for fluids, with the equations needed for their understanding (Euler and Navier Strokes, but also those for microfluids) and with the numerical simulation of these problems. It presents the state of the art in shape optimization for an extended range of applications involving fluid flows. Automatic differentiation, approximate gradients, unstructured mesh adaptation, multi-model configurations, and time-dependent problems are introduced, and their implementation into the industrial environments of aerospace and automobile equipment industry explained and illustrated. With the increases in the power of computers in industry since the first edition, methods which were previously unfeasible have begun giving results, namely evolutionary algorithms, topological optimization methods, and level set algortihms. In this edition, these methods have been treated in separate chapters, but the book remains primarily one on differential shape optimization. This book is essential reading for engineers interested in the implementation and solution of optimization problems using commercial packages or in-house solvers and graduates and researchers in applied mathematics, aerospace, or mechanical engineering, fluid dynamics, and CFD. More generally, anyone needing to understand and solve design problems or looking for new exciting areas for research and development in this area will find this book useful, especially in applying the methodology to practical problems.
(source: Nielsen Book Data)9780199546909 20160603
Engineering Library (Terman)
xiv, 498 p. : ill. ; 26 cm.
  • 1. Automatic code generation for real-time convex optimization J. Mattingley and S. Boyd-- 2. Gradient-based algorithms with applications to signal recovery problems A. Beck and M. Teboulle-- 3. Graphical models of autoregressive processes J. Songsiri, J. Dahl and L. Vandenberghe-- 4. SDP relaxation of homogeneous quadratic optimization Z. Q. Luo and T. H. Chang-- 5. Probabilistic analysis of SDR detectors for MIMO systems A. Man-Cho So and Y. Ye-- 6. Semidefinite programming, matrix decomposition, and radar code design Y. Huang, A. De Maio and S. Zhang-- 7. Convex analysis for non-negative blind source separation with application in imaging W. K. Ma, T. H. Chan, C. Y. Chi and Y. Wang-- 8. Optimization techniques in modern sampling theory T. Michaeli and Y. C. Eldar-- 9. Robust broadband adaptive beamforming using convex optimization M. Rubsamen, A. El-Keyi, A. B. Gershman and T. Kirubarajan-- 10. Cooperative distributed multi-agent optimization A. Nenadic and A. Ozdaglar-- 11. Competitive optimization of cognitive radio MIMO systems via game theory G. Scutari, D. P. Palomar and S. Barbarossa-- 12. Nash equilibria: the variational approach F. Facchinei and J. S. Pang.
  • (source: Nielsen Book Data)9780521762229 20160603
Over the past two decades there have been significant advances in the field of optimization. In particular, convex optimization has emerged as a powerful signal processing tool, and the variety of applications continues to grow rapidly. This book, written by a team of leading experts, sets out the theoretical underpinnings of the subject and provides tutorials on a wide range of convex optimization applications. Emphasis throughout is on cutting-edge research and on formulating problems in convex form, making this an ideal textbook for advanced graduate courses and a useful self-study guide. Topics covered range from automatic code generation, graphical models, and gradient-based algorithms for signal recovery, to semidefinite programming (SDP) relaxation and radar waveform design via SDP. It also includes blind source separation for image processing, robust broadband beamforming, distributed multi-agent optimization for networked systems, cognitive radio systems via game theory, and the variational inequality approach for Nash equilibrium solutions.
(source: Nielsen Book Data)9780521762229 20160603
Engineering Library (Terman)
xix, 464 p. : ill. (some col.) ; 25 cm.
  • Preface.- Preface to 2nd edition.- Preface to 3rd edition.- Introduction.- The Simplex Method.- Degeneracy.- Efficiency of the Simplex Method.- Duality method.- The Simplex Method in matrix notation.- Sensitivity and parametric analyses.- Implementation issues.- Problems in general form.- Convex analysis.- Game theory.- Regression.- Financial applications.- Network flow problems.- Applications.- Structural optimization.- The central path.- A path-following method.- The KKT system.- Implementation issues.- The affine-scaling method.- The homogeneous self-dual method.- Integer programming.- Quadratic programming.- Convex programming.- Source listings.- Answers to selected exercises.- Bibliography.- Index.
  • (source: Nielsen Book Data)9780387743875 20160528
"Linear Programming: Foundations and Extensions" is an introduction to the field of optimization. The book emphasizes constrained optimization, beginning with a substantial treatment of linear programming, and proceeding to convex analysis, network flows, integer programming, quadratic programming, and convex optimization. The book is carefully written. Specific examples and concrete algorithms precede more abstract topics. Topics are clearly developed with a large number of numerical examples worked out in detail. Moreover, "Linear Programming: Foundations and Extensions" underscores the purpose of optimization: to solve practical problems on a computer. Accordingly, the book is coordinated with free efficient C programs that implement the major algorithms studied: the two-phase simplex method; the primal-dual simplex method; the path-following interior-point method; and the homogeneous self-dual methods. In addition, there are online JAVA applets that illustrate various pivot rules and variants of the simplex method, both for linear programming and for network flows.
(source: Nielsen Book Data)9780387743875 20160528
Engineering Library (Terman)
xxv, 272 p. : ill. ; 26 cm.
  • Preface-- 1. Brief review of control and stability theory-- 2. Algorithms as dynamical systems with feedback-- 3. Optimal control and variable structure design of iterative methods-- 4. Neural-gradient dynamical systems for linear and quadratic programming problems-- 5. Control tools in the numerical solution of ordinary differential equations and in matrix problems-- 6. Epilogue-- Bibliography-- Index.
  • (source: Nielsen Book Data)9780898716023 20160528
Control Perspectives on Numerical Algorithms and Matrix Problems organizes the analysis and design of iterative numerical methods from a control perspective. The authors discuss a variety of applications, including iterative methods for linear and nonlinear systems of equations, neural networks for linear and quadratic programming problems, support vector machines, integration and shooting methods for ordinary differential equations, matrix preconditioning, matrix stability, and polynomial zero finding. This book opens up a new field of interdisciplinary research that should lead to insights in the areas of both control and numerical analysis and shows that a wide range of applications can be approached from - and benefit from - a control perspective.
(source: Nielsen Book Data)9780898716023 20160528
Engineering Library (Terman)
xxi, 546 p. : ill. ; 25 cm.
  • Preface and Acknowledgments (J. D. Pinter) 1. Global Optimization in Solvent Design (L. E. K. Achenie, G. M. Ostrovsky and M. Sinha) 2. Feeding Strategies for Maximising Gross Margin in Pig Production (D. L. J. Alexander, P .C .H. Morel and G. R. Wood) 3. Optimized Design of Dynamic Networks with Heuristic Algorithms (S. Allen, S. Hurley, V. Samko and R. Whitaker) 4. A New Smoothing-based Global Optimization Algorithm for Protein Conformation Problems (A. M. Azmi, R. H. Byrd, E. Eskow and R. B. Schnabel) 5. Physical Perspectives on the Global Optimization of Atomic Clusters (J. P. K. Doye) 6. Efficient Global Geometry Optimization of Atomic and Molecular Clusters (B. Hartke) 7. Computational Analysis of Human DNA Sequences: An Application of Artificial Neural Networks (A. G. Hatzigeorgiou and M. S. Megraw) 8. Determination of a Laser Cavity Field Solution Using Global Optimization (G. Isenor, J. D. Pinter and M. Cada) 9. Computational Experience with the Molecular Distance Geometry Problem (C. Lavor, L. Liberti and N. Macaulan) 10. Non-linear Optimization Models in Water Resource Systems (S. Liberatore, G. M. Sechi and P. Zuddas) 11. Solving the Phase Unwrapping Problem by a Parameterized Network Optimization Approach (P. Maponi and F. Zirilli) 12. Evolutionary Algorithms for Global Optimization (A. Osyczka and S. Krenich) 13. Determining 3-D Structure of Spherical Viruses by Global Optimization (O. Ozturk, P. C. Doerschuk and S. B. Gelfand) 14. A Collaborative Solution Methodology for Inverse Position Problem (C. S. Pedamallu and L. Ozdamar) 15. Improved Learning of Neural Nets through Global Search (V. P. Plagianakos, G. D. Magoulos and M. N. Vrahatis) 16. Evolutionary Approach to Design Assembly Lines (B. Rekiek, P. De Lit and A. Delchambre) 17. Agroecosystem Management (R. Seppelt) 18. Finding the Minimal Root of an Equation -- Applications and Algorithms Based on Lipschitz Condition (Ya. D. Sergeyev) 19. Optimization of Radiation Therapy Dose Delivery with Multiple Static Collimation (J. Tervo, P. Kolmonen, J. D. Pinter and T. Lyyra-Lahtinen) 20. Parallel Triangulated Partitioning for Black Box Optimization (Y. Wu, L. Ozdamar and A. Kumar) 21. A Case Study: Composite Structure Design Optimization (Z. B. Zabinsky, M. E. Tuttle and C. Khompatraporn) 22. Neural Network Enhanced Optimal Self-tuning Controller Design for Induction Motors (Q. M. Zhu, L. Z. Guo and Z. Ma).
  • (source: Nielsen Book Data)9780387304083 20160528
Optimization models based on a nonlinear systems description often possess multiple local optima. The objective of global optimization (GO) is to find the best possible solution of multiextremal problems. This volume illustrates the applicability of GO modeling techniques and solution strategies to real-world problems. The contributed chapters cover a broad range of applications from agroecosystem management, assembly line design, bioinformatics, biophysics, black box systems optimization, cellular mobile network design, chemical process optimization, chemical product design, composite structure design, computational modeling of atomic and molecular structures, controller design for induction motors, electrical engineering design, feeding strategies in animal husbandry, the inverse position problem in kinematics, laser design, learning in neural nets, mechanical engineering design, numerical solution of equations, radiotherapy planning, robot design, and satellite data analysis. The solution strategies discussed encompass a range of practically viable methods, including both theoretically rigorous and heuristic approaches.
(source: Nielsen Book Data)9780387304083 20160528
Engineering Library (Terman)
xxii, 429 p. : ill. ; 24 cm.
  • List Of Figures.- List of Tables.- Contributing Authors.- Preface.- Acknowledgments.- The Formulation and Solution of Discrete Optimization Models.- Continuous Approaches for Solving Discrete Optimization Problems.- Logic-Based Modeling.- Modelling for Feasibility - The Case Of Mutually Orthogonal Latin Squares Problem.- Network Modelling.- Modeling and Optimization of Vehicle Routing Problems.- Radio Resource Management.- Strategic and Tactical Planning Models for Supply Chain: An Application of Stochastic Mixed Integer Programming.- Logic Inference and A Decomposition Algorithm for The Resource-Constrained Scheduling of Testing Tasks in The Development of New Pharmaceutical and Agrochemical Products.- A Mixed-Integer Nonlinear Programming Approach to the Optimal Planning of Offshore Oilfield Infrastructures.- Radiation Treatment Planning: Mixed Integer Programming Formulations and Approaches.- Multiple Hypothesis Correlation in Track-to-Track Fusion Management.- Applications to Computational Molecular Biology.- Index.
  • (source: Nielsen Book Data)9780387329413 20160528
The primary objective underlying the "Handbook on Modelling for Discrete Optimization" is to demonstrate and detail the pervasive nature of Discrete Optimization. While its applications cut across an incredibly wide range of activities, many of the applications are only known to specialists. It is the aim of this handbook to correct this. It has long been recognized that 'modelling' is a critically important mathematical activity in designing algorithms for solving these discrete optimization problems. Nevertheless solving the resultant models is also often far from straightforward. In recent years it has become possible to solve many large-scale discrete optimization problems. However, some problems remain a challenge, even though advances in mathematical methods, hardware, and software technology have pushed the frontiers forward. This handbook couples the difficult, critical-thinking aspects of mathematical modelling with the hot area of discrete optimization. It will be done in an academic handbook treatment outlining the state-of-the-art for researchers across the domains of the Computer Science, Math Programming, Applied Mathematics, Engineering, and Operations Research. Included in the handbook's treatment are results from Graph Theory, Logic, Computer Science, and Combinatorics. The chapters of this book are divided into two parts: one dealing with general methods in the modelling of discrete optimization problems and the other with specific applications. The first chapter of this volume, written by H. Paul Williams, can be regarded as a basic introduction of how to model discrete optimization problems as mixed integer problems, and outlines the main methods of solving them. In the second part of the book various real life applications are presented, most of them formulated as mixed integer linear or nonlinear programming problems. These applications include network problems, constant logic problems, many engineering problems, computer design, finance problems, medical diagnosis and medical treatment problems, applications of the Genome project, an array of transportation scheduling problems, and other applications.
(source: Nielsen Book Data)9780387329413 20160528
Engineering Library (Terman)
xxiv, 497 p. : ill. ; 25 cm.
  • List of figures.- List of tables.- Preface.- Acknowledgements.- Introduction.- I. Introdcution: Theory and Complexity.- Duality Theory for Linear Optimization.- A Polynomial Algorithm for the Self-dual Model.- Solving the Canonical Problem.- II. The Logatithmic Barrier Approach.- Preliminaries.- The Dual Logarithmic Barrier Method.- The Primal-Dual Logarithmic Barrier Method.- Initialization.- III. The Target-Following Approach.- Preliminaries.- The Primal-Dual Newton Method.- Applications.- The Dual Newton Method.- The Primal Newton Method.- Application to the Method of Centers.- IV. Miscellaneous Topics.- Karmarkar?s Projective Method.- More Properties of the Central Path.- Partial Updating.- Higher-Order Methods.- Parametric and Sensitivity Analysis.- Implementing Interior Point Methods.- Appendices.- Bibliography.- Author Index.- Subject Index.- Symbol Index.
  • (source: Nielsen Book Data)9780387263786 20160528
"Interior Point Methods for Linear Optimization" is a comprehensive, thorough textbook on interior point methods (IPMs). The era of IPMs was initiated by N. Karmarkar's 1984 paper, which triggered turbulent research and reshaped almost all areas of optimization theory and computational practice. This book gives a comprehensive review of the main results of more than a decade of IPM research. Numerous exercises are provided to aid in understanding the material.
(source: Nielsen Book Data)9780387263786 20160528
Engineering Library (Terman)
xvii, 407 p. : ill. ; 25 cm.
  • From the contents Multiscale Optimization in VLSI Physical Design Automation.- A Distributed Method for Solving Semidefinite Programs Arising from Ad Hoc Wireless Sensor Network Localization.- Optimization Algorithms for Sparse Representations and Applications.- A Unified Framework for Modeling and Solving Combinatorial Optimization Problems: A Tutorial.- Global Convergence of a Non-monotone Trust-Region Filter Algorithm for Nonlinear Programming.- Factors Affecting the Performance of Optimization-based Multigrid Methods.- A Local Relaxation Methods for Nonlinear Facility Location Problems.- Fluence Map Optimization in IMRT Cancer Treatment Planning and a Geometric Approach.- Panoramic Image Processing Using Non-Commutative Harmonic Analysis.- Generating Geometric Models through Self-Organizing Maps.- Self-Similar Solution of Unsteady Mixed Convection Flow on a Rotating Cone in a Rotating Fluid.- Homogenization of a Nonlinear Elliptic Boundary Value Problem Modelling Galvanic Interactions on a Heterogeneous Surface.
  • (source: Nielsen Book Data)9780387295497 20160528
As optimization researchers tackle larger and larger problems, scale interactions play an increasingly important role. One general strategy for dealing with a large or difficult problem is to partition it into smaller ones, which are hopefully much easier to solve, and then work backwards towards the solution of original problem, using a solution from a previous level as a starting guess at the next level. This volume contains 22 chapters highlighting some recent research. The topics of the chapters selected for this volume are focused on the development of new solution methodologies, including general multilevel solution techniques, for tackling difficult, large-scale optimization problems that arise in science and industry. Applications presented in the book include but are not limited to the circuit placement problem in VLSI design, a wireless sensor location problem, optimal dosages in the treatment of cancer by radiation therapy, and facility location.
(source: Nielsen Book Data)9780387295497 20160528
Engineering Library (Terman)
xxii, 664 p. : ill. ; 25 cm.
  • Introduction * Fundamentals of Unconstrained Optimization * Line Search Methods * Trust-Region Methods * Conjugate Gradient Methods * Practical Newton Methods * Calculating Derivatives * Quasi-Newton Methods * Large-Scale Quasi-Newton and Partially Separable Optimization * Nonlinear Least-Squares Problems * Nonlinear Equations * Theory of Constrained Optimization * Linear Programming: The Simplex Method * Linear Programming: Interior-Point Methods * Fundamentals of Algorithms for Nonlinear Constrained Optimization * Quadratic Programming * Penalty, Barrier, and Augmented Lagrangian Methods * Sequential Quadratic Programming * Background Material * References * Index.
  • (source: Nielsen Book Data)9780387987934 20160528
  • Preface.-Preface to the Second Edition.-Introduction.-Fundamentals of Unconstrained Optimization.-Line Search Methods.-Trust-Region Methods.-Conjugate Gradient Methods.-Quasi-Newton Methods.-Large-Scale Unconstrained Optimization.-Calculating Derivatives.-Derivative-Free Optimization.-Least-Squares Problems.-Nonlinear Equations.-Theory of Constrained Optimization.-Linear Programming: The Simplex Method.-Linear Programming: Interior-Point Methods.-Fundamentals of Algorithms for Nonlinear Constrained Optimization.-Quadratic Programming.-Penalty and Augmented Lagrangian Methods.-Sequential Quadratic Programming.-Interior-Point Methods for Nonlinear Programming.-Background Material.- Regularization Procedure.
  • (source: Nielsen Book Data)9780387303031 20160528
"Numerical Optimization" presents a comprehensive and up-to-date description of the most effective methods in continuous optimization. It responds to the growing interest in optimization in engineering, science, and business by focusing on the methods that are best suited to practical problems. Drawing on their experiences in teaching, research, and consulting, the authors have produced a textbook that will be of interest to students and practitioners alike. Each chapter begins with the basic concepts and builds up gradually to the best techniques currently available. Because of the emphasis on practical methods, as well as the extensive illustrations and exercises, the book is accessible to a wide audience. It can be used as a graduate text in engineering, operations research, mathematics, computer science, and business. It also serves as a handbook for researchers and practitioners in the field. Above all, the authors have strived to produce a text that is pleasant to read, informative, and rigorous - one that reveals both the beautiful nature of the discipline and its practical side. MMOR, Mathematical Methods of Operations Research, 2001 observes: "The book looks very suitable to be used in an graduate-level course in optimization for students in mathematics, operations research, engineering, and others. Moreover, it seems to be very helpful to do some self-studies in optimization, to complete own knowledge and can be a source of new ideas...I recommend this excellent book to everyone who is interested in optimization problems.".
(source: Nielsen Book Data)9780387987934 20160528
"Numerical Optimization" presents a comprehensive and up-to-date description of the most effective methods in continuous optimization. It responds to the growing interest in optimization in engineering, science, and business by focusing on the methods that are best suited to practical problems. For this new edition, the book has been thoroughly updated throughout. There are new chapters on nonlinear interior methods and derivative-free methods for optimization, both of which are used widely in practice and the focus of much current research. Because of the emphasis on practical methods, as well as the extensive illustrations and exercises, the book is accessible to a wide audience. It can be used as a graduate text in engineering, operations research, mathematics, computer science, and business. It also serves as a handbook for researchers and practitioners in the field. The authors have strived to produce a text that is pleasant to read, informative, and rigorous - one that reveals both the beautiful nature of the discipline and its practical side. There is a selected solutions manual for instructors for the new edition.
(source: Nielsen Book Data)9780387303031 20160528
Engineering Library (Terman), Science Library (Li and Ma)
xiii, 198 p. : ill. ; 24 cm.
Engineering Library (Terman)
xii, 687 p. : ill. ; 25 cm.
  • Preface 1 Introduction 1.1 Introduction 1.2 Mathematics Foundations 1.2.1 Norm 1.2.2 Inverse and Generalized Inverse of a Matrix1.2.3 Properties of Eigenvalues 1.2.4 Rank-One Update 1.2.5 Function and Differential 1.3 Convex Sets and Convex Functions 1.3.1 Convex Sets 1.3.2 Convex Functions 1.3.3 Separation and Support of Convex Sets 1.4 Optimality Conditions for Unconstrained Case 1.5 Structure of Optimization Methods Exercises 2 Line Search 2.1 Introduction 2.2 Convergence Theory for Exact Line Search 2.3 Section Methods 2.3.1 The Golden Section Method 2.3.2 The Fibonacci Method 2.4 Interpolation Method 2.4.1 Quadratic Interpolation Methods 2.4.2 Cubic Interpolation Method 2.5 Inexact Line Search Techniques 2.5.1 Armijo and Goldstein Rule 2.5.2 Wolfe-Powell Rule 2.5.3 Goldstein Algorithm and Wolfe-Powell Algorithm 2.5.4 Backtracking Line Search 2.5.5 Convergence Theorems of Inexact Line Search Exercises 3 Newton's Methods 3.1 The Steepest Descent Method 3.1.1 The Steepest Descent Method 3.1.2 Convergence of the Steepest Descent Method 3.1.3 Barzilai and Borwein Gradient Method 3.1.4 Appendix: Kantorovich Inequality 3.2 Newton's Method 3.3 Modified Newton's Method 3.4 Finite-Difference Newton's Method 3.5 Negative Curvature Direction Method 3.5.1 Gill-Murray Stable Newton's Method 3.5.2 Fiacco-McCormick Method 3.5.3 Fletcher-Freeman Method 3.5.4 Second-Order Step Rules 3.6 Inexact Newton's Method Exercises 4 Conjugate Gradient Method 4.1 Conjugate Direction Methods 4.2 Conjugate Gradient Method 4.2.1 Conjugate Gradient Method 4.2.2 Beale's Three-Term Conjugate Gradient Method 4.2.3 Preconditioned Conjugate Gradient Method 4.3 Convergence of Conjugate Gradient Methods 4.3.1 Global Convergence of Conjugate Gradient Methods 4.3.2 Convergence Rate of Conjugate Gradient Methods Exercises 5 Quasi-Newton Methods 5.1 Quasi-Newton Methods 5.1.1 Quasi-Newton Equation 5.1.2 Symmetric Rank-One (SR1) Update 5.1.3 DFP Update 5.1.4 BFGS Update and PSB Update 5.1.5 The Least Change Secant Update 5.2 The Broyden Class 5.3 Global Convergence of Quasi-Newton Methods 5.3.1 Global Convergence under Exact Line Search 5.3.2 Global Convergence under Inexact Line Search 5.4 Local Convergence of Quasi-Newton Methods 5.4.1 Superlinear Convergence of General Quasi-Newton Methods 5.4.2 Linear Convergence of General Quasi-Newton Methods 5.4.3 Local Convergence of Broyden's Rank-One Update 5.4.4 Local and Linear Convergence of DFP Method 5.4.5 Superlinear Convergence of BFGS Method 5.4.6 Superlinear Convergence of DFP Method 5.4.7 Local Convergence of Broyden's Class Methods 5.5 Self-Scaling Variable Metric (SSVM) Methods 5.5.1 Motivation to SSVM Method 5.5.2 Self-Scaling Variable Metric (SSVM) Method 5.5.3 Choices of the Scaling Factor 5.6 Sparse Quasi-Newton Methods 5.7 Limited Memory BFGS Method Exercises 6 Trust-Region and Conic Model Methods 6.1 Trust-Region Methods 6.1.1 Trust-Region Methods 6.1.2 Convergence of Trust-Region Methods 6.1.3 Solving A Trust-Region Subproblem 6.2 Conic Model and Collinear Scaling Algorithm 6.2.1 Conic Model 6.2.2 Generalized Quasi-Newton Equation 6.2.3 Updates that Preserve Past Information 6.2.4 Collinear Scaling BFGS Algorithm 6.3 Tensor Methods 6.3.1 Tensor Method for Nonlinear Equations 6.3.2 Tensor Methods for Unconstrained Optimization Exercises 7 Nonlinear Least-Squares Problems 7.1 Introduction 7.2 Gauss-Newton Method 7.3 Levenberg-Marquardt Method 7.3.1 Motivation and Properties 7.3.2 Convergence of Levenberg-Marquardt Method 7.4 Implementation of L-M Method 7.5 Quasi-Newton Method Exercises 8 Theory of Constrained Optimization 8.1 Constrained Optimization Problems 8.2 First-Order Optimality Conditions 8.3 Second-Order Optimality Conditions 8.4 Duality Exercises 9 Quadratic Programming 9.1 Optimality for Quadratic Programming 9.2 Duality for Quadratic Programming 9.3 Equality-Constrained Quadratic Programming 9.4 Active Set Methods 9.5 Dual Method 9.6 Interior Ellipsoid Method 9.7 Primal-Dual Interior-Point Methods Exercises 10 Penalty Function Methods 10.1 Penalty Function 10.2 The Simple Penalty Function Method 10.3 Interior Point Penalty Functions 10.4 Augmented Lagrangian Method 10.5 Smooth Exact Penalty Functions 10.6 Nonsmooth Exact Penalty Functions Exercises 11 Feasible Direction Methods 11.1 Feasible Point Methods11.2 Generalized Elimination11.3 Generalized Reduced Gradient Method11.4 Projected Gradient Method11.5 Linearly Constrained ProblemsExercises12 Sequential Quadratic Programming12.1 Lagrange-Newton Method12.2 Wilson-Han-Powell Method12.3 Superlinear Convergence of SQP Step12.4 Maratos Effect12.5 Watchdog Technique12.6 Second-Order Correction Step12.7 Smooth Exact Penalty Functions12.8 Reduced Hessian Matrix MethodExercises13 TR Methods for Constrained Problems13.1 Introduction13.2 Linear Constraints13.3 Trust-Region Subproblems13.4 Null Space Method13.5 CDT Subproblem13.6 Powell-Yuan AlgorithmExercises14 Nonsmooth Optimization14.1 Generalized Gradients14.2 Nonsmooth Optimization Problem14.3 The Subgradient Method14.4 Cutting Plane Method14.5 The Bundle Methods14.6 Composite Nonsmooth Function14.7 Trust Region Method for Composite Problems14.8 Nonsmooth Newton's MethodExercisesAppendix: Test Functionsx1. Test Functions for Unconstrained Optimization Problemsx2. Test Functions for Constrained Optimization Problems BibliographyIndex.
  • (source: Nielsen Book Data)9780387249759 20160528
This book, a result of the author's teaching and research experience in various universities and institutes over the past ten years, can be used as a textbook for an optimization course for graduates and senior undergraduates. It systematically describes optimization theory and several powerful methods, including recent results. For most methods, the authors discuss an idea's motivation, study the derivation, establish the global and local convergence, describe algorithmic steps, and discuss the numerical performance. The book deals with both theory and algorithms of optimization concurrently. It also contains an extensive bibliography. Finally, apart from its use for teaching, "Optimization Theory and Methods" will be very beneficial as a research reference.
(source: Nielsen Book Data)9780387249759 20160528
Engineering Library (Terman)
xxxi, 926 p. : ill. ; 25 cm.
  • Part I: Introduction.- Challenges in Computer Control and Automation.- Scientific Foundations for Biomimicry.- For Further Study.- Part II: Elements of Decision Making.- Neural Network Substrates for Control Instincts.- Rule-Based Control.- Planning Systems.- Attentional Systems.- For Further Study.- Part III: Learning.-Learning and Control.- Linear Least Squares Methods.- Gradient Methods.- Adaptive Control.- For Further Study.- Part IV: Evolution.- The Genetic Algorithm.- Stochastic and Non-Gradient Optimization for Design.- Evolution and Learning: Synergistic Effects.- For Further Study.- Part V: Foraging.- Cooperative Foraging and Search.- Competitive and Intelligent Foraging.- For Further Study.
  • (source: Nielsen Book Data)9781852338046 20160528
There are many highly effective optimization, feedback control, and automation systems embedded in living organisms and nature. Evolution persistently seeks optimal robust designs for biological feedback control systems and decision making processes. From this comprehensive text, you will gain knowledge of how mimicry of such biological processes can be used to solve optimization, control, and automation problems encountered in the construction of high technology systems. Mathematical stability analysis is treated for a number of cases, from attentional systems to social foraging swarm cohesion properties. Bio-inspired optimization and control methods are compared to conventional techniques with an objective to provide a balanced viewpoint. A companion web site, continually updated by the author, will provide you with further examples and design problems, solution hints, lecture slides, a running lab and ongoing self-study problems and resources. MATLAB code is provided to solve a number of key problems. The focus lies on verifying correct operation of technologies via a process of mathematical modelling and analysis complimented by computer simulations. Written from an engineering perspective, methods are applied to extensive real-world applications, from ship steering to cooperative control of a group of autonomous robots. Aimed primarily at graduate courses and research, much of the material has been successfully used for undergraduate courses. This dynamic textbook sends an injection of new ideas into engineering technology and the academic community.
(source: Nielsen Book Data)9781852338046 20160528
Engineering Library (Terman)
xviii, 302 p. : ill. ; 24 cm.
  • Evolutionary Multiobjective Optimization.- Recent Trends in Evolutionary Multiobjective Optimization.- Self-adaptation and Convergence of Multiobjective Evolutionary Algorithms in Continuous Search Spaces.- A Simple Approach to Evolutionary Multiobjective Optimization.- Quad-trees: A Data Structure for Storing Pareto-sets in Multiobjective Evolutionary Algorithms with Elitism.- Scalable Test Problems for Evolutionary Multiobjective Optimization.- Particle Swarm Inspired Evolutionary Algorithm (PS-EA) for Multi-criteria Optimization Problems.- Evolving Continuous Pareto Regions.- MOGADES: Multiobjective Genetic Algorithm with Distributed Environment Scheme.- Use of Multiobjective Optimization Concepts to Handle Constraints in Genetic Algorithms.- Multi-criteria Optimization of Finite State Automata: Maximizing Performance while Minimizing Description Length.- Multiobjective Optimization of Space Structures under Static and Seismic Loading Conditions.
  • (source: Nielsen Book Data)9781852337872 20160528
"Evolutionary Multiobjective Optimization" is a rare collection of the latest state-of-the-art theoretical research, design challenges and applications in the field of multiobjective optimization paradigms using evolutionary algorithms. It includes two introductory chapters giving all the fundamental definitions, several complex test functions and a practical problem involving the multiobjective optimization of space structures under static and seismic loading conditions used to illustrate the various multiobjective optimization concepts. Important features include: detailed overview of all the multiobjective optimization paradigms using evolutionary algorithms; excellent coverage of timely, advanced multiobjective optimization topics; state-of-the-art theoretical research and application developments; and chapters authored by pioneers in the field. Academics and industrial scientists as well as engineers engaged in research, development and application of evolutionary algorithm based "Multiobjective Optimization" will find the comprehensive coverage of this book invaluable.
(source: Nielsen Book Data)9781852337872 20160528
Engineering Library (Terman)
xiii, 466 p. : ill. ; 25 cm.
  • Part I: Advances For New Model And Solution Approaches.- A Scatter Search Tutorial for Graph-Based Permutation Problems.- A Multistart Scatter Search Heuristic for Smooth NLP and MINLP Problems.- Scatter Search Methods for the Covering Tour Problem.- Solution for the Sonet Ring Assignment Problem with Capacity Constraints.- Part II: Advances for Solving Classical Problems.- A Very Fast Tabu Search Algorithm for Job Shop Problem.- Tabu Search Heuristics for the Vehicle Routing Problem.- Some New Ideas in TS for Job Shop Scheduling.- A Tabu Search Heuristic for the Uncapacitated Facility Location Problem.- Adaptive Memory Search Guidance for Satisfiability Problems.- Part III: Experimental Evaluation.- Lessons from Applying and Experimenting with Scatter Search.- Tabu Search for Mixed-Integer Programming.- Scatter Search vs. Genetic Algorithms: An experimental evaluation with permutation problems.- Part IV: Review of Recent Developments.- Parallel Computation, Co-operation, Tabu Search.- Using Group Theory to Construct and Characterize Metaheuristic Search Neighborhoods.- Logistics Management: An Opportunity for Metaheuristics.- Part V: New Procedural Designs.- On the Integration of Metaheuristic Strategies in Constraint Programming.- General Purpose Metrics for Solution Variety.- Controlled Pool Maintenance in Combinatorial Optimization.- Adaptive Memory Projection Methods for Integer Programming.
  • (source: Nielsen Book Data)9781402081347 20160528
Tabu Search (TS) and, more recently, Scatter Search (SS) have proved highly effective in solving a wide range of optimization problems, and have had a variety of applications in industry, science, and government. The goal of "Metaheuristic Optimization via Memory and Evolution: Tabu Search and Scatter Search" is to report original research on algorithms and applications of tabu search, scatter search or both, as well as variations and extensions having "adaptive memory programming" as a primary focus. Individual chapters identify useful new implementations or new ways to integrate and apply the principles of TS and SS, or that prove new theoretical results, or describe the successful application of these methods to real world problems.
(source: Nielsen Book Data)9781402081347 20160528
Engineering Library (Terman)
xvii, 261 p. : ill. ; 24 cm.
  • List of Figures List of Tables Preface 1: PORTFOLIO OPTIMIZATION 1. Nonlinear optimization 2. Portfolio return and risk 3. Optimizing two-asset portfolios 4. Minimimum risk for three-asset portfolios 5. Two- and three-asset minimum-risk solutions 6. A derivation of the minimum risk problem 7. Maximum return problems2: ONE-VARIABLE OPTIMIZATION 1. Optimality conditions 2. The bisection method 3. The secant method 4. The Newton method 5. Methods using quadratic or cubic interpolation 6. Solving maximum-return problems3: OPTIMAL PORTFOLIOS WITH N ASSETS 1. Introduction 2. The basic minimum-risk problem 3. Minimum risk for specified return 4. The maximum return problem4: UNCONSTRAINED OPTIMIZATION IN N VARIABLES1. Optimality conditions2. Visualising problems in several variables3. Direct search methods4. Optimization software and examples5: THE STEEPEST DESCENT METHOD1. Introduction 2. Line searches3. Convergence of the steepest descent method4. Numerical results with steepest descent5. Wolfe's convergence theorem6. Further results with steepest descent6: THE NEWTON METHOD1. Quadratic models and the Newton step 2. Positive definiteness and Cholesky factors3. Advantages and drawbacks of Newton's method4. Search directions from indefinite Hessians5. Numerical results with the Newton method7: QUASINEWTON METHODS 1. Approximate second derivative information 2. Rauk-two updates for the inverse Hessian3. Convergence of quasi-Newton methods4. Numerical results with quasi-Newton methods5. The rank-one update for the inverse Hessian6. Updating estimates of the Hessian8: CONJUGATE GRADIENT METHODS 1. Conjugate gradients and quadratic functions2. Conjugate gradients and general functions3. Convergence of conjugate gradient methods4. Numerical results with conjugate gradients5. The truncated Newton method9: OPTIMAL PORTFOLIOS WITH RESTRICTIONS1. Introduction 2. Transformations to exclude short-selling3. Results from Minrisk2u and Maxret2u 4. Upper and lower limits on invested fractions10: LARGER-SCALE PORTFOLIOS1. Introduction 2. Portfolios with increasing numbers of assets3. Time-variation of optimal portfolios4. Performance of optimized portfolios11: DATA-FITTING AND THE GAUSS-NEWTON METHOD1. Data fitting problems2. The Gauss-Newton method3. Least-squares in time series analysis4. Gauss-Newton applied to time series5. Least-squares forms of minimum-risk problems6. Gauss-Newton applied to Minrisk1 and Minrisk212: EQUALITY CONSTRAINED OPTIMIZATION1. Portfolio problems with equality constraints2. Optimality conditions3. A worked example 4. Interpretation of Lagrange multipliers5. Some example problems13: LINEAR EQUALITY CONSTRAINTS1. Equality constrained quadratic programming2. Solving minimum-risk problems as EQPs3. Reduced-gradient methods4. Projected gradient methods5. Results with methods for linear constraints14: PENALTY FUNCTION METHODS1. Introduction2. Penalty functions3. The Augmented Lagrangian4. Results with P-SUMT and AL-SUMT5. Exact penalty functions15: SEQUENTIAL QUADRATIC PROGRAMMING1. Introduction2. Quadratic/linear models3. SQP methods based on penalty functions4. Results with AL-SQP 5. SQP line searches and the Maratos effect16: FURTHER PORTFOLIO PROBLEMS1. Including transaction costs 2. A re-balancing problem3. A sensitivity problem17: INEQUALITY CONSTRAINED OPTIMIZATION1. Portfolio problems with inequality constraints2. Optimality conditions3. Transforming inequalities to equalities 4. Transforming inequalities to simple bounds5. Example problems18: EXTENDING EQUALITY-CONSTRAINT METHODS1. Inequality constrained quadratic programming 2. Reduced gradients for inequality constraints3. Penalty functions for inequality constraints4. AL-SUMT for inequality constraints5. SQP for inequality constraints6. Results with P-SUMT, AL-SUMT and AL-SQP19: BARRIER FUNCTION METHODS1. Introduction2. Barrier functions3. Numerical results with B-SUMT20: INTERIOR POINT METHODS1. Introduction 2. Approximate solutions of problem B-NLP3. An interior point algorithm4. Numerical results with IPM21: DATA FITTING USING INEQUALITY CONSTRAINTS1. Minimax approximation 2. Trend channels for time series data22: PORTFOLIO RE-BALANCING AND OTHER PROBLEMS1. Re-balancing allowing for transaction costs2. Downside risk 3. Worst-case analysis23: GLOBAL UNCONSTRAINED OPTIMIZATION1. Introduction2. Multi-start methods3. DIRECT4. Numerical examples5. Global optimization in portfolio selectionAppendix ReferencesIndex.
  • (source: Nielsen Book Data)9781402081101 20160528
The book introduces the key ideas behind practical nonlinear optimization. Computational finance - an increasingly popular area of mathematics degree programs is combined here with the study of an important class of numerical techniques. The financial content of the book is designed to be relevant and interesting to specialists. However, this material which occupies about one-third of the text is also sufficiently accessible to allow the book to be used on optimization courses of a more general nature. The essentials of most currently popular algorithms are described, and their performance is demonstrated on a range of optimization problems arising in financial mathematics. Theoretical convergence properties of methods are stated, and formal proofs are provided in enough cases to be instructive rather than overwhelming. Practical behavior of methods is illustrated by computational examples and discussions of efficiency, accuracy and computational costs. Supporting software for the examples and exercises is available (but the text does not require the reader to use or understand these particular codes). The author has been active in optimization for over thirty years in algorithm development and application and in teaching and research supervision.
(source: Nielsen Book Data)9781402081101 20160528
Engineering Library (Terman)
xvii, 554 p. : ill. ; 25 cm.
  • Foreword. Preface Contributors. PART I: INTRODUCTION TO METAHEURISITICS AND PARALLELISM. 1. An Introduction to Metaheuristic Techniques (C. Blum, et al.). 2. Measuring the Performance of Parallel Metaheuristics (E. Alba & G. Luque). 3. New Technologies in Parallelism (E. Alba & A. Nebro). 4. Metaheuristics and Parallelism (E. Alba, et al.). PART II: PARALLEL METAHEURISTIC MODELS. 5. Parallel Genetic Algorithms (G. Luque, et al.). 6. Parallel Genetic Programming (F. Fernandez, et al.). 7. Parallel Evolution Strategies (G. Rudolph). 8. Parallel Ant Colony Algorithms (S. Janson, et al.). 9. Parallel Estimation of Distribution Algorithms (J. Madera, et al.). 10. Parallel Scatter Search (F. Garcia, et al.). 11. Parallel Variable Neighborhood Search (J. Moreno-Perez, et al.). 12. Parallel Simulated Annealing (M. Aydin, V. Yigit). 13. Parallel Tabu Search (T. Crainic, et al.). 14. Parallel Greedy Randomized Adaptive Search Procedures (M. Resende & C. Ribeiro). 15. Parallel Hybrid Metaheuristics (C. Cotta, et al.). 16. Parallel MultiObjective Optimization (A. Nebro, et al.). 17. Parallel Heterogeneous Metaheuristics (F. Luna, et al.). PART III: THEORY AND APPLICATIONS. 18. Theory of Parallel Genetic Algorithms (E. Cantu-Paz). 19. Parallel Metaheuristics Applications (T. Crainic & N. Hail). 20. Parallel Metaheuristics in Telecommunications (S. Nesmachnow, et al.). 21. Bioinformatics and Parallel Metaheuristics (O. Trelles, A. Rodriguez). Index.
  • (source: Nielsen Book Data)9780471678069 20160528
This title looks at solving complex optimization problems with parallel metaheuristics. "Parallel Metaheuristics" brings together an international group of experts in parallelism and metaheuristics to provide a much-needed synthesis of these two fields. Readers discover how metaheuristic techniques can provide useful and practical solutions for a wide range of problems and application domains, with an emphasis on the fields of telecommunications and bioinformatics. This volume fills a long-existing gap, allowing researchers and practitioners to develop efficient metaheuristic algorithms to find solutions. The book is divided into three parts. Part One: Introduction to Metaheuristics and Parallelism includes: an Introduction to Metaheuristic Techniques, Measuring the Performance of Parallel Metaheuristics, New Technologies in Parallelism, and a head-to-head discussion on Metaheuristics and Parallelism. Part Two: Parallel Metaheuristic Models includes: Parallel Genetic Algorithms, Parallel Genetic Programming, Parallel Evolution Strategies, Parallel Ant Colony Algorithms, Parallel Estimation of Distribution Algorithms, Parallel Scatter Search, Parallel Variable Neighborhood Search, Parallel Simulated Annealing, Parallel Tabu Search, Parallel GRASP, Parallel Hybrid Metaheuristics, Parallel Multi-Objective Optimization, and Parallel Heterogeneous Metaheuristics. Part Three: Theory and Applications includes Theory of Parallel Genetic Algorithms, Parallel Metaheuristics Applications, Parallel Metaheuristics in Telecommunications, and a final chapter on Bioinformatics and Parallel Metaheuristics. Each self-contained chapter begins with clear overviews and introductions that bring the reader up to speed, describes basic techniques, and ends with a reference list for further study. Packed with numerous tables and figures to illustrate the complex theory and processes, this comprehensive volume also includes numerous practical real-world optimization problems and their solutions. This is essential reading for students and researchers in computer science, mathematics, and engineering who deal with parallelism, metaheuristics, and optimization in general.
(source: Nielsen Book Data)9780471678069 20160528 Wiley Online Library
Engineering Library (Terman)
xx, 257 p. : fig., tab. ; 25 cm.
  • Preface Table of Notation Chapter 1. IntroductionChapter 2. Line Search Descent Methods for Unconstrained MinimizationChapter 3. Standard Methods for Constrained OptimizationChapter 4. New Gradient-Based Trajectory and Approximation MethodsChapter 5. Example ProblemsChapter 6. Some TheoremsChapter 7. The Simplex Method for Linear Programming Problems BibliographyIndex.
  • (source: Nielsen Book Data)9780387298245 20160528
This book presents basic optimization principles and gradient-based algorithms to a general audience in a brief and easy-to-read form, without neglecting rigor. The work should enable professionals to apply optimization theory and algorithms to their own particular practical fields of interest, be it engineering, physics, chemistry, or business economics. Most importantly, for the first time in a relatively brief and introductory work, due attention is paid to the difficulties - such as noise, discontinuities, expense of function evaluations, and the existence of multiple minima - that often unnecessarily inhibit the use of gradient-based methods. In a separate chapter on new gradient-based methods developed by the author and his coworkers, it is shown how these difficulties may be overcome without losing the desirable features of classical gradient-based methods.
(source: Nielsen Book Data)9780387298245 20160528
Engineering Library (Terman)