LECTURE: CLASSICAL OPTIMIZATION OVERVIEW

3y ago
115 Views
2 Downloads
2.19 MB
36 Pages
Last View : 1d ago
Last Download : 3m ago
Upload by : Ryan Jay
Transcription

Florida International UniversityDepartment of Civil and Environmental EngineeringOptimization in Water Resources Engineering, Spring 2020LECTURE: CLASSICAL OPTIMIZATION OVERVIEWArturo S. Leon, Ph.D., P.E., D.WRE

Optimization problem Design variables: variables with which the design problem isparameterized:Objective: quantity that is to be minimized (maximized)Usually denoted by:( “cost function”)Constraint: condition that has to be satisfied Inequality constraint: Equality constraint:

Solving optimization problems Optimization problems are typically solved using an iterative sponsesDerivatives ofresponses(design sensitivities)

Defining an optimization problem1.Choose design variables and their bounds2.Formulate objective3.Formulate constraints (restrictions)4.Choose suitable optimization algorithm

Example – Design of a SODA Can5 Design a SODA can to hold an specifiedamount of SODA and other requirements.The cans will be produced in billions, so itis desirable to minimize the cost ofmanufacturing.Since the cost is related directly to thesurface area of the sheet metal used, it isreasonable to minimize the sheet metalrequired to fabricate the can.

Example – Design of a SODA Can (Cont.)6Requirements:1.The diameter of the can should be nomore than 8 cm and no less than 3.5cm.2.The height of the can should be nomore than18 cm and no less than 8 cm.3.The can is required to hold at least400 ml of fluid.

Example – Design of a SODA Can (Cont.)7Design variablesD diameter of the can (cm)H height of the can (cm)Objective functionThe design objective is to minimize the surface area

Example – Design of a SODA Can (Cont.)8The constraints must be formulated in terms of design variables.The first constraint is that the can must hold at least 400 ml of fluid.The other constraints on the size of the can are:The problem has two independent design variables and five explicitconstraints.

Optimization Problem CharacteristicsLinearity Nonlinear objective functions can have multiplelocal optima:fx2fx2x Challenge: finding the global optimum.x1x1

Unimodality Bracketing and sectioning methods work best for unimodal functions:“An unimodal function consists of exactly one monotonically increasing anddecreasing part”

Convexity Convex set:“A set S is convex if for every two points x1, x2 in S, theconnecting line also lies completely inside S”

Lagrange Multipliers12 The method of Lagrange multipliers gives a set ofnecessary conditions to identify optimal points ofequality constrained optimization problems.This is done by converting a constrained problem to anequivalent unconstrained problem with the help ofcertain unspecified parameters known as Lagrangemultipliers.

Finding an Optimum using Lagrange Multipliers13The classical problem formulationminimize f(x1, x2, ., xn)Subject to h1(x1, x2, ., xn) 0can be converted tominimize L(x, ) f(x) - h1(x)whereL(x, ) is the Lagrangian function is an unspecified positive or negative constant called theLagrangian Multiplier

Lagrange Multipliers Method141.Original problem is rewritten as:minimize L(x, ) f(x) - h1(x)2.Take derivatives of L(x, ) with respect to xi and set them equal to zero. If there are n variables (i.e., x1, ., xn) then you will get n equations with n 1unknowns (i.e., n variables xi and one Lagrangian multiplier )3.Express all xi in terms of Langrangian multiplier 4.Plug x in terms of in constraint h1(x) 0 and solve .5.Calculate x by using the just found value for . Note that the n derivatives and one constraint equation result in n 1 equationsfor n 1 variables!

Multiple constraints15 The Lagrangian multiplier method can be used for any number of equalityconstraints.Suppose we have a classical problem formulation with k equality constraintsminimizef(x1, x2, ., xn)Subject toh1(x1, x2, ., xn) 0.hk(x1, x2, ., xn) 0This can be converted inminimizeL(x, ) f(x) - T h(x)Where T is the transpose vector of Lagrangian multpliers and has length k

EXAMPLE16A factory manufactures HONDA CITY and HONDA CIVIC cars. Determine the optimalnumber of HONDA CITY and HONDA CIVIC cars produced if the factory capacity is90 cars per day, and the cost of manufacturing is C (x, y) 6x2 12y2, where x isthe number of HONDA CITY cars and y is the number of HONDA CIVIC carsproduced.

EXAMPLE (Cont.)17 VARIABLESx No. of HONDA CITY cars producedy No. of HONDA CIVIC cars produced COST of Manufacturing;C (x, y) 6x2 12y2OBJECTIVE:MINIMIZE COSTCONSTRAINT: 90 cars per dayx y 90Original problem is rewritten as:minimize L(x, ) f(x) - h1(x)

EXAMPLE (Cont.)18

Unconstrained optimization algorithms Single-variable methods 0th order (involving only f ) 1st order (involving f and f ’ ) 2nd order (involving f, f ’ and f ” )Multiple variable methods Gradient Descent Methods Simplex Method Sequential Linear Programming Sequential Quadratic Programming Etc.

Single-variable methodsBisection method Optimality conditions: minimum at stationary point Root finding off’ Similar to sectioning methods, but uses derivative:ff’Interval is halved in each iteration. Note, this isbetter than any of the direct methods

Newton’s methodf’ Again, root finding of Basis: Taylor approximation off ’:f '( x h ) f '( x ) f ''( x ) h o ( h 2 ) h New guess:f '(x)f "(x)x k 1 x k h k x k f '( xk )f " ( xk )Linearapproximation

Newton’s method (cont.) Best convergence of all methods:f’f’xkxk 2xk Unless it divergesxk 1xk 1xk 2

Summary single variable methods Bracketing Dichotomous sectioning Fibonacci sectioning Golden ratio sectioning Quadratic interpolation Cubic interpolation Bisection method Secant method Newton method And many, many more!0th order1st order2nd order

MATLAB DEMO: Single variable MinimizationThis demo will show a number of ways to minimize f(x) starting at multiple initial points.Demo Folder: Single variable Classical Optimization (Download file from Canvas)Demo File: Main File Single Variable.mFunction: xcos(2x)

Single variable Minimization (cont.)(1) Change starting points(2) Discuss and show sensitivity of solutions

Multiple variable methodsGRADIENT DESCENT METHODSJ ( x), x x1 , x2 ,.xn Consider a function The gradient of J(x) at a point x0 is a vector of length n. J 0 x ( x ) 1 J 0 x ( x ) 2 J ( x 0 ) . . . J ( x 0 ) xn Each element in the vector is evaluated at the point x0.

GRADIENT DESCENT METHODS (cont.) Linear Programming Simplex Method Newton-Raphson Method Secant Method Bisection Method Line Search Methods Sequential Linear Programming Sequential Quadratic Programming Karush-Kuhn-Tucker Conditions (KKT)

SIMPLEX METHOD Solutions at the “vertices” of the designspace are called basic feasiblesolutions.The Simplex algorithm moves from BFS toBFS so that the objective alwaysimproves.

SEQUENTIAL LINEAR PROGRAMMING Consider a general nonlinear problem linearized via first order Taylor series:This is an LP problem with the design variables contained in δx. The functions andgradients evaluated at x0 are constant coefficients.

SEQUENTIAL LINEAR PROGRAMMING (Cont.)1.Initial guess x02.Linearize about x0 using first order Taylor series3.Solve resulting LP to find δx4.Update x1 x0 δx5.Linearize about x1 and repeat:xq xq-1 δxWhere δx is the solution of LP (model linearized about xq-1.

SEQUENTIAL QUADRATIC PROGRAMMING Create a quadratic approximation to the LagrangianCreate linear approximations to the constraintsSolve the quadratic problem to find the search direction, SPerform the 1-D searchUpdate the approximation to the Lagrangian

Newton methodExpand f(x) by its Taylor series about the point xkwhere the gradient is the vectorand the Hessian is the symmetric matrix

Newton method (Cont.)For a minimum we require thatwith solution, and so. This gives the iterative update If f(x) is quadratic, then the solution is found in one step. The method has quadratic convergence (as in the 1D case). The solution is guaranteed to be a downhill direction.Rather than jump straight to the minimum, it is better to perform a line minimization which ensuresglobal convergence

Summary of MATLAB MultipleVariable Methods Fminsearch: Find minimum of unconstrained multivariable functionusing derivative-free method Fminunc: Nonlinear programming solver. Finds minimum ofunconstrained multivariable function. Gradient and Hessian maybe supplied. Lsqnonlin: Solves nonlinear least-squares curve fitting problemsof the form

MATLAB DEMO: Banana Function MinimizationMinimize Rosenbrock's "banana function" f(x) is called the banana function becauseof its curvature around the origin.It is notorious in optimization examplesbecause of the slow convergence mostmethods exhibit when trying to solve thisproblemf(x) has a unique minimum at the point x [1, 1] where f(x) 0

Banana Function Minimization (cont.)This demo will show a number of ways to minimize f(x) starting atmultiple initial points.Demo Folder: BananaFunction Classical Optimization (Download filefrom Canvas)

16 A factory manufactures HONDA CITY and HONDA CIVIC cars. Determine the optimal number of HONDA CITY and HONDA CIVIC cars produced if the factory capacity is 90 cars per day, and the cost of manufacturing is C (x, y) 6x 2 12y, where x is the number of HONDA CITY cars and y is the number of HONDA CIVIC cars produced.

Related Documents:

Introduction of Chemical Reaction Engineering Introduction about Chemical Engineering 0:31:15 0:31:09. Lecture 14 Lecture 15 Lecture 16 Lecture 17 Lecture 18 Lecture 19 Lecture 20 Lecture 21 Lecture 22 Lecture 23 Lecture 24 Lecture 25 Lecture 26 Lecture 27 Lecture 28 Lecture

Lecture 1: A Beginner's Guide Lecture 2: Introduction to Programming Lecture 3: Introduction to C, structure of C programming Lecture 4: Elements of C Lecture 5: Variables, Statements, Expressions Lecture 6: Input-Output in C Lecture 7: Formatted Input-Output Lecture 8: Operators Lecture 9: Operators continued

Lecture 1: Introduction and Orientation. Lecture 2: Overview of Electronic Materials . Lecture 3: Free electron Fermi gas . Lecture 4: Energy bands . Lecture 5: Carrier Concentration in Semiconductors . Lecture 6: Shallow dopants and Deep -level traps . Lecture 7: Silicon Materials . Lecture 8: Oxidation. Lecture

TOEFL Listening Lecture 35 184 TOEFL Listening Lecture 36 189 TOEFL Listening Lecture 37 194 TOEFL Listening Lecture 38 199 TOEFL Listening Lecture 39 204 TOEFL Listening Lecture 40 209 TOEFL Listening Lecture 41 214 TOEFL Listening Lecture 42 219 TOEFL Listening Lecture 43 225 COPYRIGHT 2016

Partial Di erential Equations MSO-203-B T. Muthukumar tmk@iitk.ac.in November 14, 2019 T. Muthukumar tmk@iitk.ac.in Partial Di erential EquationsMSO-203-B November 14, 2019 1/193 1 First Week Lecture One Lecture Two Lecture Three Lecture Four 2 Second Week Lecture Five Lecture Six 3 Third Week Lecture Seven Lecture Eight 4 Fourth Week Lecture .

Since the eld { also referred to as black-box optimization, gradient-free optimization, optimization without derivatives, simulation-based optimization and zeroth-order optimization { is now far too expansive for a single survey, we focus on methods for local optimization of continuous-valued, single-objective problems.

Aruna Sairam Vocalist Carnatic Music Asad Ali Khan : Radra Veena . Hindustani Classical Bade Ghulam Ali Khan Vocalist : Hindustani Classical Begum Akhtar . Vocalist : Hindustani Classical Bhimsen Joshi . Vocalist : Hindustani Classical . Famous Indian Classical Musicians and

Structure topology optimization design is a complex multi-standard, multi-disciplinary optimization theory, which can be divided into three category Sizing optimization, Shape optimization and material selection, Topology optimization according to the structura