MAD S - M Esh Ad Apti Ve D Ire Ct Sea Rch Fo R Constr .

2y ago
8 Views
2 Downloads
900.91 KB
130 Pages
Last View : 2m ago
Last Download : 3m ago
Upload by : Camryn Boren
Transcription

1MADS - Mesh Adaptive Direct Searchfor constrained optimizationMark Abramson,Charles Audet,Gilles Couture,John Dennis,www.gerad.ca/Charles.Audet/Thanks to: ExxonMobil, AFOSR, Boeing, LANL,FQRNT, NSERC, SANDIA, NSF.Rice 2004

2Outline!Statement of the optimization problem

2Outline!Statement of the optimization problem!The GPS and MADS algorithm classes

2Outline!Statement of the optimization problem!The GPS and MADS algorithm classes!GPS theory and limiting examples

2Outline!Statement of the optimization problem!The GPS and MADS algorithm classes!GPS theory and limiting examples!The MADS algorithm class

2Outline!Statement of the optimization problem!The GPS and MADS algorithm classes!GPS theory and limiting examples!The MADS algorithm class An implementable MADS instance MADS theory Numerical results

2Outline!Statement of the optimization problem!The GPS and MADS algorithm classes!GPS theory and limiting examples!The MADS algorithm class An implementable MADS instance MADS theory Numerical results!DiscussionRice 2004

3Outline!Statement of the optimization problem!The GPS and MADS algorithm classes!GPS theory and limiting examplesThe MADS algorithm class An implementable MADS instance MADS theory Numerical results! Discussion!Rice 2004

4The target problem(N LP )minimize f (x)subject to x Ω,where f : Rn R { } may be discontinuous and:

4The target problem(N LP )minimize f (x)subject to x Ω,where f : Rn R { } may be discontinuous and:! the functions are expensive black boxes,

4The target problem(N LP )minimize f (x)subject to x Ω,where f : Rn R { } may be discontinuous and:! the functions are expensive black boxes,!the functions provide few correct digits and may faileven for x Ω

4The target problem(N LP )minimize f (x)subject to x Ω,where f : Rn R { } may be discontinuous and:! the functions are expensive black boxes,!the functions provide few correct digits and may faileven for x Ω!accurate approximation of derivatives is problematicRice 2004

5minimize f (x)subject to x X,where f : Rn R { } may be discontinuous and:

5minimize f (x)subject to x X,where f : Rn R { } may be discontinuous and:! the functions are expensive black boxes, often producedby simulations or output of MDO codes

5minimize f (x)subject to x X,where f : Rn R { } may be discontinuous and:! the functions are expensive black boxes, often producedby simulations or output of MDO codes!the functions provide few correct digits and may faileven for x X

5minimize f (x)subject to x X,where f : Rn R { } may be discontinuous and:! the functions are expensive black boxes, often producedby simulations or output of MDO codes!the functions provide few correct digits and may faileven for x X!accurate approximation of derivatives is problematic!surrogate models s f and P X may be availableRice 2004

6Goals – or validation of the method(N LP )!NOMAD Algo!x̂

6Goals – or validation of the method(N LP )!NOMAD Algoif f is continuously differentiable!x̂then f (x̂) 0

6Goals – or validation of the method(N LP )!NOMAD Algoif f is continuously differentiableif f is convex!x̂then f (x̂) 0then 0 f (x̂)

6Goals – or validation of the method(N LP )!NOMAD Algoif f is continuously differentiableif f is convexif f is Lipschitz near x̂!x̂then f (x̂) 0then 0 f (x̂)then 0 f (x̂)Rice 2004

7Clarke Calculus – for f Lipschitz near x!Clarke generalized derivative at x in the direction v:f (y tv) f (y)f (x; v) lim sup.ty x, t 0

7Clarke Calculus – for f Lipschitz near x!Clarke generalized derivative at x in the direction v:f (y tv) f (y)f (x; v) lim sup.ty x, t 0 !The generalized gradient of f at x is the set f (x) : {ζ Rn : f (x; v) v T ζ for all v Rn}

7Clarke Calculus – for f Lipschitz near x!Clarke generalized derivative at x in the direction v:f (y tv) f (y)f (x; v) lim sup.ty x, t 0 !The generalized gradient of f at x is the set f (x) : {ζ Rn : f (x; v) v T ζ for all v Rn} co{lim f (yi) : yi x and f (yi) exists }.

7Clarke Calculus – for f Lipschitz near x!Clarke generalized derivative at x in the direction v:f (y tv) f (y)f (x; v) lim sup.ty x, t 0 !The generalized gradient of f at x is the set f (x) : {ζ Rn : f (x; v) v T ζ for all v Rn} co{lim f (yi) : yi x and f (yi) exists }.!f (x; v) can be obtained from f (x) :f (x; v) max{v T ζ : ζ f (x)}.Rice 2004

8Outline!Statement of the optimization problem!The GPS and MADS algorithm classes!GPS theory and limiting examplesThe MADS algorithm class An implementable MADS instance MADS theory Numerical results! Discussion!Rice 2004

9The two iterated phases of GPS and MADS!The global search in the variable space is flexibleenough to allow user heuristics that incorporateknowledge of the driving simulation model and facilitatethe use of surrogate functions.

9The two iterated phases of GPS and MADS!The global search in the variable space is flexibleenough to allow user heuristics that incorporateknowledge of the driving simulation model and facilitatethe use of surrogate functions.!The local poll around the incumbent solution is morerigidly defined, but it ensures convergence to a pointsatisfying necessary first order optimality conditions.

9The two iterated phases of GPS and MADS!The global search in the variable space is flexibleenough to allow user heuristics that incorporateknowledge of the driving simulation model and facilitatethe use of surrogate functions.!The local poll around the incumbent solution is morerigidly defined, but it ensures convergence to a pointsatisfying necessary first order optimality conditions.!This talk focusses on the basic algorithm, and theconvergence analysis. In the next talks, Alison, Markand Gilles will talk about surrogates in the search.Rice 2004

10f"x1x2############### ##!

10f" x1x2############### ## is the incumbent solution!

10f" x1x2############### ## is the incumbent solution!

10f" x1x2############### ## is the incumbent solution!

10f"Surrogate x1x2############### ## is the incumbent solution!

10f"Surrogate , x1x2############### ## !, is the incumbent solutionRice 2004

11f" x1x2############### ## is still the incumbent solution!

11f" x1x2############### ## is still the incumbent solutionWe poll near the incumbent solution!

11f" x1x2############### ## is still the incumbent solutionWe poll near the incumbent solution!

11f" x1x2############### ## ! is still the incumbent solutionWe poll near the incumbent solutionRice 2004

12f" x1x2############### ##! New iteration from the same incumbentsolution, but on a finer meshRice 2004

13Positive spanning sets and meshes!A positive spanning set D is a set of vectors whosenonnegative linear combinations span Rn.

13Positive spanning sets and meshes!!A positive spanning set D is a set of vectors whosenonnegative linear combinations span Rn.The mesh is centered around xk Rn and its fineness isparameterized by mk 0 as follows D Mk {xk mDz:z N}.k

13Positive spanning sets and meshes!!A positive spanning set D is a set of vectors whosenonnegative linear combinations span Rn.The mesh is centered around xk Rn and its fineness isparameterized by mk 0 as follows D Mk {xk mDz:z N}.k!xkEx: D [I; I]Rice 2004

14Basic pattern search algorithm forunconstrained optimizationGiven m0 , x0 M0 with f (x0) , and D,

14Basic pattern search algorithm forunconstrained optimizationGiven m0 , x0 M0 with f (x0) , and D,for k 0, 1, · · ·, do1. Employ some finite strategy to try to choose xk 1 Mkmsuch that f (xk 1) f (xk ) and then set m k 1k orm m 2 k 1k (xk 1 is called an improved mesh point);

14Basic pattern search algorithm forunconstrained optimizationGiven m0 , x0 M0 with f (x0) , and D,for k 0, 1, · · ·, do1. Employ some finite strategy to try to choose xk 1 Mkmsuch that f (xk 1) f (xk ) and then set m k 1k orm m 2 k 1k (xk 1 is called an improved mesh point);2. Else if xk minimizes f (x) for x Pk , then set xk 1 xkmand m k 1k /2 (xk is called a minimal frame center).Rice 2004

15The Coordinate Search (CS) frame PkPk {xk mk d : d [I; I]};2n points adjacent to xk in Mk . mk 1"xk

15The Coordinate Search (CS) frame PkPk {xk mk d : d [I; I]};2n points adjacent to xk in Mk . mk 1p2!!p3"xk!p4p1!

15The Coordinate Search (CS) frame PkPk {xk mk d : d [I; I]};2n points adjacent to xk in Mk . mk 1 mk 1 p2!!p3"xk!p4p1!"xk 112

15The Coordinate Search (CS) frame PkPk {xk mk d : d [I; I]};2n points adjacent to xk in Mk . mk 1 mk 1 p2!12p! 2!p3"xkp1!p! 3"xk 1!p4!p4p1!

15The Coordinate Search (CS) frame PkPk {xk mk d : d [I; I]};2n points adjacent to xk in Mk . mk 1 mk 1 p2!12 mk 2 p! 2!p3"xkp1!p! 3"xk 1!p4!p4p1!"xk 214

15The Coordinate Search (CS) frame PkPk {xk mk d : d [I; I]};2n points adjacent to xk in Mk . mk 1 mk 1 p2!12 mk 2 p! 2!p3"xkp1!p! 3"xk 1!p4!p4p1!p3 x!p! 2"k 2!p4!14p1

15The Coordinate Search (CS) frame PkPk {xk mk d : d [I; I]};2n points adjacent to xk in Mk . mk 1 mk 1 p2!12 mk 2 p! 2!p3"xkp1!p! 3"xk 1!p1!p3 x!p! 2"k 2!14!p1p4p4!p4Always the same 2n 4 directions, regardless of k .Rice 2004

16The GPS frame PkPk {xk mk d : d Dk D}; points adjacentto xk in Mk (wrt positive spanning set Dk ). mk 1"xk

16The GPS frame PkPk {xk mk d : d Dk D}; points adjacentto xk in Mk (wrt positive spanning set Dk ). mk 1!p2p1!!!!!!!!!!!"!xk!p3

16The GPS frame PkPk {xk mk d : d Dk D}; points adjacentto xk in Mk (wrt positive spanning set Dk ). mk 1!p2p1!!!!!!!!!!!"!xk!p3 mk 1 "xk 112

16The GPS frame PkPk {xk mk d : d Dk D}; points adjacentto xk in Mk (wrt positive spanning set Dk ). mk 1!p2p1!!!!!!!!!!!"!xk!p3 mk 1 12p! 2p! 3""xk 1"""p1""!

16The GPS frame PkPk {xk mk d : d Dk D}; points adjacentto xk in Mk (wrt positive spanning set Dk ). mk 1!p2p1!!!!!!!!!!!"!xk!p3 mk 1 12 mk 2 p! 2p! 3""xk 2"xk 1"""p1""!14

16The GPS frame PkPk {xk mk d : d Dk D}; points adjacentto xk in Mk (wrt positive spanning set Dk ). mk 1!p2p1!!!!!!!!!!!"!xk!p3 mk 1 12 mk 2 p! 2p! 3"p2!!!"!"""!xk 2"xk 1""p1!""p1"!p314

16The GPS frame PkPk {xk mk d : d Dk D}; points adjacentto xk in Mk (wrt positive spanning set Dk ). mk 1!p2p1!!!!!!!!!!!"!xk mk 1 12 mk 2 p! 2p! 3"p2!p1!!!"!"""!xk 2"xk 1""14""p1"!p3!p3Here, only 14 different ways of selecting Dk , regardless of k .Rice 2004

17Outline!Statement of the optimization problem!The GPS and MADS algorithm classes!GPS theory and limiting examplesThe MADS algorithm class An implementable MADS instance MADS theory Numerical results! Discussion!Rice 2004

18Convergence results – unconstrained GPSIf all iterates are in a compact set, then lim inf mk 0k

18Convergence results – unconstrained GPSIf all iterates are in a compact set, then lim inf mk 0k!The mesh is refined only at a minimal frame center.

18Convergence results – unconstrained GPSIf all iterates are in a compact set, then lim inf mk 0k!The mesh is refined only at a minimal frame center.!There is a limit point x̂ of a subsequence {xk }k K ofminimal frame centers with { mk }k K 0.

18Convergence results – unconstrained GPSIf all iterates are in a compact set, then lim inf mk 0k!The mesh is refined only at a minimal frame center.!There is a limit point x̂ of a subsequence {xk }k K ofminimal frame centers with { mk }k K 0.{xk }k K is called a refining subsequence.

18Convergence results – unconstrained GPSIf all iterates are in a compact set, then lim inf mk 0k!The mesh is refined only at a minimal frame center.!There is a limit point x̂ of a subsequence {xk }k K ofminimal frame centers with { mk }k K 0.{xk }k K is called a refining subsequence.!f (xk ) f (xk mk d) d Dk D with k K.

18Convergence results – unconstrained GPSIf all iterates are in a compact set, then lim inf mk 0k!The mesh is refined only at a minimal frame center.!There is a limit point x̂ of a subsequence {xk }k K ofminimal frame centers with { mk }k K 0.{xk }k K is called a refining subsequence.!f (xk ) f (xk mk d) d Dk D with k K.Let D̂ D be the set of poll directions used infinitelyoften in the refining subsequence.D̂ is the set of refining direction.Rice 2004

19Set of refining directions D̂1xk1 mdk 2dxk1 mk1&&&&&&&xk1%%% xk&%%1 m d3k1 %%%1

19Set of refining directions D̂1xk1 mdk 1&&&&&&&%&%% xk2dxk1 mk&&&&%%2xk1%%% xk&%%1 m d3k1 %%%1

19Set of refining directions D̂1xk1 mdk 1&&&&&)))) xk3)'(((('('''&&%&%% xk2dxk1 mk&&&&%%2xk1%%% xk&%%1 m d3k1 %%%1

19Set of refining directions D̂1xk1 mdk 1&&&&&))&%&%&% xk4)) xk3)'(((('('''&&%&%% xk2dxk1 mk&&&&%%2xk1%%% xk&%%1 m d3k1 %%%1

19Set of refining directions D̂1xk1 mdk 1&&&&&))&%&%&% xk5 xk4&%&%)) xk3)'(((('('''&&%&%% xk2dxk1 mk&&&&%%2xk1%%% xk&%%1 m d3k1 %%%1

19Set of refining directions D̂1xk1 mdk 1&&&&&))&%&%&% . && %%xkx̂ lim xkk K5 xk4)) xk3)'(((('('''&&%&%% xk2dxk1 mk&&&&%%2xk1%%% xk&%%1 m d3k1 %%%1

19Set of refining directions D̂1xk1 mdkD̂ &&&&,&&))&%&%&% 1&&&& *%&%&%&%2dxk1 mk. && %%xkx̂ lim xk5 xk4)) xk3)'(((('('''&&%&%% xk%%2xk1%%% xk&%% 1%%%1 m d3k1k KRice 2004

20Convergence results – unconstrained GPSIf all iterates are in a compact set, then for any refiningsubsequence (with limit x̂ and refining directions D̂)

20Convergence results – unconstrained GPSIf all iterates are in a compact set, then for any refiningsubsequence (with limit x̂ and refining directions D̂)!f is Lipschitz near x̂ f (x̂; d) 0 for every d D̂.

20Convergence results – unconstrained GPSIf all iterates are in a compact set, then for any refiningsubsequence (with limit x̂ and refining directions D̂)!f is Lipschitz near x̂ f (x̂; d) 0 for every d D̂.this says that the Clarke derivatives are non-negative on a finite setof directions that positively span Rn.f (y td) f (y)f (xk k d) f (xk ) f (x̂; d) : lim sup limk Kt ky x̂,t 0

20Convergence results – unconstrained GPSIf all iterates are in a compact set, then for any refiningsubsequence (with limit x̂ and refining directions D̂)!f is Lipschitz near x̂ f (x̂; d) 0 for every d D̂.this says that the Clarke derivatives are non-negative on a finite setof directions that positively span Rn.f (y td) f (y)f (xk k d) f (xk ) f (x̂; d) : lim sup limk Kt ky x̂,t 0!f is regular at x̂ f 2(x̂; d) 0 for every d D̂.

20Convergence results – unconstrained GPSIf all iterates are in a compact set, then for any refiningsubsequence (with limit x̂ and refining directions D̂)!f is Lipschitz near x̂ f (x̂; d) 0 for every d D̂.this says that the Clarke derivatives are non-negative on a finite setof directions that positively span Rn.f (y td) f (y)f (xk k d) f (xk ) f (x̂; d) : lim sup limk Kt ky x̂,t 0!f is regular at x̂ f 2(x̂; d) 0 for every d D̂.!f is strictly differentiable at x̂ f (x̂) 0.Rice 2004

21Limitations of GPSGPS methods are directional, so the restriction to a finiteset of directions is a big limitation, particularly whendealing with nonlinear constraints.

21Limitations of GPSGPS methods are directional, so the restriction to a finiteset of directions is a big limitation, particularly whendealing with nonlinear constraints.GPS with empty search: The iterates stall at x0.%f (x) 3x3 % !x0 (1, 1)T#& #&

21Limitations of GPSGPS methods are directional, so the restriction to a finiteset of directions is a big limitation, particularly whendealing with nonlinear constraints.GPS with empty search: The iterates stall at x0.%f (x) 3x3 % !x0 (1, 1)T#& 1#&Even with a C function, GPS may generate infinitelymany limit points, some of them non-stationary.Rice 2004

22GPS convergence to a bad solutionLevel """"""""""""""""""""!a

22GPS convergence to a bad solution(0, 0) 4 f (0, 0)Level Sets% """"""""""""""""""""""" !a"" f (0, 0)"""" "#

22GPS convergence to a bad solution(0, 0) 4 f (0, 0)Level Sets% """"""""""""""""""""""" "" f (0, 0)"""" "#!aGPS iterates – with a bad strategy – converge to theorigin, where the gradient exists and is nonzero(f is differentiable at (0, 0) but not strictly differentiable).Rice 2004

23Outline!Statement of the optimization problem!The GPS and MADS algorithm classes!GPS theory and limiting examplesThe MADS algorithm class An implementable MADS instance MADS theory Numerical results! Discussion!Rice 2004

24The MADS framesIn addition to the mesh size parameter mk , we introducethe poll size parameter pk .

24The MADS framesmIn addition to the mesh size parameter !k , we introducethe poll size parameter pk . Ex : pk n mk .p m 1,kk 2"xk

24The MADS framesmIn addition to the mesh size parameter !k , we introducethe poll size parameter pk . Ex : pk n mk .p m 1,kk 23p"p1"!!!!!!!!!!"!xk"p2

24The MADS framesmIn addition to the mesh size parameter !k , we introducethe poll size parameter pk . Ex : pk n mk .pp1m m 2 1, , 1kkkk43p"p1"!!!!!!!!!!"!xk"p2x" k

24The MADS framesmIn addition to the mesh size parameter !k , we introducethe poll size parameter pk . Ex : pk n mk .pp1m m 2 1, , 1kkkk43p"p1"!!!!!!!!!!"!xk"p2x" k

24The MADS framesmIn addition to the mesh size parameter !k , we introducethe poll size parameter pk . Ex : pk n mk .pp1m m 2 1, , 'xk"p2

24The MADS framesmIn addition to the mesh size parameter !k , we introducethe poll size parameter pk . Ex : pk n mk .pp1m m 2 1, , 1kkkk43p"p1"!!!!!!!!!!"!xkp2p1, k16p3''"'p1""(((((((("'('' mk xk"p2x" k 12

24The MADS framesmIn addition to the mesh size parameter !k , we introducethe poll size parameter pk . Ex : pk n mk .pp1m m 2 1, , 1kkkk43p"p1"!!!!!!!!!!"!xkp2p1, k16p3''"'p1""(((((((("'('' mk xk"p2x" k 12

24The MADS framesmIn addition to the mesh size parameter !k , we introducethe poll size parameter pk . Ex : pk n mk .pp1m m 2 1, , 1kkkk43p"p1"!!!!!!!!!!"!xkp2p1, k16p3''"'p1""(((((((("'('' mk xk2"pp)"1xk p" 2))"***"p3 12

24The MADS framesmIn addition to the mesh size parameter !k , we introducethe poll size parameter pk . Ex : pk n mk .pp1m m 2 1, , 1kkkk43p"p1"!!!!!!!!!!"!xk"(((((((("'('' mk p1, k16 12p3''"'p1xk2"pp)"1xk p" 2))"***"p3"p2Number of ways of selecting Dk increases as pk gets smaller.Rice 2004

25Barrier approach to constraintsTo enforce Ω constraints, replace f by a barrier objective"f (x) if x Ω,fΩ(x) : otherwise.This is a standard construct in nonsmooth optimization.

25Barrier approach to constraintsTo enforce Ω constraints, replace f by a barrier objective"f (x) if x Ω,fΩ(x) : otherwise.This is a standard construct in nonsmooth optimization.Then apply the unconstrained algorithm to fΩ.This is NOT a standard construct in optimizationalgorithms.

25Barrier approach to constraintsTo enforce Ω constraints, replace f by a barrier objective"f (x) if x Ω,fΩ(x) : otherwise.This is a standard construct in nonsmooth optimization.Then apply the unconstrained algorithm to fΩ.This is NOT a standard construct in optimizationalgorithms.Quality of the limit solution depends the localsmoothness of f , not of fΩ.Rice 2004

26A MADS instancenote: GPS MADS withp k mk.

26A MADS instancep knote: GPS MADS with mk.An implementable way to generate Dk :

26A MADS instancep knote: GPS MADS with mk.An implementable way to generate Dk :!Let B be a lower triangular nonsingular random integermatrix.

26A MADS instancep knote: GPS MADS with mk.An implementable way to generate Dk :!Let B be a lower triangular nonsingular random integermatrix.!Randomly permute the lines of B

26A MADS instancep knote: GPS MADS with mk.An implementable way to generate Dk :!Let B be a lower triangular nonsingular random integermatrix.!Randomly permute the lines of B!Complete to a positive basis Dk [B; B] (maximal positive basis 2n directions). or# Dk [B; b B b] (minimal positive basis n 1 directions). Use Luis’ talk to order the poll directionsRice 2004

27Dense polling directionsTheorem 1. As k , MADS’s polling directions forma dense set in Rn (with probability 1).

27Dense polling directionsTheorem 1. As k , MADS’s polling directions forma dense set in Rn (with probability 1).The ultimate goal is a way to be sure that the subset ofrefining directions D̂ is dense.

27Dense polling directionsTheorem 1. As k , MADS’s polling directions forma dense set in Rn (with probability 1).The ultimate goal is a way to be sure that the subset ofrefining directions D̂ is dense.Then the barrier approach to constraints promises strongoptimality under weak assumptions - the existence of ahypertangent vector, e.g., a vector that makes a negativeinner product with all the active constraint gradients.Rice 2004

28MADS convergence resultsLet f be Lipschitz near a limit x̂ of a refining sequence.Theorem 2. Suppose that D̂ is dense in Ω. If either Ω Rn, or x̂ int(Ω), then0 f (x̂).

28MADS convergence resultsLet f be Lipschitz near a limit x̂ of a refining sequence.Theorem 2. Suppose that D̂ is dense in Ω. If either Ω Rn, or x̂ int(Ω), then0 f (x̂).Theorem 3. Suppose that D̂ is dense in TΩH (x̂) 4 . Then x̂ is a Clarke stationary point of f over Ω:

28MADS convergence resultsLet f be Lipschitz near a limit x̂ of a refining sequence.Theorem 2. Suppose that D̂ is dense in Ω. If either Ω Rn, or x̂ int(Ω), then0 f (x̂).Theorem 3. Suppose that D̂ is dense in TΩH (x̂) 4 . Then x̂ is a Clarke stationary point of f over Ω:f (x̂; v) 0, v TΩCl (x̂).

28MADS convergence resultsLet f be Lipschitz near a limit x̂ of a refining sequence.Theorem 2. Suppose that D̂ is dense in Ω. If either Ω Rn, or x̂ int(Ω), then0 f (x̂).Theorem 3. Suppose that D̂ is dense in TΩH (x̂) 4 . Then x̂ is a Clarke stationary point of f over Ω:f (x̂; v) 0, v TΩCl (x̂). In addition, it f is strictly differentiable at x̂ and if Ω isregular at x̂, then x̂ is a contingent KKT stationary pointof f over Ω

28MADS convergence resultsLet f be Lipschitz near a limit x̂ of a refining sequence.Theorem 2. Suppose that D̂ is dense in Ω. If either Ω Rn, or x̂ int(Ω), then0 f (x̂).Theorem 3. Suppose that D̂ is dense in TΩH (x̂) 4 . Then x̂ is a Clarke stationary point of f over Ω:f (x̂; v) 0, v TΩCl (x̂). In addition, it f is strictly differentiable at x̂ and if Ω isregular at x̂, then x̂ is a contingent KKT stationary pointof f over Ω : f (x̂)T v 0, v TΩCo(x̂).Rice 2004

29A problem for which GPS stagnates1. 631. 521. 41. 311. 2yy01. 11120. 93100. 850x51043. 532. 52xRice 2004

30Our resultsRice 2004

31Results for a chemE pid problemRice 2004

32Constrained optimizationA disk constrained problemmin x yx,ys.t.How hard can that be?x2 y 2 6

32Constrained optimizationA disk constrained problemmin x yx,ys.t.x2 y 2 6How hard can that be?Very hard for GPS and filter-GPS with the standard 2ndirections with an empty searchRice 2004

33dynamic 2n directions!3.36GPS FilterGPS BarrierMADS (5 400450Number of function evaluationsRice 2004

34Parameter fit in a rheology problemRheology is a branch of mechanics that studiesproperties of materials which determine theirresponse to mechanical force.MODEL :Viscosity η of a material can be modelled as afunction of the shear rate γ̇i :2 2 β 12η(γ̇) η0(1 λ γ̇ )A parameter fit problem.Rice 2004

35Observation Strain rate Viscosityiγ̇i (s 1) ηi (P a · 14.3476.7125.4668.1136.8858.2The unconstrainedoptimization problem :min g(η0, λ, β)η0,λ,βwithg #13i 1 η(γ̇) ηi Rice 2004

36Coordinate searchFixed CompleteFixed opportunistDynamic opportunistDynamic opportunist with cache250f20015010050500100015002000Number of function evaluations2500Rice 2004

37GPS with n 1 directionsNo searchLHS searchRandom search250f20015010050500100015002000Number of function evaluations2500Rice 2004

38MADS with n 1 directionsNo searchLHS searchRandom search250f20015010050500100015002000Number of function evaluations2500Rice 2004

39Discussion!MADS variant looks good as a 1st try

39Discussion!MADS variant looks good as a 1st try,and is more general than the instance shown here.

39Discussion!MADS variant looks good as a 1st try,and is more general than the instance shown here.!Numerically, randomness is a blessing and a curse.

39Discussion!MADS variant looks good as a 1st try,and is more general than the instance shown here.!Numerically, randomness is a blessing and a curse.!MADS can handle oracular or yes/no constraints.

39Discussion!MADS variant looks good as a 1st try,and is more general than the instance shown here.!Numerically, randomness is a blessing and a curse.!MADS can handle oracular or yes/no constraints.!The underlying mesh is finer in MADS than in GPS :Good for general searches and surrogates.

39Discussion!MADS variant looks good as a 1st try,and is more general than the instance shown here.!Numerically, randomness is a blessing and a curse.!MADS can handle oracular or yes/no constraints.!The underlying mesh is finer in MADS than in GPS :Good for general searches and surrogates.!MADS is the result of nonsmooth analysis pointing upthe weaknesses in GPS.Rice 2004

40Later today.!This is a meeting about surrogates,but I did not talk about surrogates.Alison will present in the next talk the use of a surrogatein a specific mechanical engineering problem usingGPS/MADS.

40Later today.!This is a meeting about surrogates,but I did not talk about surrogates.Alison will present in the next talk the use of a surrogatein a specific mechanical engineering problem usingGPS/MADS.!MADS replaces GPS in our NOMADm and NOMADsoftwares. Gilles and Mark will present a demo of thesesofwares after lunch.Rice 2004

MAD S - M esh Ad apti ve D ire ct Sea rch fo r constr ained optimi zati on Ma rk Ab rams on , Cha rles Audet , Gil les C out ure , Joh n D ennis , ww w.gera d.ca/Ch arles.Aud et/ Th an ks to: ExxonMobil, A F OS

Related Documents:

Job Safety Analysis ESH Job Safety Analysis Kayaks ESH Kayaks Ladders ESH Ladders Lawn Mower - Push Yes ESH Lawn Mower - Push Lawn Mower - Riding Yes ESH Lawn Mower - Riding . Welding, Cutting and Brazing ESH Welding, Cutting and Brazing Winter Drivi

The Mad collection has evolved starting from Mad Chair, an armchair with harmonious and enveloping lines, later accom-panied by the armchair with high back and swivel base Mad Joker and the low Mad Queen armchair, soft and reassuring. To complete the collection designed by Marcel Wanders, Mad King and Mad Chaise Longue, two armchairs with an asym-

Contractor ESH Orientation Forms The Contractor shall maintain written proof of the Contractor ESH Orientation reviews by completing Form 1 and 2 provided as part of this package. Contact the LMMFC Contract Monitor, ESH Office representative or refer to the Emergency Contact Information section of the ESH Contractor Manual for site specific .

the U-MAD Trailer Mounted Attenuator. Proper installation and operation of the U-MAD Trailer is essential to assure maximum performance. Take the time to review this entire manual thor-oughly prior to installing and/or operating the U-MAD Trailer. The U-MAD Trailer Mounted Attenuator is a highly engineered safety device made up of a

U-MAD Trailer Mounted Attenuator. Proper installa-tion and operation of the U-MAD Trailer is essential to assure maximum performance. Take the time to review this entire manual thoroughly prior to install-ing and/or operating the U-MAD Trailer. The U-MAD Trailer Mounted Attenuator is a highly engineered safety device made up of a relatively

Zach Even - Esh is a Strength & Performance Specialist located in NJ. Zach is the Founder of The Underground Strength Gym, a private warehouse gym for athletes and hardcore strength addicts. You can gain insider access as to how Zach trains his athletes and operates his business via the resources below. 9 http://ZachEven-Esh.com

Former wrestler and bodybuilder Zach Even-Esh warns against jumping into just any squat program. His advice is to nd the one that suits your specic needs. “The best program is the one that’s matching what you need biomechanically, physiologically and psychologically.” —Zach Even-Esh Courtesy of Zach Even-Esh

American Gear Manufacturers Association AGMA is a voluntary association of companies, consultants and academicians with a direct interest in the design, manufacture, and application of gears and flexible couplings. AGMA was founded in 1916 by nine companies in response to the market demand for standardized gear products; it remains a member- and market-driven organization to this day. AGMA .