Adaptive Scalarization Methods in Multiobjective by Gabriele Eichfelder

By Gabriele Eichfelder

This booklet offers adaptive resolution equipment for multiobjective optimization difficulties in keeping with parameter based scalarization techniques. With the aid of sensitivity effects an adaptive parameter regulate is constructed such that top of the range approximations of the effective set are generated. those examinations are according to a unique scalarization process, however the software of those effects to many different recognized scalarization tools can be awarded. Thereby very normal multiobjective optimization difficulties are thought of with an arbitrary partial ordering outlined by way of a closed pointed convex cone within the goal house. The effectiveness of those new equipment is verified with a number of attempt difficulties in addition to with a up to date challenge in intensity-modulated radiotherapy. The e-book concludes with one more software: a technique for fixing multiobjective bilevel optimization difficulties is given and is utilized to a bicriteria bilevel challenge in clinical engineering.

Show description

Read Online or Download Adaptive Scalarization Methods in Multiobjective Optimization (Vector Optimization) PDF

Similar linear programming books

Integer Programming: Theory and Practice

Integer Programming: conception and perform comprises refereed articles that discover either theoretical points of integer programming in addition to significant purposes. This quantity starts off with an outline of recent confident and iterative seek equipment for fixing the Boolean optimization challenge (BOOP).

Extrema of Smooth Functions: With Examples from Economic Theory

It isn't an exaggeration to nation that almost all difficulties handled in monetary thought will be formulated as difficulties in optimization idea. This holds precise for the paradigm of "behavioral" optimization within the pursuit of person self pursuits and societally effective source allocation, in addition to for equilibrium paradigms the place life and balance difficulties in dynamics can usually be said as "potential" difficulties in optimization.

Variational and Non-variational Methods in Nonlinear Analysis and Boundary Value Problems

This e-book displays an important a part of authors' examine task dur­ ing the final ten years. the current monograph is developed at the effects received via the authors via their direct cooperation or as a result authors individually or in cooperation with different mathematicians. some of these effects slot in a unitary scheme giving the constitution of this paintings.

Optimization on Low Rank Nonconvex Structures

Worldwide optimization is without doubt one of the quickest constructing fields in mathematical optimization. in reality, more and more remarkably effective deterministic algorithms were proposed within the final ten years for fixing numerous sessions of enormous scale in particular dependent difficulties encountered in such components as chemical engineering, monetary engineering, situation and community optimization, construction and stock keep watch over, engineering layout, computational geometry, and multi-objective and multi-level optimization.

Additional info for Adaptive Scalarization Methods in Multiobjective Optimization (Vector Optimization)

Example text

1)): min fk (x) subject to the constraints fi (x) ≤ εi , i ∈ {1, . . , m} \ {k}, x ∈ Ω. 24) It is easy to see that this is just a special case of the Pascoletti-Serafini scalarization for the ordering cone K = Rm + . We even get a connection w. r. t. 27. 25 hold and let K = Rm + , C = R+ , and Sˆ = S = Rn . A point x ¯ is a minimal solution of (Pk (ε)) with Lagrange multipliers μ ¯i ∈ R+ for i ∈ {1, . . , m} \ {k}, ν¯ ∈ Rp+ , and ξ¯ ∈ Rq , if and only if (fk (¯ x), x ¯) is a minimal solution of (SP(a, r)) with ¯ with μ Lagrange multipliers (¯ μ, ν¯, ξ) ¯k = 1, and ai = εi , ∀i ∈ {1, .

11. Let x ¯ be K-minimal for (MOP) and define a hyperplane H = {y ∈ Rm | b y = β} with b ∈ Rm \ {0m } and β ∈ R. Let r ∈ K with b r = 0 be arbitrarily given. Then there is a parameter a ∈ H and some t¯ ∈ R so that (t¯, x ¯) is a minimal solution of (SP(a, r)). This holds for instance for x) − β b f (¯ t¯ = b r and a = f (¯ x) − t¯r. Proof. For x) − β b f (¯ and a = f (¯ x) − t¯r t¯ = b r we have a ∈ H and the point (t¯, x ¯) is feasible for (SP(a, r)). We assume that (t¯, x ¯) is not a minimal solution of (SP(a, r)).

22) t ∈ R, x ∈ Ω, s ∈ Rm−1 for j ∈ {1, . . , m − 1} with minimal solution (tmax,j , xmax,j , smax,j ) and minimal value −smax,j . We get j m−1 H 0 := y ∈ Rm si v i , si ∈ [smin,i , smax,i ], i = 1, . . , m − 1 i i y= i=1 44 2 Scalarization Approaches ˜ ⊂ H 0 . This is a suitable restriction of the parameter set H as with H the following lemma shows. 20. Let x ¯ be a K-minimal solution of the multiobjective optimization problem (MOP). Let r ∈ K \ {0m }. Then there is a pa¯) is a minimal solution of rameter a ¯ ∈ H 0 and some t¯ ∈ R so that (t¯, x (SP(¯ a, r)).

Download PDF sample

Rated 4.43 of 5 – based on 12 votes