Pymoo Multi-Objective Optimization in Python

Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

Received March 13, 2020, accepted March 30, 2020, date of publication April 27, 2020, date of current

version May 22, 2020.


Digital Object Identifier 10.1109/ACCESS.2020.2990567

Pymoo: Multi-Objective Optimization in Python


JULIAN BLANK AND KALYANMOY DEB , (Fellow, IEEE)
Computer Science and Engineering, Michigan State University, East Lansing, MI 48824, USA
Corresponding author: Julian Blank ([email protected])
This work was supported by Koenig Endowed Chair Grant at Michigan State University, East Lansing, USA.

ABSTRACT Python has become the programming language of choice for research and industry projects
related to data science, machine learning, and deep learning. Since optimization is an inherent part of these
research fields, more optimization related frameworks have arisen in the past few years. Only a few of them
support optimization of multiple conflicting objectives at a time, but do not provide comprehensive tools
for a complete multi-objective optimization task. To address this issue, we have developed pymoo, a multi-
objective optimization framework in Python. We provide a guide to getting started with our framework
by demonstrating the implementation of an exemplary constrained multi-objective optimization scenario.
Moreover, we give a high-level overview of the architecture of pymoo to show its capabilities followed by an
explanation of each module and its corresponding sub-modules. The implementations in our framework are
customizable and algorithms can be modified/extended by supplying custom operators. Moreover, a variety
of single, multi- and many-objective test problems are provided and gradients can be retrieved by automatic
differentiation out of the box. Also, pymoo addresses practical needs, such as the parallelization of function
evaluations, methods to visualize low and high-dimensional spaces, and tools for multi-criteria decision
making. For more information about pymoo, readers are encouraged to visit: https://pymoo.org.

INDEX TERMS Customization, genetic algorithm, multi-objective optimization, python.

I. INTRODUCTION implementation itself. However, the implementation of opti-


Optimization plays an essential role in many scientific areas, mization algorithms can be challenging and specifically
such as engineering, data analytics, and deep learning. These benchmarking is time-consuming. Having access to either a
fields are fast-growing and their concepts are employed for good collection of different source codes or a comprehensive
various purposes, for instance gaining insights from a large library is time-saving and avoids an error-prone implementa-
data sets or fitting accurate prediction models. Whenever an tion from scratch.
algorithm has to handle a significantly large amount of data, To address this need for multi-objective optimization in
an efficient implementation in a suitable programming lan- Python, we introduce pymoo. The goal of our framework
guage is important. Python [1] has become the programming is not only to provide state of the art optimization algo-
language of choice for the above mentioned research areas rithms, but also to cover different aspects related to the
over the last few years, not only because it is easy to use but optimization process itself. We have implemented single,
also good community support exists. Python is a high-level, multi and many-objective test problems which can be used
cross-platform, and interpreted programming language that as a test-bed for algorithms. In addition to the objective and
focuses on code readability. A large number of high-quality constraint values of test problems, gradient information can
libraries are available and support for any kind of scientific be retrieved through automatic differentiation [2]. Moreover,
computation is ensured. These characteristics make Python a parallelized evaluation of solutions can be implemented
an appropriate tool for many research and industry projects through vectorized computations, multi-threaded execution,
where the investigations can be rather complex. and distributed computing. Further, pymoo provides imple-
A fundamental principle of research is to ensure repro- mentations of performance indicators to measure the qual-
ducibility of studies and to provide access to materials ity of results obtained by a multi-objective optimization
used in the research, whenever possible. In computer sci- algorithm. Tools for an explorative analysis through visu-
ence, this translates to a sketch of an algorithm and the alization of lower and higher-dimensional data are avail-
able and multi-criteria decision making methods guide the
The associate editor coordinating the review of this manuscript and selection of a single solution from a solution set based on
approving it for publication was Halil Yetgin . preferences.
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/
VOLUME 8, 2020 89497
J. Blank, K. Deb: Pymoo: Multi-Objective Optimization in Python

Our framework is designed to be extendable through of its TABLE 1. Multi-objective optimization frameworks in Python.
modular implementation. For instance, a genetic algorithm is
assembled in a plug-and-play manner by making use of spe-
cific sub-modules, such as initial sampling, mating selection,
crossover, mutation and survival selection. Each sub-module
takes care of an aspect independently and, therefore, variants
of algorithms can be initiated by passing different combina-
tions of sub-modules. This concept allows end-users to incor-
porate domain knowledge through custom implementations.
For example, in an evolutionary algorithm a biased initial
sampling module created with the knowledge of domain
experts can guide the initial search.
Furthermore, we like to mention that our framework is
well-documented with a large number of available code-
snippets. We created a starter’s guide for users to become
familiar with our framework and to demonstrate its capa- Recently, the well-known multi-objective optimization
bilities. As an example, it shows the optimization results framework jMetal [5] developed in Java [6] has been ported
of a bi-objective optimization problem with two constraints. to a Python version, namely jMetalPy [7]. The authors aim
An extract from the guide will be presented in this paper. to further extend it and to make use of the full feature set
Moreover, we provide an explanation of each algorithm and of Python, for instance, data analysis and data visualization.
source code to run it on a suitable optimization problem in our In addition to traditional optimization algorithms, jMetalPy
software documentation. Additionally, we show a definition also offers methods for dynamic optimization. Moreover,
of test problems and provide a plot of their fitness landscapes. the post analysis of performance metrics of an experiment
The framework documentation is built using Sphinx [3] and with several independent runs is automated.
correctness of modules is ensured by automatic unit test- Parallel Global Multiobjective Optimizer, PyGMO [8],
ing [4]. Most algorithms have been developed in collabo- is an optimization library for the easy distribution of massive
ration with the second author and have been benchmarked optimization tasks over multiple CPUs. It uses the gener-
extensively against the original implementations. alized island-model paradigm for the coarse grained paral-
In the remainder of this paper, we first present related exist- lelization of optimization algorithms and, therefore, allows
ing optimization frameworks in Python and in other program- users to develop asynchronous and distributed algorithms.
ming languages. Then, we provide a guide to getting started Platypus [9] is a multi-objective optimization framework
with pymoo in Section III which covers the most important that offers implementations of state-of-the art algorithms.
steps of our proposed framework. In Section IV we illustrate It enables users to create an experiment with various algo-
the framework architecture and the corresponding modules, rithms and provides post-analysis methods based on metrics
such as problems, algorithms and related analytics. Each of and visualization.
the modules is then discussed separately in Sections V to VII. A Distributed Evolutionary Algorithms in Python
Finally, concluding remarks are presented in Section VIII. (DEAP) [10] is novel evolutionary computation framework
for rapid prototyping and testing of ideas. Even though,
II. RELATED WORKS DEAP does not focus on multi-objective optimization, how-
In the last decades, various optimization frameworks in ever, due to the modularity and extendibility of the framework
diverse programming languages were developed. However, multi-objective algorithms can be developed. Moreover, par-
some of them only partially cover multi-objective optimiza- allelization and load-balancing tasks are supported out of the
tion. In general, the choice of a suitable framework for box.
an optimization task is a multi-objective problem itself. Inspyred [11] is a framework for creating bio-inspired
Moreover, some criteria are rather subjective, for instance, computational intelligence algorithms in Python which is
the usability and extendibility of a framework and, therefore, not focused on multi-objective algorithms directly, but on
the assessment regarding criteria as well as the decision mak- evolutionary computation in general. However, an example
ing process differ from user to user. For example, one might for NSGA-II [12] is provided and other multi-objective algo-
have decided on a programming language first, either because rithms can be implemented through the modular implemen-
of personal preference or a project constraint, and then search tation of the framework.
for a suitable framework. One might give more importance to If the search for frameworks is not limited to Python, other
the overall features of a framework, for example paralleliza- popular frameworks should be considered: PlatEMO [13]
tion or visualization, over the programming language itself. in Matlab, MOEA [14] and jMetal [5] in Java, jMetal-
An overview of some existing multi-objective optimization Cpp [15] and PaGMO [16] in C++. Of course this is not
frameworks in Python is listed in Table 1, each of which is an exhaustive list and readers may search for other available
described in the following. options.

89498 VOLUME 8, 2020


J. Blank, K. Deb: Pymoo: Multi-Objective Optimization in Python

III. GETTING STARTED1 It consists of two objectives (M = 2) where f1 (x) is


In the following, we provide a starter’s guide for pymoo. minimized and f2 (x) maximized. The optimization is with
It covers the most important steps in an optimization scenario subject to two inequality constraints (J = 2) where g1 (x)
starting with the installation of the framework, defining an is formulated as a less-than-equal-to and g2 (x) as a greater-
optimization problem, and the optimization procedure itself. than-equal-to constraint. The problem is defined with respect
to two variables (N = 2), x1 and x2 , which both are in the
A. INSTALLATION range [−2, 2]. The problem does not contain any equality
Our framework pymoo is available on PyPI [17] which is a constraints (K = 0). Contour plots of the objective functions
central repository to make Python software package easily are shown in Figure 1. The contours of the objective function
accessible. The framework can be installed by using the f1 (x) are represented by solid lines and f2 (x) by dashed lines.
package manager: Constraints g1 (x) and g2 (x) are parabolas which intersect
the x1 -axis at (0.1, 0.9) and (0.4, 0.6). The Pareto-optimal
$ pip install -U pymoo set is marked by a thick orange line. Through the combi-
Some components are available in Python and additionally nation of both constraints the Pareto-set is split into two
in Cython [18]. Cython allows developers to annotate existing parts. Analytically, the Pareto-optimal set is given by PS =
Python code which is translated to C or C++ programming {(x1 , x2 ) | (0.1 ≤ x1 ≤ 0.4) ∨ (0.6
√ ≤ x1 ≤ 0.9) ∧ x2 = 0}
languages. The translated files are compiled to a binary exe- and the efficient-front by f2 = ( f1 − 1)2 where f1 is defined
cutable and can be used to speed up computations. During in [0.01, 0.16] and [0.36, 0.81].
the installation of pymoo, attempts are made for compilation,
however, if unsuccessful due to the lack of a suitable compiler
or other reasons, the pure Python version is installed. We
would like to emphasize that the compilation is optional
and all features are available without it. More detail about
the compilation and troubleshooting can be found in our
installation guide online.

B. PROBLEM DEFINITION
In general, multi-objective optimization has several objective
functions with subject to inequality and equality constraints
to optimize [19]. The goal is to find a set of solutions (variable
vectors) that satisfy all constraints and are as good as possible
regarding all its objectives values. The problem definition in
its general form is given by:
FIGURE 1. Contour plot of the test problem (Equation 2).
min fm (x) m = 1, .., M ,
s.t. gj (x) ≤ 0, j = 1, .., J ,
(1)
hk (x) = 0, k = 1, .., K , In the following, we provide an example implementation
of the problem formulation above using pymoo. We assume
xiL ≤ xi ≤ xiU , i = 1, .., N .
the reader is familiar with Python and has a fundamental
The formulation above defines a multi-objective optimiza- knowledge of NumPy [20] which is utilized to deal with
tion problem with N variables, M objectives, J inequality, and vector and matrix computations.
K equality constraints. Moreover, for each variable xi , lower In pymoo, we consider pure minimization problems for
and upper variable boundaries (xiL and xiU ) are also defined. optimization in all our modules. However, without loss of
In the following, we illustrate a bi-objective optimization generality an objective which is supposed to be maximized,
problem with two constraints. can be multiplied by −1 and be minimized [19]. There-
min f1 (x) = (x12 + x22 ), fore, we minimize −f2 (x) instead of maximizing f2 (x) in
max f2 (x) = −(x1 − 1)2 − x22 , our optimization problem. Furthermore, all constraint func-
tions need to be formulated as a less-than-equal-to constraint.
s.t. g1 (x) = 2 (x1 − 0.1) (x1 − 0.9) ≤ 0, For this reason, g2 (x) needs to be multiplied by −1 to flip
(2)
g2 (x) = 20 (x1 − 0.4) (x1 − 0.6) ≥ 0, the ≥ to a ≤ relation. We recommend the normalization
− 2 ≤ x1 ≤ 2, of constraints to give equal importance to each of them.
− 2 ≤ x2 ≤ 2. For g1 (x), the constant ‘resource’ value of the constraint is
2 · (−0.1) · (−0.9) = 0.18 and for g2 (x) it is 20 · (−0.4) ·
1 ALL SOURCE CODES IN THIS PAPER ARE RELATED TO PYMOO (−0.6) = 4.8, respectively. We achieve normalization of
VERSION 0.4.0. A GETTING STARTED GUIDE FOR UPCOMING VER- constraints by dividing g1 (x) and g2 (x) by the corresponding
SIONS CAN BE FOUND AT HTTPS://PYMOO.ORG. constant [21].

VOLUME 8, 2020 89499


J. Blank, K. Deb: Pymoo: Multi-Objective Optimization in Python

Finally, the optimization problem to be optimized using through customized operators. However, in our case the opti-
pymoo is defined by: mization problem is rather simple, but the aspect of having
two objectives and two constraints should be considered.
min f1 (x) = (x12 + x22 ),
For this reason, we decided to use NSGA-II [12] with its
min f2 (x) = (x1 − 1)2 + x22 , default configuration with minor modifications. We chose
s.t. g1 (x) = 2 (x1 − 0.1) (x1 − 0.9) / 0.18 ≤ 0, a population size of 40, but instead of generating the same
(3) number of offsprings, we create only 10 in each generation.
g2 (x) = −20 (x1 − 0.4) (x1 − 0.6) / 4.8 ≤ 0,
This is a steady-state variant of NSGA-II and it is likely to
− 2 ≤ x1 ≤ 2, improve the convergence property for rather simple optimiza-
− 2 ≤ x2 ≤ 2. tion problems without much difficulties, such as the existence
of local Pareto-fronts. Moreover, we enable a duplicate check
Next, the derived problem formulation is implemented
which makes sure that the mating produces offsprings which
in Python. Each optimization problem in pymoo has to
are different with respect to themselves and also from the
inherit from the Problem class. First, by calling the
existing population regarding their variable vectors. To illus-
super() function the problem properties such as the num-
trate the customization aspect, we listed the other unmodified
ber of variables (n_var), objectives (n_obj) and con-
default operators in the code-snippet below. The constructor
straints (n_constr) are initialized. Furthermore, lower (xl)
of NSGA2 is called with the supplied parameters and returns
and upper variables boundaries (xu) are supplied as a NumPy
an initialized algorithm object.
array. Additionally, the evaluation function _evaluate
needs to be overwritten from the superclass. The method
takes a two-dimensional NumPy array x with n rows and
m columns as an input. Each row represents an individual
and each column an optimization variable. After doing the
necessary calculations, the objective values are added to the
dictionary out with the key F and the constraints with key G.

D. OPTIMIZATION
Next, we use the initialized algorithm object to optimize the
defined problem. Therefore, the minimize function with
both instances problem and algorithm as parameters
is called. Moreover, we supply the termination criterion of
running the algorithm for 40 generations which will result
in 40 + 40 × 10 = 440 function evaluations. In addition,
we define a random seed to ensure reproducibility and enable
the verbose flag to see printouts for each generation.

As mentioned above, pymoo utilizes NumPy [20] for most


of its computations. To be able to retrieve gradients through
automatic differentiation we are using a wrapper around
NumPy called Autograd [22]. Note that this is not obligatory
for a problem definition.
The method returns a Result object which contains the
C. ALGORITHM INITIALIZATION non-dominated set of solutions found by the algorithm.
Next, we need to initialize a method to optimize the prob- The optimization results are illustrated in Figure 2 where
lem. In pymoo, an algorithm object needs to be created for the design space is shown in Figure 2a and in the objective
optimization. For each of the algorithms an API documenta- space in Figure 2b. The solid line represents the analyti-
tion is available and through supplying different parameters, cally derived Pareto set and front in the corresponding space
algorithms can be customized in a plug-and-play manner. and the circles solutions found by the algorithm. It can be
In general, the choice of a suitable algorithm for optimization observed that the algorithm was able to converge and a set
problems is a challenge itself. Whenever problem character- of nearly-optimal solutions was obtained. Some additional
istics are known beforehand we recommended using those post-processing steps and more details about other aspects of

89500 VOLUME 8, 2020


J. Blank, K. Deb: Pymoo: Multi-Objective Optimization in Python

other hand, it allows flexibility and extendibility by adding


new modules. Figure 3 visualizes the architecture of pymoo.
The first level of abstraction consists of the optimization
problems, algorithms and analytics. Each of the modules can
be categorized into more detail and consists of multiple sub-
modules.
(i) Problems: Optimization problems in our framework
are categorized into single, multi, and many-objective
test problems. Gradients are available through automatic
differentiation and parallelization can be implemented
by using a variety of techniques.
(ii) Optimization: Since most of the algorithms are based
on evolutionary computations, operators such as sam-
pling, mating selection, crossover and mutation have
to be chosen or implemented. Furthermore, because
many problems in practice have one or more con-
straints, a methodology for handling those must be
incorporated. Some algorithms are based on decom-
position which splits the multi-objective problem into
many single-objective problems. Moreover, when the
algorithm is used to solve the problem, a termination
FIGURE 2. Result of the getting started optimization.
criterion must be defined either explicitly or implicitly
by the implementation of the algorithm.
(iii) Analytics: During and after an optimization run analyt-
the optimization procedure can be found in the remainder of ics support the understanding of data. First, intuitively
this paper and in our software documentation. the design space, objective space, or other metrics can
The starters guide showed the steps starting from the instal- be explored through visualization. Moreover, to measure
lation up to solving an optimization problem. The investiga- the convergence and/or diversity of a Pareto-optimal set
tion of a constrained bi-objective problem demonstrated the performance indicators can be used. For real-parameter
basic procedure in an optimization scenario. problems, recently proposed theoretical KKT proxim-
ity metric [23] computation procedure is included in
IV. ARCHITECTURE pymoo to compute the proximity of a solution to the
Software architecture is fundamentally important to keep true Pareto-optimal front, despite not knowing its exact
source code organized. On the one hand, it helps developers location. To support the decision making process either
and users to get an overview of existing classes, and on the through finding points close to the area of interest in

FIGURE 3. Architecture of pymoo.

VOLUME 8, 2020 89501


J. Blank, K. Deb: Pymoo: Multi-Objective Optimization in Python

the objective space or high trade-off solutions. This can TABLE 2. Multi-objective optimization test problems.
be applied either during an optimization run to mimic
interactive optimization or as a post analysis.
In the remainder of the paper, we will discuss each of the
modules mentioned in more detail.

V. PROBLEMS
It is common practice for researchers to evaluate the perfor-
mance of algorithms on a variety of test problems. Since we
know no single-best algorithm for all arbitrary optimization
problems exist [24], this helps to identify problem classes
where the algorithm is suitable. Therefore, a collection of
test problems with different numbers of variables, objectives
or constraints and alternating complexity becomes handy
for algorithm development. Moreover, in a multi-objective
context, test problems with different Pareto-front shapes or
varying variable density close to the optimal region are of
interest.

A. IMPLEMENTATIONS
In our framework, we categorize test problems regarding
the number of objectives: single-objective (1 objective),
multi-objective (2 or 3 objectives) and many-objective (more
than 3 objectives). Test problems implemented in pymoo are
listed in Table 2. For each problem the number of variables,
objectives, and constraints are indicated. If the test problem
is scalable to any of the parameters, we label the problem
with (s). If the problem is scalable, but a default number was
original proposed we indicate that with surrounding brackets.
In case the category does not apply, for example because
we refer to a test problem family with several functions,
we use (·).
The implementations in pymoo let end-users define what
values of the corresponding problem should be returned.
On an implementation level, the evaluate function of a
Problem instance takes a list return_value_of which
contains the type of values being returned. By default the
with respect to each variable is given by:
objective values “F” and if the problem has constraints the  
constraint violation “CV” are included. The constraint func- 2 x1 2 x2
∇F = . (4)
tion values can be returned independently by adding “G”. 2(x1 − 1) 2 x2
This gives developers the flexibility to receive the values that
are needed for their methods. The gradients at the point [0.1, 0.2] are calculated by:

B. GRADIENTS
All our test problems are implemented using Autograd [22].
Therefore, automatic differentiation is supported out of the returns the following output
box. We have shown in Section III how a new optimization
problem is defined.
If gradients are desired to be calculated the prefix
“d” needs to be added to the corresponding value of
the return_value_of list. For instance to ask for the
objective values and its gradients return_value_of = It can easily be verified that the values are matching with
[“F”, “dF”]. the analytic gradient derivation. The gradients for the con-
Let us consider the problem we have implemented shown straint functions can be calculated accordingly by adding
in Equation 3. The derivation of the objective functions F “dG” to the return_value_of list.

89502 VOLUME 8, 2020


J. Blank, K. Deb: Pymoo: Multi-Objective Optimization in Python

C. PARALLELIZATION TABLE 3. Multi-objective optimization algorithms.

If evaluation functions are computationally expensive,


a serialized evaluation of a set of solutions can become the
bottleneck of the overall optimization procedure. For this rea-
son, parallelization is desired for an use of existing computa-
tional resources more efficiently and distribute long-running
calculations. In pymoo, the evaluation function receives a set
of solutions if the algorithm is utilizing a population. This
empowers the user to implement any kind of parallelization
as long as the objective values for all solutions are written
as an output when the evaluation function terminates. In our
framework, a couple of possibilities to implement paralleliza-
tion exist:
(i) Vectorized Evaluation: A common technique to par-
allelize evaluations is to use matrices where each row
represents a solution. Therefore, a vectorized evaluation each algorithm is customizable and variants can be initial-
refers to a column which includes the variables of all ized with different parameters. For instance, a Steady-State
solutions. By using vectors the objective values of all NSGA-II [27] can be initialized by setting the number of
solutions are calculated at once. The code-snippet of the offspring to 1. This can be achieved by supplying this as a
example problem in Section III shows such an imple- parameter in the initialization method as shown in Section III.
mentation using NumPy [20]. To run calculations on a Moreover, it is worth mentioning that many-objective algo-
GPU, implementing support for PyTorch [25] tensors rithms, such as NSGA-III or MOEAD, require reference
can be done with little overhead given suitable hardware directions to be provided. The reference directions are com-
and correctly installed drivers. monly desired to be uniform or to have a bias towards a
(ii) Threaded Loop-wise Evaluation: If the function eval- region of interest. Our framework offers an implementation
uation should occur independently, a for loop can be of the Das and Dennis method [28] for a fixed number of
used to set the values. By default the evaluation is serial- points (fixed with respect to a parameter often referred to
ized and no calculations occur in parallel. By providing as partition number) and a recently proposed Riesz-Energy
a keyword to the evaluation function, pymoo spawns a based method which creates a well-spaced point set for an
thread for each evaluation and manages those by using arbitrary number of points and is capable of introducing a
the default thread pool implementation in Python. This bias towards preferred regions in the objective space [29].
behaviour can be implemented out of the box and the
number of parallel threads can be modified. B. OPERATORS
(iii) Distributed Evaluation: If the evaluation should not The following evolutionary operators are available:
be limited to a single machine, the evaluation itself can
(i) Sampling: The initial population is mostly based on
be distributed to several workers or a whole cluster.
sampling. In some cases it is created through domain
We recommend using Dask [26] which enables dis-
knowledge and/or some solutions are already evalu-
tributed computations on different levels. For instance,
ated, they can directly be used as an initial population.
the matrix operation itself can be distributed or a whole
Otherwise, it can be sampled randomly for real, inte-
function can be outsourced. Similar to the loop wise
ger, or binary variables. Additionally, Latin-Hypercube
evaluation each individual can be evaluate element-wise
Sampling [41] can be used for real variables.
by sending it to a worker.
(ii) Crossover: A variety of crossover operators for different
type of variables are implemented. In Figure 4 some of
VI. OPTIMIZATION MODULE them are presented. Figures 4a to 4d help to visualize
The optimization module provides different kinds of the information exchange in a crossover with two par-
sub-modules to be used in algorithms. Some of them are more ents being involved. Each row represents an offspring
of a generic nature, such as decomposition and termination and each column a variable. The corresponding boxes
criterion, and others are more related to evolutionary com- indicate whether the values of the offspring are inherited
puting. By assembling those modules together algorithms are from the first or from the second parent. For one and
built. two-point crossovers it can be observed that either one
or two cuts in the variable sequence exist. Contrarily,
A. ALGORITHMS the Uniform Crossover (UX) does not have any clear
Available algorithm implementations in pymoo are listed pattern, because each variable is chosen randomly either
in Table 3. Compared to other optimization frameworks from the first or from the second parent. For the Half
the list of algorithms may look rather short, however, Uniform Crossover (HUX) half of the variables, which

VOLUME 8, 2020 89503


J. Blank, K. Deb: Pymoo: Multi-Objective Optimization in Python

predefined number of function evaluations, iterations, or a


more advanced criterion, such as the change of a performance
metric over time. For example, we have implemented a ter-
mination criterion based on the variable and objective space
difference between generations. To make the termination
criterion more robust the last k generations are considered.
The largest movement from a solution to its closest neighbour
is tracked across generation and whenever it is below a certain
threshold, the algorithm is considered to have converged.
Analogously, the movement in the objective space can also be
used. In the objective space, however, normalization is more
challenging and has to be addressed carefully. The default
termination criterion for multi-objective problems in pymoo
keeps track of the boundary points in the objective space and
uses them, when they have settled down, for normalization.
More details about the proposed termination criterion can be
found in [47].

D. DECOMPOSITION
Decomposition transforms multi-objective problems into
many single-objective optimization problems [48]. Such a
technique can be either embedded in a multi-objective algo-
rithm and solved simultaneously or independently using a
single-objective optimizer. Some decomposition methods are
FIGURE 4. Illustration of some crossover operators for different variables
types. based on the lp-metrics with different p values. For instance,
a naive but frequently used decomposition approach is the
Weighted-Sum Method (p = 1), which is known to be not
are different, are exchanged. For the purpose of illus- able to converge to the non-convex part of a Pareto-front [19].
tration, we have created two parents that have different Moreover, instead of summing values, Tchebysheff Method
values in 10 different positions. For real variables, Sim- (p = ∞) considers only the maximum value of the dif-
ulated Binary Crossover [42] is known to be an efficient ference between the ideal point and a solution. Similarly,
crossover. It mimics the crossover of binary encoded the Achievement Scalarization Function (ASF) [49] and
variables. In Figure 4e the probability distribution when a modified version Augmented Achievement Scalarization
the parents x1 = 0.2 and x2 = 0.8 where xi ∈ [0, 1] with Function (AASF) [50] use the maximum of all differences.
η = 0.8 are recombined is shown. Analogously, in case Furthermore, Penalty Boundary Intersection (PBI) [40] is
of integer variables we subtract 0.5 from the lower and calculated by a weighted sum of the norm of the projection of
add (0.5 − ) to the upper bound before applying the a point onto the reference direction and the perpendicular dis-
crossover and round to the nearest integer afterwards tance. Also it is worth to note that normalization is essential
(see Figure 4f). for any kind of decomposition. All decomposition techniques
(iii) Mutation: For real and integer variables Polynomial mentioned above are implemented in pymoo.
Mutation [19], [43] and for binary variables Bitflip
mutation [44] is provided. VII. ANALYTICS
Different problems require different type of operators. A. PERFORMANCE INDICATORS
In practice, if a problem is supposed to be solved repeatedly For single-objective optimization algorithms the com-
and routinely, it makes sense to customize the evolution- parison regarding performance is rather simple because
ary operators to improve the convergence of the algorithm. each optimization run results in a single best solution.
Moreover, for custom variable types, for instance trees or In multi-objective optimization, however, each run returns
mixed variables [45], custom operators [46] can be imple- a non-dominated set of solutions. To compare sets of solu-
mented easily and called by algorithm class. Our software tions, various performance indicators have been proposed in
documentation contains examples for custom modules, the past [51]. In pymoo most commonly used performance
operators and variable types. indicators are described:
(i) GD/IGD: Given the Pareto-front PF the deviation
C. TERMINATION CRITERION between the non-dominated set S found by the algo-
For every algorithm it must be determined when it should rithm and the optimum can be measured. Following this
terminate a run. This can be simply based on a principle, Generational Distance (GD) indicator [52]

89504 VOLUME 8, 2020


J. Blank, K. Deb: Pymoo: Multi-Objective Optimization in Python

calculates the average Euclidean distance in the objec- plotted vertically and represent an objective. Each solution is
tive space from each solution in S to the closest solution illustrated by a line from the left to the right. The intersec-
in PF. This measures the convergence of S, but does tion of a line and an axis indicate the value of the solution
not indicate whether a good diversity on the Pareto- regarding the corresponding objective. For the purpose of
front has been reached. Similarly, Inverted Generational comparison solution(s) can be highlighted by varying color
Distance (IGD) indicator [52] measures the average and opacity.
Euclidean distance in the objective space from each Moreover, a common practice is to project the higher
solution in PF to the closest solution in S. The Pareto- dimensional objective values onto the 2D plane using a trans-
front as a whole needs to be covered by solutions from formation function. Radviz (Figure 5e) visualizes all points
S to minimize the performance metric. Thus, lower the in a circle and the objective axes are uniformly positioned
GD and IGD values, the better is the set. However, IGD around on the perimeter. Considering a minimization problem
is known to be not Pareto compliant [53]. and a set of non-dominated solutions, an extreme point very
(ii) GD+/IGD+: A variation of GD and IGD has been close to an axis represents the worst solution for that corre-
proposed in [53]. The Euclidean distance is replaced by sponding objective, but is comparably ‘‘good’’ in one or many
a distance measure that takes the dominance relation into other objectives. Similarly, Star Coordinate Plots (Figure 5f)
account. The authors show that IGD+ is weakly Pareto illustrate the objective space, except that the transformation
compliant. function allows solutions outside of the circle.
(iii) Hypervolume: Moreover, the dominated portion of the Heatmaps (Figure 5g) are used to represent the goodness
objective space can be used to measure the quality of of solutions through colors. Each row represents a solution
non-dominated solutions [54]. The higher the hypervol- and each column a variable. We leave the choice to the
ume, the better is the set. Instead of the Pareto-front a end-user of what color map to use and whether light or dark
reference point needs to be provided. It has been shown colors illustrate better or worse solutions. Also, solutions can
that Hypervolume is Pareto compliant [55]. Because the be sorted lexicographically by their corresponding objective
performance metric becomes computationally expen- values.
sive in higher dimensional spaces the exact measure Instead of visualizing a set of solutions, one solution can
becomes intractable. However, we plan to include some be illustrated at a time. The Petal Diagram (Figure 5h) is a
proposed approximation methods in the near future. pie diagram where the objective value is represented by each
Performance indicators are used to compare existing algo- piece’s diameter. Colors are used to further distinguish the
rithms. Moreover, the development of new algorithms can be pieces. Finally, the Spider-Web or Radar Diagram (Figure 5i)
driven by the goodness of different metrics itself. shows the objectives values as a point on an axis. The ideal
and nadir point [19] is represented by the inner and outer
B. VISUALIZATION polygon. By definition, the solution lies in between those two
The visualization of intermediate steps or the final result is extremes. If the objective space ranges are scaled differently,
inevitable. In multi and many-objective optimization, visual- normalization for the purpose of plotting can be enabled and
ization of the objective space is of interest so that trade-off the diagram becomes symmetric. New and emerging methods
information among solutions can be easily experienced from for visualizing more than three-dimensional efficient solu-
the plots. Depending on the dimension of the objective space, tions, such as 2.5-dimensional PaletteViz plots [57], would
different types of plots are suitable to represent a single be implemented in the future.
or a set of solutions. In pymoo the implemented visualiza-
tions wrap around the well-known plotting library in Python C. DECISION MAKING
Matplotlib [56]. Keyword arguments provided by Matplotlib
In practice, after obtaining a set of non-dominated solu-
itself are still available which allows to modify for instance
tions a single solution has to be chosen for implementation.
the color, thickness, opacity of lines, points or other shapes.
pymoo provides a few ‘‘a posteriori’’ approaches for decision
Therefore, all visualization techniques are customizable and
making [19].
extendable.
For 2 or 3 objectives, scatter plots (see Figure 5a and 5b) (i) Compromise Programming: One way of making a
can give a good intuition about the solution set. Trade-offs decision is to compute value of a scalarized and
can be observed by considering the distance between two aggregated function and select one solution based
points. It might be desired to normalize each objective to on minimum or maximum value of the function.
make sure a comparison between values is based on relative In pymoo a number of scalarization functions described
and not absolute values. Pairwise Scatter Plots (see Figure 5c) in Section VI-D can be used to come to a decision
visualize more than 3 objectives by showing each pair of axes regarding desired weights of objectives.
independently. The diagonal is used to label the correspond- (ii) Pseudo-Weights: However, a more intuitive way
ing objectives. to chose a solution out of a Pareto-front is the
Also, high-dimensional data can be illustrated by Parallel pseudo-weight vector approach proposed in [19]. The
Coordinate Plots (PCP) as shown in Figure 5d. All axes are pseudo weight wi for the i-th objective function is

VOLUME 8, 2020 89505


J. Blank, K. Deb: Pymoo: Multi-Objective Optimization in Python

FIGURE 5. Different visualization methods coded in pymoo.

calculated by: It was described to be embedded in an algorithm to guide


the search; we, however, use it for post-processing. The
(f max − fi (x)) / (fimax − fimin ) metric for each solution pair xi and xj in a non-dominated
wi = PM i . (5)
max − f (x)) / (f max − f min ) set is given by:
m=1 (fm m m m
PM
The normalized distance to the worst solution regarding i=1 max[0, fm (xj ) − fm (xi )]
T (xi , xj ) = PM , (6)
each objective i is calculated. It is interesting to note that i=1 max[0, fm (xi ) − fm (xj )]
for non-convex Pareto-fronts, the pseudo weight does
where the numerator represents the aggregated sacrifice
not correspond to the result of an optimization using
and the denominator the aggregated gain. The trade-off
the weighted-sum method. A solution having the closest
measure µ(xi , S) for each solution xi with respect to a
pseudo-weight to a target preference vector of objectives
set of neighboring solutions S is obtained by:
(f1 being preferred twice as important as f2 results in a
target preference vector of (0.667, 0.333)) can be chosen µ(xi , S) = min T (xi , xj ) (7)
xj ∈S
as the preferred solution from the efficient set.
(iii) High Trade-Off Solutions: Furthermore, high trade-off It finds the minimum T (xi , xj ) from xi to all other
solutions are usually of interest, but not straightfor- solutions xj ∈ S. Instead of calculating the metric
ward to detect in higher-dimensional objective spaces. with respect to all others, we provide the option to
We have implemented the procedure proposed in [65]. only consider the k closest neighbors in the objective

89506 VOLUME 8, 2020


J. Blank, K. Deb: Pymoo: Multi-Objective Optimization in Python

space to reduce the computational complexity. Based Furthermore, we like to mention that any kind of con-
on circumstances, the ‘min’ operator can be replaced tribution is more than welcome. We see our framework as
with ‘average’, or ‘max’, or any other suitable operator. a collaborative collection from and to the multi-objective
Thereafter, the solution having the maximum µ can optimization community. By adding a method or algorithm to
be chosen as the preferred solution, meaning that this pymoo the community can benefit from a growing compre-
solution causes a maximum sacrifice in one of the hensive framework and it can help researchers to advertise
objective values for a unit gain in another objective value their methods. Interested researchers are welcome to con-
for it be the most valuable solution for implementation. tact the authors. In general, different kinds of contributions
The above methods are algorithmic, but requires an user are possible and more information can be found online.
interaction to choose a single preferred solution. However, Moreover, we would like to mention that even though we
in real practice, a more problem specific decision-making try to keep our framework as bug-free as possible, in case
method must be used, such as an interaction EMO method of exceptions during the execution or doubt of correctness,
suggested elsewhere [66]. We emphasize here the fact please contact us directly or use our issue tracker.
that multi-objective frameworks should include methods for
multi-criteria decision making and support end-user further REFERENCES
in choosing a solution out of a trade-off solution set. [1] G. Rossum, ‘‘Python reference manual,’’ CWI, Amsterdam, The
Netherlands, Tech. Rep. 10.5555/869369, 1995. [Online]. Available:
https://dl.acm.org/doi/book/10.5555/869369
VIII. CONCLUDING REMARKS [2] M. Bücker, G. Corliss, P. Hovland, U. Naumann, and B. Norris, Auto-
This paper has introduced pymoo, a multi-objective opti- matic Differentiation: Applications, Theory, and Implementations (Lec-
mization framework in Python. We have walked through ture Notes in Computational Science and Engineering). Berlin, Germany:
Springer-Verlag, 2006.
our framework beginning with the installation up to the [3] G. Brandl. (2019). Sphinx Documentation. [Online]. Available:
optimization of a constrained bi-objective optimization prob- https://www.sphinx-doc.org/
lem. Moreover, we have presented the overall architecture of [4] A. Pajankar, Python Unit Test Automation: Practical Techniques for Python
Developers and Testers, 1st ed. Berkely, CA, USA: Apress, 2017.
the framework consisting of three core modules: Problems, [5] J. J. Durillo and A. J. Nebro, ‘‘JMetal: A java framework for multi-
Optimization, and Analytics. Each module has been objective optimization,’’ Adv. Eng. Softw., vol. 42, no. 10, pp. 760–771,
described in depth and illustrative examples have been pro- Oct. 2011. [Online]. Available: http://www.sciencedirect.com/science/
article/pii/S0965997811001219
vided. We have shown that our framework covers various [6] J. Gosling, B. Joy, G. L. Steele, G. Bracha, and A. Buckley, The Java
aspects of multi-objective optimization including the visu- Language Specification, Java SE 8 Edition, 1st ed. Reading, MA, USA:
alization of high-dimensional spaces and multi-criteria deci- Addison-Wesley, 2014.
[7] A. Benítez-Hidalgo, A. J. Nebro, J. García-Nieto, I. Oregi, and
sion making to finally select a solution out of the obtained J. Del Ser, ‘‘JMetalPy: A Python framework for multi-objective
solution set. One distinguishing feature of our framework optimization with metaheuristics,’’ Swarm Evol. Comput., vol. 51,
with other existing ones is that we have provided a few Dec. 2019, Art. no. 100598. [Online]. Available: http://www.sciencedirect.
com/science/article/pii/S2210650219301397
options for various key aspects of a multi-objective opti- [8] D. Izzo, ‘‘PyGMO and PyKEP: Open source tools for massively parallel
mization task, providing standard evolutionary operators optimization in astrodynamics (the case of interplanetary trajectory
for optimization, standard performance metrics for evalu- optimization),’’ in 5th Int. Conf. Astrodyn. Tools Techn. (ICATT), 2012.
[Online]. Available: http://www.esa.int/gsp/ACT/doc/MAD/pub/ACT-
ating a run, standard visualization techniques for showcas- RPR-MAD-2012-(ICATT)PyKEP-PaGMO-SOCIS.pdf
ing obtained trade-off solutions, and a few approaches for [9] D. Hadka. Platypus: Multiobjective Optimization in Python. Accessed:
decision-making. Most such implementations were originally May 16, 2019. [Online]. Available: https://platypus.readthedocs.io
[10] F.-A. Fortin, F.-M. De Rainville, M.-A. Gardner, M. Parizeau, and
suggested and developed by the second author and his collab- C. Gagné, ‘‘DEAP: Evolutionary algorithms made easy,’’ J. Mach. Learn.
orators for more than 25 years. Hence, we consider that the Res., vol. 13, pp. 2171–2175, Jul. 2012.
implementations of all such ideas are authentic and error-free. [11] A. Garrett. Inspyred: Python Library for Bio-Inspired Computational Intel-
ligence. Accessed: May 16, 2019. [Online]. Available: https://github.com/
Thus, the results from the proposed framework should stay as
aarongarrett/inspyred
benchmark results of implemented procedures. [12] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, ‘‘A fast and elitist
However, the framework can be definitely extended to multiobjective genetic algorithm: NSGA-II,’’ IEEE Trans. Evol. Comput.,
vol. 6, no. 2, pp. 182–197, Apr. 2002, doi: 10.1109/4235.996017.
make it more comprehensive and we are constantly adding
[13] Y. Tian, R. Cheng, X. Zhang, and Y. Jin, ‘‘PlatEMO: A MATLAB platform
new capabilities based on practicalities learned from our col- for evolutionary multi-objective optimization [Educational Forum],’’ IEEE
laboration with industries. In the future, we plan to implement Comput. Intell. Mag., vol. 12, no. 4, pp. 73–87, Nov. 2017.
more optimization algorithms and test problems to provide [14] D. Hadka. MOEA Framework: A Free and Open Source Java Framework
for Multiobjective Optimization. Accessed: May 16, 2019. [Online]. Avail-
more choices to end-users. Also, we aim to implement some able: http://moeaframework.org
methods from the classical literature on single-objective opti- [15] E. López-Camacho, M. J. García Godoy, A. J. Nebro, and
mization which can also be used for multi-objective opti- J. F. Aldana-Montes, ‘‘JMetalCpp: Optimizing molecular docking
problems with a C++ Metaheuristic framework,’’ Bioinformatics, vol. 30,
mization through decomposition or embedded as a local no. 3, pp. 437–438, Feb. 2014, doi: 10.1093/bioinformatics/btt679.
search. So far, we have provided a few basic performance [16] F. Biscani, D. Izzo, and C. H. Yam, ‘‘A global optimisation toolbox for mas-
metrics. We plan to extend this by creating a module that sively parallel engineering optimisation,’’ Apr. 2010, arXiv:1004.3824.
[Online]. Available: http://arxiv.org/abs/1004.3824
runs a list of algorithms on test problems automatically and [17] PS Foundation. PyPI: The Python Package Index. Accessed: May 20, 2019.
provides a statistics of different performance indicators. [Online]. Available: https://pypi.org

VOLUME 8, 2020 89507


J. Blank, K. Deb: Pymoo: Multi-Objective Optimization in Python

[18] S. Behnel, R. Bradshaw, C. Citro, L. Dalcin, D. S. Seljebotn, and K. Smith, [40] Q. Zhang and H. Li, ‘‘A multi-objective evolutionary algorithm based on
‘‘Cython: The best of both worlds,’’ Comput. Sci. Eng., vol. 13, no. 2, decomposition,’’ IEEE Trans. Evol. Comput., vol. 11, no. 6, pp. 712–731,
pp. 31–39, Mar. 2011. Dec. 2007.
[19] K. Deb, Multi-Objective Optimization Using Evolutionary Algorithms. [41] M. D. McKay, R. J. Beckman, and W. J. Conover, ‘‘A comparison of
New York, NY, USA: Wiley, 2001. three methods for selecting values of input variables in the analysis of
[20] T. Oliphant, NumPy: A Guide to NumPy. USA: Trelgol output from a computer code,’’ Technometrics, vol. 42, no. 1, pp. 55–61,
Publishing, 2006. Accessed: May 16, 2019. [Online]. Available: Feb. 2000, doi: 10.2307/1271432.
https://web.mit.edu/dvp/Public/numpybook.pdf [42] K. Deb, K. Sindhya, and T. Okabe, ‘‘Self-adaptive simulated binary
[21] K. Deb and R. Datta, ‘‘A bi-objective constrained optimization algorithm crossover for real-parameter optimization,’’ in Proc. 9th Annu. Conf.
using a hybrid evolutionary and penalty function approach,’’ Eng. Optim., Genetic Evol. Comput. (GECCO), New York, NY, USA, 2007,
vol. 45, no. 5, pp. 503–527, May 2013. pp. 1187–1194, doi: 10.1145/1276958.1277190.
[22] D. Maclaurin, D. Duvenaud, and R. P. Adams, ‘‘Autograd: Effortless [43] K. Deb and D. Deb, ‘‘Analysing mutation schemes for real-parameter
gradients in numpy,’’ in Proc. ICML AutoML Workshop, 2015. [Online]. genetic algorithms,’’ Int. J. Artif. Intell. Soft Comput., vol. 4, no. 1,
Available: https://bib/maclaurin/maclaurinautograd/automl-short.pdf, pp. 1–28, 2014.
https://indico.lal.in2p3.fr/event/2914/session/1/contribution/6/3/ [44] D. E. Goldberg, Genetic Algorithms for Search, Optimization, and
material/paper/0.pdf, and https://github.com/HIPS/autograd Machine Learning. Reading, MA, USA: Addison-Wesley, 1989.
[23] K. Deb and M. Abouhawwash, ‘‘An optimality theory based proximity [45] K. Deb and M. Goyal, ‘‘A flexible optimization procedure for mechanical
measure for set based multi-objective optimization,’’ IEEE Trans. Evol. component design based on genetic adaptive search,’’ J. Mech. Design,
Comput., vol. 20, no. 4, pp. 515–528, Sep. 2016. vol. 120, no. 2, pp. 162–164, Jun. 1998.
[24] D. H. Wolpert and W. G. Macready, ‘‘No free lunch theorems for optimiza- [46] K. Deb and C. Myburgh, ‘‘A population-based fast algorithm for a billion-
tion,’’ IEEE Trans. Evol. Comput., vol. 1, no. 1, pp. 67–82, Apr. 1997, doi: dimensional resource allocation problem with integer variables,’’ Eur. J.
10.1109/4235.585893. Oper. Res., vol. 261, no. 2, pp. 460–474, Sep. 2017.
[25] A. Paszke, S. Gross, S. Chintala, G. Chanan, E. Yang, Z. DeVito, [47] J. Blank and K. Deb, ‘‘A running performance metric and termination cri-
Z. Lin, A. Desmaison, L. Antiga, and A. Lerer, ‘‘Automatic terion for evaluating evolutionary multi- and many-objective optimization
differentiation in pytorch,’’ Tech. Rep., 2017. [Online]. Available: algorithms,’’ in Proc. IEEE World Congr. Comput. Intell. (WCCI), 2020.
https://github.com/pytorch/pytorch/blob/master/CITATION [48] A. Santiago, H. J. F. Huacuja, B. Dorronsoro, J. E. Pecero, C. G. Santillan,
[26] Dask Development Team. (2016). Dask: Library for Dynamic Task J. J. G. Barbosa, and J. C. S. Monterrubio, A Survey of Decomposition
Scheduling. [Online]. Available: https://dask.org Methods for Multi-Objective Optimization. Cham, Switzerland: Springer,
[27] S. Mishra, S. Mondal, and S. Saha, ‘‘Fast implementation of steady- 2014, pp. 453–465, doi: 10.1007/978-3-319-05170-3_31.
state NSGA-II,’’ in Proc. IEEE Congr. Evol. Comput. (CEC), Jul. 2016, [49] A. P. Wierzbicki, ‘‘The use of reference objectives in multiobjective opti-
pp. 3777–3784. mization,’’ in Multiple Criteria Decision Making Theory and Application.
[28] I. Das and J. E. Dennis, ‘‘Normal-boundary intersection: A new Cham, Switzerland: Springer, 1980, pp. 468–486.
method for generating the Pareto surface in nonlinear multicriteria [50] A. P. Wierzbicki, ‘‘A mathematical basis for satisficing decision mak-
optimization problems,’’ SIAM J. Optim., vol. 8, no. 3, pp. 631–657, ing,’’ Math. Model., vol. 3, no. 5, pp. 391–405, 1982. [Online]. Available:
Aug. 1998. http://www.sciencedirect.com/science/article/pii/0270025582900380
[29] J. Blank, K. Deb, Y. Dhebar, S. Bandaru, and H. Seada, ‘‘Generating [51] J. Knowles and D. Corne, ‘‘On metrics for comparing non-dominated sets,’’
well-spaced points on a unit simplex for evolutionary many-objective in Proc. Congr. Evol. Comput. Conf. (CEC), 2002, pp. 711–716.
optimization,’’ IEEE Trans. Evol. Comput., early access, May 6, 2020. [52] D. A. V. Veldhuizen and D. A. V. Veldhuizen, ‘‘Multiobjective evo-
[Online]. Available: https://ieeexplore.ieee.org/document/9086772, doi: lutionary algorithms: Classifications, analyses, and new innovations,’’
10.1109/TEVC.2020.2992387. Evol. Comput., Air Force Inst. Technol., Dayton, OH, USA, Tech. Rep.
[30] J. C. Bean, ‘‘Genetic algorithms and random keys for sequencing and AFIT/DS/ENG/99-01, 1999.
optimization,’’ ORSA J. Comput., vol. 6, no. 2, pp. 154–160, May 1994, [53] H. Ishibuchi, H. Masuda, Y. Tanigaki, and Y. Nojima, ‘‘Modified dis-
doi: 10.1287/ijoc.6.2.154. tance calculation in generational distance and inverted generational dis-
[31] K. Price, R. M. Storn, and J. A. Lampinen, Differential Evolution: A tance,’’ in Evolutionary Multi-Criterion Optimization, A. Gaspar-Cunha,
Practical Approach to Global Optimization (Natural Computing Series). C. H. Antunes, and C. C. Coello, Eds. Cham, Switzerland: Springer, 2015,
Berlin, Germany: Springer-Verlag, 2005. pp. 110–125.
[32] J. A. Nelder and R. Mead, ‘‘A simplex method for function min- [54] E. Zitzler and L. Thiele, ‘‘Multiobjective optimization using evo-
imization,’’ Comput. J., vol. 7, no. 4, pp. 308–313, Jan. 1965, doi: lutionary algorithms—A comparative case study,’’ in Proc. 5th Int.
10.1093/comjnl/7.4.308. Conf. Parallel Problem Solving from Nature (PPSN), V. London,
[33] N. Hansen, The CMA Evolution Strategy: A Comparing Review. Berlin, UK, U.K.: Springer-Verlag, 1998, pp. 292–304. [Online]. Available:
Germany: Springer, 2006, pp. 75–102, doi: 10.1007/3-540-32494-1_4. http://dl.acm.org/citation.cfm?id=645824.668610
[34] K. Deb and J. Sundar, ‘‘Reference point based multi-objective optimiza- [55] E. Zitzler, D. Brockhoff, and L. Thiele, ‘‘The hypervolume indicator
tion using evolutionary algorithms,’’ in Proc. 8th Annu. Conf. Genetic revisited: On the design of Pareto-compliant indicators via weighted inte-
Evol. Comput. (GECCO), New York, NY, USA, 2006, pp. 635–642, doi: gration,’’ in Proc. 4th Int. Conf. Evol. Multi-Criterion Optim. (EMO),
10.1145/1143997.1144112. Berlin, Germany: Springer-Verlag, 2007, pp. 862–876. [Online]. Avail-
[35] K. Deb and H. Jain, ‘‘An evolutionary many-objective optimization algo- able: http://dl.acm.org/citation.cfm?id=1762545.1762618
rithm using reference-point-based nondominated sorting approach, part [56] J. D. Hunter, ‘‘Matplotlib: A 2D graphics environment,’’ Comput. Sci. Eng.,
I: Solving problems with box constraints,’’ IEEE Trans. Evol. Comput., vol. 9, no. 3, pp. 90–95, 2007.
vol. 18, no. 4, pp. 577–601, Aug. 2014. [57] A. K. A. Talukder and K. Deb, ‘‘PaletteViz: A visualization method for
[36] H. Jain and K. Deb, ‘‘An evolutionary many-objective optimization algo- functional understanding of high-dimensional Pareto-optimal data-sets to
rithm using reference-point based nondominated sorting approach, part II: aid multi-criteria decision making,’’ IEEE Comput. Intell. Mag., vol. 15,
Handling constraints and extending to an adaptive approach,’’ IEEE Trans. no. 2, pp. 36–48, May 2020.
Evol. Comput., vol. 18, no. 4, pp. 602–622, Aug. 2014. [58] J. M. Chambers and B. Kleiner, ‘‘10 graphical techniques for multivariate
[37] J. Blank, K. Deb, and P. C. Roy, ‘‘Investigating the normalization pro- data and for clustering,’’ Tech. Rep., 1982.
cedure of NSGA-III,’’ in Evolutionary Multi-Criterion Optimization, [59] E. J. Wegman, ‘‘Hyperdimensional data analysis using parallel coordi-
K. Deb, E. Goodman, C. A. Coello Coello, K. Klamroth, K. Miettinen, nates,’’ J. Amer. Stat. Assoc., vol. 85, no. 411, pp. 664–675, Sep. 1990.
S. Mostaghim, and P. Reed, Eds. Cham, Switzerland: Springer, 2019, [60] P. Hoffman, G. Grinstein, and D. Pinkney, ‘‘Dimensional anchors: A
pp. 229–240. graphic primitive for multidimensional multivariate information visualiza-
[38] H. Seada and K. Deb, ‘‘A unified evolutionary optimization procedure tions,’’ in Proc. Workshop New Paradigms Inf. Vis. Manipulation Conjunct
for single, multiple, and many objectives,’’ IEEE Trans. Evol. Comput., 8th ACM Int. Conf. Inf. Knowl. Manage. (NPIVM), New York, NY, USA,
vol. 20, no. 3, pp. 358–369, Jun. 2016. 1999, pp. 9–16.
[39] Y. Vesikar, K. Deb, and J. Blank, ‘‘Reference point based NSGA-III for [61] E. Kandogan, ‘‘Star coordinates: A multi-dimensional visualization tech-
preferred solutions,’’ in Proc. IEEE Symp. Ser. Comput. Intell. (SSCI), nique with uniform treatment of dimensions,’’ in Proc. IEEE Inf. Vis.
Nov. 2018, pp. 1587–1594. Symp., Late Breaking Hot Topics, Oct. 2000, pp. 9–12.

89508 VOLUME 8, 2020


J. Blank, K. Deb: Pymoo: Multi-Objective Optimization in Python

[62] A. Pryke, S. Mostaghim, and A. Nazemi, ‘‘Heatmap visualization of popu- KALYANMOY DEB (Fellow, IEEE) received
lation based multi objective algorithms,’’ in Evolutionary Multi-Criterion the bachelor’s degree in mechanical engineering
Optimization, S. Obayashi, K. Deb, C. Poloni, T. Hiroyasu, and T. Murata, from IIT Kharagpur, India, and the master’s and
Eds. Berlin, Germany: Springer, 2007, pp. 361–375. Ph.D. degrees from the University of Alabama,
[63] Y. S. Tan and N. M. Fraser, ‘‘The modified star graph and the petal diagram: Tuscaloosa, AL, USA, in 1989 and 1991,
Two new visual aids for discrete alternative multicriteria decision making,’’ respectively.
J. Multi-Criteria Decis. Anal., vol. 7, no. 1, pp. 20–33, Jan. 1998. He is currently the Koenig Endowed Chair
[64] E. Kasanen, R. Östermark, and M. Zeleny, ‘‘Gestalt system of holis-
Professor with the Department of Electrical and
tic graphics: New management support view of MCDM,’’ Comput. OR,
Computer Engineering, Michigan State Univer-
vol. 18, no. 2, pp. 233–239, 1991, doi: 10.1016/0305-0548(91)90093-7.
[65] L. Rachmawati and D. Srinivasan, ‘‘Multiobjective evolutionary algorithm sity, East Lansing, MI, USA. He is also largely
with controllable focus on the knees of the Pareto front,’’ IEEE Trans. Evol. known for his seminal research in evolutionary multi-criterion optimization.
Comput., vol. 13, no. 4, pp. 810–824, Aug. 2009. He has authored two text books on optimization and published more than
[66] K. Deb, A. Sinha, P. J. Korhonen, and J. Wallenius, ‘‘An interactive 535 international journal and conference research papers to date. His current
evolutionary multiobjective optimization method based on progressively research interests include evolutionary optimization and its application in
approximated value functions,’’ IEEE Trans. Evol. Comput., vol. 14, no. 5, design, AI, and machine learning. He is one of the top cited EC researchers
pp. 723–739, Oct. 2010. with more than 140 000 Google Scholar citations.
Dr. Deb is a Fellow of ASME. He was a recipient of the Shanti Swarup
Bhatnagar Prize in Engineering Sciences, in 2005, the Edgeworth-Pareto
Award, in 2008, the CajAstur Mamdani Prize, in 2011, the Distinguished
Alumni Award from IIT Kharagpur, in 2011, the Infosys Prize, in 2012,
JULIAN BLANK received the B.Sc. degree in the TWAS Prize in Engineering Sciences, in 2012, the Lifetime Achievement
business information systems and the M.Sc. degree Award from Clarivate Analytics, in 2017, the IEEE CIS EC Pioneer Award,
in computer science from Otto von Guericke Uni- in 2018, and the Thomson Citation Laureate Award from Thompson Reuters.
versity, Germany, in 2010 and 2016, respectively. http://www.coin-lab.org.
He is currently pursuing the Ph.D. degree in com-
puter science with Michigan State University, East
Lansing, MI, USA. He was a Visiting Scholar for
six months with the Michigan State University,
in 2015. His current research interests include
multi-objective optimization, evolutionary compu-
tation, surrogate-assisted optimization, and machine learning.

VOLUME 8, 2020 89509

You might also like