Greedy approximated hypervolume subset selection for many-objective optimization

K Shang, H Ishibuchi, W Chen - Proceedings of the Genetic and …, 2021 - dl.acm.org
Proceedings of the Genetic and Evolutionary Computation Conference, 2021dl.acm.org
Hypervolume subset selection (HSS) aims to select a subset from a candidate solution set
so that the hypervolume of the selected subset is maximized. Due to its NP-hardness nature,
the greedy algorithms are the most efficient for solving HSS in many-objective optimization.
However, when the number of objectives is large, the calculation of the hypervolume
contribution in the greedy HSS is time-consuming, which makes the greedy HSS inefficient.
To solve this issue, in this paper we propose a greedy approximated HSS algorithm. The …
Hypervolume subset selection (HSS) aims to select a subset from a candidate solution set so that the hypervolume of the selected subset is maximized. Due to its NP-hardness nature, the greedy algorithms are the most efficient for solving HSS in many-objective optimization. However, when the number of objectives is large, the calculation of the hypervolume contribution in the greedy HSS is time-consuming, which makes the greedy HSS inefficient. To solve this issue, in this paper we propose a greedy approximated HSS algorithm. The main idea is to use an R2-based hypervolume contribution approximation method in the greedy HSS. In the algorithm implementation, a utility tensor structure is introduced to facilitate the calculation of the hypervolume contribution approximation. In addition, the tensor information in the last step is utilized in the current step to accelerate the calculation. We also apply the lazy strategy in the proposed algorithm to further improve its efficiency. We test the greedy approximated HSS algorithm on 3-10 objective candidate solution sets. The experimental results show that the proposed algorithm is much faster than the state-of-the-art greedy HSS algorithm in many-objective optimization while their hypervolume performance is almost the same.
ACM Digital Library
Showing the best result for this search. See all results