Boosting Multiverse Optimizer by Simulated Annealing for Dimensionality Reduction
Downloads
Because of its dynamic graph structure and exceptional global/local search abilities, the Multiverse Optimizer (MVO) is widely used in feature selection. The exponential growth of the search space makes finding the optimum feature subset for numerous dimensional datasets quite challenging. Despite that MVO is a promising algorithm, the sluggish convergence issue affects the multi-verse optimizer performance. This work focuses on hybridizing and boosting MVO with the powerful local search algorithm, Simulated Annealing algorithm (SAA), in order to get around MVO limitations and enhance feature selection efficiency in high dimensional datasets. Stated differently, a paradigm known as high-level relay hybrid (HRH) is put forth that sequentially implements self-contained optimization (i.e. MVO and SAA). As a result, the optimal regions are found by MVO and then supplied to SAA in the suggested MVOSA-FS model. Ten high-dimensional datasets obtained from the Arizona State University (ASU) repository were used to verify the effectiveness of the proposed method; the results are compared with other six state-of-the-art feature selection algorithms: Atom Search Optimization (ASO), Equilibrium Optimizer (EO), Emperor Penguin Optimizer (EPO), Monarch Butterfly Optimization (MBO), Satin Bowerbird Optimizer (SBO), and Sine Cosine Algorithm (SCA). The results validate that the proposed MVOSA-FS technique performed better than the other algorithms and showed an exceptional ability to select the most significant and optimal features. The lowest average error rates, classification standard deviation (STD) values, and feature selection (FS) rates are obtained by MVOSA-FS across all datasets.
N. Dupin and E. G. Talbi, "Machine learning-guided dual heuristics and new lower bounds for the refueling and maintenance planning problem of nuclear power plants," Algorithms, vol. 13, no. 8, Aug. 2020, doi: 10.3390/a13080185.
U. Moorthy and U. D. Gandhi, "A novel optimal feature selection technique for medical data classification using ANOVA based whale optimization," J. Ambient Intell. Humaniz. Comput., vol. 12, no. 3, pp. 3527–3538, Mar. 2021, doi: 10.1007/s12652-020-02592-w.
M. Agyeman, A. Guerrero, and Q. V.-I. Access, "A review of classification techniques for arrhythmia patterns using convolutional neural networks and Internet of Things (IoT) devices," IEEE Access, 2022. [Online]. Available: https://ieeexplore.ieee.org/abstract/document/9832886
K. Jha and S. Saha, "Incorporation of multimodal multiobjective optimization in designing a filter based feature selection technique," Appl. Soft Comput., vol. 98, Jan. 2021, doi: 10.1016/j.asoc.2020.106823.
R. Kohavi and G. H. John, "Wrappers for feature subset selection," Artif. Intell., vol. 97, no. 1–2, pp. 273–324, 1997, doi: 10.1016/S0004-3702(97)00043-X.
E. Emary and H. M. Zawbaa, "Feature selection via Lèvy Antlion optimization," Pattern Anal. Appl., vol. 22, no. 3, pp. 857–876, Aug. 2019, doi: 10.1007/s10044-018-0695-2.
E. G. Talbi, "A taxonomy of hybrid metaheuristics," J. Heuristics, vol. 8, no. 5, pp. 541–564, Sep. 2002, doi: 10.1023/A:1016540724870.
S. Mirjalili, S. M. Mirjalili, and A. Hatamlou, "Multi-verse optimizer: A nature-inspired algorithm for global optimization," Neural Comput. Appl., vol. 27, no. 2, pp. 495–513, Feb. 2016, doi: 10.1007/s00521-015-1870-7.
H. Faris et al., "A multi-verse optimizer approach for feature selection and optimizing SVM parameters based on a robust system architecture," Neural Comput. Appl., vol. 30, no. 8, pp. 2355–2369, Oct. 2018, doi: 10.1007/s00521-016-2818-2.
L. Abualigah, "Multi-verse optimizer algorithm: A comprehensive survey of its results, variants, and applications," Neural Comput. Appl., Aug. 2020, doi: 10.1007/s00521-020-04839-1.
W. Ma et al., "A two-stage hybrid ant colony optimization for high-dimensional feature selection," Pattern Recognit., vol. 116, Aug. 2021, doi: 10.1016/j.patcog.2021.107933.
H. M. Zawbaa et al., "Large-dimensionality small-instance set feature selection: A hybrid bio-inspired heuristic approach," Swarm Evol. Comput., vol. 42, pp. 29–42, Oct. 2018, doi: 10.1016/j.swevo.2018.02.021.
K. Hussain et al., "An efficient hybrid sine-cosine Harris Hawks optimization for low and high-dimensional feature selection," Expert Syst. Appl., vol. 176, Aug. 2021, doi: 10.1016/j.eswa.2021.114778.
C. Yan et al., "Hybrid binary coral reefs optimization algorithm with simulated annealing for feature selection in high-dimensional biomedical datasets," Chemom. Intell. Lab. Syst., vol. 184, pp. 102–111, Jan. 2019, doi: 10.1016/j.chemolab.2018.11.010.
F. Moeini and S. J. Mousavirad, "An evolutionary hybrid feature selection approach for biomedical data classification," Proc. Int. Conf. Comput. Knowl. Eng. (ICCKE), Oct. 2020, doi: 10.1109/ICCKE50421.2020.9303648.
H. Drucker, D. Wu, and V. N. Vapnik, "Support vector machines for spam categorization," IEEE Trans. Neural Netw., vol. 10, no. 5, pp. 1048–1054, 1999, doi: 10.1109/72.788645.
L. Hermes and J. M. Buhmann, "Feature selection for support vector machines," Proc. IEEE Int. Conf. Pattern Recognit. (ICPR), vol. 2, pp. 716–719, 2000, doi: 10.1109/ICPR.2000.906174.
W. Zhao, L. Wang, and Z. Zhang, "Atom search optimization and its application to solve a hydrogeologic parameter estimation problem," Knowl. Based Syst., vol. 163, pp. 283–304, Jan. 2019, doi: 10.1016/j.knosys.2018.08.030.
A. Faramarzi et al., "Equilibrium optimizer: A novel optimization algorithm," Knowl. Based Syst., vol. 191, p. 105190, 2020, doi: 10.1016/j.knosys.2020.105190.
G. Dhiman and V. Kumar, "Emperor Penguin Optimizer: A bio-inspired algorithm for engineering problems," Knowl. Based Syst., vol. 159, pp. 20–50, Nov. 2018, doi: 10.1016/j.knosys.2018.06.001.
G. G. Wang, S. Deb, and Z. Cui, "Monarch Butterfly Optimization," Neural Comput. Appl., vol. 31, no. 7, pp. 1995–2014, Jul. 2019, doi: 10.1007/s00521-015-1923-y.
S. H. S. Moosavi and V. K. Bardsiri, "Satin Bowerbird Optimizer: A new optimization algorithm to optimize ANFIS for software development effort estimation," Eng. Appl. Artif. Intell., vol. 60, pp. 1–15, Apr. 2017, doi: 10.1016/j.engappai.2017.01.006.
S. Mirjalili, "SCA: A sine cosine algorithm for solving optimization problems," Knowl. Based Syst., vol. 96, pp. 120–133, Mar. 2016, doi: 10.1016/j.knosys.2015.12.022.
A. Mammone, M. Turchi, and N. Cristianini, "Support vector machines," Wiley Interdiscip. Rev. Comput. Stat., 2009, doi: 10.1002/wics.049.
S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, "Optimization by simulated annealing," Science, 1983.
E. Başbüyük, S. P. Brooks, and B. J. T. Morgan, "Optimization using simulated annealing," 1995.
S. Gopal, K. Patro, and K. K. Sahu, "Normalization: A preprocessing stage," [Online]. Available: www.kiplinger.com.
M. M. Mafarja and S. Mirjalili, "Hybrid whale optimization algorithm with simulated annealing for feature selection," Neurocomputing, vol. 260, pp. 302–312, Oct. 2017, doi: 10.1016/j.neucom.2017.04.053.
Dataset Repository, "Datasets | Feature Selection @ ASU," Accessed: Aug. 01, 2024. [Online]. Available: https://jundongl.github.io/scikit-feature/old/datasets_old.html.
S. Mukherjee et al., "Feature selection for SVMs," 2000. [Online]. Available: https://www.researchgate.net/publication/221619995.
Z. Xu et al., "Gradient boosted feature selection," Proc. ACM SIGKDD Int. Conf. Knowl. Discov. Data Min., pp. 522–531, 2014, doi: 10.1145/2623330.2623635.
L. Jia, W. Gong, and H. Wu, "An improved self-adaptive control parameter of differential evolution for global optimization."
S. Mirjalili and A. Lewis, "Adaptive guided differential evolution algorithm for global optimization," Appl. Soft Comput., vol. 61, pp. 727–739, Dec. 2017, doi: 10.1016/j.asoc.2017.08.017.
R. Polikar, "Ensemble based systems in decision making," IEEE Circuits Syst. Mag., vol. 6, no. 3, pp. 21–45, 2006, doi: 10.1109/MCAS.2006.1688199.
Copyright (c) 2025 The Authors. Published by Universitas Airlangga.

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors who publish with this journal agree to the following terms:
All accepted papers will be published under a Creative Commons Attribution 4.0 International (CC BY 4.0) License. Authors retain copyright and grant the journal right of first publication. CC-BY Licenced means lets others to Share (copy and redistribute the material in any medium or format) and Adapt (remix, transform, and build upon the material for any purpose, even commercially).