研究生: |
曾則儒 Tseng, Tser-Ru |
---|---|
論文名稱: |
求解高維多目標最佳化問題的演化演算法效能評析 Large-Scale Evolutionary Multiobjective Optimization: An Experimental Study |
指導教授: |
蔣宗哲
Chiang, Tsung-Che |
口試委員: |
蔣宗哲
Chiang, Tsung-Che 鄒慶士 Tsou, Ching-Shih 廖容佐 Liaw, Rung-Tzuo |
口試日期: | 2024/07/17 |
學位類別: |
碩士 Master |
系所名稱: |
資訊工程學系 Department of Computer Science and Information Engineering |
論文出版年: | 2024 |
畢業學年度: | 112 |
語文別: | 中文 |
論文頁數: | 67 |
中文關鍵詞: | 演化演算法 、多目標最佳化 、高維多目標最佳化 |
英文關鍵詞: | Evolutionary Algorithms, Multiobjective Optimization, Large Scale |
研究方法: | 實驗設計法 |
DOI URL: | http://doi.org/10.6345/NTNU202401528 |
論文種類: | 學術論文 |
相關次數: | 點閱:140 下載:2 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
演化演算法在求解多目標最佳化問題有很好的表現,也被廣泛使用於各領域的實際應用問題。然而當決策變數的個數增加至數百個甚至數千個時,指數成長的搜尋空間會對多目標演化演算法的求解能力形成嚴峻的挑戰;因此,近年來演化計算領域有越來越多的學者投入高維多目標演化演算法的設計。然而,我們發現到此領域的研究文獻在評估演算法效能時未能使用統一的實驗設定,此舉可能造成實驗結果的偏頗與失真。本論文自文獻中挑選八個具有代表性的高維多目標演化演算法,使用兩套公開測試函式集 LSMOP 與 LMF,檢視這些演算法在四種問題維度和四種計算資源所組合的十六種實驗環境下的求解效能。經由公平且完整的實驗測試,我們得以正確評比演算法的效能與特質,也對各種演算法設計的異同處與其對於求解效能的影響提出解釋與探討。
Evolutionary algorithms have shown promising performance in addressing multiobjective optimization problems (MOPs) and are widely applied across various domains. However, as the number of decision variables grows into the hundreds or even thousands, the exponentially growing search space poses significant challenges to the searching capabilities of multiobjective evolutionary algorithms (MOEAs). In response, recent years have seen a growing focus within the evolutionary computation community on designing MOEAs tailored for large-scale MOPs. One issue we have identified in this research area is the inconsistent experimental settings when evaluating algorithm performance, which can result in biased and misleading outcomes. In this thesis, we select eight representative MOEAs from the literature and employ two publicly available test function sets, LSMOP and LMF, to assess the performance of these algorithms across sixteen experimental scenarios, each defined by a combination of four problem dimensions and four levels of computational budgets. Through rigorous and comprehensive testing, we provide a fair comparison of the algorithms' performance and offer insights into the differences in algorithm design and their implications for solving efficiency.
[1] J. H. Holland, Adaptation in Natural and Artificial Systems: An Introductory Analysis With Applications to Biology, Control, and Artificial Intelligence. Ann Arbor, MI, USA: Univ. of Michigan, 1975.
[2] R. Storn and K. Price, “Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, pp. 341–359, 1997.
[3] R. Eberhart and J. Kennedy, “A new optimizer using particle swarm theory,” in Proc. 6th International Symposium on Micro Machine and Human Science, pp. 39–43, 1995.
[4] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A fast and elitist multiobjective genetic algorithm: NSGA-II,” IEEE Transactions on Evolutionary Computation, vol. 6, no. 2, pp. 182–197, 2002.
[5] Q. Zhang and H. Li, “MOEA/D: A multiobjective evolutionary algorithm based on decomposition,” IEEE Transactions on Evolutionary Computation, vol. 11, no. 6, pp. 712–731, 2007.
[6] K. Deb and H. Jain, “An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part I: Solving problems with box constraints,” IEEE Transactions on Evolutionary Computation, vol. 18, no. 4, pp. 577–601, 2014.
[7] R. Cheng, Y. Jin, M. Olhofer, and B. Sendhoff, “A Reference Vector Guided Evolutionary Algorithm for Many-Objective Optimization,” IEEE Transactions on Evolutionary Computation, vol. 20, no. 5, pp. 773–791, 2016.
[8] N. Beumea, B. Naujoks, and M. Emmerich, “SMS-EMOA: Multiobjective selection based on dominated hypervolume,” European Journal of Operational Research, vol. 181, no. 3, pp. 1653–1669, 2007.
[9] L. While, P. Hingston, L. Barone, and S. Huband, “A faster algorithm for calculating hypervolume,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 1, pp. 29–38, 2006.
[10] Y. Tian, L. Si, X. Zhang, R. Cheng, C. He, K. C. Tan, and Y. Jin “Evolutionary large-scale multi-objective optimization: A survey,” ACM Computing Surveys, vol. 54, no. 8, pp. 1–34, 2021.
[11] M. A. Potter and K. A. De Jong, “A cooperative coevolutionary approach to function optimization,” in Proc. International Conference on Parallel Problem Solving from Nature, pp. 249–257, 1994.
[12] X. Ma, et al., “A multiobjective evolutionary algorithm based on decision variable analyses for multiobjective optimization problems with largescale variables,” IEEE Transactions on Evolutionary Computation, vol. 20, no. 2, pp. 275–298, 2016.
[13] L. M. Antonio and C. A. C. Coello, “Use of cooperative coevolution for solving large scale multiobjective optimization problems,” in Proc. IEEE Congress on Evolutionary Computation, pp. 2758–2765, 2013.
[14] A. Song, Q. Yang, W.-N. Chen, and J. Zhang, “A random-based dynamic grouping strategy for large scale multi-objective optimization,” in Proc. IEEE Congress on Evolutionary Computation, pp. 468–475, 2016.
[15] M. N. Omidvar, X. Li, Y. Mei, and X. Yao, “Cooperative co-evolution with differential grouping for large scale optimization,” IEEE Transactions on Evolutionary Computation, vol. 18, no. 3, pp. 378–393, 2014.
[16] W. Chen, T. Weise, Z. Yang, and K. Tang, “Large-scale global optimization using cooperative coevolution with variable interaction learning,” in Proc. 2010 International Conference on Parallel Problem Solving from Nature, pp. 300–309, 2010.
[17] X. Zhang, Y. Tian, R. Cheng, and Y. Jin, “A decision variable clustering-based evolutionary algorithm for large-scale many-objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 22, no. 1, pp. 97–112, 2018.
[18] R. Liu, R. Ren, J. Liu, and J. Liu, “A clustering and dimensionality reduction based evolutionary algorithm for large-scale multi-objective problems,” Applied Soft Computing, vol. 89, no. 106120, 2020.
[19] H. Chen, R. Cheng, J. Wen, H. Li, and J. Weng, “Solving large-scale many-objective optimization problems by covariance matrix adaptation evolution strategy with scalable small subpopulations,” Information Sciences, vol. 509, pp. 457–469, 2020.
[20] N. Hansen, S. D. Müller, and P. Koumoutsakos, “Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (cma-es),” Evolutionary Computation, vol. 11, no. 1, pp. 1–18, 2003.
[21] C. He, R. Cheng, L. Li, K. C. Tan, and Y. Jin, “Large-scale Multiobjective Optimization via Reformulated Decision Variable Analysis,” IEEE Transactions on Evolutionary Computation, vol. 28, no. 1, pp. 47–61, 2022.
[22] H. Zille, H. Ishibuchi, S. Mostaghim, and Y. Nojima, “A framework for large-scale multiobjective optimization based on problem transformation,” IEEE Transactions on Evolutionary Computation, vol. 22, no. 2, pp. 260–275, 2018.
[23] R. Liu, J. Liu, Y. Li, and J. Liu, “A random dynamic grouping based weight optimization framework for large-scale multi-objective optimization problems,” Swarm and Evolutionary Computation, vol. 55, 2020.
[24] Q. Lin, J. Li, Z. Du, J. Chen, and Z. Ming, “A novel multi-objective particle swarm optimization with multiple search strategies,” European Journal of Operational Research, vol. 247, no. 3, pp. 732–744, 2015.
[25] C. He, L. Li, Y. Tian, X. Zhang, R. Cheng, Y. Jin, and X. Yao, “Accelerating large-scale multiobjective optimization via problem reformulation,” IEEE Transactions on Evolutionary Computation, vol. 23, no. 6, pp. 949–961, 2019.
[26] C. He, R. Cheng, and D. Yazdani, “Adaptive offspring generation for evolutionary large-scale multiobjective optimization,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 52, no. 2, pp. 786–798, 2020.
[27] S. Qin, C. Sun, Y. Jin, Y. Tan, and J. Fieldsend, “Large-scale evolutionary multiobjective optimization assisted by directed sampling,” IEEE Transactions on Evolutionary Computation, vol. 25, no. 4, pp. 724–738, 2021.
[28] L. Li, C. He, R. Cheng, H. Li, L. Pan, and Y. Jin, “A fast sampling based evolutionary algorithm for million-dimensional multiobjective optimization,” Swarm and Evolutionary Computation, vol. 75, pp. 101181, 2022.
[29] Y. Tian, C. Lu, X. Zhang, F. Cheng, and Y. Jin, “A pattern mining-based evolutionary algorithm for large-scale sparse multiobjective optimization problems,” IEEE Transactions on Cybernetics, vol. 52, no. 7, pp. 1–14, 2020.
[30] J.-H. Yi, et al., “Behavior of crossover operators in NSGA-III for large-scale optimization problems,” Information Sciences, vol. 509, pp. 470–487, 2020.
[31] Y. Zhang, G.-G. Wang, K. Li, W.-C. Yeh, M. Jian, and J. Dong, “Enhancing MOEA/D with information feedback models for largescale many-objective optimization,” Information Sciences, vol. 522, pp. 1–16, Jun. 2020.
[32] Y. Tian, X. Zhang, C. Wang, and Y. Jin, “An evolutionary algorithm for large-scale sparse multiobjective optimization problems,” IEEE Transactions on Evolutionary Computation, vol. 24, no. 2, pp. 380–393, 2020.
[33] Y. Tian, R. Liu, X. Zhang, H. Ma, K. C. Tan, and Y. Jin, “A multipopulation evolutionary algorithm for solving large-scale multimodal multiobjective optimization problems,” IEEE Transactions on Evolutionary Computation, vol. 25, no. 3, pp. 405–418, 2021.
[34] Y. Tian, X. Zheng, X. Zhang, and Y. Jin, “Efficient large-scale multiobjective optimization based on a competitive swarm optimizer,” IEEE Transactions on Cybernetics, vol. 50, no. 8, pp. 3696–3708, 2020.
[35] R. Cheng and Y. Jin, “A competitive swarm optimizer for large scale optimization,” IEEE Transactions on Cybernetics, vol. 45, no. 2, pp. 191–204, 2015.
[36] S. Liu, J. Li, Q. Lin, Y. Tian, and K. C. Tan, “Learning to accelerate evolutionary search for large-scale multiobjective optimization,” IEEE Transactions on Evolutionary Computation, vol. 27, no. 1, pp. 67–81, 2023.
[37] W. Hong, K. Tang, A. Zhou, H. Ishibuchi, and X. Yao, “Scalable Indicator-Based Evolutionary Algorithm for Large-Scale Multiobjective Optimization,” IEEE Transactions on Evolutionary Computation, vol. 23, no. 3, pp. 525–537, 2019.
[38] R. Cheng, Y. Jin, M. Olhofer, and B. Sendhoff, “Test problems for large-scale multiobjective and many-objective optimization,” IEEE Transactions on Cybernetics, vol. 47, no. 12, pp. 4108–4121, 2017.
[39] S. Liu, Q. Lin, “Evolutionary large-scale multiobjective optimization: Benchmarks and algorithms,” IEEE Transactions on Evolutionary Computation, vol. 27, no. 3, pp. 401–415, 2021.
[40] Y. Tian, R. Cheng, X. Zhang, and Y. Jin, “PlatEMO: A MATLAB platform for evolutionary multi-objective optimization [educational forum],” IEEE Computational Intelligence Magazine, vol. 12, no. 4, pp. 73–87, 2017.
[41] A. Zhou, Y. Jin, Q. Zhang, B. Sendhoff, and E. Tsang, “Combining model-based and genetics-based offspring generation for multi-objective optimization using a convergence criterion,” in Proc. IEEE Congress on Evolutionary Computation, pp. 892–899, 2006.
[42] Y. Yuan, H. Xu, B. Wang, and X. Yao, “A New Dominance Relation-Based Evolutionary Algorithm for Many-Objective Optimization,” IEEE Transactions on Evolutionary Computation, vol. 20, no. 1, pp. 16–37, 2016.
[43] L. M. Pang, H. Ishibuchi, and K. Shang, “Counterintuitive experimental results in evolutionary large-scale multiobjective optimization,” IEEE Transactions on Evolutionary Computation, vol. 26, no. 6, pp. 1609–1616, 2022.
[44] S. Z. Martínez, C. A. Coello Coello, H. E. Aguirre, K. Tanaka, “A Review of Features and Limitations of Existing Scalable Multiobjective Test Suites,” IEEE Transactions on Evolutionary Computation, vol. 23, no. 1, pp. 130–142, 2019.