PlatEMO Evolutionary multi-objective optimization platform
BIMK
Institute of
Bioinspired Intelligence and Mining Knowledge

PlatEMO

Evolutionary multi-objective optimization platform

  • 50+ open source evolutionary algorithms
  • 100+ open source multi-objective test problems
  • Powerful GUI for performing experiments
  • Generating results in the format of Excel or LaTeX table by one-click operation
  • State-of-the-art algorithms will be included continuously
Download now
Functions>                Libraries>
*The Copyright of the PlatEMO belongs to the BIMK group. You are free to use the PlatEMO for research purposes. All publications which use this platform or any code in the platform should acknowledge the use of "PlatEMO" and reference "Ye Tian, Ran Cheng, Xingyi Zhang, and Yaochu Jin, PlatEMO: A MATLAB Platform for Evolutionary Multi-Objective Optimization, IEEE Computational Intelligence Magazine, 2017, in press".
 

Totally Developed in MATLAB

PlatEMO consists of a number of MATLAB functions without using any other libraries. Any machines able to run MATLAB can use PlatEMO regardless of the operating system

 

Includes Many Popular Algorithms

PlatEMO includes more than fifty existing popular MOEAs, including genetic algorithm, differential evolution, particle swarm optimization, memetic algorithm, estimation of distribution algorithm, and surrogate model based algorithm. Most of them are representative algorithms published in top journals after 2010

List of the included MOEAs>
 

Various Figure Demonstrations

Users can select various figures to be displayed, including the Pareto front of the result, the Pareto set of the result, the true Pareto front, and the evolutionary trajectories of any performance indicator values

 

Powerful and Friendly GUI

PlatEMO provides a powerful and friendly GUI, where users can configure all the settings and perform experiments via the GUI without writing any code

How to use>
 

Generates Data in the Format of Excel or LaTeX

Users can save the statistical experimental results generated by PlatEMO as an Excel table or LaTeX table, which can be directly used in academic writings

PlatEMO v1.2  (2017/08/08)
  • More popular MOEAs and MOPs: Currently there are 69 MOEAs and 129 MOPs in PlatEMO.
  • Now the function NDSort.m supports to sort the population with constraints, and many MOEAs including NSGA-II support to solve problems with constraints.
  • Add icons for all the menu items in the GUI.
  • The results can be plotted in experimental mode by right-clicking the statistical table.
  • Fix some minor bugs in MOEAs, MOPs and the GUI.
PlatEMO v1.1  (2017/04/10)
  • More popular MOEAs and MOPs: Currently there are 61 MOEAs and 126 MOPs in PlatEMO.
  • Simpler way to execute experiment in experimental module: The configuration of each experiment will be automatically saved when starting to execute the experiment. The configuration files can be reloaded so that users need not configure the same experiment once more.
  • Performance metrics ensemble: The performance metrics ensemble method is adopted in the experimental module, which can provide a ranking order of all the compared MOEAs on one MOP with respect to all the performance indicators.
  • Fix some minor bugs in the experimental module, and enhance the stability of the GUI.
  • The sampling method of the PF of MaF9 has been corrected.
PlatEMO v1.0  (2017/02/17)
  • The first version of PlatEMO
The MOEAs Included in PlatEMO

Algorithm Year of Publication Description
Multi-Objective Genetic Algorithms
SPEA2 [2] 2001 Strength Pareto evolutionary algorithm 2
PSEA-II [3] 2001 Pareto envelope-based selection algorithm II
NSGA-II [1] 2002 Non-dominated sorting genetic algorithm II
ϵ-MOEA [8] 2003 Multi-objective evolutionary algorithm based on ϵ-dominance
IBEA [9] 2004 Indicator-based evolutionary algorithm
MOEA/D [4] 2007 Multi-objective evolutionary algorithm based on decomposition
SMS-EMOA [10] 2007 S metric selection evolutionary multi-objective optimization algorithm
MSOPS-II [11] 2007 Multiple single objective Pareto sampling algorithm II
MTS [12] 2009 Multiple trajectory search
AGE-II [13] 2013 Approximation-guided evolutionary algorithm II
NSLS [14] 2015 Non-dominated sorting and local search
BCE-IBEA [15] 2015 Bi-criterion evolution for IBEA
MOEA/IGD-NS [16] 2016 Multi-objective evolutionary algorithm based on an
enhanced inverted generational distance metric
Many-Objective Genetic Algorithms
HypE [17] 2011 Hypervolume-based estimation algorithm
PICEA-g [18] 2013 Preference-inspired coevolutionary algorithm with goals
GrEA [19] 2013 Grid-based evolutionary algorithm
NSGA-III [20] 2014 Many-objective evolutionary algorithm based on objective space reduction and diversity improvement
A-NSGA-III [21] 2014 Adaptive NSGA-III
SPEA2+SDE [22] 2014 SPEA2 with shift-based density estimation
BiGE [23] 2015 Bi-goal evolution
EFR-RR [7] 2015 Ensemble fitness ranking with ranking restriction
I-DBEA [24] 2015 Improved decomposition based evolutionary algorithm
KnEA [25] 2015 Knee point driven evolutionary algorithm
MaOEA-DDFC [26] 2015 Many-objective evolutionary algorithm based on directional
diversity and favorable convergence
MOEA/DD [27] 2015 Multi-objective evolutionary algorithm based on dominance and decomposition
MOMBI-II [28] 2015 Many-objective metaheuristic based on the R2 indicator II
Two Arch2 [29] 2015 Two-archive algorithm 2
MaOEA-R&D [30] 2016 Many-objective evolutionary algorithm based on objective
space reduction and diversity improvement
RPEA [31] 2016 Reference points-based evolutionary algorithm
RVEA [32] 2016 Reference vector guided evolutionary algorithm
RVEA* [32] 2016 RVEA embedded with the reference vector regeneration strategy
SPEA/R [33] 2016 Strength Pareto evolutionary algorithm based on reference direction
θ-DEA [34] 2016 θ-dominance based evolutionary algorithm
Multi-Objective Genetic Algorithms for Large-Scale Optimization
MOEA/DVA [35] 2016 Multi-objective evolutionary algorithm based on decision variable analyses
LMEA [36] 2016 Large-scale many-objective evolutionary algorithm
Multi-Objective Genetic Algorithms with Preference
g-NSGA-II [37] 2009 g-dominance based NSGA-II
r-NSGA-II [38] 2010 r-dominance based NSGA-II
WV-MOEA-P [39] 2016 Weight vector based multi-objective optimization algorithm with preference
Multi-objective Differential Algorithms
GDE3 [40] 2005 Generalized differential evolution 3
MOEA/D-DE [5] 2009 MOEA/D based on differential evolution
Multi-objective Particle Swarm Optimization Algorithms
MOPSO [41] 2002 Multi-objective particle swarm optimization
SMPSO [42] 2009 Speed-constrained multi-objective particle swarm optimization
dMOPSO [43] 2011 Decomposition-based particle swarm optimization
Multi-objective Memetic Algorithms
M-PAES [44] 2000 Memetic algorithm based on Pareto archived evolution strategy
Multi-objective Estimation of Distribution Algorithms
MO-CMA [45] 2007 Multi-objective covariance matrix adaptation
RM-MEDA [46] 2008 Regularity model-based multi-objective estimation of distribution algorithm
IM-MOEA [47] 2015 Inverse modeling multi-objective evolutionary algorithm
Surrogate Model Based Multi-objective Algorithms
ParEGO [48] 2005 Efficient global optimization for Pareto optimization
SMS-EGO [49] 2008 S-metric-selection-based efficient global optimization
K-RVEA [50] 2016 Kriging assisted RVEA

The MOPs Included in PlatEMO

Problem Year of Publication Description
MOKP [51] 1999

Multi-objective 0/1 knapsack problem and
behavior of MOEAs on this problem analyzed in [52]

ZDT1–ZDT6 [53] 2000 Multi-objective test problems
mQAP [54] 2003 Multi-objective quadratic assignment problem
DTLZ1–DTLZ9 [55] 2005 Scalable multi-objective test problems
WFG1–WFG9 [56] 2006 Scalable multi-objective test problems and
degenerate problem WFG3 analyzed in [57]
MONRP [58] 2007 Multi-objective next release problem
MOTSP [59] 2007 Multi-objective traveling salesperson problem
Pareto-Box [60] 2007 Pareto-Box problem
CF1–CF10 [61] 2008 Constrained multi-objective test problems for the
CEC 2009 special session and competition
F1–F10 for RM-MEDA [46] 2008 The test problems designed for RM-MEDA
UF1–UF12 [61] 2008 Unconstrained multi-objective test problems for the
CEC 2009 special session and competition
F1–F9 for MOEA/D-DE [5] 2009 The test problems extended from [62] designed for MOEA/D-DE

C1_DTLZ1, C2_DTLL2, C3_DTLZ4,IDTLZ1, IDTLZ2 [21]

2014 Constrained DTLZ andinverted DTLZ
F1–F7 for MOEA/D-M2M [6] 2014 The test problems designed for MOEA/D-M2M
F1–F10 for IM-MOEA [47] 2015 The test problems designed for IM-MOEA
BT1–BT9 [63] 2016 Multi-objective test problems with bias
LSMOP1–LSMOP9 [64] 2016 Large-scale multi-objective test problems

REFERENCES
[1] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A fast and elitist multi-objective genetic algorithm: NSGA-II,” IEEE Transactions on Evolutionary Computation, vol. 6, no. 2, pp. 182–197, 2002. [2] E. Zitzler, M. Laumanns, and L. Thiele, “SPEA2: Improving the strength Pareto evolutionary algorithm for multiobjective optimization,” in Proceedings of the Fifth Conference on Evolutionary Methods for Design, Optimization and Control with Applications to Industrial Problems, 2001, pp. 95–100. [3] D. W. Corne, N. R. Jerram, J. D. Knowles, and M. J. Oates, “PESA-II: Region-based selection in evolutionary multi-objective optimization,” in Proceedings of the 2001 Genetic and Evolutionary Computation Conference, 2001, pp. 283–290. [4] Q. Zhang and H. Li, “MOEA/D: A multi-objective evolutionary algorithm based on decomposition,” IEEE Transactions on Evolutionary Computation, vol. 11, no. 6, pp. 712–731, 2007. [5] H. Li and Q. Zhang, “Multiobjective optimization problems with complicated Pareto sets, MOEA/D and NSGA-II,” IEEE Transactions on Evolutionary Computation, vol. 13, no. 2, pp. 284–302, 2009. [6] H.-L. Liu, F. Gu, and Q. Zhang, “Decomposition of a multiobjective optimization problem into a number of simple multiobjective subproblems,” IEEE Transactions on Evolutionary Computation, vol. 18, no. 3, pp. 450–455, 2014. [7] Y. Yuan, H. Xu, B. Wang, B. Zhang, and X. Yao, “Balancing convergence and diversity in decomposition-based many-objective optimizers,” IEEE Transactions on Evolutionary Computation, vol. 20, no. 2, pp. 180–198, 2016. [8] K. Deb, M. Mohan, and S. Mishra, “Towards a quick computation of well-spread pareto-optimal solutions,” in Proceedings of the International Conference on Evolutionary Multi-Criterion Optimization, 2003, pp. 222–236. [9] E. Zitzler and S. K¨unzli, “Indicator-based selection in multiobjective search,” in Proceedings of the 8th International Conference on Parallel Problem Solving from Nature, 2004, pp. 832–842. [10] N. Beume, B. Naujoks, and M. Emmerich, “SMS-EMOA: Multiobjective selection based on dominated hypervolume,” European Journal of Operational Research, vol. 181, no. 3, pp. 1653–1669, 2007. [11] E. J. Hughes, “MSOPS-II: A general-purpose many-objective optimiser,” in Proceedings of the 2007 IEEE Congress on Evolutionary Computation, 2007, pp. 3944–3951. [12] L.-Y. Tseng and C. Chen, “Multiple trajectory search for unconstrained/constrained multi-objective optimization,” in Proceedings of the 2009 IEEE Congress on Evolutionary Computation, 2009, pp. 1951–1958. [13] M. Wagner and F. Neumann, “A fast approximation-guided evolutionary multi-objective algorithm,” in Proceedings of the 15th Annual Conference on Genetic and Evolutionary Computation, 2013, pp. 687–694. [14] B. Chen, W. Zeng, Y. Lin, and D. Zhang, “A new local search-based multiobjective optimization algorithm,” IEEE Transactions on Evolutionary Computation, vol. 19, no. 1, pp. 50–73, 2015. [15] M. Li, S. Yang, and X. Liu, “Pareto or non-Pareto: Bi-criterion evolution in multi-objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 20, no. 5, pp. 645–665, 2015. [16] Y. Tian, X. Zhang, R. Cheng, and Y. Jin, “A multi-objective evolutionary algorithm based on an enhanced inverted generational distance metric,” in Proceedings of the 2016 IEEE Congress on Evolutionary Computation, 2016, pp. 5222–5229. 18 [17] J. Bader and E. Zitzler, “HypE: An algorithm for fast hypervolume-based many-objective optimization,” Evolutionary Computation, vol. 19, no. 1, pp. 45–76, 2011. [18] R. Wang, R. C. Purshouse, and P. J. Fleming, “Preference-inspired coevolutionary algorithms for many-objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 17, no. 4, pp. 474–494, 2013. [19] S. Yang, M. Li, X. Liu, and J. Zheng, “A grid-based evolutionary algorithm for many-objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 17, no. 5, pp. 721–736, 2013. [20] K. Deb and H. Jain, “An evolutionary many-objective optimization algorithm using reference-point based non-dominated sorting approach, part I: Solving problems with box constraints,” IEEE Transactions on Evolutionary Computation, vol. 18, no. 4, pp. 577–601, 2014. [21] H. Jain and K. Deb, “An evolutionary many-objective optimization algorithm using reference-point based nondominated sorting approach, part II: Handling constraints and extending to an adaptive approach,” IEEE Transactions on Evolutionary Computation, vol. 18, no. 4, pp. 602–622, 2014. [22] M. Li, S. Yang, and X. Liu, “Shift-based density estimation for pareto-based algorithms in many-objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 18, no. 3, pp. 348–365, 2014. [23] M. Li, S. Yang, and X. L. (2015), “Bi-goal evolution for many-objective optimization problems,” Artificial Intelligence, vol. 228, pp. 45–65, 2015. [24] M. Asafuddoula, T. Ray, and R. Sarker, “A decomposition based evolutionary algorithm for many objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 19, no. 3, pp. 445–460, 2015. [25] X. Zhang, Y. Tian, and Y. Jin, “A knee point driven evolutionary algorithm for many-objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 19, no. 6, pp. 761–776, 2015. [26] J. Cheng, G. Yen, and G. Zhang, “A many-objective evolutionary algorithm with enhanced mating and environmental selections,” IEEE Transactions on Evolutionary Computation, vol. 19, pp. 592–605, 2015. [27] K. Li, K. Deb, Q. Zhang, and S. Kwong, “Combining dominance and decomposition in evolutionary many-objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 19, no. 5, pp. 694–716, 2015. [28] R. Hern´andez G´omez and C. A. Coello Coello, “Improved metaheuristic based on the R2 indicator for many-objective optimization,” in Proceedings of the 2015 on Genetic and Evolutionary Computation Conference, 2015, pp. 679–686. [29] H. Wang, L. Jiao, and X. Yao, “Two Arch2: An improved two-archive algorithm for many-objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 19, no. 4, pp. 524–541, 2015. [30] Z. He and G. G. Yen, “Many-objective evolutionary algorithm: Objective space reduction and diversity improvement,” IEEE Transactions on Evolutionary Computation, vol. 20, no. 1, pp. 145–160, 2016. [31] Y. Liu, D. Gong, X. Sun, and Z. Yong, “Many-objective evolutionary optimization based on reference points,” Applied Soft Computing, 2016, in press. [32] R. Cheng, Y. Jin, M. Olhofer, and B. Sendhoff, “A reference vector guided evolutionary algorithm for many-objective optimization,” IEEE Transactions on Evolutionary Computation, 2016, in press. [33] S. Jiang and S. Yang, “A strength Pareto evolutionary algorithm based on reference direction for multi-objective and many-objective optimization,” IEEE Transactions on Evolutionary Computation, 2016, in press. [34] Y. Yuan, H. Xu, B. Wang, and X. Yao, “A new dominance relation-based evolutionary algorithm for many-objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 20, no. 1, pp. 16–37, 2016. [35] X.Ma,F.Liu,Y.Qi,X.Wang,L.Li,L.Jiao,M.Yin,andM.Gong,“Amultiobjectiveevolutionaryalgorithmbasedondecisionvariable analyses for multiobjective optimization problems with large-scale variables,” IEEE Transactions on Evolutionary Computation, vol. 20, no. 2, pp. 275–298, 2016. [36] X. Zhang, Y. Tian, R. Cheng, and Y. Jin, “A decision variable clustering-based evolutionary algorithm for large-scale many-objective optimization,” IEEE Transactions on Evolutionary Computation, 2016, in press. [37] J. Molina, L. V. Santana, A. G. Hern´andez-D´ıaz, C. A. C. Coello, and R. Caballero, “g-dominance: Reference point based dominance for multiobjective metaheuristics,” European Journal of Operational Research, vol. 197, no. 2, pp. 685–692, 2009. [38] L. B. Said, S. Bechikh, and K. Gh´edira, “The r-dominance: A new dominance relation for interactive evolutionary multicriteria decision making,” IEEE Transactions on Evolutionary Computation, vol. 14, no. 5, pp. 801–818, 2010. [39] X. Zhang, X. Jiang, and L. Zhang, “A weight vector based multi-objective optimization algorithm with preference,” Acta Electronica Sinica, vol. 44, no. 11, pp. 2639–2645, 2016. [40] S. Kukkonen and J. Lampinen, “GDE3: The third evolution step of generalized differential evolution,” in Proceedings of the 2005 IEEE Congress on Evolutionary Computation, vol. 1, 2005, pp. 443–450.19 [41] C. C. Coello and M. S. Lechuga, “MOPSO: A proposal for multiple objective particle swarm optimization,” in Proceedings of the 2002 IEEE Congress on Evolutionary Computation, vol. 2, 2002, pp. 1051–1056. [42] A. J. Nebro, J. J. Durillo, J. Garcia-Nieto, C. C. Coello, F. Luna, and E. Alba, “SMPSO: A new PSO-based metaheuristic for multi-objective optimization,” in Proceedings of the Computational Intelligence in Multi-Criteria Decision-Making, 2009, pp. 66–73. [43] S. Zapotecas Mart´ınez and C. A. Coello Coello, “A multi-objective particle swarm optimizer based on decomposition,” in Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation, 2011, pp. 69–76. [44] J. D. Knowles and D. W. Corne, “M-PAES: A memetic algorithm for multiobjective optimization,” in Proceedings of the 2000 IEEE Congress on Evolutionary Computation, 2000, pp. 325–332. [45] C. Igel, N. Hansen, and S. Roth, “Covariance matrix adaptation for multi-objective optimization,” Evolutionary computation, vol. 15, no. 1, pp. 1–28,2007. [46] Q. Zhang, A. Zhou, and Y. Jin, “RM-MEDA: A regularity model-based multiobjective estimation of distribution algorithm,” IEEE Transactions on Evolutionary Computation, vol. 12, no. 1, pp. 41–63, 2008. [47] R. Cheng, Y. Jin, K. Narukawa, and B. Sendhoff, “A multiobjective evolutionary algorithm using Gaussian process based inverse modeling,” IEEE Transactions on Evolutionary Computation, 2015, in press. [48] J. Knowles, “ParEGO: A hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 1, pp. 50–66, 2006. [49] W. Ponweiser, T. Wagner, D. Biermann, and M. Vincze, “Multiobjective optimization on a limited budget of evaluations using modelassisted S-metric selection,” in Proceedings of the International Conference on Parallel Problem Solving from Nature, 2008, pp. 784–794. [50] T. Chugh, Y. Jin, K. Miettinen, J. Hakanen, and K. Sindhya, “A surrogate-assisted reference vector guided evolutionary algorithm for computationally expensive many-objective optimization,” IEEE Transactions on Evolutionary Computation, 2016, in press. [51] E. Zitzler and L. Thiele, “Multiobjective evolutionary algorithms: A comparative case study and the strength pareto approach,” IEEE transactions on Evolutionary Computation, vol. 3, no. 4, pp. 257–271, 1999. [52] H. Ishibuchi, N. Akedo, and Y. Nojima, “Behavior of multiobjective evolutionary algorithms on many-objective knapsack problems,” IEEE Transactions on Evolutionary Computation, vol. 19, no. 2, pp. 264–283, 2015. [53] K. D. E. Zitzler and L. Thiele, “Comparison of multiobjective evolutionary algorithms: Empirical results,” Evolutionary Computation, vol. 8, no. 2, pp. 173–195, 2000. [54] J. Knowles and D. Corne, “Instance generators and test suites for the multiobjective quadratic assignment problem,” in Proceedings of the International Conference on Evolutionary Multi-Criterion Optimization, 2003, pp. 295–310. [55] K. Deb, L. Thiele, M. Laumanns, and E. Zitzler, Scalable test problems for evolutionary multiobjective optimization, 2005. [56] L. B. S. Huband, P. Hingston and L. While, “A review of multiobjective test problems and a scalable test problem toolkit,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 5, pp. 477–506, 2006. [57] H. Ishibuchi, H. Masuda, and Y. Nojima, “Pareto fronts of many-objective degenerate test problems,” IEEE Transactions on Evolutionary Computation, vol. 20, no. 5, pp. 807–813, 2016. [58] Y. Zhang, M. Harman, and S. A. Mansouri, “The multi-objective next release problem,” in Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation, 2007, pp. 1129–1137. [59] D. W. Corne and J. D. Knowles, “Techniques for highly multiobjective optimisation: Some nondominated points are better than others,” in Proceedings of the 9th Conference on Genetic and Evolutionary Computation, 2007, pp. 773–780. [60] M. K¨oppen and K. Yoshida, “Substitute distance assignments in NSGA-II for handling many-objective optimization problems,” in Proceedings of the Evolutionary Multi-criterion Optimization, 2007, pp. 727–741. [61] Q. Zhang, A. Zhou, S. Zhao, P. N. Suganthan, W. Liu, and S. Tiwari, “Multiobjective optimization test instances for the CEC 2009 special session and competition,” University of Essex, Colchester, UK and Nanyang technological University, Tech. Rep. CES-487, Tech. Rep., 2008. [62] T. Okabe, Y. Jin, M. Olhofer, and B. Sendhoff, “On test functions for evolutionary multi-objective optimization,” in Proceedings of the International Conference on Parallel Problem Solving from Nature, 2004, pp. 792–802. [63] H. Li, Q. Zhang, and J. Deng, “Biased multiobjective optimization and decomposition algorithm,” IEEE Transactions on Cybernetics, 2016, in press. [64] R. Cheng, Y. Jin, M. Olhofer, and B. Sendhoff, “Test problems for large-scale multiobjective and many-objective optimization,” IEEE Transactions on Cybernetics, 2016, in press.
PlatEMO without GUI

PlatEMO provides the only interface main() to users. If this function is invoked with input parameters, PlatEMO will be run without GUI. While if the function is invoked without any input parameter, PlatEMO will be run with a GUI. When PlatEMO is run without GUI, users should set the MOEA to be executed, the MOP to be solved, the operator to be used, and all the other parameter settings by passing input parameters to main(). Please refer to the user manual for the details of all the acceptable parameters of main(). After the MOEA has been terminated, the final population can be displayed in a figure, or saved in a .mat file. In the figure, users can select the data to be displayed on the menu, including the Pareto front of the result, the Pareto set of the result, the true Pareto front, and the evolutionary trajectories of any performance indicator values.

Test Module

If main() is invoked without any input parameter, the GUI of PlatEMO will be shown, which currently contains two modules, i.e. test module and experimental module. The test module is used to analyze the performance of each MOEA, where one MOEA on an MOP can be executed each time. The test module provides similar functions to the PlatEMO without GUI, but users do not need to write any additional command or code.

Experimental Module

The experimental module is designed for statistical experiments, where multiple MOEAs on a batch of MOPs can be executed at the same time. Different from the text module showing a figure, the experimental module shows a table containing all the statistical experimental results, including the mean and the standard deviation of the performance indicator value, and the Wilcoxon rank sum test result. Afterwards, the table can be saved as Excel table (.xlsx) or LaTeX table (.tex).

Support
The Copyright of the PlatEMO belongs to the BIMK group. You are free to use the PlatEMO for research purposes. All publications which use this platform or any code in the platform should acknowledge the use of "PlatEMO" and reference "Ye Tian, Ran Cheng, Xingyi Zhang, and Yaochu Jin, PlatEMO: A MATLAB Platform for Evolutionary Multi-Objective Optimization, IEEE Computational Intelligence Magazine, 2017, in press".
  • If you have any question, comment or suggestion to PlatEMO or the algorithms in PlatEMO, please contact Ye Tian (field910921@gmail.com).
  • If you want to add your MOEA, MOP, operator or performance indicator to PlatEMO, please send the MATLAB code (able to be used in PlatEMO) and the relevant literature to field910921@gmail.com.