2024, issue 4, p. 32-42

Received 30.10.2024; Revised 16.11.2024; Accepted 03.12.2024

Published 18.12.2024; First Online 23.12.2024

https://doi.org/10.34229/2707-451X.24.4.3

Previous  |  FULL TEXT (in Ukrainian)  |  Next

 

UDC 519.711

Estimating the Significance of Computer Model Factors Based on a Simple Neural Network

Volodymyr Pepelyaev ORCID ID favicon Big,   Nataliia Oriekhova *,   Ihor Lukyanov

V.M. Glushkov Institute of Cybernetics of the NAS of Ukraine, Kyiv

* Correspondence: This email address is being protected from spambots. You need JavaScript enabled to view it.

 

Introduction. The existing means of computer technology provide an opportunity to develop models of any complexity. This circumstance provokes developers of computer models of complex systems to excessive detailing. Among experienced specialists in computer modeling, there is an opinion that sometimes at the preliminary stage of model development, the number of insignificant factors can reach 80%. Such an increase in dimensionality not only significantly complicates the implementation of computer experiments, but can also have a significant impact on the understanding of the interaction of important factors that determine the basis and essence of the functioning of a complex system. Therefore, it is no less important for further model research, and especially for the optimization of a complex system, to determine insignificant factors.

The purpose of the work is to develop an algorithm for determining insignificant factors in the presence of a set of training data, in which the number of data samples is relatively small and exceeds the number of factors by only 2-3 times. For this, a neural network model implementing regression created using the Keras library was used. Artificially created datasets were used to conduct experiments to determine network parameters (number of layers, number of hidden neurons in a layer, as well as the number of learning epochs).

The results. The resulting neural network model demonstrated effective performance on test data sets. The model was then used to determine the significance of factors in sets of initial populations for a multipopulational genetic algorithm (MGA) study.

Conclusions. The proposed algorithm based on a simple neural network allows to correctly and quickly determine insignificant factors in a set of initial populations for the study of MGA, containing from 8 to 10 populations (250 - 300 samples). Since the initial weights of the neural network are chosen randomly, the results of different runs on the same set of data are slightly different. Therefore, in the general case of evaluating the significance of factors of a computer model, several runs must be made to obtain more reliable results.

 

Keywords: set of initial populations, significance of factors, neural network, learning epochs.

 

Cite as: Pepelyaev V., Oriekhova N., Lukyanov I. Estimating the Significance of Computer Model Factors Based on a Simple Neural Network. Cybernetics and Computer Technologies. 2024. 4. P. 32–42. (in Ukrainian) https://doi.org/10.34229/2707-451X.24.4.3

 

References

           1.     Horne G.E., Meyer T.E. Data Farming: Discovering Surprise. Prociding. of the Winter Simulation Conference. 2005. P. 1082–1087. https://doi.org/10.1109/WSC.2005.1574362

           2.     SEED Center for Data Farming. http://harvest.nps.edu/ (accessed: 30.10.2024)

           3.     Barry Ph., Koehler M. Simulation in context: using Data Farming for decision Support. Prociding. of the Winter Simulation Conference. 2004. P. 814–819. https://doi.org/10.1287/ijoc.14.3.192.113

           4.     Horne G.E., Schwierz K.-P. Data Farming around the world overview. Prociding. of the Winter Simulation Conference. 2008. P. 1442–1447. https://doi.org/10.1109/WSC.2008.4736222

           5.     Choo C.S., Ng E.C., Ang D., Chua C.L. Data Farming in Singapore: A brief history. Prociding. of the Winter Simulation Conference. 2008. P. 1448–1455. https://doi.org/10.1109/WSC.2008.4736223          

           6.     Fu M. Optimization for Simulation: Theory and Practice. INFORMS J. on Computing. 2002. 14 (3). P. 192–215. https://doi.org/10.1287/ijoc.14.3.192.113

           7.     April J., Glover F., Kelly J.P., Laguna M. Practical introduction to simulation optimization. Prociding. of the Winter Simulation Conference. 2003. P. 71–78. https://doi.org/10.1109/WSC.2003.1261410

           8.     Pepelyaev V.A. Regarding the integration of optimization methods and simulation modeling. Theory of optimal solutions. 2003. No. 2. P. 51–61. (in Russian) http://dspace.nbuv.gov.ua/handle/123456789/84855

           9.     Pepelyaev V.A. Planning optimization-simulation experiments. Cybernetics and Systems Analysis. 2006. 42 (6). P. 866–875. https://doi.org/10.1007/s10559-006-0126-z

       10.     Lytvynenko F.A., Lukyanov I.O., Krykovlyuk E.A. Peculiarities of implementation of the parallel version of the multipopulation genetic algorithm. Computer mathematics. 2018. No. 2. P. 21–29. (in Russian) http://dspace.nbuv.gov.ua/handle/123456789/161882

       11.     Lytvynenko F.A., Lukyanov I.O., Krykovlyuk E.A. On increasing the efficiency of the parallel version of the multipopulation genetic algorithm. Theory of optimal solutions. 2019. No. 18. P. 116–122. (in Russian) http://dspace.nbuv.gov.ua/handle/123456789/161683

       12.     Pepelyaev V.A., Ch'orny Yu.M. On the possibilities of applying genetic algorithms in optimization and simulation experiments. Theory of optimal solutions. 2019. No. 18. P. 69–77. http://dspace.nbuv.gov.ua/handle/123456789/161681

       13.     Lukyanov I.O., Lytvynenko F.A., Koval V.P. On the selection of the size of the initial population for the parallel version of the multipopulation genetic algorithm. IX International School-Seminar "Theory of Decision Making". Ukraine. Uzhhorod April 15-20, 2019. P. 95–96. (in Russian)

       14.     Lukyanov I.O., Lytvynenko F.A., Kozlyuk E.M. On the effective use of the initial population of the genetic algorithm. XIX International Scientific and Technical Conference "Problems of Informatics and Modeling". Ukraine. Kharkiv - Odesa. September 11-16, 2019. P. 52–53. (in Russian)

       15.     Lukyanov I.O. On the effectiveness of a parallel multipopulation genetic algorithm for different numbers of processors. VII International scientific conference "Mathematical modeling, optimization and information technologies". Chisinau - Kyiv - Batumi. November 15-19, 2021. (in Russian)

       16.     Lukyanov I.O., Lytvynenko F.A. On the selection of the number of processors for a parallel multi-population genetic algorithm. Cybernetics and computer technologies. 2022. No. 2. P. 31–37. (in Ukrainian) https://doi.org/10.34229/2707-451X.22.2.3

       17.     About Keras 3. https://keras.io/about/ (accessed: 30.10.2024)

       18.     Chollet F. Deep learning with Python. Simon and Schuster. 2021.

 

 

ISSN 2707-451X (Online)

ISSN 2707-4501 (Print)

Previous  |  FULL TEXT (in Ukrainian)  |  Next

 

 

            Archive

 

© Website and Design. 2019-2024,

V.M. Glushkov Institute of Cybernetics of the NAS of Ukraine,

National Academy of Sciences of Ukraine.