Bayesian Hyperparameter Optimization Analysis for Sustainable Innovation Performance Prediction Model
DOI:
https://doi.org/10.29303/emj.v8i2.266Keywords:
Sustainable Innovation Performance, Bayesian Optimization, Predictive ModellingAbstract
This study examines how well the Gaussian Process Regression (GPR) model performs in interpreting the optimization outcomes achieved through Bayesian Optimization (BO) with Keras Tuner, specifically in the context of Sustainable Innovation Performance (SIP). The GPR surrogate model serves to examine the outcomes of optimization and offers valuable insights into the strategies of exploration and exploitation while seeking the most effective hyperparameters. The evaluation of the effectiveness of GPR involved calculating the Mean Absolute Error (MAE), which was bootstrapped 1000 times to establish a 95\%. Confidence Interval (CI). This study's findings demonstrate the dependability of GPR in forecasting the validation loss generated by BO, characterized by minimal prediction errors and consistent confidence intervals. The results indicate that GPR serves as a dependable statistical method for assessing uncertainty in Bayesian-based optimization. Additionally, they offer valuable perspectives on how exploration and exploitation strategies can be utilized to attain optimal hyperparameter configurations. By strategically balancing exploitation and exploration, Bayesian Optimization can enhance the process of identifying optimal hyperparameter configurations within continuous innovation prediction models.References
J. Bergstra and Y. Bengio, “Random search for hyper-parameter optimization,” Journal of Machine Learning Research, vol. 13, pp. 281–305, 2012. http://dx.doi.org/10.1145/3575882.3575900.
J. Snoek, H. Larochelle, and R. P. Adams, “Practical bayesian optimization of machine learning algorithms,” Advances in Neural Information Processing Systems, vol. 25, pp. 2960–2968, 2012. http://dx.doi.org/10.1163/15685292-12341254.
C. E. Rasmussen and C. K. I. Williams, Gaussian Processes for Machine Learning. MIT Press, 2006. www.GaussianProcess.org/gpml.
Z. Y. Shnain, A. K. Mageed, H. S. Majdi, M. Mohammadi, A. A. AbdulRazak, and M. F. Abid, “Investigating the effect of tio2-based nanofluids in the stability of crude oil flow: parametric analysis and gaussian process regression modeling,” Journal of Petroleum Exploration and Production Technology, vol. 12, no. 9, pp. 2429–2439, 2022. https://dx.doi.org/10.1007/s13202-022-01473-6.
C. Bian, X. Wang, C. Liu, X. Xie, and L. Haitao, “Impact of exploration-exploitation trade-off on ucb-based bayesian optimization,” IOP Conference Series: Materials Science and Engineering, vol. 1081, no. 1, p. 012023, 2021. https://dx.doi.org/10.1088/1757-899x/1081/1/012023.
S. Liu, Y. Xiang, and H. Zhou, “A deep learning-based approach for high-dimensional industrial steam consumption prediction to enhance sustainability management,” Sustainability (Switzerland), vol. 16, no. 22, 2024. http://dx.doi.org/10.3390/su16229631.
C. Ji, “Predicting fuel consumptions and exhaust gas emissions for lng carriers via machine learning with hyperparameter optimization,” in SNAME 26th Offshore Symposium, TOS 2021, OnePetro, 2021. https://dx.doi.org/10.5957/TOS-2021-09.
Y. Hamoudi, H. Amimeur, D. Aouzellag, M. G. M. Abdolrasol, and T. S. Ustun, “Hyperparameter bayesian optimization of gaussian process regression applied in speed-sensorless predictive torque control of an autonomous wind energy conversion system,” Energies, vol. 16, no. 12, p. 4738, 2023. https://dx.doi.org/10.3390/en16124738.
Q. Liu, X. Bao, D. Guo, and L. Li, “Bayesian optimization algorithm-based gaussian process regression for in situ state of health prediction of minorly deformed lithium-ion battery,” Energy Science & Engineering, vol. 12, no. 4, pp. 1472–1485, 2024. https://dx.doi.org/10.1002/ese3.1678.
Y. Liu, T. Yang, L. Tian, B. Huang, J. Yang, and Z. Zeng, “Ada-xg-catboost: A combined forecasting model for gross ecosystem product (gep) prediction,” Sustainability (Switzerland), vol. 16, 2024. https://dx.doi.org/10.3390/su16167203.
J. Shin, C. Kim, and H. Yang, “The effect of sustainability as innovation objectives on innovation efficiency,” Sustainability, vol. 10, no. 6, p. 1966, 2018. http://dx.doi.org/10.3390/su10061966.
P. A. Danarahmanto, I. Primiana, Y. Azis, and U. Kaltum, “The sustainable performance of the digital start-up company based on customer participation, innovation, and business model,” Business: Theory and Practice, vol. 21, no. 1, pp. 115–124, 2020. https://dx.doi.org/10.3846/btp.2020.11053.
A. Widya-Hasuti, A. Mardani, D. Streimikiene, A. Sharifara, and F. Cavallaro, “The role of process innovation between firm-specific capabilities and sustainable innovation in smes: Empirical evidence from indonesia,” Sustainability, vol. 10, no. 7, p. 2244, 2018. http://dx.doi.org/10.3390/su10072244.
A. Melane-Lavado and Álvarez-Herranz, Agustín, “Different ways to access knowledge for sustainability-oriented innovation. the effect of foreign direct investment,” Sustainability (Switzerland), vol. 10, no. 11, 2018. http://dx.doi.org/10.3390/su10114206.
F. Michelino, A. Cammarano, A. Celone, and M. Caputo, “The linkage between sustainability and innovation performance in it hardware sector,” 2019. https://dx.doi.org/10.3390/su11164275.
Maletič, Matjaž and Maletič, Damjan and Dahlgaard, Jens J. and Dahlgaard-Park, Su Mi and Gomišček, Boštjan, “The relationship between sustainability– oriented innovation practices and organizational performance: Empirical evidence from slovenian organizations,” Organizacija, vol. 47, no. 1, pp. 3–13, 2014. http://dx.doi.org/10.2478/orga-2014-0001.
S. Putatunda and K. Rama, “A modified bayesian optimization based hyper-parameter tuning approach for extreme gradient boosting,” 2019 15th International Conference on Information Processing: Internet of Things, ICINPRO 2019 - Proceedings, pp. 22–27, 2019. http://dx.doi.org/10.1109/ICInPro47689.2019.9092025.
M. Zhang, H. Li, and S. Su, “High dimensional bayesian optimization via supervised dimension reduction,” IJCAI International Joint Conference on Artificial Intelligence, pp. 4292–4298, 2019. https://dx.doi.org/10.24963/ijcai.2019/596.
R. Moriconi, K. S. Kumar, and M. P. Deisenroth, “High-dimensional bayesian optimization with projections using quantile gaussian processes,” Optimization Letters, vol. 14, no. 1, pp. 51–64, 2020. http://dx.doi.org/10.1007/s11590-019-01433-w.
T. T. Joy, S. Rana, S. Gupta, and S. Venkatesh, “Fast hyperparameter tuning using bayesian optimization with directional derivatives,” Knowledge-Based Systems, vol. 205, 2020. https://dx.doi.org/10.1016/j.knosys.2020.106247.
M. Feurer and F. Hutter, Hyperparameter Optimization. 2019. http://dx.doi.org/10.1007/978-3-030-05318-5 1.
M. Parsa, J. P. Mitchell, C. D. Schuman, R. M. Patton, T. E. Potok, and K. Roy, “Bayesian multi-objective hyperparameter optimization for accurate, fast, and efficient neural network accelerator design,” Frontiers in Neuroscience, vol. 14, no. July, pp. 1–16, 2020. http://dx.doi.org/10.3389/fnins.2020.00667.
L. Yang and A. Shami, “On hyperparameter optimization of machine learning algorithms: Theory and practice,” Neurocomputing, vol. 415, pp. 295–316, 2020. http://dx.doi.org/10.1016/j.neucom.2020.07.061.
T. O’Malley, E. Bursztein, J. Long, F. Chollet, H. Jin, and L. Invernizzi, “Keras tuner,” 2019. https://github.com/keras-team/keras-tuner.
C. Willmott and K. Matsuura, “Advantages of the mean absolute error (mae) over the root mean square error (rmse) in assessing average model performance,” Climate Research, vol. 30, no. 1, pp. 79–82, 2005. https://dx.doi.org/10.3354/cr030079.
D. Anggara, N. Suarna, and Y. Arie Wijaya, “Performance comparison analysis of optimizer adam, sgd, and rmsprop on the h5 model,” Jurnal Ilmiah NERO, vol. 8, no. 1, p. 2023, 2023. https://dx.doi.org/10.21107/nero.v8i1.19226.
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
All articles published in the Eigen Mathematics Journal will be available for free reading and downloading. The license applied to this journal is Creative Commons Attribution-Non-Commercial-Share Alike (CC BY-NC-SA).



