Applied Computational Intelligence and Soft Computing (Jan 2024)
Fuzzy Neural Network for Fuzzy Quadratic Programming With Penalty Function and Mean-Variance Markowitz Portfolio Model
Abstract
This research tries to integrate fuzzy neural networks with penalty function to address the quadratic programming based on the mean-variance Markowitz portfolio model. The fuzzy quadratic programming problem with penalty function consists of the lower, central, and upper models. The models utilize fuzzy neural networks to solve the models. The proposed method has been implemented on the six leading stocks in the Pakistan Stock Exchange. The approach identifies the ideal portfolios for potential investors in the Pakistan Stock Exchange. Data of the six popular stocks trading on the stock exchange from January 2016 to October 2020 are taken into consideration. The optimizers are RMSprop, Momentum, Adadelta, Adagrad, Adam, and gradient descent, respectively. The findings of all the optimizers at all three phases (lower, central, and upper) agree on identifying the optimal investment portfolios for investors. The optimizers recommend investing in either one of the two categories. The first group recommends investing in the FFC, ARPL, and UPFL portfolios. The second group recommends LUCK, AGTL, and IGIHL. The first group tends to enhance return, variability, and risk. It is a high-risk group. The second group aims to reduce return variability while lowering risk. It is a risk-averse group. It is evident that all of the optimizers recommend investing in FFC, ARPL, and UPFL, with the exception of the Adam and Adadelta optimizers, which recommends investment in IGIHL, AGTL, and LUCK. RMSprop, Momentum, Adagrad, and gradient descent increase variability, risk, and returns. Adam proves the best optimizer, then RMSprop, and finally, Adagrad. Adam, Adadelta, and RMSprop are sensitive, whereas momentum and gradient descent are irresponsive to fuzzy uncertain data. The percent improvement in the objective is 0.59% and 0.18% for the proposed Adagrad and Adadelta, respectively.