mail
scientific@pphmj.com
logo

Far East Journal of Electronics and Communications

Keyword

double regularization

Explore 1 research publication tagged with this keyword

1Publications
5Authors
1Years

Publications Tagged with "double regularization"

1 publication found

2026

1 publication

TRAINING PI-SIGMA NEURAL NETWORK USING DOUBLE REGULARIZATION

Khidir Shaib Mohamed et al.
3/10/2026

Traditional regularization parameters such as L1 and L2 are added to the cost function for neural network learning to improve learning ability and generate sparsity in the solution. L2 regularization adds the squared value of the weights to the cost function, whereas L1 regularization adds the absolute value of the weights. This study proposes an online gradient method with a novel double regularization (OGDr) for enhancing the learning ability of pi-sigma neural networks (PSNNs). The L1 and L2 regularization methods are combined in the double regularization method, which is frequently used in several machine learning frameworks. To improve the suggested method’s performance learning ability, we applied the XOR problem, parity problem, Gabor function problem, and sonar benchmark challenge. The numerical examples of cases, OGL1, and OGL2 were compared. The OGDr has a good learning accuracy, according to numerical statistics. In addition, unlike OGL1 and OGL2, the error decreases monotonically, and the gradient of the error function approaches zero throughout learning. Received: September 8, 2022 Accepted: October 29, 2022

Keyword Statistics
Total Publications:1
Years Active:1
Latest Publication:2026
Contributing Authors:5