AASEC 2019 Conference

The Most Optimal Performance of the Levenberg Marquarth Algorithm Based on the Number of Neurons in the Hidden Layer
Hindayati Mustafidah, Chintia Permata Putri, Harjono, Suwarsito

Universitas Muhammadiyah Purwokerto


Abstract

The training algorithm is the main driver in artificial neural networks. The performance of the training algorithm is influenced by several parameters including the number of neurons in the input and hidden layers, epoch maximum, and the size of the learning rate (lr). One of the benchmarks for optimizing the performance of training algorithms can be viewed from the error or MSE (mean square error) produced. The smaller the error, the more optimal the performance. The test conducted in the previous study obtained information that the most optimal training algorithm based on the smallest error produced was Levenberg Marquardt (LM) with an average MSE of 0.001 and using 10 neurons in the hidden layer. In this study, the LM algorithm was tested using variations in the number of neurons in hidden layers, namely 2, 4, 5, 7, 9 neurons, and used the same parameters as previous study. This study uses a mixed method that is developing computer programs and quantitative testing of program output data using statistical test. The results showed that the LM algorithm had the most optimal performance using 9 neurons in the hidden layer at the level of lr 0.5 with the smallest error of 0.000137501.

Keywords: neuron, hidden layer, learning rate, Levenberg Marquardt, error

Topic: Computer Science

Link: https://ifory.id/abstract-plain/8EYKDW9nB7zP

Web Format | Corresponding Author (HINDAYATI MUSTAFIDAH)