Use of backpropagation and differential evolution algorithms to training MLPs

Luiz Carlos Camargo, Hegler Correa Tissot, Aurora Trinidad Ramirez Pozo

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

1 Citation (Scopus)

Abstract

Artificial Neural Networks (ANNs) are often used (trained) to find a general solution in problems where a pattern needs to be extracted, such as data classification. Feedforward (FFNN) is one of the ANN architectures and multilayer perceptron (MLP) is a type of FFNN. Based on gradient descent, backpropagation (BP) is one of the most used algorithms for MLP training. Evolutionary algorithms can be also used to train MLPs, including Differential Evolution (DE) algorithm. In this paper, BP and DE are used to train MLPs and they are both compared in four different approaches: (a) backpropagation, (b) DE with fixed parameter values, (c) DE with adaptive parameter values and (d) a hybrid alternative using both DE+BP algorithms.

Original languageEnglish
Title of host publicationProceedings - 31st International Conference of the Chilean Computer Science Society, SCCC 2012
Place of PublicationPistcataway, NJ
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages78-86
Number of pages9
ISBN (Print)9781479929375
DOIs
Publication statusPublished - 2013
Externally publishedYes
Event31st International Conference of the Chilean Computer Science Society, SCCC 2012 - Valparaiso, Chile
Duration: 12 Nov 201216 Nov 2012

Other

Other31st International Conference of the Chilean Computer Science Society, SCCC 2012
Country/TerritoryChile
CityValparaiso
Period12/11/1216/11/12

Keywords

  • Artificial Neural Network
  • Backpropagation (BP) algorithm
  • Differential evolution (DE) algorithm
  • Multilayer perceptron

Fingerprint

Dive into the research topics of 'Use of backpropagation and differential evolution algorithms to training MLPs'. Together they form a unique fingerprint.

Cite this