Results on weight configurations that are not local minima in feed-forward neural networks

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

Abstract

Local minima in the error surfaces of feed-forward neural networks are significant because they may entrap gradient based training algorithms. Recent results have identified conditions under which local minima do not occur. The present paper considers three distinct definitions of local minimum, concluding that a new definition, called regional minimum, corresponds most closely to intuition. Using this definition, we analyse weight configurations in which a hidden node is ignored or redundant and show that these are not local minima. The practical implications of this result for gradient based learning are discussed.
Original languageEnglish
Title of host publicationProceedings of the Seventh Australian Conference on Neural Networks : ACNN'96, Canberra, 10-12 April 1996
EditorsP Bartlett, A Burkitt, R Williamson
Place of PublicationCanberra
PublisherANU
Pages173-178
Number of pages6
ISBN (Print)0731524292
Publication statusPublished - 1996
EventAustralian Conference on Neural Networks (7th : 1996) - Canberra, Australia
Duration: 10 Apr 199612 Apr 1996

Conference

ConferenceAustralian Conference on Neural Networks (7th : 1996)
Country/TerritoryAustralia
CityCanberra
Period10/04/9612/04/96

Fingerprint

Dive into the research topics of 'Results on weight configurations that are not local minima in feed-forward neural networks'. Together they form a unique fingerprint.

Cite this