Developing FFNN applications using cross-validated validation training

Jeffrey Yeh, Leonard Hamey, Tas Westcott

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contribution

Abstract

In this paper, we present a novel, effective and reliable training technique for feed-forward neural networks (FFNN). We call it cross-validated validation training (CVVT) since it combines statistical cross-validation with the validation training technique used in FFNNs. CVVT improves the generalisation estimation of validation training, enabling reliable comparison and selection of network architectures. Since it utilises validation training, CVVT also preserves the generalisation performance of FFNNs with excess weights. These benefits are demonstrated using statistical analysis of real-life results from a bake inspection system. Contrary to previous work, we found that significant excess weights may actually deteriorate the generalisation preserving ability of validation training.
Original languageEnglish
Title of host publication Proceedings of the 2nd IEEE International Conference on Intelligent Processing Systems
EditorsZ Liu, B Verma
Place of PublicationGold Coast
PublisherGriffith University, Australia
Pages565-569
Number of pages5
ISBN (Print)064633229 5
Publication statusPublished - 1998
EventIEEE International Conference on Intelligent Processing Systems (2nd : 1998) - Gold Coast, Qld, Australia
Duration: 4 Aug 19987 Aug 1998

Conference

ConferenceIEEE International Conference on Intelligent Processing Systems (2nd : 1998)
CountryAustralia
CityGold Coast, Qld
Period4/08/987/08/98

Fingerprint Dive into the research topics of 'Developing FFNN applications using cross-validated validation training'. Together they form a unique fingerprint.

Cite this