In this paper, we present a novel, effective and reliable training technique for feed-forward neural networks (FFNN). We call it cross-validated validation training (CVVT) since it combines statistical cross-validation with the validation training technique used in FFNNs. CVVT improves the generalisation estimation of validation training, enabling reliable comparison and selection of network architectures. Since it utilises validation training, CVVT also preserves the generalisation performance of FFNNs with excess weights. These benefits are demonstrated using statistical analysis of real-life results from a bake inspection system. Contrary to previous work, we found that significant excess weights may actually deteriorate the generalisation preserving ability of validation training.
|Title of host publication||Proceedings of the 2nd IEEE International Conference on Intelligent Processing Systems|
|Editors||Z Liu, B Verma|
|Place of Publication||Gold Coast|
|Publisher||Griffith University, Australia|
|Number of pages||5|
|ISBN (Print)||064633229 5|
|Publication status||Published - 1998|
|Event||IEEE International Conference on Intelligent Processing Systems (2nd : 1998) - Gold Coast, Qld, Australia|
Duration: 4 Aug 1998 → 7 Aug 1998
|Conference||IEEE International Conference on Intelligent Processing Systems (2nd : 1998)|
|City||Gold Coast, Qld|
|Period||4/08/98 → 7/08/98|
Yeh, J., Hamey, L., & Westcott, T. (1998). Developing FFNN applications using cross-validated validation training. In Z. Liu, & B. Verma (Eds.), Proceedings of the 2nd IEEE International Conference on Intelligent Processing Systems (pp. 565-569). Gold Coast: Griffith University, Australia.