Abstract
A novel approach to generalisation is presented that is able, under certain circumstances, to guarantee the generalisation to binary-output data for which no targets have been given. The basis of the guarantee is the recognition of a persistent global minimum error solution. An empirical test for whether the guarantee holds is provided which uses a technique called target reversal. The technique employs two neural networks whose convergence using opposing targets signals validity of the guarantee.
Original language | English |
---|---|
Pages (from-to) | 1035-1048 |
Number of pages | 14 |
Journal | Neural Networks |
Volume | 14 |
Issue number | 8 |
DOIs | |
Publication status | Published - Oct 2001 |
Keywords
- feed-forward neural networks
- binary outputs
- guaranteed generalisation
- global minimum errors
- target reversal
- exhaustive search
- OUT CROSS-VALIDATION
- FEEDFORWARD NETWORKS
- COMPLEXITY
- FRAMEWORK
- BOUNDS
- NETS
- SET