Welcome to the upgraded MacSphere! We're putting the finishing touches on it; if you notice anything amiss, email macsphere@mcmaster.ca

The application of monotonicity constraints to the back propagation neural network training algorithm

Abstract

<p>When statistical data are used in supervised training of a neural network employing the back propagation least mean square algorithm, the behavior of the classification boundary during training is often unpredictable. This research suggests the application of monotonicity constraints to the back propagation learning algorithm. When the training sample set is pre-processed by a linear classification function, this can improve neural network performance and efficiency in classification applications where the feature vector is related monotonically to the pattern vector. This technique can be applied to any classification problem which possesses monotonicity properties, such as managerial pattern recognition problems and others.</p>

Description

<p>30, [7] leaves : ; Includes bibliographical references (leaves 24-27). ; Probable date of paper: June, 1990. Based on enumeration of series.</p>

Citation

Endorsement

Review

Supplemented By

Referenced By