Please use this identifier to cite or link to this item:
|Title:||Regularized Radial Basis Function Networks: Theory and Applications to Probability Estimation, Classification, and Time Series Prediction|
|Authors:||Yee, Van Paul|
|Department:||Electrical and Computer Engineering|
|Keywords:||Electrical and Computer Engineering;Electrical and Computer Engineering|
|Abstract:||<p>In this thesis, we study both theoretical and practical aspects of the regularized strict interpolation radial basis function (SIRBFN) estimate or neural network. From a theoretical perspective, we show that the regularized SIRBFN can be globally mean-square (m.s.) consistent whenever the Nadaraya-Watson regression estimate is and the regularization parameter sequence for the SIRBFN is chosen to be asymptotically optimal in the mean-squared fitting error. Hence we prove the Bayes risk consistency of the approximate Bayes decision rules formed from (m.s.-consistent) regularized SIRBFN posterior probability estimates. Similarly, we prove the m.s.-consistency of the regularized SIRBFN predictor for the class of Markovian nonlinear autoregressive time series generated by an i.i.d. noise process. In a one-step-ahead prediction experiment with a phonetically-balanced suite of male and female speech waveforms, the proposed predictor offers an average 2.2dB improvement in predction SNR over corresponding expponentially-weighted RLS predictors. We also show that linearly combining an ensemble of three such proposed predictors via RLS filtering can yield an average 4.2dB improvement over the previous standard RLS predictors, and develop recursive algorithms to update the proposed predictor on-line with reduced computational complexity for certain situations. Two emerging applications areas are then considered. The first is the regression-based approach to nonlinear filtering or state estimation, where the proposed network provides comparable performance to a recurrent MLP-based solution. The second is the dynamic reconstruction of chaotic systems from noisy observational data, where the reconstructed system is shown to generate sequences whose estimated long and short-term dynamical invariants agree closely with those of the original, noise-free system. Taken together, these theoretical and practical results point to the regularized SIRBFN as a principled design choice for RBF neural networks.</p>|
|Appears in Collections:||Open Access Dissertations and Theses|
Items in MacSphere are protected by copyright, with all rights reserved, unless otherwise indicated.