Skip navigation
  • Home
  • Browse
    • Communities
      & Collections
    • Browse Items by:
    • Publication Date
    • Author
    • Title
    • Subject
    • Department
  • Sign on to:
    • My MacSphere
    • Receive email
      updates
    • Edit Profile


McMaster University Home Page
  1. MacSphere
  2. Open Access Dissertations and Theses Community
  3. Open Access Dissertations and Theses
Please use this identifier to cite or link to this item: http://hdl.handle.net/11375/15360
Title: Training of Neural Networks Using the Smooth Variable Structure Filter with Application to Fault Detection
Authors: Ahmed, Ryan
Advisor: Habibi, S.
Department: Mechanical Engineering
Keywords: Neural Networks;Smooth Variable Structure Filter
Publication Date: Apr-2011
Abstract: Artificial neural network (ANNs) is an information processing paradigm inspired by the human brain. ANNs have been used in numerous applications to provide complex nonlinear input-output mappings. They have the ability to adapt and learn from observed data. The training of neural networks is an important area of research and consideration. Training techniques have to provide high accuracy, fast speed of convergence, and avoid premature convergence to local minima. In this thesis, a novel training method is proposed. This method is based on the relatively new Smooth Variable Structure filter (SVSF) and is formulated for feedforward multilayer perceptron training. The SVSF is a state and parameter estimation that is based on the Sliding Mode Concept and works in a predictor-corrector fashion. The SVSF applies a discontinuous corrective term to estimate state and parameters. Its advantages include guaranteed stability, robustness, and fast speed of convergence. The proposed training technique is applied to three real-world benchmark problems and to a fault detection application in a Ford diesel engine. SVSF-based training technique shows an excellent generalization capability and a fast speed of convergence.
Artificial neural network (ANNs) is an information processing paradigm inspired by the human brain. ANNs have been used in numerous applications to provide complex nonlinear input-output mappings. They have the ability to adapt and learn from observed data. The training of neural networks is an important area of research and consideration. Training techniques have to provide high accuracy, fast speed of convergence, and avoid premature convergence to local minima. In this thesis, a novel training method is proposed. This method is based on the relatively new Smooth Variable Structure filter (SVSF) and is formulated for feedforward multilayer perceptron training. The SVSF is a state and parameter estimation that is based on the Sliding Mode Concept and works in a predictor-corrector fashion. The SVSF applies a discontinuous corrective term to estimate state and parameters. Its advantages include guaranteed stability, robustness, and fast speed of convergence. The proposed training technique is applied to three real-world benchmark problems and to a fault detection application in a Ford diesel engine. SVSF-based training technique shows an excellent generalization capability and a fast speed of convergence.
URI: http://hdl.handle.net/11375/15360
Appears in Collections:Open Access Dissertations and Theses

Files in This Item:
File Description SizeFormat 
Ahmed Ryan.pdf
Open Access
11.18 MBAdobe PDFView/Open
Show full item record Statistics


Items in MacSphere are protected by copyright, with all rights reserved, unless otherwise indicated.

Sherman Centre for Digital Scholarship     McMaster University Libraries
©2022 McMaster University, 1280 Main Street West, Hamilton, Ontario L8S 4L8 | 905-525-9140 | Contact Us | Terms of Use & Privacy Policy | Feedback

Report Accessibility Issue