Skip navigation
  • Home
  • Browse
    • Communities
      & Collections
    • Browse Items by:
    • Publication Date
    • Author
    • Title
    • Subject
    • Department
  • Sign on to:
    • My MacSphere
    • Receive email
      updates
    • Edit Profile


McMaster University Home Page
  1. MacSphere
  2. Open Access Dissertations and Theses Community
  3. Open Access Dissertations and Theses
Please use this identifier to cite or link to this item: http://hdl.handle.net/11375/7088
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorCrowe, Cameron M.en_US
dc.contributor.authorTong, Hongweien_US
dc.date.accessioned2014-06-18T16:38:01Z-
dc.date.available2014-06-18T16:38:01Z-
dc.date.created2010-06-29en_US
dc.date.issued1995-08en_US
dc.identifier.otheropendissertations/2383en_US
dc.identifier.other3347en_US
dc.identifier.other1375233en_US
dc.identifier.urihttp://hdl.handle.net/11375/7088-
dc.description.abstract<p>Measurements such as flow rates from a chemical process are inherently inaccurate. They are contaminated by random errors and possibly gross errors such as process disturbances, leaks, departure from steady state, and biased instrumentation. These measurements violate conservation laws and other process constraints. The goal of data reconciliation is to resolve the contradictions between the measurements and their constraints, and to process contaminated data into consistent information. Data reconciliation aims at estimating the true values of measured variables, detecting gross errors, and solving for unmeasured variables.</p> <p>This thesis presents a modification of a model of bilinear data reconciliation which is capable of handling any measurement covariance structure, followed by a construction of principal component tests which are sharper in detecting and have a substantially greater power in correctly identifying gross errors than the currently used statistical tests in data reconciliation. Sequential Analysis is combined with Principal Component Analysis to provide a procedure for detecting persistent gross errors.</p> <p>The concept of zero accumulation is used to determine the applicability of the established linear/bilinear data reconciliation model and algorithms. A two stage algorithm is presented to detect zero accumulation in the presence of gross errors.</p> <p>An interesting finding is that the univariate and the maximum power tests can be quite poor in detecting gross errors and can lead to confounding in their identification.</p>en_US
dc.subjectChemical Engineeringen_US
dc.subjectChemical Engineeringen_US
dc.titleStudies in Data Reconciliation Using Principal Component Analysisen_US
dc.typethesisen_US
dc.contributor.departmentChemical Engineeringen_US
dc.description.degreeDoctor of Philosophy (PhD)en_US
Appears in Collections:Open Access Dissertations and Theses

Files in This Item:
File SizeFormat 
fulltext.pdf
Open Access
3.22 MBAdobe PDFView/Open
Show simple item record Statistics


Items in MacSphere are protected by copyright, with all rights reserved, unless otherwise indicated.

Sherman Centre for Digital Scholarship     McMaster University Libraries
©2022 McMaster University, 1280 Main Street West, Hamilton, Ontario L8S 4L8 | 905-525-9140 | Contact Us | Terms of Use & Privacy Policy | Feedback

Report Accessibility Issue