Skip navigation
  • Home
  • Browse
    • Communities
      & Collections
    • Browse Items by:
    • Publication Date
    • Author
    • Title
    • Subject
    • Department
  • Sign on to:
    • My MacSphere
    • Receive email
      updates
    • Edit Profile


McMaster University Home Page
  1. MacSphere
  2. Open Access Dissertations and Theses Community
  3. Open Access Dissertations and Theses
Please use this identifier to cite or link to this item: http://hdl.handle.net/11375/19051
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorWatter, Scott-
dc.contributor.authorJansen, Peter-
dc.date.accessioned2016-04-06T19:55:18Z-
dc.date.available2016-04-06T19:55:18Z-
dc.date.issued2010-10-
dc.identifier.urihttp://hdl.handle.net/11375/19051-
dc.description.abstract<p> Connectionist models of language acquisition typically have difficulty with systematicity, or the ability for the network to generalize its limited experience with language to novel utterances. In this way, connectionist systems learning grammar from a set of example sentences tend to store a set of specific instances, rather than a generalized abstract knowledge of the process of grammatical combination. Further, recent models that do show limited systematicity do so at the expense of simultaneously storing explicit lexical knowledge, and also make use of both developmentally-implausible training data and biologically-implausible learning rules. Consequently, this research program develops a novel unsupervised neural network architecture, and applies this architecture to the problem of systematicity in language models.</p> <p> In the first of several studies, a connectionist architecture capable of simultaneously storing explicit and separate representations of both conceptual and grammatical information is developed, where this architecture is a hybrid of both a self-organizing map and an intra-layer Hebbian associative network. Over the course of several studies, this architecture's capacity to acquire linguistic grammar is evaluated, where the architecture is progressively refined until it is capable of acquiring a benchmark grammar consisting of several difficult clausal sentence structures - though it must acquire this grammar at the level of grammatical category, rather than the lexical level.</p> <p> The final study bridges the gap between the lexical and grammatical category levels, and develops an activation function based on a semantic feature co-occurrence metric. In concert with developmentally-plausible sensorimotor grounded conceptual representations, it is shown that a network using this metric is able to undertake a process of semantic bootstrapping, and successfully acquire separate explicit representations at the level of the concept, part-of-speech category, and grammatical sequence. This network demonstrates broadly systematic behaviour on a difficult test of systematicity, and extends its knowledge of grammar to novel sensorimotor-grounded words.</p>en_US
dc.language.isoen_USen_US
dc.subjectself-organizing, computational, neural, network, architecture, applications, linguistic, grammaren_US
dc.titleA Self-Organizing Computational Neural Network Architecture with Applications to Sensorimotor Grounded Linguistic Grammar Acquisitionen_US
dc.typeThesisen_US
dc.contributor.departmentPsychologyen_US
dc.description.degreetypeThesisen_US
dc.description.degreeDoctor of Philosophy (PhD)en_US
Appears in Collections:Open Access Dissertations and Theses

Files in This Item:
File Description SizeFormat 
Jansen_Peter_2010Oct_Ph.D..pdf
Open Access
11.35 MBAdobe PDFView/Open
Show simple item record Statistics


Items in MacSphere are protected by copyright, with all rights reserved, unless otherwise indicated.

Sherman Centre for Digital Scholarship     McMaster University Libraries
©2022 McMaster University, 1280 Main Street West, Hamilton, Ontario L8S 4L8 | 905-525-9140 | Contact Us | Terms of Use & Privacy Policy | Feedback

Report Accessibility Issue