Please use this identifier to cite or link to this item:
http://hdl.handle.net/11375/25992
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Lyons, Jim | - |
dc.contributor.author | Brillantes, Jacqueline | - |
dc.date.accessioned | 2020-10-23T18:01:45Z | - |
dc.date.available | 2020-10-23T18:01:45Z | - |
dc.date.issued | 2020 | - |
dc.identifier.uri | http://hdl.handle.net/11375/25992 | - |
dc.description.abstract | It has been suggested that both visual (e.g., Ward et al., 2002) and auditory (e.g., Sinnett & Kingstone, 2010; Cañal-Bruland, 2018) stimuli can influence anticipatory judgements. The following studies address the nature of the perceptuomotor mechanisms underlying these judgments by contrasting two well-established theoretical frameworks that differentially explain the linkage of perception and action (i.e., a perception-action dissociation model, Milner & Goodale, 2006; a planning-control model, Glover & Dixon, 2001). Specifically, this thesis explores how, when two sources of sensory information (audition and vision) are put into conflict, motor actions may mediate simple perceptual judgements. With reference to the theoretical models noted above, perception-action dissociation theory would predict that incompatible sensory information would have no influence on action. Conversely, the planning-control model would predict incompatible sensory information to significantly influence action. In the three studies comprising this thesis, participants completed two tasks under two distinct response conditions: Perception only; and perception combined with a goal-directed aiming action. In Studies 1 and 3, participants (n=16) predicted when a visual stimulus travelling at either a fast (224 mm/s) or slow (113 mm/s) velocity in a straight-line trajectory would enter a specified target zone. In Study 1, this was accomplished by simply pushing a button whereas in Study 3, participants moved a stylus controlling an on-screen cursor to the predicted interception point. In both experiments, either a loud (70dB), soft (50db), and no (0db) burst of white noise was presented for 150 ms at the initiation of visual stimulus movement. On each trial, the stimulus would disappear after either 33% or 66% of distance traveled. Results of Study 1 suggest that when the loud sound accompanied the 33% vision stimulus, overshoot bias was significantly reduced. Conversely, when the loud sound accompanied the 66% vision stimulus, overshoot bias increased (and vice versa) suggesting that in situations where visual information is less reliable, robust auditory information may serve as a useful substitute. When, however, more vision is available, that same robust auditory information may interfere with perceptual judgements. In Study 2, no differences were observed in participants’ accuracy and directional bias when the auditory stimulus originated from a different hemifield than the origin of the visual stimulus. Lastly, in Study 3, participants were less accurate overall in predicting the point of interception. In addition, results from the kinematic markers associated with participants moving the stylus (e.g., Movement Time, Peak Velocities and Acceleration, etc.) suggest that differing combinations of auditory and visual stimulus information affect how motor actions are generated and executed in these types of multisensory anticipatory interception tasks. | en_US |
dc.language.iso | en | en_US |
dc.title | Exploring Perception and Action Response to Multisensory Incompatibility Effects | en_US |
dc.type | Thesis | en_US |
dc.contributor.department | Kinesiology | en_US |
dc.description.degreetype | Thesis | en_US |
dc.description.degree | Master of Science in Kinesiology | en_US |
Appears in Collections: | Open Access Dissertations and Theses |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Brillantes_Jacqueline_FS_202010_MSc.pdf | 1.34 MB | Adobe PDF | View/Open |
Items in MacSphere are protected by copyright, with all rights reserved, unless otherwise indicated.