Please use this identifier to cite or link to this item:
|Title:||Anatomical Refinement in the Projection from the Anteroventral Cochlear Nucleus to the Lateral Superior Olive|
|Abstract:||In mammals, the basic computations required for azimuthal sound localization are performed by a group of auditory brainstem nuclei known as the superior olivary complex (SOC). The lateral superior olive (LSO), in the SOC, aids in sound localization by computing intensity differences between sounds arriving at the two ears. It does this by comparing excitatory input from the ipsilateral anteroventral cochlear nucleus (AVCN) with inhibitory input from the ipsilateral medial nucleus of the trapezoid body (MNTB), which is driven by the contralateral AVCN. In order for sounds to be accurately localized, the AVCN-LSO and MNTB-LSO projections must be aligned with each other in a frequency-dependent manner. Rough alignment occurs over the course of development, but a significant amount of circuit refinement is required to achieve adult-like precision. Two types of refinement occur in these pathways: 1) physiological, or functional refinement; and 2) anatomical refinement. Little is known about the latter type of refinement in the AVCN-LSO pathway. In order to study this, I conducted a variety of experiments all aimed at anterogradely labeling a small number of cells projecting from the AVCN to the LSO in juvenile rats. I experimented with several approaches in order to develop the technique of ex vivo, sparse axon labeling in this area of the brain. I show the optimal technique developed after testing various tracers, application methods, and incubation times, among others. This optimized technique can now be used in a future experiment that will uncover and describe anatomical refinement in the AVCN-LSO pathway of the auditory brainstem.|
|Appears in Collections:||Open Access Dissertations and Theses|
Files in This Item:
|Molot-Toker_Samuel_A_2015September_MSc.pdf||10.89 MB||Adobe PDF||View/Open|
Items in MacSphere are protected by copyright, with all rights reserved, unless otherwise indicated.