Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
**New Addition (Feb 20, 2024) -** **UR_AN_IC_Model_2024a** **** A wrapper that generates population responses using the Zilany et al., 2014, and Bruce et al., 2018, AN models, along with code for IC models (from Mao et al., 2013 and Carney & McDonough, 2019). **** Previous code (still available): This Project holds MATLAB source code and executable versions for UR_EAR_2022a, a GUI interface developed at the University of Rochester (UR), Rochester NY. The goal of the code is to provide visualizations of population responses of auditory-nerve (AN) and inferior colliculus (IC) model neurons. (EAR = Envisioning Auditory Responses!) A cloud-based version of the UR_EAR GUI is now available as a web app at the following site: **https://urhear.urmc.rochester.edu** Four downloadable versions are also included here. Note that all versions are based on the same 'web app' layout which was implemented using MATLAB's Web App Designer. Thus, the appearance of the downloadable versions is identical to the cloud-based version. The 4 downloadable versions: 1. MATLAB code that can be downloaded and run, along with a manual that provides guidance for users who want to add their own stimuli, etc. This version requires the user to have access to MATLAB. Compiled versions of C code (as well as the source) are included for MATLAB running under Windows, macOS and GNU/Linux operating systems. 2. A "standalone" version for Windows that is an executable that can be run without owning MATLAB. Installing this version requires downloading the free Runtime package from The MathWorks. 3. A "standalone" version for macOS that is an executable that can be run without owning MATLAB. Installing this version requires downloading the free Runtime package from The MathWorks. 4. A "standalone" version for GNU/Linux that is an executable that can be run without owning MATLAB. Installing this version requires downloading the free Runtime package from The MathWorks. ---------- The auditory-nerve model options in the GUI include: Zilany, M.S.A., Bruce, I.C., and L. H. Carney (2014) Updated parameters and expanded simulation options for a model of the auditory periphery. J Acoust Soc Am 135:283-286. PMCID: PMC3985897 and Bruce, I. C., Erfani, Y., & Zilany, M. S. (2018). A phenomenological model of the synapse between the inner hair cell and auditory nerve: Implications of limited neurotransmitter release sites. Hearing research, 360, 40-54. IC models are from: Nelson, P.C. and Carney, L. H. (2004) A phenomenological model of peripheral and central neural responses to amplitude-modulated tones. J. Acoust. Soc. Am. 116:2173-2186. PMCID: PMC1379629. Carney, LH, Li, T., McDonough, JM (2015) Speech Coding in the Brain: Representation of Formants by Midbrain Neurons Tuned to Sound Fluctuations. eNeuro 2(4), 1-12. e0004-15.2015 1–1. (DOI: 10.1523/ENEURO.0004-15.2015). PMCID: PMC4596011. Carney, LH, and JM McDonough (2019) Nonlinear auditory models yield new insights into representations of vowels, Atten Percept Psychophys, 81(4):1034-1046.DOI: 10.3758/s13414-018-01644-w. PMID: 30565098; PMCID: PMC6581637. Mao, J., Vosoughi, A., and L.H. Carney (2013) Predictions of diotic tone-in-noise detection based on a nonlinear optimal combination of energy, envelope, and fine-structure cues, J Acoust Soc Am 134: 396-406. PMCID: PMC3724726. Please cite the appropriate papers if you publish any modeling results using this code. This work has been supported over the years by NIH grants DC001641 & DC010813. The Cloud-computing version, in particular, was supported by a supplement to DC001641.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.