Quantitative models of visual-auditory interactions

J. Hong, T. V. Papathomas, Z. Vidnyanszky

Research output: Contribution to journalConference article

Abstract

We developed a neurocomputational model for the integration of visual and auditory stimuli. The model comprises three main stages. The first visual stage is motion extraction, with a corresponding stage for auditory processing. The second stage models visual-auditory neural interactions, simulating a neural network involving the superior colliculus. The third stage is a global integration stage, simulating higher cortical areas. Simulation results with this model agree closely with experimental data.

Original languageEnglish
Article number1.2.2
Pages (from-to)11-12
Number of pages2
JournalBioengineering, Proceedings of the Northeast Conference
Publication statusPublished - Dec 12 2005
EventProceedings of the 2005 IEEE 31st Annual Northeast Bioengineering Conference - Hoboken, NJ, United States
Duration: Apr 2 2005Apr 3 2005

    Fingerprint

ASJC Scopus subject areas

  • Bioengineering

Cite this