Enhancement of information transmission efficiency by synaptic failures

Mark S Goldman

Research output: Contribution to journalArticle

38 Scopus citations

Abstract

Many synapses have a high percentage of synaptic transmission failures. I consider the hypothesis that synaptic failures can increase the efficiency of information transmission across the synapse. I use the information transmitted per vesicle release about the presynaptic spike train as a measure of synaptic transmission efficiency and show that this measure can increase with the synaptic failure probability. I analytically calculate the Shannon mutual information transmitted across two model synapses with probabilistic transmission: one with a constant probability of vesicle release and one with vesicle release probabilities governed by the dynamics of synaptic depression. For inputs generated by a non-Poisson process with positive autocorrelations, both synapses can transmit more information per vesicle release than a synapse with perfect transmission, although the information increases are greater for the depressing synapse than for a constant-probability synapse with the same average transmission probability. The enhanced performance of the depressing synapse over the constant-release-probability synapse primarily reflects a decrease in noise entropy rather than an increase in the total transmission entropy. This indicates a limitation of analysis methods, such as decorrelation, that consider only the total response entropy. My results suggest that synaptic transmission failures governed by appropriately tuned synaptic dynamics can increase the information-carrying efficiency of a synapse.

Original languageEnglish (US)
Pages (from-to)1137-1162
Number of pages26
JournalNeural Computation
Volume16
Issue number6
DOIs
StatePublished - Jun 2004
Externally publishedYes

    Fingerprint

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Artificial Intelligence
  • Neuroscience(all)

Cite this