Enhancement of information transmission efficiency by synaptic failures

Mark S Goldman

Research output: Contribution to journalArticle

35 Citations (Scopus)

Abstract

Many synapses have a high percentage of synaptic transmission failures. I consider the hypothesis that synaptic failures can increase the efficiency of information transmission across the synapse. I use the information transmitted per vesicle release about the presynaptic spike train as a measure of synaptic transmission efficiency and show that this measure can increase with the synaptic failure probability. I analytically calculate the Shannon mutual information transmitted across two model synapses with probabilistic transmission: one with a constant probability of vesicle release and one with vesicle release probabilities governed by the dynamics of synaptic depression. For inputs generated by a non-Poisson process with positive autocorrelations, both synapses can transmit more information per vesicle release than a synapse with perfect transmission, although the information increases are greater for the depressing synapse than for a constant-probability synapse with the same average transmission probability. The enhanced performance of the depressing synapse over the constant-release-probability synapse primarily reflects a decrease in noise entropy rather than an increase in the total transmission entropy. This indicates a limitation of analysis methods, such as decorrelation, that consider only the total response entropy. My results suggest that synaptic transmission failures governed by appropriately tuned synaptic dynamics can increase the information-carrying efficiency of a synapse.

Original languageEnglish (US)
Pages (from-to)1137-1162
Number of pages26
JournalNeural Computation
Volume16
Issue number6
DOIs
StatePublished - Jun 2004
Externally publishedYes

Fingerprint

Synaptic Transmission
Synapses
Entropy
Enhancement
Autocorrelation
Noise

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Artificial Intelligence
  • Neuroscience(all)

Cite this

Enhancement of information transmission efficiency by synaptic failures. / Goldman, Mark S.

In: Neural Computation, Vol. 16, No. 6, 06.2004, p. 1137-1162.

Research output: Contribution to journalArticle

Goldman, Mark S. / Enhancement of information transmission efficiency by synaptic failures. In: Neural Computation. 2004 ; Vol. 16, No. 6. pp. 1137-1162.
@article{1b43cb43fd6a4848ba2766b92187ae80,
title = "Enhancement of information transmission efficiency by synaptic failures",
abstract = "Many synapses have a high percentage of synaptic transmission failures. I consider the hypothesis that synaptic failures can increase the efficiency of information transmission across the synapse. I use the information transmitted per vesicle release about the presynaptic spike train as a measure of synaptic transmission efficiency and show that this measure can increase with the synaptic failure probability. I analytically calculate the Shannon mutual information transmitted across two model synapses with probabilistic transmission: one with a constant probability of vesicle release and one with vesicle release probabilities governed by the dynamics of synaptic depression. For inputs generated by a non-Poisson process with positive autocorrelations, both synapses can transmit more information per vesicle release than a synapse with perfect transmission, although the information increases are greater for the depressing synapse than for a constant-probability synapse with the same average transmission probability. The enhanced performance of the depressing synapse over the constant-release-probability synapse primarily reflects a decrease in noise entropy rather than an increase in the total transmission entropy. This indicates a limitation of analysis methods, such as decorrelation, that consider only the total response entropy. My results suggest that synaptic transmission failures governed by appropriately tuned synaptic dynamics can increase the information-carrying efficiency of a synapse.",
author = "Goldman, {Mark S}",
year = "2004",
month = "6",
doi = "10.1162/089976604773717568",
language = "English (US)",
volume = "16",
pages = "1137--1162",
journal = "Neural Computation",
issn = "0899-7667",
publisher = "MIT Press Journals",
number = "6",

}

TY - JOUR

T1 - Enhancement of information transmission efficiency by synaptic failures

AU - Goldman, Mark S

PY - 2004/6

Y1 - 2004/6

N2 - Many synapses have a high percentage of synaptic transmission failures. I consider the hypothesis that synaptic failures can increase the efficiency of information transmission across the synapse. I use the information transmitted per vesicle release about the presynaptic spike train as a measure of synaptic transmission efficiency and show that this measure can increase with the synaptic failure probability. I analytically calculate the Shannon mutual information transmitted across two model synapses with probabilistic transmission: one with a constant probability of vesicle release and one with vesicle release probabilities governed by the dynamics of synaptic depression. For inputs generated by a non-Poisson process with positive autocorrelations, both synapses can transmit more information per vesicle release than a synapse with perfect transmission, although the information increases are greater for the depressing synapse than for a constant-probability synapse with the same average transmission probability. The enhanced performance of the depressing synapse over the constant-release-probability synapse primarily reflects a decrease in noise entropy rather than an increase in the total transmission entropy. This indicates a limitation of analysis methods, such as decorrelation, that consider only the total response entropy. My results suggest that synaptic transmission failures governed by appropriately tuned synaptic dynamics can increase the information-carrying efficiency of a synapse.

AB - Many synapses have a high percentage of synaptic transmission failures. I consider the hypothesis that synaptic failures can increase the efficiency of information transmission across the synapse. I use the information transmitted per vesicle release about the presynaptic spike train as a measure of synaptic transmission efficiency and show that this measure can increase with the synaptic failure probability. I analytically calculate the Shannon mutual information transmitted across two model synapses with probabilistic transmission: one with a constant probability of vesicle release and one with vesicle release probabilities governed by the dynamics of synaptic depression. For inputs generated by a non-Poisson process with positive autocorrelations, both synapses can transmit more information per vesicle release than a synapse with perfect transmission, although the information increases are greater for the depressing synapse than for a constant-probability synapse with the same average transmission probability. The enhanced performance of the depressing synapse over the constant-release-probability synapse primarily reflects a decrease in noise entropy rather than an increase in the total transmission entropy. This indicates a limitation of analysis methods, such as decorrelation, that consider only the total response entropy. My results suggest that synaptic transmission failures governed by appropriately tuned synaptic dynamics can increase the information-carrying efficiency of a synapse.

UR - http://www.scopus.com/inward/record.url?scp=2442640054&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=2442640054&partnerID=8YFLogxK

U2 - 10.1162/089976604773717568

DO - 10.1162/089976604773717568

M3 - Article

C2 - 15130245

AN - SCOPUS:2442640054

VL - 16

SP - 1137

EP - 1162

JO - Neural Computation

JF - Neural Computation

SN - 0899-7667

IS - 6

ER -