Partner: Bartosz Paprocki

Kazimierz Wielki University (PL)

Recent publications
1.Paprocki B., Szczepański J., How do the amplitude fluctuations affect the neuronal transmission efficiency, NEUROCOMPUTING, ISSN: 0925-2312, DOI: 10.1016/j.neucom.2012.11.001, Vol.104, pp.50-56, 2013
Abstract:

Tremendous effort is done to understand the nature of neuronal coding, its high efficiency and mechanisms governing it. Paprocki and Szczepanski [13] explored the model of neuron proposed by Levy and Baxter [12] and analyzed the efficiency with respect to the synaptic failure, activation threshold, firing rate and type of the input source. In this paper we study influence of the amplitude fluctuations (damping, uniform and amplifying), another important component in neuronal computations. The efficiency is understood in the sense of mutual information (between input and output signals), which constitutes the fundamental concept of the Shannon communication theory. Using high quality entropy estimators we determined maximal values of the mutual information between input and output neuronal signals for simple neuronal architecture. We observed that this maximal efficiency remains nearly constant, almost regardless of the fluctuation type. We have also found that for wide range of thresholds, both for damping and amplifying fluctuations, the mutual information behaves in an opposite way to the corresponding correlations between input and output signals. These calculations confirm that the neuronal coding is much more subtle than the straightforward intuitive optimization of input–output correlations.

Keywords:

Neural computation, Mutual information, Amplitude fluctuation, Activation threshold, Synaptic failure, Entropy estimation

Affiliations:
Paprocki B.-Kazimierz Wielki University (PL)
Szczepański J.-IPPT PAN
2.Paprocki B., Szczepański J., Transmission efficiency in ring, brain inspired neuronal networks. Informationand energetic aspects, Brain Research, ISSN: 0006-8993, DOI: 10.1016/j.brainres.2013.07.024, Vol.1536, pp.135-143, 2013
Abstract:

Organisms often evolve as compromises, and many of these compromises can be expressed in terms of energy efficiency. Thus, many authors analyze energetic costs processes during information transmission in the brain. In this paper we study information transmission rate per energy used in a class of ring, brain inspired neural networks, which we assume to involve components like excitatory and inhibitory neurons or long-range connections. Choosing model of neuron we followed a probabilistic approach proposed by Levy and Baxter (2002), which contains all essential qualitative mechanisms participating in the transmission process and provides results consistent with physiologically observed values.

Our research shows that all network components, in broad range of conditions, significantly improve the information-energetic efficiency. It turned out that inhibitory neurons can improve the information-energetic transmission efficiency by 50%, while long-range connections can improve the efficiency even by 70%. We also found that the most effective is the network with the smallest size: we observed that two times increase of the size can cause even three times decrease of the information-energetic efficiency.

Keywords:

Information transmission efficiency, Mutual information, Brain inspired network, Inhibitory neuron, Long-range connection, Neuronal computation

Affiliations:
Paprocki B.-Kazimierz Wielki University (PL)
Szczepański J.-IPPT PAN
3.Paprocki B., Szczepański J., Kołbuk D., Information transmission efficiency in neuronal communication systems, BMC NEUROSCIENCE, ISSN: 1471-2202, DOI: 10.1186/1471-2202-14-S1-P217, Vol.14(Suppl 1), No.P217, pp.1-2, 2013
Abstract:

The nature and efficiency of brain transmission pro-cesses, its high reliability and efficiency is one of the most elusive area of contemporary science [1]. We study information transmission efficiency by considering a neuronal communication as a Shannon-type channel. Thus, using high quality entropy estimators, we evaluate the mutual information between input and output signals. We assume model of neuron proposed by Levy and Baxter [2], which incorporates all essential qualitative mechanisms participating in neural transmission process.

Keywords:

transmission efficiency, neuronal communication, Shannon-type channe

Affiliations:
Paprocki B.-Kazimierz Wielki University (PL)
Szczepański J.-IPPT PAN
Kołbuk D.-IPPT PAN
4.Paprocki B., Szczepański J., Efficiency of neural transmission as a function of synaptic noise, threshold, and source characteristics, BIOSYSTEMS, ISSN: 0303-2647, DOI: 10.1016/j.biosystems.2011.03.005, Vol.105, pp.62-72, 2011
Abstract:

There has been a growing interest in the estimation of information carried by a single neuron and multiple single units or population of neurons to specific stimuli. In this paper we analyze, inspired by article of Levy and Baxter (2002), the efficiency of a neuronal communication by considering dendrosomatic summation as a Shannon-type channel (1948) and by considering such uncertain synaptic transmission as part of the dendrosomatic computation. Specifically, we study Mutual Information between input and output signals for different types of neuronal network architectures by applying efficient entropy estimators. We analyze the influence of the following quantities affecting transmission abilities of neurons: synaptic failure, activation threshold, firing rate and type of the input source. We observed a number of surprising non-intuitive effects. It turns out that, especially for lower activation thresholds, significant synaptic noise can lead even to twofold increase of the transmission efficiency. Moreover, the efficiency turns out to be a non-monotonic function of the activation threshold. We find a universal value of threshold for which a local maximum of Mutual Information is achieved for most of the neuronal architectures, regardless of the type of the source (correlated and non-correlated). Additionally, to reach the global maximum the optimal firing rates must increase with the threshold. This effect is particularly visible for lower firing rates. For higher firing rates the influence of synaptic noise on the transmission efficiency is more advantageous. Noise is an inherent component of communication in biological systems, hence, based on our analysis, we conjecture that the neuronal architecture was adjusted to make more effective use of this attribute.

Keywords:

Neuronal computation, Entropy, Mutual Information, Estimators, Neuron, Quantal failure, Activation threshold, Neural network

Affiliations:
Paprocki B.-Kazimierz Wielki University (PL)
Szczepański J.-IPPT PAN

Conference abstracts
1.Paprocki B., Pręgowska A., Szczepański J., Information Processing in Brain-Inspired Networks: Size and Density Effects, SolMech 2016, 40th Solid Mechanics Conference, 2016-08-29/09-02, Warszawa (PL), No.P192, pp.1-2, 2016
2.Szczepański J., Sanchez-Vives M.V., Arnold M.M., Montejo N., Paprocki B., Pręgowska A., Amigó J.M., Wajnryb E., Analyzing Neuroscience Signals using Information Theory and Complexity Shannon Communication Approach, 12th INCF, 12th INCF Workshop on Node Communication and Collaborative Neuroinformatics, 2015-04-16/04-17, Warszawa (PL), pp.1-32, 2015
3.Szczepański J., Paprocki B., Transmission efficiency in the brain-like neuronal networks. Information and energetic aspects, 10th International Neural Coding Workshop, 2012-09-02/09-08, Prague (CZ), pp.127-128, 2012
Keywords:

Neuronal Communication, Brain-like Network, Shannon Theory

Affiliations:
Szczepański J.-IPPT PAN
Paprocki B.-Kazimierz Wielki University (PL)
4.Paprocki B., Szczepański J., Effectiveness of information transmission in the brain-like communication models, 10th International Neural Coding Workshop, 2012-09-02/09-08, Prague (CZ), pp.93-94, 2012
Keywords:

Brain-like network, Information transmission, Neuronal computation

Affiliations:
Paprocki B.-Kazimierz Wielki University (PL)
Szczepański J.-IPPT PAN