Prof. Janusz Szczepański, Ph.D., Dr. Habil.

Department of Biosystems and Soft Matter (ZBiMM)
Division of Complex Fluids (PFPZ)
position: professor IPPT
telephone: (+48) 22 826 12 81 ext.: 449
room: 324
e-mail: jszczepa
personal site: http://bluebox.ippt.pan.pl/~jszczepa/

Doctoral thesis
1985Równanie Liouville'a w nieskończenie wymiarowej ośrodkowej przestrzeni Hilberta 
supervisor -- Prof. Henryk Zorski, Ph.D., Dr. Habil., Eng., IPPT PAN
416 
Habilitation thesis
2007-06-14Zastosowanie układów dynamicznych w kryptografii 
Professor
2014-07-28Title of professor
Supervision of doctoral theses
1.2015-12-04Paprocki Bartosz  
(UKW)
Analiza wydajności transmisji danych w komórkach i sieciach neuronowych metodami Teorii Informacji668
 
2.2012-05-31Chmielowiec Andrzej Wydajne metody generowania bezpiecznych parametrów algorytmów klucza publicznego654
 

Recent publications
1.Pręgowska A., Szczepański J., Wajnryb E., Temporal code versus rate code for binary Information Sources, NEUROCOMPUTING, ISSN: 0925-2312, DOI: 10.1016/j.neucom.2016.08.034, Vol.216, pp.756-762, 2016
Abstract:

Neuroscientists formulate very different hypotheses about the nature of neural coding. At one extreme, it has been argued that neurons encode information through relatively slow changes in the arrival rates of individual spikes (rate codes) and that the irregularity in the spike trains reflects the noise in the system. At the other extreme, this irregularity is the code itself (temporal codes) so that the precise timing of every spike carries additional information about the input. It is well known that in the estimation of Shannon Information Transmission Rate, the patterns and temporal structures are taken into account, while the “rate code” is already determined by the firing rate, i.e. by the spike frequency. In this paper we compare these two types of codes for binary Information Sources, which model encoded spike trains. Assuming that the information transmitted by a neuron is governed by an uncorrelated stochastic process or by a process with a memory, we compare the Information Transmission Rates carried by such spike trains with their firing rates. Here we show that a crucial role in the relation between information transmission and firing rates is played by a factor that we call the “jumping” parameter. This parameter corresponds to the probability of transitions from the no-spike-state to the spike-state and vice versa. For low jumping parameter values, the quotient of information and firing rates is a monotonically decreasing function of the firing rate, and there therefore a straightforward, one-to-one, relation between temporal and rate codes. However, it turns out that for large enough values of the jumping parameter this quotient is a non-monotonic function of the firing rate and it exhibits a global maximum, so that in this case there is an optimal firing rate. Moreover, there is no one-to-one relation between information and firing rates, so the temporal and rate codes differ qualitatively. This leads to the observation that the behavior of the quotient of information and firing rates for a large jumping parameter value is especially important in the context of bursting phenomena.

Keywords:

Information Theory, Information Source, Stochastic process, Information transmission rate, Firing rate

Affiliations:
Pręgowska A.-IPPT PAN
Szczepański J.-IPPT PAN
Wajnryb E.-IPPT PAN
2.Pręgowska A., Szczepański J., Wajnryb E., Mutual information against correlations in binary communication channels, BMC NEUROSCIENCE, ISSN: 1471-2202, DOI: 10.1186/s12868-015-0168-0, Vol.16, No.32, pp.1-7, 2015
Abstract:

Background
Explaining how the brain processing is so fast remains an open problem (van Hemmen JL, Sejnowski T., 2004). Thus, the analysis of neural transmission (Shannon CE, Weaver W., 1963) processes basically focuses on searching for effective encoding and decoding schemes. According to the Shannon fundamental theorem, mutual information plays a crucial role in characterizing the efficiency of communication channels. It is well known that this efficiency is determined by the channel capacity that is already the maximal mutual information between input and output signals. On the other hand, intuitively speaking, when input and output signals are more correlated, the transmission should be more efficient. A natural question arises about the relation between mutual information and correlation. We analyze the relation between these quantities using the binary representation of signals, which is the most common approach taken in studying neuronal processes of the brain.

Results
We present binary communication channels for which mutual information and correlation coefficients behave differently both quantitatively and qualitatively. Despite this difference in behavior, we show that the noncorrelation of binary signals implies their independence, in contrast to the case for general types of signals.

Conclusions
Our research shows that the mutual information cannot be replaced by sheer correlations. Our results indicate that neuronal encoding has more complicated nature which cannot be captured by straightforward correlations between input and output signals once the mutual information takes into account the structure and patterns of the signals.

Keywords:

Shannon information, Communication channel, Entropy, Mutual information, Correlation, Neuronal encoding

Affiliations:
Pręgowska A.-IPPT PAN
Szczepański J.-IPPT PAN
Wajnryb E.-IPPT PAN
3.Arnold M.M., Szczepański J., Montejo N., Amigó J.M., Wajnryb E., Sanchez-Vives M.V., Information content in cortical spike trains during brain state transitions, JOURNAL OF SLEEP RESEARCH, ISSN: 0962-1105, DOI: 10.1111/j.1365-2869.2012.01031.x, Vol.22, pp.13-21, 2013
Abstract:

Even in the absence of external stimuli there is ongoing activity in the cerebral cortex as a result of recurrent connectivity. This paper attempts to characterize one aspect of this ongoing activity by examining how the information content carried by specific neurons varies as a function of brain state. We recorded from rats chronically implanted with tetrodes in the primary visual cortex during awake and sleep periods. Electro-encephalogram and spike trains were recorded during 30-min periods, and 2–4 neuronal spikes were isolated per tetrode off-line. All the activity included in the analysis was spontaneous, being recorded from the visual cortex in the absence of visual stimuli. The brain state was determined through a combination of behavior evaluation, electroencephalogram and electromyogram analysis. Information in the spike trains was determined by using Lempel–Ziv Complexity. Complexity was used to estimate the entropy of neural discharges and thus the information content (Amigóet al. Neural Comput., 2004, 16: 717–736). The information content in spike trains (range 4–70 bits s−1) was evaluated during different brain states and particularly during the transition periods. Transitions toward states of deeper sleep coincided with a decrease of information, while transitions to the awake state resulted in an increase in information. Changes in both directions were of the same magnitude, about 30%. Information in spike trains showed a high temporal correlation between neurons, reinforcing the idea of the impact of the brain state in the information content of spike trains.

Keywords:

awake, brain states, entropy, firing rate, information, sleep, spike train

Affiliations:
Arnold M.M.-Universidad Miguel Hernández-CSIC (ES)
Szczepański J.-IPPT PAN
Montejo N.-Universidad Miguel Hernández-CSIC (ES)
Amigó J.M.-Universidad Miguel Hernández-CSIC (ES)
Wajnryb E.-IPPT PAN
Sanchez-Vives M.V.-ICREA-IDIBAPS (ES)
4.Paprocki B., Szczepański J., How do the amplitude fluctuations affect the neuronal transmission efficiency, NEUROCOMPUTING, ISSN: 0925-2312, DOI: 10.1016/j.neucom.2012.11.001, Vol.104, pp.50-56, 2013
Abstract:

Tremendous effort is done to understand the nature of neuronal coding, its high efficiency and mechanisms governing it. Paprocki and Szczepanski [13] explored the model of neuron proposed by Levy and Baxter [12] and analyzed the efficiency with respect to the synaptic failure, activation threshold, firing rate and type of the input source. In this paper we study influence of the amplitude fluctuations (damping, uniform and amplifying), another important component in neuronal computations. The efficiency is understood in the sense of mutual information (between input and output signals), which constitutes the fundamental concept of the Shannon communication theory. Using high quality entropy estimators we determined maximal values of the mutual information between input and output neuronal signals for simple neuronal architecture. We observed that this maximal efficiency remains nearly constant, almost regardless of the fluctuation type. We have also found that for wide range of thresholds, both for damping and amplifying fluctuations, the mutual information behaves in an opposite way to the corresponding correlations between input and output signals. These calculations confirm that the neuronal coding is much more subtle than the straightforward intuitive optimization of input–output correlations.

Keywords:

Neural computation, Mutual information, Amplitude fluctuation, Activation threshold, Synaptic failure, Entropy estimation

Affiliations:
Paprocki B.-Kazimierz Wielki University (PL)
Szczepański J.-IPPT PAN
5.Paprocki B., Szczepański J., Transmission efficiency in ring, brain inspired neuronal networks. Informationand energetic aspects, Brain Research, ISSN: 0006-8993, DOI: 10.1016/j.brainres.2013.07.024, Vol.1536, pp.135-143, 2013
Abstract:

Organisms often evolve as compromises, and many of these compromises can be expressed in terms of energy efficiency. Thus, many authors analyze energetic costs processes during information transmission in the brain. In this paper we study information transmission rate per energy used in a class of ring, brain inspired neural networks, which we assume to involve components like excitatory and inhibitory neurons or long-range connections. Choosing model of neuron we followed a probabilistic approach proposed by Levy and Baxter (2002), which contains all essential qualitative mechanisms participating in the transmission process and provides results consistent with physiologically observed values.

Our research shows that all network components, in broad range of conditions, significantly improve the information-energetic efficiency. It turned out that inhibitory neurons can improve the information-energetic transmission efficiency by 50%, while long-range connections can improve the efficiency even by 70%. We also found that the most effective is the network with the smallest size: we observed that two times increase of the size can cause even three times decrease of the information-energetic efficiency.

Keywords:

Information transmission efficiency, Mutual information, Brain inspired network, Inhibitory neuron, Long-range connection, Neuronal computation

Affiliations:
Paprocki B.-Kazimierz Wielki University (PL)
Szczepański J.-IPPT PAN
6.Paprocki B., Szczepański J., Kołbuk D., Information transmission efficiency in neuronal communication systems, BMC NEUROSCIENCE, ISSN: 1471-2202, DOI: 10.1186/1471-2202-14-S1-P217, Vol.14(Suppl 1), No.P217, pp.1-2, 2013
Abstract:

The nature and efficiency of brain transmission pro-cesses, its high reliability and efficiency is one of the most elusive area of contemporary science [1]. We study information transmission efficiency by considering a neuronal communication as a Shannon-type channel. Thus, using high quality entropy estimators, we evaluate the mutual information between input and output signals. We assume model of neuron proposed by Levy and Baxter [2], which incorporates all essential qualitative mechanisms participating in neural transmission process.

Keywords:

transmission efficiency, neuronal communication, Shannon-type channe

Affiliations:
Paprocki B.-Kazimierz Wielki University (PL)
Szczepański J.-IPPT PAN
Kołbuk D.-IPPT PAN
7.Szczepański J., Arnold M., Wajnryb E., Amigó J.M., Sanchez-Vives M.V., Mutual information and redundancy in spontaneous communication between cortical neurons, BIOLOGICAL CYBERNETICS, ISSN: 0340-1200, DOI: 10.1007/s00422-011-0425-y, Vol.104, pp.161-174, 2011
Abstract:

An important question in neural information processing is how neurons cooperate to transmit information. To study this question, we resort to the concept of redundancy in the information transmitted by a group of neurons and, at the same time, we introduce a novel concept for measuring cooperation between pairs of neurons called relative mutual information (RMI). Specifically, we studied these two parameters for spike trains generated by neighboring neurons from the primary visual cortex in the awake, freely moving rat. The spike trains studied here were spontaneously generated in the cortical network, in the absence of visual stimulation. Under these conditions, our analysis revealed that while the value of RMI oscillated slightly around an average value, the redundancy exhibited a behavior characterized by a higher variability. We conjecture that this combination of approximately constant RMI and greater variable redundancy makes information transmission more resistant to noise disturbances. Furthermore, the redundancy values suggest that neurons can cooperate in a flexible way during information transmission. This mostly occurs via a leading neuron with higher transmission rate or, less frequently, through the information rate of the whole group being higher than the sum of the individual information rates—in other words in a synergetic manner. The proposed method applies not only to the stationary, but also to locally stationary neural signals.

Keywords:

Neurons, Shannon information, Entropy, Mutual information, Redundancy, Visual cortex, Spikes train, Spontaneous activity

Affiliations:
Szczepański J.-IPPT PAN
Arnold M.-Universidad Miguel Hernández-CSIC (ES)
Wajnryb E.-IPPT PAN
Amigó J.M.-Universidad Miguel Hernández-CSIC (ES)
Sanchez-Vives M.V.-ICREA-IDIBAPS (ES)
8.Paprocki B., Szczepański J., Efficiency of neural transmission as a function of synaptic noise, threshold, and source characteristics, BIOSYSTEMS, ISSN: 0303-2647, DOI: 10.1016/j.biosystems.2011.03.005, Vol.105, pp.62-72, 2011
Abstract:

There has been a growing interest in the estimation of information carried by a single neuron and multiple single units or population of neurons to specific stimuli. In this paper we analyze, inspired by article of Levy and Baxter (2002), the efficiency of a neuronal communication by considering dendrosomatic summation as a Shannon-type channel (1948) and by considering such uncertain synaptic transmission as part of the dendrosomatic computation. Specifically, we study Mutual Information between input and output signals for different types of neuronal network architectures by applying efficient entropy estimators. We analyze the influence of the following quantities affecting transmission abilities of neurons: synaptic failure, activation threshold, firing rate and type of the input source. We observed a number of surprising non-intuitive effects. It turns out that, especially for lower activation thresholds, significant synaptic noise can lead even to twofold increase of the transmission efficiency. Moreover, the efficiency turns out to be a non-monotonic function of the activation threshold. We find a universal value of threshold for which a local maximum of Mutual Information is achieved for most of the neuronal architectures, regardless of the type of the source (correlated and non-correlated). Additionally, to reach the global maximum the optimal firing rates must increase with the threshold. This effect is particularly visible for lower firing rates. For higher firing rates the influence of synaptic noise on the transmission efficiency is more advantageous. Noise is an inherent component of communication in biological systems, hence, based on our analysis, we conjecture that the neuronal architecture was adjusted to make more effective use of this attribute.

Keywords:

Neuronal computation, Entropy, Mutual Information, Estimators, Neuron, Quantal failure, Activation threshold, Neural network

Affiliations:
Paprocki B.-Kazimierz Wielki University (PL)
Szczepański J.-IPPT PAN
9.Szczepański J., On the distribution function of the complexity of finite sequences, INFORMATION SCIENCES, ISSN: 0020-0255, DOI: 10.1016/j.ins.2008.12.019, Vol.179, pp.1217-1220, 2009
Abstract:

Investigations of complexity of sequences lead to important applications such as effective data compression, testing of randomness, discriminating between information sources and many others. In this paper we establish formulae describing the distribution functions of random variables representing the complexity of finite sequences introduced by Lempel and Ziv in 1976. It is known that this quantity can be used as an estimator of entropy. We show that the distribution functions depend affinely on the probabilities of the so-called ‘‘exact” sequences.

Keywords:

Lempel–Ziv complexity, Distribution function, Randomness

Affiliations:
Szczepański J.-IPPT PAN
10.Nagarajan R., Szczepański J., Wajnryb E., Interpreting non-random signatures in biomedical signals with Lempel-Ziv complexity, PHYSICA D-NONLINEAR PHENOMENA, ISSN: 0167-2789, DOI: 10.1016/j.physd.2007.09.007, Vol.237, pp.359-364, 2008
Abstract:

Lempel–Ziv complexity (LZ) [J. Ziv, A. Lempel, On the complexity of finite sequences, IEEE Trans. Inform. Theory 22 (1976) 75–81] and its variants have been used widely to identify non-random patterns in biomedical signals obtained across distinct physiological states. Non-random signatures of the complexity measure can occur under nonlinear deterministic as well as non-deterministic settings. Surrogate data testing have also been encouraged in the past in conjunction with complexity estimates to make a finer distinction between various classes of processes. In this brief letter, we make two important observations (1) Non-Gaussian noise at the dynamical level can elude existing surrogate algorithms namely: Phase-randomized surrogates (FT) amplitude-adjusted Fourier transform (AAFT) and iterated amplitude-adjusted Fourier transform (IAAFT). Thus any inference nonlinear determinism as an explanation for the non-randomness is incomplete (2) Decrease in complexity can be observed even across two linear processes with identical auto-correlation functions. The results are illustrated with a second-order auto-regressive process with Gaussian and non-Gaussian innovations. AR(2) processes have been used widely to model several physiological phenomena, hence their choice. The results presented encou rage cautious interpretation of non-random signatures in experimental signals using complexity measures.

Keywords:

Lempel–Ziv complexity, Surrogate testing, Auto-regressive process

Affiliations:
Nagarajan R.-other affiliation
Szczepański J.-IPPT PAN
Wajnryb E.-IPPT PAN
11.Amigó J.M., Kocarev L., Szczepański J., On some properties of the discrete Lyapunov exponent, PHYSICS LETTERS A, ISSN: 0375-9601, DOI: 10.1016/j.physleta.2008.07.076, Vol.372, pp.6265-6268, 2008
Abstract:

One of the possible by-products of discrete chaos is the application of its tools, in particular of the discrete Lyapunov exponent, to cryptography. In this Letter we explore this question in a very general setting.

Affiliations:
Amigó J.M.-Universidad Miguel Hernández-CSIC (ES)
Kocarev L.-University “Kiril i Metodij”, Skopje (MK)
Szczepański J.-IPPT PAN
12.Amigó J.M., Kocarev L., Szczepański J., Theory and practice of chaotic cryptography, PHYSICS LETTERS A, ISSN: 0375-9601, DOI: 10.1016/j.physleta.2007.02.021, Vol.366, pp.211-216, 2007
Abstract:

In this Letter we address some basic questions about chaotic cryptography, not least the very definition of chaos in discrete systems. We propose a conceptual framework and illustrate it with different examples from private and public key cryptography. We elaborate also on possible limits of chaotic cryptography.

Affiliations:
Amigó J.M.-Universidad Miguel Hernández-CSIC (ES)
Kocarev L.-University “Kiril i Metodij”, Skopje (MK)
Szczepański J.-IPPT PAN
13.Amigó J.M., Kocarev L., Szczepański J., Discrete Lyapunov exponent and resistance to differential cryptanalysis, IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, ISSN: 1549-7747, DOI: 10.1109/TCSII.2007.901576, Vol.54, No.10, pp.882-886, 2007
Abstract:

In a recent paper, Jakimoski and Subbalakshmi provided a nice connection between the so-called discrete Lyapunov exponent of a permutation F defined on a finite lattice and its maximal differential probability, a parameter that measures the complexity of a differential cryptanalysis attack on the substitution defined by F. In this brief, we take a second look at their result to find some practical shortcomings. We also discuss more general aspects.

Keywords:

Differential cryptanalysis, discrete Lyapunov exponent (DLE), maximum differential probability (DP)

Affiliations:
Amigó J.M.-Universidad Miguel Hernández-CSIC (ES)
Kocarev L.-University “Kiril i Metodij”, Skopje (MK)
Szczepański J.-IPPT PAN
14.Kocarev L., Szczepański J., Amigó J.M., Tomovski I., Discrete chaos - I: theory, IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, ISSN: 1549-8328, DOI: 10.1109/TCSI.2006.874181, Vol.53, No.6, pp.1300-1309, 2006
Abstract:

We propose a definition of the discrete Lyapunov exponent for an arbitrary permutation of a finite lattice. For discrete-time dynamical systems, it measures the local (between neighboring points) average spreading of the system. We justify our definition by proving that, for large classes of chaotic maps, the corresponding discrete Lyapunov exponent approaches the largest Lyapunov exponent of a chaotic map when Mrarrinfin, where M is the cardinality of the discrete phase space. In analogy with continuous systems, we say the system has discrete chaos if its discrete Lyapunov exponent tends to a positive number, when Mrarrinfin. We present several examples to illustrate the concepts being introduced.

Keywords:

Chaos, discrete chaos, Lyapunov components

Affiliations:
Kocarev L.-University “Kiril i Metodij”, Skopje (MK)
Szczepański J.-IPPT PAN
Amigó J.M.-Universidad Miguel Hernández-CSIC (ES)
Tomovski I.-other affiliation
15.Amigó J.M., Kocarev L., Szczepański J., Order patterns and chaos, PHYSICS LETTERS A, ISSN: 0375-9601, DOI: 10.1016/j.physleta.2006.01.093, Vol.355, pp.27-31, 2006
Abstract:

Chaotic maps can mimic random behavior in a quite impressive way. In particular, those possessing a generating partition can produce any symbolic sequence by properly choosing the initial state. We study in this Letter the ability of chaotic maps to generate order patterns and come to the conclusion that their performance in this respect falls short of expectations. This result reveals some basic limitation of a deterministic dynamic as compared to a random one. This being the case, we propose a non-statistical test based on ‘forbidden’ order patterns to discriminate chaotic from truly random time series with, in principle, arbitrarily high probability. Some relations with discrete chaos and chaotic cryptography are also discussed.

Keywords:

Chaotic maps, Order patterns, Permutation entropy, Discrete Lyapunov exponent, Chaotic cryptography

Affiliations:
Amigó J.M.-Universidad Miguel Hernández-CSIC (ES)
Kocarev L.-University “Kiril i Metodij”, Skopje (MK)
Szczepański J.-IPPT PAN
16.Szczepański J., Amigó J.M., Michałek T., Kocarev L., Cryptographically secure substitutions based on the approximation of mixing maps, IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, ISSN: 1549-8328, DOI: 10.1109/TCSI.2004.841602, Vol.52, No.2, pp.443-453, 2005
Abstract:

In this paper, we explore, following Shannon’s suggestion that diffusion should be one of the ingredients of resistant block ciphers, the feasibility of designing cryptographically secure substitutions (think of S-boxes, say) via approximation of mixing maps by periodic transformations. The expectation behind this approach is, of course, that the nice diffusion properties of such maps will be inherited by their approximations, at least if the convergence rate is appropriate and the associated partitions are sufficiently fine. Our results show that this is indeed the case and that, in principle, block ciphers with close-to-optimal immunity to linear and differential cryptanalysis (as measured by the linear and differential approximation probabilities) can be designed along these guidelines. We provide also practical examples and numerical evidence for this approximation philosophy.

Keywords:

Black cipher, differential cryptanalysis, linear cryptanalysis, mixing dynamical system, periodic approximation, S box

Affiliations:
Szczepański J.-IPPT PAN
Amigó J.M.-Universidad Miguel Hernández-CSIC (ES)
Michałek T.-IPPT PAN
Kocarev L.-University “Kiril i Metodij”, Skopje (MK)
17.Amigó J.M., Szczepański J., Kocarev L., A chaos-based approach to the design of cryptographically secure substitutions, PHYSICS LETTERS A, ISSN: 0375-9601, DOI: 10.1016/j.physleta.2005.05.057, Vol.343, pp.55-60, 2005
Abstract:

We show that chaotic maps may be used for designing so-called substitution boxes for ciphers resistant to linear and differential cryptanalysis, providing an alternative to the algebraic methods. Our approach is based on the approximation of mixing maps by periodic transformations. The expectation behind is, of course, that the nice chaotic properties of such maps will be inherited by their approximations, at least if the convergence rate is appropriate and the associated partitions are sufficiently fine. We show that this is indeed the case and that, in principle, substitutions with close-to-optimal immunity to linear and differential cryptanalysis can be designed along these guidelines. We provide also practical examples and numerical evidence for this approximation philosophy

Keywords:

Chaotic maps, Periodic approximations, Bit permutations, Cryptanalysis

Affiliations:
Amigó J.M.-Universidad Miguel Hernández-CSIC (ES)
Szczepański J.-IPPT PAN
Kocarev L.-University “Kiril i Metodij”, Skopje (MK)
18.Amigó J.M., Szczepański J., A Conceptual Guide to Chaos Theory, Prace IPPT - IFTR Reports, ISSN: 2299-3657, No.9, pp.1-43, 1999

Conference abstracts
1.Paprocki B., Pręgowska A., Szczepański J., Information Processing in Brain-Inspired Networks: Size and Density Effects, SolMech 2016, 40th Solid Mechanics Conference, 2016-08-29/09-02, Warszawa (PL), No.P192, pp.1-2, 2016
2.Szczepański J., Sanchez-Vives M.V., Arnold M.M., Montejo N., Paprocki B., Pręgowska A., Amigó J.M., Wajnryb E., Analyzing Neuroscience Signals using Information Theory and Complexity Shannon Communication Approach, 12th INCF, 12th INCF Workshop on Node Communication and Collaborative Neuroinformatics, 2015-04-16/04-17, Warszawa (PL), pp.1-32, 2015
3.Szczepański J., Paprocki B., Transmission efficiency in the brain-like neuronal networks. Information and energetic aspects, 10th International Neural Coding Workshop, 2012-09-02/09-08, Prague (CZ), pp.127-128, 2012
Keywords:

Neuronal Communication, Brain-like Network, Shannon Theory

Affiliations:
Szczepański J.-IPPT PAN
Paprocki B.-Kazimierz Wielki University (PL)
4.Paprocki B., Szczepański J., Effectiveness of information transmission in the brain-like communication models, 10th International Neural Coding Workshop, 2012-09-02/09-08, Prague (CZ), pp.93-94, 2012
Keywords:

Brain-like network, Information transmission, Neuronal computation

Affiliations:
Paprocki B.-Kazimierz Wielki University (PL)
Szczepański J.-IPPT PAN