Prof. Janusz Szczepański, Ph.D., Dr. Habil. 

Doctoral thesis
1985  Równanie Liouville'a w nieskończenie wymiarowej ośrodkowej przestrzeni Hilberta
 416 
Habilitation thesis
20070614  Zastosowanie układów dynamicznych w kryptografii 
Professor
20140728  Title of professor 
Supervision of doctoral theses
1.  20151204  Paprocki Bartosz (UKW)  Analiza wydajności transmisji danych w komórkach i sieciach neuronowych metodami Teorii Informacji  668  
2.  20120531  Chmielowiec Andrzej  Wydajne metody generowania bezpiecznych parametrów algorytmów klucza publicznego  654 
Recent publications
1.  Pręgowska A., Szczepański J., Wajnryb E., Temporal code versus rate code for binary Information Sources, NEUROCOMPUTING, ISSN: 09252312, DOI: 10.1016/j.neucom.2016.08.034, Vol.216, pp.756762, 2016 Abstract: Neuroscientists formulate very different hypotheses about the nature of neural coding. At one extreme, it has been argued that neurons encode information through relatively slow changes in the arrival rates of individual spikes (rate codes) and that the irregularity in the spike trains reflects the noise in the system. At the other extreme, this irregularity is the code itself (temporal codes) so that the precise timing of every spike carries additional information about the input. It is well known that in the estimation of Shannon Information Transmission Rate, the patterns and temporal structures are taken into account, while the “rate code” is already determined by the firing rate, i.e. by the spike frequency. In this paper we compare these two types of codes for binary Information Sources, which model encoded spike trains. Assuming that the information transmitted by a neuron is governed by an uncorrelated stochastic process or by a process with a memory, we compare the Information Transmission Rates carried by such spike trains with their firing rates. Here we show that a crucial role in the relation between information transmission and firing rates is played by a factor that we call the “jumping” parameter. This parameter corresponds to the probability of transitions from the nospikestate to the spikestate and vice versa. For low jumping parameter values, the quotient of information and firing rates is a monotonically decreasing function of the firing rate, and there therefore a straightforward, onetoone, relation between temporal and rate codes. However, it turns out that for large enough values of the jumping parameter this quotient is a nonmonotonic function of the firing rate and it exhibits a global maximum, so that in this case there is an optimal firing rate. Moreover, there is no onetoone relation between information and firing rates, so the temporal and rate codes differ qualitatively. This leads to the observation that the behavior of the quotient of information and firing rates for a large jumping parameter value is especially important in the context of bursting phenomena. Keywords:Information Theory, Information Source, Stochastic process, Information transmission rate, Firing rate Affiliations:
 
2.  Pręgowska A., Szczepański J., Wajnryb E., Mutual information against correlations in binary communication channels, BMC NEUROSCIENCE, ISSN: 14712202, DOI: 10.1186/s1286801501680, Vol.16, No.32, pp.17, 2015 Abstract: Background Shannon information, Communication channel, Entropy, Mutual information, Correlation, Neuronal encoding Affiliations:
 
3.  Arnold M.M.^{♦}, Szczepański J., Montejo N.^{♦}, Amigó J.M.^{♦}, Wajnryb E., SanchezVives M.V.^{♦}, Information content in cortical spike trains during brain state transitions, JOURNAL OF SLEEP RESEARCH, ISSN: 09621105, DOI: 10.1111/j.13652869.2012.01031.x, Vol.22, pp.1321, 2013 Abstract: Even in the absence of external stimuli there is ongoing activity in the cerebral cortex as a result of recurrent connectivity. This paper attempts to characterize one aspect of this ongoing activity by examining how the information content carried by specific neurons varies as a function of brain state. We recorded from rats chronically implanted with tetrodes in the primary visual cortex during awake and sleep periods. Electroencephalogram and spike trains were recorded during 30min periods, and 2–4 neuronal spikes were isolated per tetrode offline. All the activity included in the analysis was spontaneous, being recorded from the visual cortex in the absence of visual stimuli. The brain state was determined through a combination of behavior evaluation, electroencephalogram and electromyogram analysis. Information in the spike trains was determined by using Lempel–Ziv Complexity. Complexity was used to estimate the entropy of neural discharges and thus the information content (Amigóet al. Neural Comput., 2004, 16: 717–736). The information content in spike trains (range 4–70 bits s−1) was evaluated during different brain states and particularly during the transition periods. Transitions toward states of deeper sleep coincided with a decrease of information, while transitions to the awake state resulted in an increase in information. Changes in both directions were of the same magnitude, about 30%. Information in spike trains showed a high temporal correlation between neurons, reinforcing the idea of the impact of the brain state in the information content of spike trains. Keywords:awake, brain states, entropy, firing rate, information, sleep, spike train Affiliations:
 
4.  Paprocki B.^{♦}, Szczepański J., How do the amplitude fluctuations affect the neuronal transmission efficiency, NEUROCOMPUTING, ISSN: 09252312, DOI: 10.1016/j.neucom.2012.11.001, Vol.104, pp.5056, 2013 Abstract: Tremendous effort is done to understand the nature of neuronal coding, its high efficiency and mechanisms governing it. Paprocki and Szczepanski [13] explored the model of neuron proposed by Levy and Baxter [12] and analyzed the efficiency with respect to the synaptic failure, activation threshold, firing rate and type of the input source. In this paper we study influence of the amplitude fluctuations (damping, uniform and amplifying), another important component in neuronal computations. The efficiency is understood in the sense of mutual information (between input and output signals), which constitutes the fundamental concept of the Shannon communication theory. Using high quality entropy estimators we determined maximal values of the mutual information between input and output neuronal signals for simple neuronal architecture. We observed that this maximal efficiency remains nearly constant, almost regardless of the fluctuation type. We have also found that for wide range of thresholds, both for damping and amplifying fluctuations, the mutual information behaves in an opposite way to the corresponding correlations between input and output signals. These calculations confirm that the neuronal coding is much more subtle than the straightforward intuitive optimization of input–output correlations. Keywords:Neural computation, Mutual information, Amplitude fluctuation, Activation threshold, Synaptic failure, Entropy estimation Affiliations:
 
5.  Paprocki B.^{♦}, Szczepański J., Transmission efficiency in ring, brain inspired neuronal networks. Informationand energetic aspects, Brain Research, ISSN: 00068993, DOI: 10.1016/j.brainres.2013.07.024, Vol.1536, pp.135143, 2013 Abstract: Organisms often evolve as compromises, and many of these compromises can be expressed in terms of energy efficiency. Thus, many authors analyze energetic costs processes during information transmission in the brain. In this paper we study information transmission rate per energy used in a class of ring, brain inspired neural networks, which we assume to involve components like excitatory and inhibitory neurons or longrange connections. Choosing model of neuron we followed a probabilistic approach proposed by Levy and Baxter (2002), which contains all essential qualitative mechanisms participating in the transmission process and provides results consistent with physiologically observed values. Information transmission efficiency, Mutual information, Brain inspired network, Inhibitory neuron, Longrange connection, Neuronal computation Affiliations:
 
6.  Paprocki B.^{♦}, Szczepański J., Kołbuk D., Information transmission efficiency in neuronal communication systems, BMC NEUROSCIENCE, ISSN: 14712202, DOI: 10.1186/1471220214S1P217, Vol.14(Suppl 1), No.P217, pp.12, 2013 Abstract: The nature and efficiency of brain transmission processes, its high reliability and efficiency is one of the most elusive area of contemporary science [1]. We study information transmission efficiency by considering a neuronal communication as a Shannontype channel. Thus, using high quality entropy estimators, we evaluate the mutual information between input and output signals. We assume model of neuron proposed by Levy and Baxter [2], which incorporates all essential qualitative mechanisms participating in neural transmission process. Keywords:transmission efficiency, neuronal communication, Shannontype channe Affiliations:
 
7.  Szczepański J., Arnold M.^{♦}, Wajnryb E., Amigó J.M.^{♦}, SanchezVives M.V.^{♦}, Mutual information and redundancy in spontaneous communication between cortical neurons, BIOLOGICAL CYBERNETICS, ISSN: 03401200, DOI: 10.1007/s004220110425y, Vol.104, pp.161174, 2011 Abstract: An important question in neural information processing is how neurons cooperate to transmit information. To study this question, we resort to the concept of redundancy in the information transmitted by a group of neurons and, at the same time, we introduce a novel concept for measuring cooperation between pairs of neurons called relative mutual information (RMI). Specifically, we studied these two parameters for spike trains generated by neighboring neurons from the primary visual cortex in the awake, freely moving rat. The spike trains studied here were spontaneously generated in the cortical network, in the absence of visual stimulation. Under these conditions, our analysis revealed that while the value of RMI oscillated slightly around an average value, the redundancy exhibited a behavior characterized by a higher variability. We conjecture that this combination of approximately constant RMI and greater variable redundancy makes information transmission more resistant to noise disturbances. Furthermore, the redundancy values suggest that neurons can cooperate in a flexible way during information transmission. This mostly occurs via a leading neuron with higher transmission rate or, less frequently, through the information rate of the whole group being higher than the sum of the individual information rates—in other words in a synergetic manner. The proposed method applies not only to the stationary, but also to locally stationary neural signals. Keywords:Neurons, Shannon information, Entropy, Mutual information, Redundancy, Visual cortex, Spikes train, Spontaneous activity Affiliations:
 
8.  Paprocki B.^{♦}, Szczepański J., Efficiency of neural transmission as a function of synaptic noise, threshold, and source characteristics, BIOSYSTEMS, ISSN: 03032647, DOI: 10.1016/j.biosystems.2011.03.005, Vol.105, pp.6272, 2011 Abstract: There has been a growing interest in the estimation of information carried by a single neuron and multiple single units or population of neurons to specific stimuli. In this paper we analyze, inspired by article of Levy and Baxter (2002), the efficiency of a neuronal communication by considering dendrosomatic summation as a Shannontype channel (1948) and by considering such uncertain synaptic transmission as part of the dendrosomatic computation. Specifically, we study Mutual Information between input and output signals for different types of neuronal network architectures by applying efficient entropy estimators. We analyze the influence of the following quantities affecting transmission abilities of neurons: synaptic failure, activation threshold, firing rate and type of the input source. We observed a number of surprising nonintuitive effects. It turns out that, especially for lower activation thresholds, significant synaptic noise can lead even to twofold increase of the transmission efficiency. Moreover, the efficiency turns out to be a nonmonotonic function of the activation threshold. We find a universal value of threshold for which a local maximum of Mutual Information is achieved for most of the neuronal architectures, regardless of the type of the source (correlated and noncorrelated). Additionally, to reach the global maximum the optimal firing rates must increase with the threshold. This effect is particularly visible for lower firing rates. For higher firing rates the influence of synaptic noise on the transmission efficiency is more advantageous. Noise is an inherent component of communication in biological systems, hence, based on our analysis, we conjecture that the neuronal architecture was adjusted to make more effective use of this attribute. Keywords:Neuronal computation, Entropy, Mutual Information, Estimators, Neuron, Quantal failure, Activation threshold, Neural network Affiliations:
 
9.  Szczepański J., On the distribution function of the complexity of finite sequences, INFORMATION SCIENCES, ISSN: 00200255, DOI: 10.1016/j.ins.2008.12.019, Vol.179, pp.12171220, 2009 Abstract: Investigations of complexity of sequences lead to important applications such as effective data compression, testing of randomness, discriminating between information sources and many others. In this paper we establish formulae describing the distribution functions of random variables representing the complexity of finite sequences introduced by Lempel and Ziv in 1976. It is known that this quantity can be used as an estimator of entropy. We show that the distribution functions depend affinely on the probabilities of the socalled ‘‘exact” sequences. Keywords:Lempel–Ziv complexity, Distribution function, Randomness Affiliations:
 
10.  Nagarajan R.^{♦}, Szczepański J., Wajnryb E., Interpreting nonrandom signatures in biomedical signals with LempelZiv complexity, PHYSICA DNONLINEAR PHENOMENA, ISSN: 01672789, DOI: 10.1016/j.physd.2007.09.007, Vol.237, pp.359364, 2008 Abstract: Lempel–Ziv complexity (LZ) [J. Ziv, A. Lempel, On the complexity of finite sequences, IEEE Trans. Inform. Theory 22 (1976) 75–81] and its variants have been used widely to identify nonrandom patterns in biomedical signals obtained across distinct physiological states. Nonrandom signatures of the complexity measure can occur under nonlinear deterministic as well as nondeterministic settings. Surrogate data testing have also been encouraged in the past in conjunction with complexity estimates to make a finer distinction between various classes of processes. In this brief letter, we make two important observations (1) NonGaussian noise at the dynamical level can elude existing surrogate algorithms namely: Phaserandomized surrogates (FT) amplitudeadjusted Fourier transform (AAFT) and iterated amplitudeadjusted Fourier transform (IAAFT). Thus any inference nonlinear determinism as an explanation for the nonrandomness is incomplete (2) Decrease in complexity can be observed even across two linear processes with identical autocorrelation functions. The results are illustrated with a secondorder autoregressive process with Gaussian and nonGaussian innovations. AR(2) processes have been used widely to model several physiological phenomena, hence their choice. The results presented encou rage cautious interpretation of nonrandom signatures in experimental signals using complexity measures. Keywords:Lempel–Ziv complexity, Surrogate testing, Autoregressive process Affiliations:
 
11.  Amigó J.M.^{♦}, Kocarev L.^{♦}, Szczepański J., On some properties of the discrete Lyapunov exponent, PHYSICS LETTERS A, ISSN: 03759601, DOI: 10.1016/j.physleta.2008.07.076, Vol.372, pp.62656268, 2008 Abstract: One of the possible byproducts of discrete chaos is the application of its tools, in particular of the discrete Lyapunov exponent, to cryptography. In this Letter we explore this question in a very general setting. Affiliations:
 
12.  Amigó J.M.^{♦}, Kocarev L.^{♦}, Szczepański J., Theory and practice of chaotic cryptography, PHYSICS LETTERS A, ISSN: 03759601, DOI: 10.1016/j.physleta.2007.02.021, Vol.366, pp.211216, 2007 Abstract: In this Letter we address some basic questions about chaotic cryptography, not least the very definition of chaos in discrete systems. We propose a conceptual framework and illustrate it with different examples from private and public key cryptography. We elaborate also on possible limits of chaotic cryptography. Affiliations:
 
13.  Amigó J.M.^{♦}, Kocarev L.^{♦}, Szczepański J., Discrete Lyapunov exponent and resistance to differential cryptanalysis, IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS IIEXPRESS BRIEFS, ISSN: 15497747, DOI: 10.1109/TCSII.2007.901576, Vol.54, No.10, pp.882886, 2007 Abstract: In a recent paper, Jakimoski and Subbalakshmi provided a nice connection between the socalled discrete Lyapunov exponent of a permutation F defined on a finite lattice and its maximal differential probability, a parameter that measures the complexity of a differential cryptanalysis attack on the substitution defined by F. In this brief, we take a second look at their result to find some practical shortcomings. We also discuss more general aspects. Keywords:Differential cryptanalysis, discrete Lyapunov exponent (DLE), maximum differential probability (DP) Affiliations:
 
14.  Kocarev L.^{♦}, Szczepański J., Amigó J.M.^{♦}, Tomovski I.^{♦}, Discrete chaos  I: theory, IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS IREGULAR PAPERS, ISSN: 15498328, DOI: 10.1109/TCSI.2006.874181, Vol.53, No.6, pp.13001309, 2006 Abstract: We propose a definition of the discrete Lyapunov exponent for an arbitrary permutation of a finite lattice. For discretetime dynamical systems, it measures the local (between neighboring points) average spreading of the system. We justify our definition by proving that, for large classes of chaotic maps, the corresponding discrete Lyapunov exponent approaches the largest Lyapunov exponent of a chaotic map when Mrarrinfin, where M is the cardinality of the discrete phase space. In analogy with continuous systems, we say the system has discrete chaos if its discrete Lyapunov exponent tends to a positive number, when Mrarrinfin. We present several examples to illustrate the concepts being introduced. Keywords:Chaos, discrete chaos, Lyapunov components Affiliations:
 
15.  Amigó J.M.^{♦}, Kocarev L.^{♦}, Szczepański J., Order patterns and chaos, PHYSICS LETTERS A, ISSN: 03759601, DOI: 10.1016/j.physleta.2006.01.093, Vol.355, pp.2731, 2006 Abstract: Chaotic maps can mimic random behavior in a quite impressive way. In particular, those possessing a generating partition can produce any symbolic sequence by properly choosing the initial state. We study in this Letter the ability of chaotic maps to generate order patterns and come to the conclusion that their performance in this respect falls short of expectations. This result reveals some basic limitation of a deterministic dynamic as compared to a random one. This being the case, we propose a nonstatistical test based on ‘forbidden’ order patterns to discriminate chaotic from truly random time series with, in principle, arbitrarily high probability. Some relations with discrete chaos and chaotic cryptography are also discussed. Keywords:Chaotic maps, Order patterns, Permutation entropy, Discrete Lyapunov exponent, Chaotic cryptography Affiliations:
 
16.  Szczepański J., Amigó J.M.^{♦}, Michałek T., Kocarev L.^{♦}, Cryptographically secure substitutions based on the approximation of mixing maps, IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS IREGULAR PAPERS, ISSN: 15498328, DOI: 10.1109/TCSI.2004.841602, Vol.52, No.2, pp.443453, 2005 Abstract: In this paper, we explore, following Shannon’s suggestion that diffusion should be one of the ingredients of resistant block ciphers, the feasibility of designing cryptographically secure substitutions (think of Sboxes, say) via approximation of mixing maps by periodic transformations. The expectation behind this approach is, of course, that the nice diffusion properties of such maps will be inherited by their approximations, at least if the convergence rate is appropriate and the associated partitions are sufficiently fine. Our results show that this is indeed the case and that, in principle, block ciphers with closetooptimal immunity to linear and differential cryptanalysis (as measured by the linear and differential approximation probabilities) can be designed along these guidelines. We provide also practical examples and numerical evidence for this approximation philosophy. Keywords:Black cipher, differential cryptanalysis, linear cryptanalysis, mixing dynamical system, periodic approximation, S box Affiliations:
 
17.  Amigó J.M.^{♦}, Szczepański J., Kocarev L.^{♦}, A chaosbased approach to the design of cryptographically secure substitutions, PHYSICS LETTERS A, ISSN: 03759601, DOI: 10.1016/j.physleta.2005.05.057, Vol.343, pp.5560, 2005 Abstract: We show that chaotic maps may be used for designing socalled substitution boxes for ciphers resistant to linear and differential cryptanalysis, providing an alternative to the algebraic methods. Our approach is based on the approximation of mixing maps by periodic transformations. The expectation behind is, of course, that the nice chaotic properties of such maps will be inherited by their approximations, at least if the convergence rate is appropriate and the associated partitions are sufficiently fine. We show that this is indeed the case and that, in principle, substitutions with closetooptimal immunity to linear and differential cryptanalysis can be designed along these guidelines. We provide also practical examples and numerical evidence for this approximation philosophy Keywords:Chaotic maps, Periodic approximations, Bit permutations, Cryptanalysis Affiliations:
 
18.  Amigó J.M.^{♦}, Szczepański J., A Conceptual Guide to Chaos Theory, Prace IPPT  IFTR Reports, ISSN: 22993657, No.9, pp.143, 1999 
Conference abstracts
1.  Paprocki B.^{♦}, Pręgowska A., Szczepański J., Information Processing in BrainInspired Networks: Size and Density Effects, SolMech 2016, 40th Solid Mechanics Conference, 20160829/0902, Warszawa (PL), No.P192, pp.12, 2016  
2.  Szczepański J., SanchezVives M.V.^{♦}, Arnold M.M.^{♦}, Montejo N.^{♦}, Paprocki B.^{♦}, Pręgowska A., Amigó J.M.^{♦}, Wajnryb E., Analyzing Neuroscience Signals using Information Theory and Complexity Shannon Communication Approach, 12th INCF, 12th INCF Workshop on Node Communication and Collaborative Neuroinformatics, 20150416/0417, Warszawa (PL), pp.132, 2015  
3.  Szczepański J., Paprocki B.^{♦}, Transmission efficiency in the brainlike neuronal networks. Information and energetic aspects, 10th International Neural Coding Workshop, 20120902/0908, Prague (CZ), pp.127128, 2012 Keywords: Neuronal Communication, Brainlike Network, Shannon Theory Affiliations:
 
4.  Paprocki B.^{♦}, Szczepański J., Effectiveness of information transmission in the brainlike communication models, 10th International Neural Coding Workshop, 20120902/0908, Prague (CZ), pp.9394, 2012 Keywords: Brainlike network, Information transmission, Neuronal computation Affiliations:
