PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Optimizing information processing in brain-inspired neural networks

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
The way brain networks maintain high transmission efficiency is believed to be fundamental in understanding brain activity. Brains consisting of more cells render information transmission more reliable and robust to noise. On the other hand, processing information in larger networks requires additional energy. Recent studies suggest that it is complexity, connectivity, and function diversity, rather than just size and the number of neurons, that could favour the evolution of memory, learning, and higher cognition. In this paper, we use Shannon information theory to address transmission efficiency quantitatively. We describe neural networks as communication channels, and then we measure information as mutual information between stimuli and network responses. We employ a probabilistic neuron model based on the approach proposed by Levy and Baxter, which comprises essential qualitative information transfer mechanisms. In this paper, we overview and discuss our previous quantitative results regarding brain-inspired networks, addressing their qualitative consequences in the context of broader literature. It is shown that mutual information is often maximized in a very noisy environment e.g., where only one-third of all input spikes are allowed to pass through noisy synapses and farther into the network. Moreover, we show that inhibitory connections as well as properly displaced long-range connections often significantly improve transmission efficiency. A deep understanding of brain processes in terms of advanced mathematical science plays an important role in the explanation of the nature of brain efficiency. Our results confirm that basic brain components that appear during the evolution process arise to optimise transmission performance.
Rocznik
Strony
225--233
Opis fizyczny
Bibliogr. 76 poz., rys.
Twórcy
autor
  • Institute of Mechanics and Applied Computer Science, Kazimierz Wielki University, Kopernika 1, 85-074 Bydgoszcz, Poland
autor
  • Institute of Fundamental Technological Research, Polish Academy of Sciences, Pawinskiego 5B, 02-106 Warsaw, Poland
  • Institute of Fundamental Technological Research, Polish Academy of Sciences, Pawinskiego 5B, 02-106 Warsaw, Poland
Bibliografia
  • [1] F. Walter, F. Rohrbein, and A. Knoll, “Computation by time”, Neural Process. Lett. 44, 103–124 (2016).
  • [2] C. Shannon, “A mathematical theory of communication”, Bell Syst. Tech. 27, 379–423 (1948).
  • [3] M.D. McDonnell, N.G. Stocks, C.E.M. Pearce, and D. Abbott, “Quantization in the presence of large amplitude threshold noise”, Fluct. Noise Lett. 5(3), L457–L468 (2005).
  • [4]S. Ghosh-Dastidar and H. Adeli, “Spiking Neural Network”, Int. J. Neural Syst. 19, 295–308 (2009).
  • [5]M. Grochowski, A. Kwasigroch, and A. Mikołajczyk, “Selected technical issues of deep neural networks for image classification purposes”, Bull. Pol. Ac.: Tech. 67(9), 363–376 (2019).
  • [6]F. Horn and K.R. Muller, “Predicting pairwise relations with neural similarity encoders”, Bull. Pol. Ac.: Tech. 66(6), 821–830 (2018).
  • [7]W. Gerstner and R. Naud, “How Good Are Neuron Models?”, Science 326(5951), 379–380 (2009).
  • [8]A. Hodgkin and A. Huxley, “A quantitative description of membrane current and its application to conduction and excitation in nerve”, J. Physiol. 117, 500–544 (1952).
  • [9]B. Paprocki and J. Szczepanski, “Efficiency of neural transmission as a function of synaptic noise, threshold, and source characteristics”, BioSystems 105(1), 62–72 (2011).
  • [10]B. Paprocki and J. Szczepanski, “How do the amplitude fluctuations affect the neuronal transmission efficiency”, Neurocomputing 104, 50–56 (2013).
  • [11]B. Paprocki and J. Szczepanski, “Transmission efficiency in ring, brain inspired neuronal networks. Information and energetic aspects”, Brain Res. 1536, 135–143 (2013).
  • [12]E. Izhikevich: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience, MIT Press, Cambridge, MA, (2007).
  • [13]P. Dayan, L. Abbott: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience, MIT Press, Cambridge, MA, (2005).
  • [14]G. Corrado and K. Doya, “Understanding neural coding through the modelbased analysis of decision making”, J. Neurosci. 27(31), 8178–8180 (2007).
  • [15]D. Chicharro and S. Panzeri, “Algorithms of causal inference for the analysis of effective connectivity among brain regions”, Front. in Neuroinform. 8, 64(1–17) (2014).
  • [16]V. Lebedev and V. Lempitsky, “Speeding-up convolutional neural networks: A survey”, Bull. Pol. Ac.: Tech. 66(6), 803–810 (2018).
  • [17]B. Stasiak, P. Tarasiuk, I. Michalska, and A. Tomczyk, “Application of convolutional neural networks with anatomical knowledge for brain MRI analysis in MS patients”, Bull. Pol. Ac.: Tech. 66(6), 857–868 (2018).
  • [18]C.G. Antonopoulos, E. Bianco-Martinez, and M.S. Baptista, “Evaluating performance of neural codes in model neural communication networks”, Neural Netw. 109, 90–102 (2019).
  • [19]A. Pregowska, A. Casti, E. Kaplan, E. Wajnryb, and J. Szczepanski, „Information processing in the LGN: a comparison of neural codes and cell types”, Biol. Cybern. 113, 453‒464 (2019)
  • [20]J. Dyhrfjeld-Johnsen, V. Santhakumar, R. Morgan, R. Huerta, L. Tsimring, and I. Soltesz, “Topological determinants of epileptogenesis in large-scale structural and functional models of the dentate gyrus derived from experimental data”, J. Neurophysiol. 97(2), 1566–1587 (2007).
  • [21]M.S.A. Ferraz, H.L.C. Melo-Silva, and A.H. Kihara, “Optimizing information processing in neuronal networks beyond critical states”, PLoS One 12(9), e0184367 (2017).
  • [22]B. Sengupta, M. Stemmler, and K. Friston, “Information and efficiency in the nervous system-a synthesis”, PLoS Comput. Biol. 9(7), e1003157 (2013).
  • [23]D. Poli, V. P. Pastore, S. Martinoia, and P. Massobrio, “From functional to structural connectivity using partial correlation in neuronal assemblies”, J. Neural Eng. 13, 026023 (2016).
  • [24]H.J. Chen, J. A.Wolf, and D.H. Smith, “Multichannel activity propagation across an engineered axon network”, J. Neural Eng. 14(2), 026016 (2017).
  • [25]P.E. Crago and N.S. Makowski, “Alteration of neural action potential patterns by axonal stimulation: the importance of stimulus location”, J. Neural Eng. 11(5), 056016 (2014).
  • [26]A. Moujahid, A. D’Anjou, and F. Torrealdea, “Energy and information in Hodgkin-Huxley neurons”, Phys. Rev. E 83(3), 031912 (2011).
  • [27]Y. Yu, J. Karbowski, R. Sachdev, and J. Feng, “Effect of temperature and glia in brain size enlargement and origin of allometric bodybrain size scaling in vertebrates”, BMC Evol. Biol. 14(178), 1– 14 (2014).
  • [28]K. Kurek, B. Swiderski, S. Osowski, M. Kruk, and W. Barhoumi “Deep learning versus classical neural approach to mammogram recognition”, Bull. Pol. Ac.: Tech. 66(6), 833–840 (2018).
  • [29]T. Poggio and Q. Liao, “Theory I: Deep networks and the curse of dimensionality”, Bull. Pol. Ac.: Tech. 66(6), 761–773 (2018).
  • [30]T. Poggio and Q. Liao “Theory II: Deep learning and optimization”, Bull. Pol. Ac.: Tech. 66(6), 775–787 (2018).
  • [31]W. Levy and R. Baxter, “Energy-efficient neuronal computation via quantal synaptic failures”, J. Neurosci. 22(11), 4746–4755 (2002).
  • [32]F. Rieke, D. Warland, R. de Ruyter van Steveninck, W. Bialek: Spikes. Exploring the Neural Code. MIT Press, Cambridge, MA (1999).
  • [33]C. Jankowski, D. Reda, M. Mankowski, and G. Borowik, “TDiscretization of data using Boolean transformations and information theory based evaluation criteria”, Bull. Pol. Ac.: Tech. 63(4), 923–932 (2015).
  • [34]M. Kaiser, “Brain architecture: a design for natural computation”, Philos. T. R. Soc. Series A 365(1861), 3033–3045 (2007).
  • [35]S. Panzeri, S.B. Schultz, A. Treves, and E.T. Rolls, “Correlation and the encoding of information in the nervous system”, Proc. R. Soc. B. 266(1423), 1001–1012 (1999).
  • [36]R. Haslinger, K. Klinkner, and C. Shalizi, “The Computational Structure of Spike Trains”, Neural Comput. 157, 121–157 (2010).
  • [37]N. Gordon, T. Shackleton, A. Palmer, and I. Nelken, “Responses of neurons in the inferior colliculus to binaural disparities: insights from the use of Fisher information and mutual information”, J. Neurosci. Methods. 169(2), 391–404 (2008).
  • [38]S. Strong, R. Koberle, R. de Ruyter van Steveninck, and W. Bialek, “Entropy and Information in Neural Spike Trains”, Phys. Rev. Lett. 80(1), 197–200 (1998).
  • [39]J. Amigo, J. Szczepanski, E. Wajnryb, and M.V. Sanchez-Vives, “Estimating the entropy rate of spike trains via Lempel-Ziv complexity”, Neural Comput. 16(4), 717–36 (2004).
  • [40]L. Paninski, J.W. Pillow, and E.P. Simoncelli, “Maximum likelihood estimation of a stochastic integrate-and-fire neural encoding model”, Neural Comput. 16(12), 2533–2561 (2004).
  • [41]S. Panzeri, R. Senatore, M.A. Montemurro, and R.S. Petersen, “Correcting for the sampling bias problem in spike train information measures”, J. Nerophysiol. 98(3), 1064–1072 (2007).
  • [42]S. Blonski, A. Pregowska, T. Michalek, and J. Szczepanski, “The use of Lempel-Ziv complexity to analyze turbulence and flow randomness based on velocity fluctuations”, Bull. Pol. Ac.: Tech. 67(5), 957–962 (2019).
  • [43]A. Hasenstaub, S. Otte, E. Callaway, and T. Sejnowski, “Metabolic cost as a unifying principle governing neuronal biophysics”, Proc. Natl. Acad. Sci. U.S.A. 107(27), 12329–12334 (2010).
  • [44]L. Kostal, P. Lansky, and M.D. McDonnell MD, “Metabolic cost of neuronal information in an empirical stimulus-response model”, Biol. Cybern. 107(3), 355–365 (2013).
  • [45]A. Pregowska, E. Kaplan, and J. Szczepanski, “How Far can Neural Correlations Reduce Uncertainty? Comparison of Information Transmission Rates for Markov and Bernoulli Processes”, Int. J. Neural Syst. 29(0), 1950003–1–13 (2019).
  • [46]A. Pregowska, J. Szczepanski, and E. Wajnryb, “Mutual information against correlations in binary communication channels”, BMC Neurosci. 16(32), 1–7 (2015).
  • [47]A. Pregowska, J. Szczepanski, and E.Wajnryb, “Temporal code versus rate code for binary Information Sources”, Neurocomputing 216, 756–762 (2016).
  • [48]S. Laughlin, R. de Ruyter van Steveninck, and J. Anderson, “The metabolic cost of neural information”, Nat. Neurosci. 1(1), 36‒41 (1998).
  • [49]J.E. Niven, J.C. Anderson, and S.B. Laughlin, “Fly photoreceptors demonstrate energy-information trade-offs in neural coding”, PLoS Biol. 5(4), 828–840 (2007).
  • [50]E. Bullmore and O. Sporns, “The economy of brain network organization”, Nat. Rev. Neurosci. 13(5), 336–349 (2012).
  • [51]F. Torrealdea, C. Sarasola, A. D’Anjou, A. Moujahid, and N. de Mendizabal, “Energy efficiency of information transmission by electrically coupled neurons”, Biosystems 97(1), 60–71 (2009).
  • [52]M.H.I. Shovon, N. Nandagopal, R. Vijayalakshmi, J.T. Du, and B. Cocks, “Directed connectivity analysis of functional brain networks during cognitiv activity using transfer entropy”, Neural Process. Lett. 45, 807–824 (2017).
  • [53]B. Doiron, A. Litwin-Kumar, R. Rosenbaum, G.K. Ocker, and K. Josic, “The mechanics of state-dependent neural correlations”, Nat. Neurosci. 19(3), 383–393 (2016).
  • [54]J. Zhang, D. Zhou, D. Cai, and A. Rangan, “A coarse-grained framework for spiking neuronal networks: between homogeneity and synchrony”, J. Comput. Neurosci. 37(1), 81–104 (2014).
  • [55]H. Toutounji and G. Pipa, “Spatiotemporal computations of an excitable and plastic brain: neuronal plasticity leads to noiserobust and noise-constructive computations”, PLoS Comput. Biol. 10(3), e1003512 (2014).
  • [56]S.A. Oprisan, A. Sorinel, and C.V. Buhusi, “Why noise is useful in functional and neural mechanisms of interval timing?”, BMC Neurosci. 14(84), 1–12 (2013).
  • [57]I. Kanitscheider, R. Coen-Cagli, and A. Pouget, “Origin of information-limiting noise correlations”, Proc. Natl. Acad. Sci. U.S.A. 112(50), e6973–e6982 (2015).
  • [58]A. Neishabouri and A.A. Faisal, “Axonal noise as a source of synaptic variability”, PLoS Comput. Biol. 10(5), e1003615 (2014).
  • [59]A. Malyshev, T. Tchumatchenko, S. Volgushev, and M. Volgushev, “Energy efficient encoding by shifting spikes in neocortical neurons”, Eur. J. Neurosci. 38(8), 3181–3188 (2013).
  • [60]C. Gentile and D.P. Helmbold DP, “Improved lower bounds for learning from noisy examples: An information-theoretic approach”, Inform. Comput. 166(2), 133–155 (2001).
  • [61] S. Folias and G. Ermentrout, “New Patterns of Activity in a Pair of Interacting Excitatory-Inhibitory Neural Fields”, Phys. Rev. Lett. 107(22), 228103 (2011).
  • [62] Q. Wang, H. Zhang, M. Perc, and G. Chen, “Multiple ring coherence resonances in excitatory and inhibitory coupled neurons”, Commun. Nonlinear Sci. 17(10), 3979–3988 (2012).
  • [63] A. Moujahid, A. D’Anjou, and M. Grana, “Energy demands of diverse spiking cells from the neocortex, hippocampus, and thalamus”, Front. in Comput. Neurosci. 8(41), 1–12 (2014).
  • [64] A. Dorrn, K. Yuan, A. Barker, C. Schreiner, and R. Froemke, “Developmental sensory experience balances cortical excitation and inhibition”, Nature 465(7300), 932–936 (2010)
  • [65] A. King, “Auditory Neuroscience: Balancing Excitation and Inhibition during Development”, Curr. Biol. 20(18), R808–R810 (2010).
  • [66] D. Battaglia, N. Brunel, and D. Hansel, “Temporal Decorrelation of Collective Oscillations in Neural Networks with Local Inhibition and Long-Range Excitation”, Phys. Rev. Lett. 99(23), 238106 (2007).
  • [67] N. Dehghani, A. Peyrache, B. Telenczuk, M. Le Van Quyen, E. Halgren, S. S. Cash, N. G. Hatsopoulos, and A. Destexhe, “Dynamic balance of excitation and inhibition in human and monkey neocortex”, Sci. Rep. 6, 23176 (2016).
  • [68] S. Deneve and C.K. Machens, “Efficient codes and balanced networks”, Nat. Neurosci. 19(3), 375–382 (2016).
  • [69] F. Lombardi, H.D. Herrmann, and L. de Arcangelis, “Balance of excitation and inhibition determines 1/f power spectrum in neuronal networks”, Chaos 4(27), 047402 (2017).
  • [70] J.L. Chen, F.F. Voigt, M. Javadzadeh, R. Krueppel, and F. Helmchen, “Longrange population dynamics of anatomically de ned neocortical networks”, eLife 5, e14679 (2016).
  • [71] J. Steinbeck, P. Koch, A. Derouiche, and O. Brustle, “Human embryonic stem cell-derived neurons establish region-specific, longrange projections in the adult brain”, Cell. Mol. Life Sci. 69(3), 461–470 (2012).
  • [72] P. Stratton and J. Wiles, “Self-sustained non-periodic activity in networks of spiking neurons: The contribution of local and long-range connections and dynamic synapses”, Neuroimage 52(3), 1070–1079 (2010).
  • [73] A.S. Etemea, C.B. Tabi, and A. Mohamadou, “Long-range patterns in hindmarshrose networks”, Commun. Nonlinear Sci. 43, 211–219 (2017).
  • [74] V. Senn, S.B. Wol, C. Herry, F. Grenier, I. Ehrlich, J. Grundemann, P.J. Fadok, C. Muller, J.J. Letzkus, and A. Luthi, “Longrange connectivity defines behavioral specificity of amygdala neurons”, Neuron 81(2), 428–37 (2014).
  • [75] S. Zhang, M. Xu, W.C. Chang, C. Ma, J.P. Hoang Do, D. Jeong, T. Lei, J.L. Fan, and Y. Dan, “Organization of long-range inputs and outputs of frontal cortex for top-down control”, Nat. Neurosci. 12(19), 173–1742 (2016).
  • [76] J. Niven and S. Laughlin, “Energy limitation as a selective pressure on the evolution of sensory systems”, J. Exp. Biol. 211, 1792–1804 (2008).
Uwagi
PL
Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2020).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-a8450bfa-5ac6-4b33-9b1c-d066f4b446e6
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.