The way brain networks maintain high transmission efficiency is believed to be fundamental in understanding brain activity. Brains consisting of more cells render information transmission more reliable and robust to noise. On the other hand, processing information in larger networks requires additional energy. Recent studies suggest that it is complexity, connectivity, and function diversity, rather than just size and the number of neurons, that could favour the evolution of memory, learning, and higher cognition. In this paper, we use Shannon information theory to address transmission efficiency quantitatively. We describe neural networks as communication channels, and then we measure information as mutual information between stimuli and network responses. We employ a probabilistic neuron model based on the approach proposed by Levy and Baxter, which comprises essential qualitative information transfer mechanisms. In this paper, we overview and discuss our previous quantitative results regarding brain-inspired networks, addressing their qualitative consequences in the context of broader literature. It is shown that mutual information is often maximized in a very noisy environment e.g., where only one-third of all input spikes are allowed to pass through noisy synapses and farther into the network. Moreover, we show that inhibitory connections as well as properly displaced long-range connections often significantly improve transmission efficiency. A deep understanding of brain processes in terms of advanced mathematical science plays an important role in the explanation of the nature of brain efficiency. Our results confirm that basic brain components that appear during the evolution process arise to optimise transmission performance.
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.