Nowa wersja platformy, zawierająca wyłącznie zasoby pełnotekstowe, jest już dostępna.
Przejdź na https://bibliotekanauki.pl

PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
2024 | Vol. 34, no. 3 | 467--483
Tytuł artykułu

A spiking neural network based on thalamo-cortical neurons for self-learning agent applications

Treść / Zawartość
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
The paper proposes a non-iterative training algorithm for a power efficient SNN classifier for applications in self-learning systems. The approach uses mechanisms of preprocessing of signals from sensory neurons typical of a thalamus in a diencephalon. The algorithm concept is based on a cusp catastrophe model and on training by routing. The algorithm guarantees a zero dispersion of connection weight values across the entire network, which is particularly important in the case of hardware implementation based on programmable logic devices. Due to non-iterative mechanisms inspired by training methods for associative memories, the approach makes it possible to estimate the capacity of the network and required hardware resources. The trained network shows resistance to the phenomenon of catastrophic forgetting. Low complexity of the algorithm makes in-situ hardware training possible without using power-hungry accelerators. The paper compares the complexities of hardware implementations of the algorithm with the classic STDP and conversion methods. The basic application of the algorithm is an autonomous agent equipped with a vision system and based on a classic FPGA device.
Wydawca

Rocznik
Strony
467--483
Opis fizyczny
Bibliogr. 65 poz., rys., tab., wykr.
Twórcy
  • Faculty of Computing and Telecommunications, Poznań University of Technology, Piotrowo 3A, 61-138 Poznań, Poland
  • Faculty of Computing and Telecommunications, Poznań University of Technology, Piotrowo 3A, 61-138 Poznań, Poland, szymon.szczesny@put.poznan.pl
  • Faculty of Computing and Telecommunications, Poznań University of Technology, Piotrowo 3A, 61-138 Poznań, Poland
autor
  • Department of Electrical and Computer Engineering, NOVA University of Lisbon, Quinta da Torre, 2829-516 Caparica, Portugal
  • Faculty of Computing and Telecommunications, Poznań University of Technology, Piotrowo 3A, 61-138 Poznań, Poland
Bibliografia
  • [1] Abusnaina, A. and Abdullah, R. (2014). Spiking neuron models: A review, International Journal of Digital Content Technology and its Applications 8(3): 14-21.
  • [2] Albert, Shalumov1, R., Halaly1, Elishai, E. and Tsur (2021). Lidar-driven spiking neural network for collision avoidance in autonomous driving, Bioinspiration & Biomimetics 16(6): 066016.
  • [3] Allred, J. and Roy, K. (2020). Controlled forgetting: Targeted stimulation and dopaminergic plasticity modulation for unsupervised lifelong learning in spiking neural networks, Frontiers in Neuroscience 14: 1-16, Article no. 7.
  • [4] Bartłomiejczyk, P., Trujillo, F.L. and Signerska-Rynkowska, J. (2023). Spike patterns and chaos in a map-based neuron model, International Journal of Applied Mathematics and Computer Science 33(3): 395-408, DOI: 10.34768/amcs-2023-0028.
  • [5] Cech, J., Hanis, T., Kononisky, A., Rurtle, T., Svancar, J. and Twardzik, T. (2021). Self-supervised learning of camera-based drivable surface roughness, 2021 IEEE Intelligent Vehicles Symposium (IV), Nagoya, Japan, pp. 1319-1325.
  • [6] Chen, D.-G., Chen, X. and Zhang, K. (2016). An exploratory statistical cusp catastrophe model, 2016 IEEE International Conference on Data Science and Advanced Analytics (DSAA), Montreal, Canada, pp. 100-109.
  • [7] Chen, D.-G., Lin, F., Chen, X., Tang, W. and Kitzman, H. (2014). Cusp catastrophe model a nonlinear model for health outcomes in nursing research, Nursing Research 63(3): 211-220.
  • [8] Chen, X., Stanton, B., Chen, D.-G. and Li, X. (2013). Intetion to use condom, cusp modeling, and evaluation of an HIV prevention intervention trial, Nonlinear Dynamics, Psychology, and Life Sciences 17(3): 385-403.
  • [9] Cheng, H.-P., Wen, W., Wu, C., Li, S., Li, H.H. and Chen, Y. (2017). Understanding the design of IBM neurosynaptic system and its tradeoffs: A user perspective, Design, Automation & Test in Europe Conference &Exhibition (DATE 2017), Lausanne, Switzerland, pp. 139-144.
  • [10] Chu, L., Raghavendra, R., Srivatsa, M., Preece, A. and Harborne, D. (2019). Feature importance identification through bottleneck reconstruction, 2019 IEEE International Conference on Cognitive Computing (ICCC),Milan, Italy, pp. 64-66.
  • [11] Davies, M., Srinivasa, N., Lin, T.-H., Chinya, G., Cao, Y., Choday, S.H., Dimou, G., Joshi, P., Imam, N., Jain, S., Liao, Y., Lin, C.-K., Lines, A., Liu, R., Mathaikutty, D., McCoy, S., Paul, A., Tse, J., Venkataramanan, G., Weng, Y.-H., Wild, A., Yang, Y. and Wang, H. (2018). Loihi: A neuromorphic manycore processor with on-chip learning, IEEE Micro 38(1): 82-99.
  • [12] Daw, R. and He, Z. (2020). Deep neural network in cusp catastrophe model, arXiv: 2004.02359, DOI: 10.48550/arXiv.2004.02359.
  • [13] de Beurs, D., Bockting, C., Kerkhof, A., Scheepers, F., O’Connor, R. and van de Leemput, I. (2020). A network perspective on suicidal behavior: Understanding suicidality as a complex system, Suicide and Life-Threatening Behavior 51(1): 115-126.
  • [14] Deng, J., Dong, W., Socher, R., Li, L.-J., Li, K. and Fei-Fei, L. (2009). ImageNet: A large-scale hierarchical image database, 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, USA, pp. 248-255.
  • [15] Diehl, P. and Cook, M. (2014). Efficient implementation of STDP rules on spinnaker neuromorphic hardware, Proceedings of the International Joint Conference on Neural Networks, Beijing, China, pp. 4288-4295.
  • [16] Encke, J. and Hemmert,W. (2018). Extraction of inter-aural time differences using a spiking neuron network model of the medial superior olive, Frontiers in Neuroscience 12: 1-12, Article no. 140.
  • [17] Fang, W., Yu, Z., Chen, Y., Masquelier, T., Huang, T. and Tian, Y. (2021). Incorporating learnable membrane time constant to enhance learning of spiking neural networks, 2021 IEEE/CVF International Conference on Computer Vision (ICCV), Montreal, Canada, pp. 2641-2651.
  • [18] Gallego, G., Delbrück, T., Orchard, G., Bartolozzi, C., Taba, B., Censi, A., Leutenegger, S., Davison, A.J., Conradt, J., Daniilidis, K. and Scaramuzza, D. (2022). Event-based vision: A survey, IEEE Transactions on Pattern Analysis and Machine Intelligence 44(1): 154-180.
  • [19] Griffin, G., Holub, A. and Perona, P. (2007). Caltech-256 object category dataset, CalTech Report, California Institute of Technology, Pasadena, DOI: 10.22002/D1.20087.
  • [20] Guastello, S., Aruka, Y., Doyle, M. and Smerz, K. (2008). Cross-cultural generalizability of a cusp catastrophe model for binge drinking among college students, Nonlinear Dynamics, Psychology, and Life Sciences 12(4): 397-407.
  • [21] Halassa, M.M. and Acsády, L. (2016). Thalamic inhibition: Diverse sources, diverse scales, Trends in Neurosciences 39(10): 680-693.
  • [22] Hazan, H., Saunders, D., Sanghavi, D. T., Siegelmann, H. and Kozma, R. (2018). Unsupervised learning with self-organizing spiking neural networks, 2018 International Joint Conference on Neural Networks (IJCNN), Rio de Janeiro, Brazil, pp. 1-6.
  • [23] He, K., Zhang, X., Ren, S. and Sun, J. (2015). Deep residual learning for image recognition, CoRR: abs/1512.03385.
  • [24] Hua, Y., Loomba, S., Pawlak, V., Voit, K.-M., Laserstein, P., Boergens, K.M., Wallace, D.J., Kerr, J.N. and Helmstaedter, M. (2022). Connectomic analysis of thalamus-driven disinhibition in cortical layer 4, Cell Reports 41(2): 111476.
  • [25] Huderek, D., Szczęsny, S. and Rato, R. (2019). Spiking neural network based on cusp catastrophe theory, Foundations of Computing and Decision Sciences 44(3): 273-284.
  • [26] Izhikevich, E. (2004). Which model to use for cortical spiking neurons?, IEEE Transactions on Neural Networks 15(5): 1063-1070.
  • [27] Kozdon, K. and Bentley, P. (2017). Wide learning: Using an ensemble of biologically-plausible spiking neural networks for unsupervised parallel classification of spatio-temporal patterns, 2017 IEEE Symposium Series on Computational Intelligence (SSCI), Honolulu, USA, pp. 1-8.
  • [28] Leong, M., Prasad, D., Lee, Y. and Lin, F. (2020). Semi-CNN architecture for effective spatio-temporal learning in action recognition, Applied Sciences 10(2): 557.
  • [29] Li, G., Deng, L., Chua, Y., Li, P., Neftci, E.O. and Li, H. (2020). Editorial: Spiking neural network learning, benchmarking, programming and executing, Frontiers in Neuroscience 14: 1-4, Article no. 276.
  • [30] Li, H., Liu, H., Ji, X., Li, G. and Shi, L. (2017). CIFAR10-DVS: An event-stream dataset for object classification, Frontiers in Neuroscience 11: 1-10, Article no. 309.
  • [31] Liang, Q., Shenoy, P. and Irwin, D. (2020). AI on the edge: Characterizing AI-based IoT applications using specialized edge architectures, 2020 IEEE International Symposium on Workload Characterization (IISWC), Beijing, China, pp. 145-156.
  • [32] Lim, Y. and Golden, J.A. (2007). Patterning the developing diencephalon, Brain Research Reviews 53(1): 17-26.
  • [33] Markram, H., Gerstner, W. and Sjöström, P.J. (2012). Spike-timing-dependent plasticity: A comprehensive overview, Frontiers in Synaptic Neuroscience 4: 1-3, Article no. 2.
  • [34] Mayr, C., Hoeppner, S. and Furber, S. (2019). Spinnaker 2: A 10 million core processor system for brain simulation and machine learning, arXiv: 1911.02385, DOI: 10.48550/arXiv.1911.02385.
  • [35] Meftah, B., Lezoray, O., Lecluse, M. and Benyettou, A. (2010). Cell microscopic segmentation with spiking neuron networks, in K. Diamantaras et al. (Eds), Artificial Neural Networks-ICANN 2010, Springer, Berlin/Heidelberg, pp. 117-126.
  • [36] Na, B., Mok, J., Park, S., Lee, D., Choe, H. and Yoon, S. (2022). AutoSNN: Towards energy-efficient spiking neural networks, 39th International Conference on Machine Learning, Baltimore, USA.
  • [37] Nazari, S., Amiri, M., Faez, K. and Van Hulle, M.M. (2020). Information transmitted from bioinspired neuron-astrocyte network improves cortical spiking network’s pattern recognition performance, IEEE Transactions on Neural Networks and Learning Systems 31(2): 464-474.
  • [38] Neculae, G. (2020). Ensemble Learning for Spiking Neural Networks, PhD thesis, University of Manchester, Manchester.
  • [39] Neftci, E.O., Mostafa, H. and Zenke, F. (2019). Surrogate gradient learning in spiking neural networks: Bringing the power of gradient-based optimization to spiking neural networks, IEEE Signal Processing Magazine 36(6): 51-63.
  • [40] Oster, S., Deiner, M., Birgbauer, E. and Sretavan, D. (2004). Ganglion cell axon pathfinding in the retina and optic nerve, Seminars in Cell & Developmental Biology 15(1): 125-136.
  • [41] Pereira-Pires, J.E., Ferreira, J. and Rato, R. (2019). Spike based computing: A novel hardware to compute and control with spikes in space, ESA contract nº 4000117067/16/NL/MH/gm, Final Report, European Space Agency, Paris, DOI: 10.13140/RG.2.2.22848.97286.
  • [42] Pietrzak, P., Szczęsny, S., Huderek, D. and Przyborowski, L. (2023). Overview of spiking neural network learning approaches and their computational complexities, Sensors 23(6): 3037.
  • [43] Pisarev, A., Busygin, A., Udovichenko, S.Y. and Maevsky, O. (2020). A biomorphic neuroprocessor based on a composite memristor-diode crossbar, Microelectronics Journal 102: 104827, DOI: 10.1016/j.mejo.2020.104827.
  • [44] Ponghiran, W., Srinivasan, G. and Roy, K. (2019). Reinforcement learning with low-complexity liquid state machines, Frontiers in Neuroscience 13: 1-14, Article no. 883.
  • [45] Ponulak, F. (2008). Analysis of the ReSuMe learning process for spiking neural networks, International Journal of Applied Mathematics and Computer Science 18(2): 117-127, DOI: 10.2478/v10006-008-0011-1.
  • [46] Quevedo, A., Mørch, C., Andersen, O. and Coghill, R. (2017). Lateral inhibition during nociceptive processing, PAIN 158(6): 1046-1052.
  • [47] Raha, A., Kim, S.K., Mathaikutty, D.A., Venkataramanan, G., Mohapatra, D., Sung, R., Brick, C. and Chinya, G.N. (2021). Design considerations for edge neural network accelerators: An industry perspective, 2021 34th International Conference on VLSI Design/2021 20th International Conference on Embedded Systems (VLSID), Guwahati, India, pp. 328-333.
  • [48] Rajbahadur, G.K., Wang, S., Oliva, G.A., Kamei, Y. and Hassan, A.E. (2022). The impact of feature importance methods on the interpretation of defect classifiers, IEEE Transactions on Software Engineering 48(7): 2245-2261.
  • [49] Raz, Halaly, Elishai, E. and Tsur (2023). Autonomous driving controllers with neuromorphic spiking neural networks, Front Neurorobot 17: 1234962, DOI: 10.3389/fnbot.2023.1234962.
  • [50] Rowcliffe, P., Feng, J. and Buxton, H. (2006). Spiking perceptrons, IEEE Transactions on Neural Networks 17(3): 803-807.
  • [51] Salt, L., Howard, D., Indiveri, G. and Sandamirskaya, Y. (2020). Parameter optimization and learning in a spiking neural network for UAV obstacle avoidance targeting neuromorphic processors, IEEE Transactions on Neural Networks and Learning Systems 31(9): 3305-3318.
  • [52] Shrestha, Sumit, B. and Orchard, G. (2018). Slayer: Spike layer error reassignment in time, 32nd Conference on Neural Information Processing Systems (NeurIPS 2018), Montreal, Canada.
  • [53] Stagsted, R., Vitale, A., Binz, J., Renner, A., Larsen, L.B. and Sandamirskaya, Y. (2020). Towards neuromorphic control: A spiking neural network based PID controller for UAV, Robotics: Science and Systems 2020, Corvalis, USA, pp. 1-8, DOI: 10.5167/uzh-200415.
  • [54] Szczęsny, S. (2017). 0.3 V 2.5 nW per channel current-mode CMOS perceptron for biomedical signal processing in amperometry, IEEE Sensors Journal 17(17): 5399-5409.
  • [55] Szczęsny, S., Huderek, D. and Przyborowski, L. (2021). Spiking neural network with linear computational complexity for waveform analysis in amperometry, Sensors 21(9): 3276.
  • [56] Tazerart, S., Mitchell, D.E., Miranda-Rottmann, S. and Araya, R. (2019). A spike-timing-dependent plasticity rule for single, clustered and distributed dendritic spines, bioRxiv: 397323, https://www.biorxiv.org/content/early/2019/01/27/397323.
  • [57] Torrico, T. and Munakomi, S. (2020). Neuroanatomy, Thalamus, National Library of Medicine, Bethesda, https://www.ncbi.nlm.nih.gov/books/NBK542184/.
  • [58] Viale, A., Marchisio, A., Martina, M., Masera, G. and Shafique, M. (2021). CARSNN: An efficient spiking neural network for event-based autonomous cars on the Loihi neuromorphic research processor, 2021 International Joint Conference on Neural Networks (IJCNN), Shenzhen, China, pp. 1-10.
  • [59] Vosahlik, D., Cech, J., Hanis, T., Konopisky, A., Rurtle, T., Svancar, J. and Twardzik, T. (2021). Self-supervised learning of camera-based drivable surface friction, 2021 IEEE International Intelligent Transportation Systems Conference (ITSC), Indianapolis, USA, pp. 2773-2780.
  • [60] Wu, D., Yi, X. and Huang, X. (2022). A little energy goes a long way: Build an energy-efficient, accurate spiking neural network from convolutional neural network, Frontiers in Neuroscience 16: 1-11, Article no. 759900.
  • [61] Wu, Y., Deng, L., Li, G., Zhu, J. and Shi, L. (2018a). Direct training for spiking neural networks: Faster, larger, better, 3rd AAAI Conference on Artificial Intelligence (AAAI-19), Honolulu, USA, pp. 1311-1318.
  • [62] Wu, Y., Deng, L., Li, G., Zhu, J. and Shi, L. (2018b). Spatio-temporal backpropagation for training high-performance spiking neural networks, Frontiers in Neuroscience 12: 1-12, Article no. 331.
  • [63] Xuelei, C. (2023). Autonomous driving using spiking neural networks on dynamic vision sensor data: A case study of traffic light change detection, arXiv: 2311.09225, DOI: 10.48550/arXiv.2311.09225.
  • [64] Zhao, J., Fang, J., Ye, Z. and Zhang, L. (2021). Large scale autonomous driving scenarios clustering with self-supervised feature extraction, 2021 IEEE Intelligent Vehicles Symposium (IV), Nagoya, Japan, pp. 473-480.
  • [65] Zhou, J., Dai, J. and Weng, S. (2022). Effect of adjacent lateral inhibition on light and electric-stimulated synaptic transistors, IEEE Electron Device Letters 43(4): 573-575.
Typ dokumentu
Bibliografia
Identyfikatory
Identyfikator YADDA
bwmeta1.element.baztech-9d85a964-2aed-4ae6-8044-0291adeb6dc4
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.