PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Distributed and Adaptive Edge-based AI Models for Sensor Networks (DAISeN)

Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Konferencja
17th Conference on Computer Science and Intelligence Systems
Języki publikacji
EN
Abstrakty
EN
This position paper describes the aims and preliminary results of the Distributed and Adaptive Edge-based AI Models for Sensor Networks (DAISeN) project. The project ambition is to address today's edge AI challenges by developing advanced AI techniques that model knowledge from the sensor network and the environment to support the deployment of sustainable AI applications. We present one of the use cases being considered in DAISeN and review the state-of-the-art in three research domains related to the use case presented and directly falling into the project scope. We additionally outline the main challenges identified in each domain. The developed Global Navigation Satellite Systems (GNSS) activation model addressing the use case challenges is also briefly introduced. The future research studies planned for the remaining period of the project are finally outlined.
Słowa kluczowe
Rocznik
Tom
Strony
71--78
Opis fizyczny
Bibliogr. 85 poz., tab.
Twórcy
  • Computer Science Department, Blekinge Institute of Technology SE-371 79 Karlskrona, Sweden
  • Computer Science Department, Blekinge Institute of Technology SE-371 79 Karlskrona, Sweden
  • Computer Science Department, Blekinge Institute of Technology SE-371 79 Karlskrona, Sweden
  • Computer Science Department, Blekinge Institute of Technology SE-371 79 Karlskrona, Sweden
  • Computer Science Department, Blekinge Institute of Technology SE-371 79 Karlskrona, Sweden
autor
  • Sony Europe BV R&D Center Europe Lund, Sweden
autor
  • Sony Europe BV R&D Center Europe Lund, Sweden
  • Sony Europe BV R&D Center Europe Lund, Sweden
  • Sony Europe BV R&D Center Europe Lund, Sweden
Bibliografia
  • 1. J. A. Manrique et al., “Contrasting internet of things and wireless sensor network from a conceptual overview,” in 2016 IEEE Int. Conference on Internet of Things and IEEE Green Computing and Communications and IEEE Cyber, Physical and Social Computing and IEEE Smart Data, pp. 252-257. [Online]. Available: https://doi.org/10.1109/iThings-GreenCom-CPSCom-SmartData.2016.66
  • 2. R. Arshad et al., “Green iot: An investigation on energy saving practices for 2020 and beyond,” IEEE Access, vol. 5, pp. 15 667-15 681, 2017. [Online]. Available: https://doi.org/10.1109/ACCESS.2017.2686092
  • 3. S. Abghari, V. Boeva, E. Casalicchio, and P. Exner, “An inductive system health monitoring approach for gnss activation,” in Artificial Intelligence Applications and Innovations. Springer Nature Switzerland, 2022. [Online]. Available: https://doi.org/10.1007/978-3-031-08337-2_36
  • 4. V. M. Devagiri, V. Boeva, and E. Tsiporkova, “Split-merge evolutionary clustering for multi-view streaming data,” Procedia Computer Science, vol. 176, pp. 460-469, 2020. [Online]. Available: https://doi.org/10.1016/j.procs.2020.08.048
  • 5. C. Åleskog, V. M. Devagiri, and V. Boeva, A Graph-Based Multi-view Clustering Approach for Continuous Pattern Mining. Cham: Springer International Publishing, 2022, pp. 201-237. [Online]. Available: https://doi.org/10.1007/978-3-030-95239-6_8
  • 6. V. Boeva et al., “Bipartite split-merge evolutionary clustering,” in Int. conference on agents and AI. Springer, 2019, pp. 204-223. [Online]. Available: https://doi.org/10.1007/978-3-030-37494-5_11
  • 7. E. Lughofer, “A dynamic split-and-merge approach for evolving cluster models,” Evolving systems, vol. 3, no. 3, pp. 135-151, 2012. [Online]. Available: https://doi.org/10.1007/s12530-012-9046-5
  • 8. C. Nordahl, V. Boeva, G. Håkan, and M. P. Netz, “Evolvecluster: an evolutionary clustering algorithm for streaming data,” Evolving Systems. [Online]. Available: https://doi.org/10.1007/s12530-021-09408-y
  • 9. V. M. Devagiri, V. Boeva, and S. Abghari, “A multi-view clustering approach for analysis of streaming data,” in AI Applications and Innovations, I. Maglogiannis, J. Macintyre, and L. Iliadis, Eds. Springer International Publishing, 2021, pp. 169-183. [Online]. Available: https://doi.org/10.1007/978-3-030-79150-6_14
  • 10. T. E. Bogale et al., “Machine intelligence techniques for next-generation context-aware wireless networks,” Int. Telecommunication Union Journal, 2018.
  • 11. Y. Zhu et al., “A fast indoor/outdoor transition detection algorithm based on machine learning,” Sensors, vol. 19, no. 4, p. 786, 2019. [Online]. Available: https://doi.org/10.3390/s19040786
  • 12. P. Bhargava et al., “Senseme: a system for continuous, on-device, and multi-dimensional context and activity recognition,” in Proceedings of the 11th Int. Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services, 2014, pp. 40-49. [Online]. Available: http://dx.doi.org/10.4108/icst.mobiquitous.2014.257654
  • 13. R. Sung et al., “Sound based indoor and outdoor environment detection for seamless positioning handover,” ICT Express, vol. 1, no. 3, pp. 106-109, 2015. [Online]. Available: https://doi.org/10.1016/j.icte.2016.02.001
  • 14. O. Canovas et al., “Wifiboost: A terminal-based method for detection of indoor/outdoor places,” in Proceedings of the 11th Int. Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services, 2014, pp. 352-353. [Online]. Available: https://doi.org/10.4108/icst.mobiquitous.2014.258063
  • 15. I. Ashraf et al., “Magio: Magnetic field strength based indoor- outdoor detection with a commercial smartphone,” Micromachines, vol. 9, no. 10, 2018. [Online]. Available: https://doi.org/10.3390/mi9100534
  • 16. W. Wang et al., “Indoor-outdoor detection using a smart phone sensor,” Sensors, vol. 16, no. 10, p. 1563, 2016. [Online]. Available: https://doi.org/10.3390/s16101563
  • 17. V. Radu et al., “A semi-supervised learning approach for robust indoor-outdoor detection with smartphones,” in Proceedings of the 12th ACM Conf. on Embedded Network Sensor Systems, 2014, p. 280-294. [Online]. Available: https://doi.org/10.1145/2668332.2668347
  • 18. T. Anagnostopoulos et al., “Environmental exposure assessment using indoor/outdoor detection on smartphones,” Personal and Ubiquitous Computing, vol. 21, no. 4, pp. 761-773, 2017. [Online]. Available: https://doi.org/10.1007/s00779-017-1028-y
  • 19. R. P. Souza et al., “A big data-driven hybrid solution to the indoor-outdoor detection problem,” Big Data Research, vol. 24, p. 100194, 2021. [Online]. Available: https://doi.org/10.1016/j.bdr.2021.100194
  • 20. M. D. Lange et al., “A continual learning survey: Defying forgetting in classification tasks,” IEEE transactions on pattern analysis and machine intelligence, vol. PP, 2021. [Online]. Available: https://doi.org/10.1109/TPAMI.2021.3057446
  • 21. R. M. French, “Catastrophic forgetting in connectionist networks,” Trends in cognitive sciences, vol. 3, no. 4, pp. 128-135, 1999. [Online]. Available: https://doi.org/10.1016/S1364-6613(99)01294-2
  • 22. A. Gepperth and B. Hammer, “Incremental learning algorithms and applications,” in European symposium on artificial neural networks (ESANN), 2016. [Online]. Available: https://hal.archives-ouvertes.fr/hal-01418129
  • 23. P. Sprechmann et al., “Memory-based parameter adaptation,” in Int. Conference on Learning Representations, 2018. [Online]. Available: https://openreview.net/forum?id=rkfOvGbCW
  • 24. V. Moens and A. Zénon, “Learning and forgetting using reinforced bayesian change detection,” PLoS computational biology, vol. 15, no. 4, p. e1006713, 2019. [Online]. Available: https://doi.org/10.1371/journal.pcbi.1006713
  • 25. Y. Sun et al., “Planning to be surprised: Optimal bayesian exploration in dynamic environments,” in Int. conf. on AGI. Springer, 2011, pp. 41-51. [Online]. Available: https://doi.org/10.1007/978-3-642-22887-2_5
  • 26. M. De Lange and T. Tuytelaars, “Continual prototype evolution: Learning online from non-stationary data streams,” in Proc. of the IEEE/CVF Int. Conf. on Comp. Vision, 2021, pp. 8250-8259. [Online]. Available: https://doi.org/10.1109/iccv48922.2021.00814
  • 27. S.-A. Rebuffi et al., “icarl: Incremental classifier and representation learning,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2017, pp. 2001-2010. [Online]. Available: https://doi.org/10.1109/CVPR.2017.587
  • 28. D. Isele and A. Cosgun, “Selective experience replay for lifelong learning,” in Proc. of the AAAI Conference on AI, vol. 32, no. 1, 2018.
  • 29. D. Rolnick et al., “Experience replay for continual learning,” Advances in Neural Information Processing Systems, vol. 32, 2019.
  • 30. H. Shin et al., “Continual learning with deep generative replay,” Advances in neural information processing systems, vol. 30, 2017. [Online]. Available: https://dl.acm.org/doi/10.5555/3294996.3295059
  • 31. F. Lavda et al., “Continual classification learning using generative models,” Proceedings of the 32nd Conference on Neural Information Processing Systems (NeurIPS) 2018, 2018.
  • 32. J. Ramapuram et al., “Lifelong generative modeling,” Neurocomputing, vol. 404, pp. 381-400, 2020. [Online]. Available: https://doi.org/10.1016/j.neucom.2020.02.115
  • 33. C. Atkinson et al., “Pseudo-rehearsal: Achieving deep reinforcement learning without catastrophic forgetting,” Neurocomputing, vol. 428, pp. 291-307, 2021. [Online]. Available: https://doi.org/10.1016/j.neuc om.2020.11.050
  • 34. D. Lopez-Paz and M. Ranzato, “Gradient episodic memory for continual learning,” Advances in neural information processing systems, vol. 30, 2017. [Online]. Available: https://dl.acm.org/doi/10.5555/329 5222.3295393
  • 35. A. Chaudhry et al., “Riemannian walk for incremental learning: Understanding forgetting and intransigence,” in Proceedings of the European Conference on Computer Vision (ECCV), 2018, pp. 532-547. [Online]. Available: https://doi.org/10.1007/978-3-030-01252-6_33
  • 36. Z. Li and D. Hoiem, “Learning without forgetting,” IEEE transactions on pattern analysis and machine intelligence, vol. 40, no. 12, pp. 2935-2947, 2017. [Online]. Available: https://doi.org/10.1109/TPAMI. 2017.2773081
  • 37. H. Jung et al., “Less-forgetting learning in deep neural networks,” Proceedings of the AAAI Conference on Artificial Intelligence, no. 1, 2018. [Online]. Available: https://doi.org/10.1609/aaai.v32i1.11769
  • 38. A. Rannen et al., “Encoder based lifelong learning,” in Proceedings of the IEEE International Conference on Computer Vision, 2017, pp. 1320-1328. [Online]. Available: https://doi.org/10.1109/ICCV.2017.148
  • 39. J. Zhang et al., “Class-incremental learning via deep model consolidation,” in Proc. of the IEEE/CVF WACV, 2020, pp. 1131-1140. [Online]. Available: https://doi.org/10.1109/WACV45572.2020.9093365
  • 40. J. Kirkpatrick et al., “Overcoming catastrophic forgetting in neural networks,” Proceedings of the national academy of sciences, vol. 114, no. 13, pp. 3521-3526, 2017. [Online]. Available: https://doi.org/10.1073/pnas.1611835114
  • 41. S.-W. Lee et al., “Overcoming catastrophic forgetting by incremental moment matching,” Advances in neural information processing systems, vol. 30, 2017. [Online]. Available: https://dl.acm.org/doi/10.5555/3294996.3295218
  • 42. F. Zenke, B. Poole, and S. Ganguli, “Continual learning through synaptic intelligence,” in International Conference on Machine Learning. PMLR, 2017, pp. 3987-3995.
  • 43. X. Liu et al., “Rotate your networks: Better weight consolidation and less catastrophic forgetting,” in 2018 24th ICPR. IEEE, 2018, pp. 2262-2268. [Online]. Available: https://doi.org/10.1109/ICPR.2018.8545895
  • 44. R. Aljundi et al., “Memory aware synapses: Learning what (not) to forget,” in Proceedings of the European Conference on Computer Vision (ECCV), 2018, pp. 139-154.
  • 45. A. Chaudhry et al., “Efficient lifelong learning with a-GEM,” in Int. Conf. on Learning Representations, 2019.
  • 46. A. Mallya and S. Lazebnik, “Packnet: Adding multiple tasks to a single network by iterative pruning,” in Proc. of IEEE CVPR, 2018, pp. 7765-7773. [Online]. Available: https://doi.org/10.1109/CVPR.2018.00810
  • 47. A. Mallya et al., “Piggyback: Adapting a single network to multiple tasks by learning to mask weights,” in Proceedings of the European Conference on Computer Vision (ECCV), 2018, pp. 67-82.
  • 48. J. Xu and Z. Zhu, “Reinforced continual learning,” Advances in Neural Information Processing Systems, vol. 31, 2018. [Online]. Available: https://dl.acm.org/doi/10.5555/3326943.3327027
  • 49. R. Aljundi et al., “Expert gate: Lifelong learning with a network of experts,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2017, pp. 3366-3375. [Online]. Available: https://doi.org/10.1109/CVPR.2017.753
  • 50. A. Rosenfeld and J. K. Tsotsos, “Incremental learning through deep adaptation,” IEEE transactions on pattern analysis and machine intelligence, vol. 42, no. 3, pp. 651-663, 2018. [Online]. Available: https://doi.org/10.1109/TPAMI.2018.2884462
  • 51. Z. Chen and B. Liu, “Lifelong machine learning,” Synthesis Lectures on AI and ML, vol. 12, no. 3, pp. 1-207, 2018.
  • 52. G. De Francisci Morales et al., “Iot big data stream mining,” in Proceedings of the 22nd ACM SIGKDD int. conference on knowledge discovery and data mining, 2016, pp. 2119-2120. [Online]. Available: https://doi.org/10.1145/2939672.2945385
  • 53. H. M. Gomes et al., “Machine learning for streaming data: state of the art, challenges, and opportunities,” ACM SIGKDD Explorations Newsletter, vol. 21, no. 2, pp. 6-22, 2019. [Online]. Available: https://doi.org/10.1145/3373464.3373470
  • 54. M. Masana et al., “Class-incremental learning: survey and performance evaluation on image classification,” arXiv preprint https://arxiv.org/abs/2010.15277, 2020.
  • 55. M. Caron, Bojanowski et al., “Deep clustering for unsupervised learning of visual features,” in Proceedings of the European conference on computer vision (ECCV), 2018, pp. 132-149. [Online]. Available: https://doi.org/10.1007/978-3-030-01264-9_9
  • 56. X. Zhan et al., “Online deep clustering for unsupervised representation learning,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2020, pp. 6688-6697. [Online]. Available: https://doi.org/10.1109/cvpr42600.2020.00672
  • 57. J. He and F. Zhu, “Unsupervised continual learning via pseudo labels,” arXiv preprint https://arxiv.org/abs/2104.07164, 2021.
  • 58. A. Bouchachia, “Evolving clustering: An asset for evolving systems,” IEEE SMC Newsletter, vol. 36, pp. 1-6, 2011.
  • 59. M. Zopf et al., “Sequential clustering and contextual importance measures for incremental update summarization,” in Proceedings of COLING 2016, the 26th Int. Conference on Computational Linguistics: Technical Papers, 2016, pp. 1071-1082.
  • 60. M. Wang et al., “A novel split-merge-evolve k clustering algorithm,” in 2018 IEEE 4th Int. Conference on Big Data Computing Service and Applications (BigDataService). IEEE, 2018, pp. 229-236. [Online]. Available: https://doi.org/10.1109/BigDataService.2018.00041
  • 61. T. Mitchell et al., “Never-ending learning,” Communications of the ACM, vol. 61, no. 5, pp. 103-115, 2018. [Online]. Available: https://doi.org/10.1145/3191513
  • 62. I. Stoica et al., “A berkeley view of systems challenges for ai,” arXiv preprint https://arxiv.org/abs/1712.05855, 2017.
  • 63. A. Tegen et al., “Towards a taxonomy of interactive continual and multimodal learning for the internet of things,” in Adjunct Proceedings of the 2019 ACM Int. Joint Conference on Pervasive and Ubiquitous Computing and Int. Symposium on Wearable Computers, 2019, pp. 524-528. [Online]. Available: https://doi.org/10.1145/3341162.3345603
  • 64. J. Konečnỳ et al., “Federated learning: Strategies for improving communication efficiency,” 2018.
  • 65. B. McMahan et al., “Communication-efficient learning of deep networks from decentralized data,” in AI and statistics. PMLR, 2017, pp. 1273-1282.
  • 66. T. Li et al., “Federated learning: Challenges, methods, and future directions,” IEEE Signal Processing Magazine, vol. 37, pp. 50-60, 2020. [Online]. Available: https://doi.org/10.1109/MSP.2020.2975749
  • 67. W.-T. Chang and R. Tandon, “Communication efficient federated learning over multiple access channels,” arXiv preprint https://arxiv.org/abs/2001.08737, 2020.
  • 68. E. Diao et al., “Heterofl: Computation and communication efficient federated learning for heterogeneous clients,” in Int. Conference on Learning Representations, 2021.
  • 69. A. Reisizadeh et al., “Fedpaq: A communication-efficient federated learning method with periodic averaging and quantization,” in Int. Conference on AI and Statistics. PMLR, 2020, pp. 2021-2031.
  • 70. Y. Chen et al., “Communication-efficient federated deep learning with layerwise asynchronous model update and temporally weighted aggregation,” IEEE transactions on neural networks and learning systems, vol. 31, no. 10, pp. 4229-4238, 2019. [Online]. Available: https://doi.org/10.1109/TNNLS.2019.2953131
  • 71. S. Caldas et al., “Expanding the reach of federated learning by reducing client resource requirements,” 2019. [Online]. Available: https://openreview.net/forum?id=SJlpM3RqKQ
  • 72. Y. Lin et al., “Deep gradient compression: Reducing the communication bandwidth for distributed training,” Proceedings of the 31st Conference on Neural Information Processing Systems (NIPS 2017), 2017.
  • 73. F. Sattler et al., “Robust and communication-efficient federated learning from non-iid data,” IEEE transactions on neural networks and learning systems, vol. 31, no. 9, pp. 3400-3413, 2019. [Online]. Available: https://doi.org/10.1109/TNNLS.2019.2944481
  • 74. X. Wu et al., “Fedmed: A federated learning framework for language modeling,” Sensors, vol. 20, no. 14, p. 4048, 2020. [Online]. Available: https://doi.org/10.3390/s20144048
  • 75. A. Malekijoo et al., “Fedzip: A compression framework for communication-efficient federated learning,” arXiv preprint https://arxiv.org/abs/2102.01593, 2021.
  • 76. D. Rothchild et al., “Fetchsgd: Communication-efficient federated learning with sketching,” in Int. Conference on Machine Learning. PMLR, 2020, pp. 8253-8265.
  • 77. J. Xu et al., “Ternary compression for communication-efficient federated learning,” IEEE Transactions on Neural Networks and Learning Systems, 2020. [Online]. Available: https://doi.org/10.1109/TNNLS.2020.3041185
  • 78. M. Asad et al., “Ceep-fl: A comprehensive approach for communication efficiency and enhanced privacy in federated learning,” Applied Soft Computing, vol. 104, p. 107235, 2021. [Online]. Available: https://doi.org/10.1016/j.asoc.2021.107235
  • 79. W. Luping et al., “Cmfl: Mitigating communication overhead for federated learning,” in IEEE 39th international conference on distributed computing systems (ICDCS), 2019, pp. 954-964. [Online]. Available: https://doi.org/10.1109/ICDCS.2019.00099
  • 80. T. Nishio and R. Yonetani, “Client selection for federated learning with heterogeneous resources in mobile edge,” in IEEE ICC. IEEE, 2019, pp. 1-7. [Online]. Available: https://doi.org/10.1109/ICC.2019.8761315
  • 81. S. Park et al., “Fedpso: federated learning using particle swarm optimization to reduce communication costs,” Sensors, vol. 21, no. 2, p. 600, 2021. [Online]. Available: https://doi.org/10.3390/s21020600
  • 82. Q. Xia et al., “A survey of federated learning for edge computing: Research problems and solutions,” High-Confidence Computing, vol. 1, no. 1, p. 100008, 2021. [Online]. Available: https: //doi.org/10.1016/j.hcc.2021.100008
  • 83. A. A. Al-Saedi, V. Boeva, and E. Casalicchio, “Reducing communication overhead of federated learning through clustering analysis,” in 2021 IEEE Symposium on Computers and Communications (ISCC). IEEE, 2021, pp. 1-7. [Online]. Available: https://doi.org/10.1109/ISCC53001.2021.9631391
  • 84. A. A. Al-Saedi, E. Casalicchio, and V. Boeva, “An energy-aware multicriteria federated learning model for edge computing,” in 2021 8th Int. Conf. on Future IoT and Cloud (FiCloud). IEEE, 2021, pp. 134-143. [Online]. Available: https://doi.org/10.1109/FiCloud49777.2021.00027
  • 85. D. L. Iverson, “Inductive system health monitoring,” in IC-AI, 2004, pp. 605-611.
Uwagi
Opracowanie rekordu ze środków MEiN, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2022-2023).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-4183894c-2742-4ad1-9133-a290d53ab5e8
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.