PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

A model of continual and deep learning for aspect based in sentiment analysis

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Sentiment analysis is a useful tool in several social and business contexts. Aspect sentiment classification is a subtask in sentiment analysis that gives information about features or aspects of people, entities, products, or services present in reviews. Different deep learning models that have been proposed to solve aspect sen‐ timent classification focus on a specific domain such as restaurant, hotel, or laptop reviews. However, there are few proposals for creating a single model with high performance in multiple domains. The continual learn‐ ing approach with neural networks has been used to solve aspect classification in multiple domains. However, avoiding low, aspect classification performance in contin‐ ual learning is challenging. As a consequence, potential neural network weight shifts in the learning process in different domains or datasets. In this paper, a novel aspect sentiment classification approach is proposed. Our approach combines a trans‐ former deep learning technique with a continual learning algorithm in different domains. The input layer used is the pretrained model Bidirectional Encoder Representations from Transformers. The experiments show the efficacy of our proposal with 78 % F1‐macro. Our results improve other approaches from the state‐of-the-art.
Twórcy
  • Faculty of Engineering in Telecommunications, Informatics and Biomedical, Universidad de Oriente Ave. Patricio Lumumba s/n, Santiago de Cuba, Cuba, www: https://www.linkedin.com/in/ ~dionis‐lopez‐ ramos.
  • Center for Neuroscience Studies and Image and Signal Processing, Faculty of Engineering in Telecommunications, Informatics and Biomedical, Universidad de Oriente Ave. Patricio Lumumba s/n, Santiago de Cuba, Cuba
Bibliografia
  • [1] R. Aljundi, F. Babiloni, M. Elhoseiny, M. Rohrbach, and T. Tuytelaars. “Memory aware synapses: Learning what (not) to forget”, Proceedings of the European Conference on Computer Vision (ECCV), 2018, pp. 139–154.
  • [2] M. Biesialska, K. Biesialska, and M. R. Costa-jussà. “Continual lifelong learning in natural language processing: A survey”, Proceedings of the 28th International Conference on Computational Linguistics, 2020, pp. 6523–6541.
  • [3] A. Chaudhry, P. K. Dokania, T. Ajanthan, and P. H. Torr. “Riemannian walk for incremental learning: Understanding forgetting and intransigence”, Proceedings of the European Conference on Computer Vision (ECCV), 2018, pp. 532–547.
  • [4] T. Chen, S. Kornblith, M. Norouzi, and G. Hinton. “A simple framework for contrastive learning of visual representations”, International Conference on Machine Learning, 2020, pp. 1597–1607.
  • [5] Z. Chen, and B. Liu. “Lifelong machine learning”, Synthesis Lectures on Artificial Intelligence and Machine Learning, vol. 12, no. 3, 2018, pp. 1–207.
  • [6] M. Delange, R. Aljundi, M. Masana, S. Parisot, X. Jia, A. Leonardis, G. Slabaugh, and T. Tuytelaars. “A continual learning survey: Defying forgetting in classification tasks”, IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021.
  • [7] J. Devlin, M.‐W. Chang, K. Lee, and K. Toutanova. “Bert: Pre‐training of Deep Bidirectional Transformers for Language Understanding”, arXiv preprint arXiv:1810.04805, 2018.
  • [8] H. H. Do, P. Prasad, A. Maag, and A. Alsadoon. “Deep learning for aspect-based sentiment analysis: a comparative review”, Expert Systems with Applications, vol. 118, 2019, pp. 272–299.
  • [9] R. M. French. “Catastrophic forgetting in connectionist networks”, Trends in Cognitive Sciences, vol. 3, no. 4, 1999, pp. 128–135.
  • [10] M. Hoang and A. Bihorac. “Aspect-based sentiment analysis using the pre‐trained languagemodel BERT”, 2019.
  • [11] M. Huang, Y. Wang, X. Zhu, and L. Zhao.“Attention‐based LSTM for aspect‐level sentiment classification”, Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, Austin, Texas, USA, 2016, pp. 606–615.
  • [12] Z. Ke, B. Liu, H. Wang, and L. Shu. “Continual learning with knowledge transfer for sentyment classification”, Proceedings of European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, vol. 3, 2020, pp. 683–698.
  • [13] Z. Ke, B. Liu, H. Xu, and L. Shu. “CLASSIC: Continual and contrastive learning of aspect sentiment classification tasks”, Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, 2021, pp. 6871–6883.
  • [14] S.‐W. Lee, J.‐H. Kim, J. Jun, J.‐W. Ha, and B.‐T. Zhang. “Overcoming catastrophic forgetting by incremental moment matching”, Advances in neural information processing systems, vol. 30, 2017.
  • [15] B. Liu, Sentiment analysis: Mining opinions, sentiments, and emotions, Cambridge University Press, 2020.
  • [16] V. Lomonaco. Continual learning with deep architectures. PhD thesis, Universidad de Bologna, Italia, 2019.
  • [17] D. López and L. Arco. “Multi‐domain aspect extraction based on deep and lifelong learning”, Iberoamerican Congress on Pattern Recognition, 2019, pp. 556–565.
  • [18] D. Lopez‐Paz. “Gradient episodic memory for continual learning”, Advances in Neural Information Processing Systems, 2017, pp. 6467–6476.
  • [19] D. López Ramos and L. Arco García. “Aprendizaje profundo para la extracción de aspectos en opiniones textuales”, Revista Cubana de Ciencias Informáticas, vol. 13, no. 2, 2019, pp. 105–145.
  • [20] A. Mallya, and S. Lazebnik. “Packnet: Adding multiple tasks to a single network by iterative pruning”, Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, 2018, pp. 7765–7773.
  • [21] D. Maltoni, and V. Lomonaco. “Continuous learning in single-incremental-task scenarios”, Neural Networks, vol. 116, 2019, pp. 56–73.
  • [22] M. McCloskey and N. J. Cohen. “Catastrophic interference in connectionist networks: The sequential learning problem”, Psychology of learning and motivation, vol. 24, 1989, pp. 109–165.
  • [23] A. Nazir, Y. Rao, L. Wu, and L. Sun. “Issues and challenges of aspect-based sentiment analysis: a comprehensive survey”, IEEE Transactions on Affective Computing, vol. 13, no. 2, 2020.
  • [24] G. I. Parisi, R. Kemker, J. L. Part, C. Kanan, and S. Wermter. “Continual lifelong learning with neural networks: a review”,Neural Networks, vol. 113, 2019, pp. 54–71.
  • [25] G. I. Parisi and V. Lomonaco. “Online continual learning on sequences”. Recent Trends in Learning From Data, pp. 197–221. New York Springer, 2020.
  • [26] M. Pontiki, D. Galanis, J. Pavlopoulos, H. Papageorgiou, I. Androutsopoulos, and S. Manandhar. “Semeval-2014 task 4: aspect based sentiment analysis”, Proceedings of the 8th International Workshop on Semantic Evaluation (SemEval 2014), 2014, pp. 27–35.
  • [27] Y. Ren, Y. Zhang, M. Zhang, and D. Ji. “Improving twitter sentiment classification using topic‐enriched multi‐prototype word embeddings”, Thirtieth AAAI Conference on Artiϔicial Intelligence, 2016.
  • [28] A. Rietzler, S. Stabinger, P. Opitz, and S. Engl. “Adapt or get left behind: domain adaptation through bert language model finetuningfor aspect-target sentiment classification”, arXiv preprint arXiv:1908.11860, 2019.
  • [29] J. Serra, D. Suris, M. Miron, and A. Karatzoglou. “Overcoming catastrophic forgetting with hard attention to the task”, International Conference on Machine Learning, 2018, pp. 4548–4557.
  • [30] R. Singh, and S. Singh. “Text similarity measures in news articles by vector space model using nlp”, The Institution of Engineers (India): Series B, vol. 102, no. 2, 2021, pp. 329–338.
  • [31] Y. Song, J. Wang, T. Jiang, Z. Liu, and Y. Rao. “Attentional encoder network for targeted sentyment classification”, arXiv preprint arXiv:1902.09314, 2019.
  • [32] F. Tang, L. Fu, B. Yao, and W. Xu. “Aspect based fine‐grained sentiment analysis for online reviews”, Information Sciences, vol. 488, 2019, pp. 190–204.
  • [33] E. Terra, A. Mohammed, and H. Hefny. “An approach for textual based clustering using wordembedding”. Machine Learning and Big Data Analytics Paradigms: Analysis, Applications and Challenges, pp. 261–280. Springer, 2021.
  • [34] G. M. Van de Ven, and A. S. Tolias. “Three scenarios for continual learning”, NeurIPS Continual Learning Workshop, vol. 1, no. 9, 2018.
  • [35] S. Wang, G. Lv, S. Mazumder, G. Fei, and B. Liu. “Lifelong learning memory networks for aspect sentiment classification”, 2018 IEEE International Conference on Big Data (Big Data), 2018, pp. 861–870.
  • [36] F. Wu, X.‐Y. Jing, Z. Wu, Y. Ji, X. Dong, X. Luo, Q. Huang, and R. Wang, “Modality‐specific and shared generative adversarial network for crossmodal retrieval”, Pattern Recognition, vol. 104, 2020, 107335.
  • [37] B. Zeng, H. Yang, R. Xu, W. Zhou, and X. Han. “LCF: a local context focus mechanism for aspect‐based sentiment classification”, Applied Sciences, vol. 9, no. 16, 2019, 3389.
  • [38] F. Zenke, B. Poole, and S. Ganguli. “Continual learning through synaptic intelligence”, International Conference on Machine Learning, 2017, pp. 3987–3995.
  • [39] J. Zhou, J. X. Huang, Q. Chen, Q. V. Hu, T. Wang, and L. He. “Deep Learning for aspect-level sentyment classification: survey, vision and challenges”, IEEE Access, vol. 7, 2019, pp. 78454–78483.
  • [40] K. M. Zorn, D. H. Foil, T. R. Lane, D. P. Russo, W. Hillwalker, D. J. Feifarek, F. Jones, W. D. Klaren, A. M. Brinkman, and S. Ekins. “Machine learning models for estrogen receptor bioactivity and endocrine disruption prediction”, Environmental Science & Technology, vol. 54, no. 19, 2020, pp. 12202–12213.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-80fdb2da-a89b-416e-a4e5-083c2aac6453
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.