PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Towards a new deep learning algorithm based on GRU and CNN : NGRU

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
This paper describes our new deep learning system based on a comparison between GRU and CNN. Initially we start with the first system which uses Convolutional Neural Network (CNN) which we will compare with the second system which uses Gated Recurrent Unit (GRU). And through this comparison we propose a new system based on the positive points of the two previous systems. Therefore, this new system will take the right choice of hyper-parameters recommended by the authors of both systems. At the final stage we propose a method to apply this new system to the dataset of different languages (used especially in socials networks).
Twórcy
  • LaRI Laboratory, Faculty of Scien - ces, Ibn Tofail University, Kénitra, Morocco
  • LaRI Laboratory, Faculty of Scien - ces, Ibn Tofail University, Kénitra, Morocco
Bibliografia
  •  [1] Y. Kim, “Convolutional Neural Networks for Sentence Classification”, http://arxiv.org/abs/1408.5882. Accessed on: 2021-02-03.
  •  [2] N.Kalchbrenner,E.Grefenstette andP.Blunsom, “A Convolutional Neural Network for Modelling Sentences”. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, 2014, 655–665, DOI: 10.3115/v1/P14-1062.
  •  [3] D. Bahdanau, K. Cho and Y. Bengio, “Neural Machine Translation by Jointly Learning to Align and Translate”, http://arxiv.org/abs/1409.0473. Accessed on: 2021-02-03.
  •  [4] A. Severyn and A. Moschitti, “UNITN: Training Deep Convolutional Neural Network for Twitter Sentiment Classification”. In: Proceedings of the 9th International Workshop on Semantic Evaluation (SemEval 2015), 2015, 464–469, DOI: 10.18653/v1/S15-2079.
  •  [5] R. Collobert and J. Weston, “A unified architecture for natural language processing: deep neural networks with multitask learning”. In: Proceedings of the 25th international conference on Machine learning, 2008, 160–167, DOI: 10.1145/1390156.1390177.
  •  [6] T. Mikolov, I. Sutskever, K. Chen, G. S. Corrado and J. Dean, “Distributed Representations of Words and Phrases and their Compositionality”, Advances in Neural Information Processing Systems, vol. 26, 2013, 3111–3119.
  •  [7] M. Nabil, A. Atyia and M. Aly, “CUFE at SemEval-2016 Task 4: A Gated Recurrent Model for Sentiment Classification”. In: Proceedings of the 10th International Workshop on Semantic Evaluation (SemEval-2016), 2016, 52–57, DOI: 10.18653/v1/S16-1005.
  •  [8] S. Lai, L. Xu, K. Liu and J. Zhao, “Recurrent convolutional neural networks for text classification”. In: Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence, 2015, 2267–2273.
  •  [9] M. D. Zeiler, “ADADELTA: An Adaptive Learning Rate Method”, http://arxiv.org/abs/ 1212.5701. Accessed on: 2021-02-03.
  • [10] D. P. Kingma and J. Ba, “Adam: A Method for Stochastic Optimization”, http://arxiv.org/abs/1412.6980. Accessed on: 2021-02-03.
  • [11] S. Ruder, “An overview of gradient descent optimization algorithms”. http://arxiv.org/abs/1609.04747. Accessed on: 2021-02-03.
Uwagi
Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2021).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-482507c3-be7d-4ae5-a081-58d21a3adbb2
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.