PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Review of Current Text Representation Technics for Sematic Relationship Extraction

Treść / Zawartość
Identyfikatory
Warianty tytułu
PL
Przegląd metod reprezentacji tekstu w kontekście wyznaczania relacji semantycznych
Języki publikacji
EN
Abstrakty
EN
Article provides review on current most popular text processing technics; sketches their evolution and compares sequence and dependency models in detecting semantic relationship between words.
PL
Artykuł zawiera przegląd najpopularniejszych metod reprezentacji tekstu - modele sekwencyjne i grafowe w kontekście wykrywania relacji semantycznych między słowami.
Twórcy
  • Military University of Technology, Faculty of Cybernetics, Institute of Computer and Information Systems, Kaliskiego St. 2, 00-908 Warsaw, Poland
Bibliografia
  • [1] Nivre J., “Deep Contextualized Word Embeddings in Transition-Based and Graph-Based Dependency Parsing – A Tale of Two Parsers Revisited”, in: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, 2755–2768, Hong Kong, China, November 3–7, 2019.
  • [2] Nivre J., “Incrementality in deterministic dependency parsing”, in: Proceedings of the Workshop on Incremental Parsing: Bringing Engineering and Cognition Together, 50–57, July 2004.
  • [3] Nivre J., “An efficient algorithm for projective dependency parsing”, in: Proceedings of the Eighth International Conference on Parsing Technologies, 149–160, Nancy, France, 2003.
  • [4] Jurafsky D., “Speech and Language Processing”, https://web.stanford.edu/~jurafsky/slp3/15.pdf.
  • [5] Franciscus N., Xuguang Ren, Stantic B., “Dependency graph for short text extraction and summarization”, Journal of Information and Telecommunication, Vol. 3, No. 4, 413–429 (2019).
  • [6] Bunescu R., Mooney R., “A Shortest Path Dependency Kernel for Relation Extraction”, in: Proceedings of Human Language Technology Conference and Conference on Empirical Methods in Natural Language Processing, 724–731, Vancouver, British Columbia, Canada, 2005.
  • [7] Guo Z., Zhang Y., Lu W., “Attention Guided Graph Convolutional Networks for Relation Extraction”, in: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 241–251, Florence, Italy, 2019.
  • [8] Zhang Y., Qi P., Manning C.D., “Graph Convolution over Pruned Dependency Trees Improves Relation Extraction”, in: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, 2205–2215, Brussels, Belgium, 2018.
  • [9] Kipf T., Welling M., “Semi-Supervised Classification With Graph Convolution Networks”, in: Conference paper at ICLR 2017, April 24–26, 2017, Toulon, France.
  • [10] Veličkovic P. et al., “Graph Attention Networks”, in: Conference paper at ICLR 2018, April 39 – May 3, 2018, Vancouver, BC, Canada.
  • [11] Hamilton W.L., Ying R., Leskovec J., “Inductive representation learning on large graphs”, in: Proceedings of the 31st International Conference on Neural Information Processing Systems, 1025–1035, Curran Associates Inc., NY, USA, 2017.
  • [12] Bengio Y., Ducharme R., Vincent P., Jauvin C., “A Neural Probabilitic Language Model”, Journal of Machine Learning Research, Vol. 3, 1137–1155 (2003).
  • [13] Camacho-Collados J., Pilehvar M.T., “On the Role of Text Preprocessing in Neural Network Architectures: An Evaluation Study on Text Categorization and Sentiment Analysis”, in: Proceedings of the 2018 EMNLP Workshop Blackbox NLP: analyzing and Interpreting Neural Networks for NLP, 40–46, Brussels, Belgium, 2018.
  • [14] Soares L.B., FitzGerald N., Ling J., Kwiatkowski T., “Matching the Blanks: Distributional Similarity for Relation Learning”, in: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 2895–2905, Association for Computational Linguistics, Florence, Italy, 2019.
  • [15] Mikolov T., et al., “Distributed Representations of Words and Phrases and their Compositionality”, in: Proceedings of the 26th International Conference on Neural Information Processing Systems, Vol. 2, 3111–3119, Curran Associates Inc., NY, USA, 2013.
  • [16] Mikolov T., Chen K., Corrado G., Dean J., “Efficient Estimation of Word Representations in Vector Space”, in: Proceedings of Workshop at ICLR, 2013.
  • [17] Brown T.B., et al., “Language Models are Few-Shot Learners”, arXiv preprint arXiv:2005.14165. 2020.
  • [18] Devlin J., Chang M.-W., Lee K., Toutanova K., “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding”, in: Proceedings of NAACL-HLT 2019, Minneapolis, Minnesota, June 2 – June 7, 2019, 4171–4186, Association for Computational Linguistics, 2019.
  • [19] Radford A., et al., “Language Models are Unsupervised Multitask Learners”, https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf.
  • [20] Peters M.E., et al., “Deep contextualized word representations”, in: Conference paper at ICLR 2018.
  • [21] Řehůřek R., 2009. “Gensim: Topic modelling for humans”, RARE Technologies Ltd., www.radimrehurek.com/gensim/
  • [22] Vaswani A., et al., “Attention Is All You Need”, in: Proceedings of NIPS 2017, Long Beach, CA, USA, 2017.
  • [23] Bahdanau D., Cho K., Bengio Y., “Neural Machine Translation by Jointly Learning to Align and Translate”, in: Conference paper at ICLR 2015.
  • [24] SpaCy. “Industrail-Strength Natural Language Processing”, www.spacy.io
  • [25] S. Hochreiter, J. Schmidthuber, “Long-Short Term Memory”, Neural Computation, Vol. Issue 8, 9 1735–1780 (1997).
  • [26] Zelenko D., Aone C., Richardella A., “Kernel Methods for Relation Extraction”, Journal of Machine Learning Research, Vol. 3, 1083–1106 (2003).
  • [27] Collins M., Duffy N., “Convolution Kernels for Natural Language”, Advances in Neural Information Processing Systems 14, Proceedings of the 2001 Neural Information Processing Systems Conference (NIPS 2001), 625–632, MIT Press, Cambridge, MA, USA, 2002.
  • [28] Cortes C., Vapnik V., “Support-vector Networks”, Machine Learning, 20, 273–297 (1995).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-53a2b755-ae2c-40ab-962f-f8d5943326dc
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.