logo

SCIENCE CHINA Information Sciences, Volume 62, Issue 11: 219102(2019) https://doi.org/10.1007/s11432-018-9528-8

Implicit discourse relation detection using concatenated word embeddings and a gated relevance network

More info
  • ReceivedMar 3, 2018
  • AcceptedJul 20, 2018
  • PublishedSep 18, 2019

Abstract

There is no abstract available for this article.


Acknowledgment

This work was partially funded by National Natural Science Foundation of China (Grant Nos. 61532011, 61473092, 61472088) and Science and Technology Commission Shanghai Municipality (Grant Nos. 16JC1420401, 17JC1420200).


References

[1] Pitler E, Louis A, Nenkova A. Automatic sense prediction for implicit discourse relations in text. In: Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP, Suntec, 2009. 683--691. Google Scholar

[2] Rutherford A, Xue N. Discovering implicit discourse relations through brown cluster pair representation and coreference patterns. In: Proceedings of the 14th Conference of the European Chapter of the Association for Computational Linguistics, 2014. 645--654. Google Scholar

[3] Liu Y, Li S J, Zhang X D, et al. Implicit discourse relation classification via multi-task neural networks. 2016,. arXiv Google Scholar

[4] Zhang B, Xiong D, Su J, et al. Variational neural discourse relation recognizer. 2016,. arXiv Google Scholar

[5] Qin L, Zhang Z, Zhao H. Implicit discourse relation recognition with context-aware character-enhanced embeddings. In: Proceedings of COLING, Osaka, 2016. 1914--1924. Google Scholar

[6] Kim Y, Jernite Y, Sontag D, et al. Character-Aware Neural Language Models. In: Proceedings of the 30th AAAI Conference on Artificial Intelligence, 2016. 2741--2749. Google Scholar

[7] Hochreiter S, Schmidhuber J. Long Short-Term Memory. Neural Computation, 1997, 9: 1735-1780 CrossRef Google Scholar

[8] Sutskever I, Tenenbaum J B, Salakhutdinov R R. Modelling relational data using bayesian clustered tensor factorization. Adv Neural Inf Process Syst, 2009, 2009: 1821--1828. Google Scholar

[9] Chen J, Zhang Q, Liu P, et al. Implicit discourse relation detection via a deep architecture with gated relevance network. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, 2016. 1: 1726--1735. Google Scholar

  • Table 1   Experimental results for the PDTB dataset
    Comp. Cont. Expa. Temp.
    Pitler et al. (2009) [1] 21.96 47.13 76.42 16.76
    Rutherford and Xue. (2014) [2] 39.70 54.42 80.44 28.69
    Liu and Li. (2016) [3]36.70 54.76 31.32
    Zhang et al. (2016) [4] 35.88 50.56 29.54
    Qin et al. (2016) [5] 38.67 54.91 80.66 32.76
    Char. CNN32.76 49.53 76.80 22.12
    Char. CNN+Bi-LSTM 33.63 50.42 77.99 22.58
    Char. CNN+Bi-LSTM+GRN 34.14 51.44 78.31 23.22
    Word. LSTM 35.48 52.11 77.36 27.62
    Word. Bi-LSTM 37.35 52.27 78.33 29.36
    Word. Bi-LSTM+GRN 40.17 54.76 80.62 31.32
    Con. LSTM 36.86 52.56 78.28 28.89
    Con. Bi-LSTM 38.11 53.38 79.22 31.37
    Con. LSTM+GRN 38.32 53.52 78.34 29.02
    Con. Bi-LSTM+GRN 41.02 54.94 80.78 31.76

Copyright 2020 Science China Press Co., Ltd. 《中国科学》杂志社有限责任公司 版权所有

京ICP备18024590号-1