logo

SCIENCE CHINA Information Sciences, Volume 62, Issue 12: 229103(2019) https://doi.org/10.1007/s11432-018-9479-3

Reformulating natural language queries using sequence-to-sequence models

More info
  • ReceivedMar 12, 2018
  • AcceptedJun 5, 2019
  • PublishedNov 6, 2019

Abstract

There is no abstract available for this article.


Acknowledgment

This work was partially supported by National Key Research and Development Plan (Grant No. 2017YFB1002104), National Natural Science Foundation of China (Grant Nos. 61532011, 61751201, 61473092, 61472088), and STCSM (Grant No. 16JC1420401, 17JC1420200). The authors would like to thank the anonymous reviewers for their helpful comments.


References

[1] Riezler S, Liu Y. Query Rewriting Using Monolingual Statistical Machine Translation. Comput Linguistics, 2010, 36: 569-582 CrossRef Google Scholar

[2] Jones R, Rey B, Madani O, et al. Generating query substitutions. In: Proceedings of the 15th international conference on World Wide Web, Edinburgh, 2006. 387--396. Google Scholar

[3] Gao J F, He X D, Xie S S, et al. Learning lexicon models from search logs for query expansion. In: Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, Jeju Island, 2013. 666--676. Google Scholar

[4] Song H J, Kim A, Park S B. Translation of natural language query into keyword query using a RNN encoder-decoder. In: Proceedings of International ACM SIGIR Conference, 2017. 965--968. Google Scholar

[5] Bahdanau D, Cho K, Bengio Y. Neural Machine Translation by Jointly Learning to Align and Translate. 2015,. arXiv Google Scholar

[6] Luong M T, Pham H, Manning C D. Effective approaches to attention-based neural machine translation. 2015,. arXiv Google Scholar

[7] Gu J, Lu Z, Li H, et al. Incorporating copying mechanism in sequence-to-sequence learning. 2016,. arXiv Google Scholar

[8] Riezler S, Liu Y, Vasserman A. Translating Queries into Snippets for Improved Query Expansion. In: Proceedings of International Conference on Computational Linguistics, Manchester, 2008. 737--744. Google Scholar

[9] Rush A M, Chopra S, Weston J. A neural attention model for abstractive sentence summarization. 2015,. arXiv Google Scholar

  • Table 1   Performance of different methods
    @@extracolsepfillccccccp0mm@ Models H@5 H@10 P@3 P@5 P@10
    Raw query 25.9 29.311.8 8.3 6.8
    Attention seq2seq 19.1 21.7 7.95.7 4.7
    Sequence labeling 22.2 25.1 9.66.9 5.7
    Pointer network 28.3 31.2 14.110.6 8.5
    ATS2S + SL 29.8 34.213.210.0 8.2
    ATS2S + PN 28.7 32.013.39.9 8.0
    The proposed model 37.1 44.1 16.311.5 9.6

Copyright 2020 Science China Press Co., Ltd. 《中国科学》杂志社有限责任公司 版权所有

京ICP备18024590号-1       京公网安备11010102003388号