Using Semantic Constraints for Question Answering

Article Preview

Abstract:

In this paper, we discuss a technique based on semantic constraints to improve the performance and portability of a reformulation-based question answering system. First, we present a method for acquiring semantic-based reformulations automatically. The goal is to generate patterns from correlative articles based on lexical, syntactic and semantic constraints. and a method to evaluate and re-rank candidate answers that satisfy these constraints is adopted. The evaluation on questions from TREC QA tracks 2003 and 2004 shows that the automatically acquired semantic patterns allows us to avoid the manual work of formulating semantically equivalent reformulations, while still reach an acceptable performance.

You might also be interested in these eBooks

Info:

Periodical:

Advanced Materials Research (Volumes 655-657)

Pages:

1750-1756

Citation:

Online since:

January 2013

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2013 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] Hovy, E., Gerber et al., Towards semantics-based answer pinpointing. In: Proc. First International Conf. Human Language Technology Research. Association for Computational Linguistics, Morristown, USA, 2001. p.1–7.

DOI: 10.3115/1072133.1072221

Google Scholar

[2] C.S. Lee, Y.F. Kao, Y.H. Kuo, M.H. Wang, Automated ontology construction for unstructured text documents, Journal of Data & Knowledge Engineering , 2007(60), p.547–566.

DOI: 10.1016/j.datak.2006.04.001

Google Scholar

[3] Hoojung Chung et al., A practical QA system in restricted domains, in: Proceedings of the ACL 2004 Workshop on Question Answering in Restricted Domains, Spain, 2004, p.223–236.

Google Scholar

[4] Zhang, D., Lee, W.S., Question classification using support vector machines. In: Proc. 26th Annual International ACM SIGIR Conf. Research and Development in Information Retrieval. ACM Press, New York, USA, 2003, p.26–32.

DOI: 10.1145/860435.860443

Google Scholar

[5] D. Moldovan, M. Pasca, S. Harabagiu, M. Surdeanu, Performance issue and error analysis in an open-domain question answering system, ACM Transactions on Information Systems (TOIS) 21 (2) , 2003, pp.113-154.

DOI: 10.1145/763693.763694

Google Scholar

[6] C.K. Lee, J.H. Wang, H.J. Kim, M.G. Jang, Extracting template for knowledge-based question-answering using conditional random fields, in: Proceedings of 28th annual international ACM SIGIR Workshop on MF/IR, 2005, pp.133-145.

Google Scholar

[7] H. -J. Kim, H. -J. Oh et al., The 3-step Answer Processing Method for Encyclopedia Question-Answering System: AnyQuestion 1. 0, in: Proceedings of Asia Information Retrieval Symposium (AIRS-04), LNCS, vol. 3689, 2004, p.309–312.

Google Scholar

[8] C.L.A. Clarke, G.V. Cormack et al., Web reinforced question answering (Multi-Text experiments for TREC 2001), In Proceedings of Text Retrieval Conference (TREC), 2001, p.673–679.

Google Scholar

[9] Y.S. Lee, Y.S. Hwang, H.C. Rim, Variable length passage retrieval for Q&A system, in: Proceedings of the 14th Hangul and Korean Information Processing, 2002, p.259–266.

Google Scholar

[10] Hui Yang, Tat-Seng Chua. The Integration of Lexical Knowledge and External Resources for Question Answering, In the Proceedings of the Eleventh Text Retrieval Conference (TREC'2002), Maryland, USA, 19-22 Nov 2002, pp.155-161.

Google Scholar

[11] Junichi Fukumoto, Tsuneaki Kato and Fumito Masui. Question Answering Challenge (QAC1): An Evaluation of QA Tasks at the NTCIR Workshop 3, In Proc. of AAAI Spring Symposium: New Directions in Question Answering, 2003, pp.122-133.

DOI: 10.1145/986278.986283

Google Scholar

[12] Marius Pasca. A Relational and Logic Representation for Open-Domain Textual Question Answering. ACL (Companion Volume) 2001, pp.37-42.

Google Scholar

[13] Zhang, D., Lee, W.S., Question classification using support vector machines. In Proc. 26th Annual International ACM SIGIR Conf. Research and Development in Information Retrieval. ACM Press, New York, USA, 2003, p.26–32.

DOI: 10.1145/860435.860443

Google Scholar

[14] Li, X., Roth, D., Learning question classifiers, in Proc. 19th International Conf. Computational Linguistics, vol. 1. Association for Computational Linguistics, Morristown, USA, 2002, p.1–7.

DOI: 10.3115/1072228.1072378

Google Scholar

[15] Li, X., Roth, D., Learning question classifiers: The role of semantic information, Nat. Lang. Eng. 2006. 12 (3), p.229–249.

DOI: 10.1017/s1351324905003955

Google Scholar

[16] Pinto, D., Branstein, M., et al., QuASM: A system for question answering using semi-structured data, In: Proc. Second ACM/IEEE-CS Joint Conf. Digital Libraries. ACM, New York, 2002, p.46–55.

DOI: 10.1145/544220.544228

Google Scholar

[17] Nguyen, M.L., Nguyen, T.T., Shimazu, A., Subtree mining for question classification problem. In: Proc. Twentieth Joint International Conf. Artificial Intelligence, 2007, p.1695–1700.

Google Scholar

[18] David, T., Vicedo, J.L., Empar, B., Lidia, M., Automatic feature extraction for question classification based on dissimilarity of probability distributions. In: Advances in Natural Language Processing. LNCS, vol. 4139. Springer, Berlin, 2006, p.133.

DOI: 10.1007/11816508_15

Google Scholar

[19] Leila Kosseim , Jamileh Yousefi, Improving the performance of question answering with semantically equivalent answer patterns, Data & Knowledge Engineering 66 (2008), p.53–67.

DOI: 10.1016/j.datak.2007.07.010

Google Scholar

[20] Hyo-Jung Oh, Sung Hyon Myaeng, Myung-Gil Jang, Semantic passage segmentation based on sentence topics for question answering, Information Sciences 177 (2007), p.3696–3717.

DOI: 10.1016/j.ins.2007.02.038

Google Scholar