Auxiliary Problems in Convex Alternating Structure Optimization Algorithm

Article Preview

Abstract:

There exist principles of Relevancy, Orthogonality and Domain Adaptation for auxiliary problems (APs) selection in Alternating Structure Optimization (ASO) algorithm. Convex Alternating Structure Optimization (cASO) algorithm is an improved one based on ASO algorithm, whose kernel still lies in creating excellent APs. In order to validate the effectiveness of the preceding principles in cASO algorithm, many types of APs were created by taking example of Chinese syntactic chunking. Experimental results and analyses both demonstrate that those principles still hold in cASO algorithm.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

2349-2357

Citation:

Online since:

December 2012

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2013 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] R.K. Ando and T. Zhang. Journal of machine learning research, 6(Nov), 2005, 1817-1853.

Google Scholar

[2] R.K. Ando and T. Zhang: A High-performance Semi-supervised Learning Method for Text Chunking. Proc. 43rd annual meeting on association for computational linguistics, Michigan, 2005, 1-9.

DOI: 10.3115/1219840.1219841

Google Scholar

[3] X. Bai, T. Zhang, S. He and X. Wang: Chinese Syntactic Chunking Based on ASO Algorithm (In Chinese). Proc. 13th China national conference on artificial intelligence (CAAI-13), Beijing, 2009.

Google Scholar

[4] C. Liu and H.T. Ng: Learning Predictive Structure for Semantic Role Labeling of Nombank. Proc. 45th annual meeting on association for computational linguistics, Prague, 2007, 208-215.

Google Scholar

[5] S. He, T. Zhang, X. Bai, et al: Incorporating Multi-task Learning in Conditional Random Fields for Chunking in Semantic Role Labeling. Proc. IEEE international conference on natural language processing and knowledge engineering (IEEE NLP-KE'09), Dalian, 2009, 47-51.

DOI: 10.1109/nlpke.2009.5313786

Google Scholar

[6] T. Zhang, X. Wang, H. Tong, et al. Journal of computational information systems, 8(2), 2012, 549-561.

Google Scholar

[7] R.K. Ando: Applying Alternating Structure Optimization to Word Sense Disambiguation. Proc. 10th conference on computational natural language learning, New York City, 2006, 77-84.

DOI: 10.3115/1596276.1596291

Google Scholar

[8] R.K. Ando, M. Dredze and T. Zhang: TREC 2005 Genomics Track Experiments at IBM Watson. Proc. 14th text retrieval conference, Maryland, 2005.

Google Scholar

[9] D. Dahlmeier and H.T. Ng: Grammatical Error Correction with Alternating Structure Optimization. Proc. 49th annual meeting of the association for computational linguistics, Portland, 2011, 915-923.

Google Scholar

[10] J. Chen, L. Tang, J. Liu, et al: A Convex Formulation for Learning Shared Structures from Multiple Tasks. Proc. 26th international conference on machine learning, Montreal, 2009.

DOI: 10.1145/1553374.1553392

Google Scholar

[11] T. Zhang, X. Wang and H. Tong. Journal of Beijing University of Posts and Telecommunications, 34 (4), 2011, 14-18.

Google Scholar

[12] T. Zhang, X. Wang and H. Tong: Researches on Combinations of Auxiliary Problems in ASO (Alternating Structure Optimization) Algorithm. Proc. 2011 international conference on future computer science and education (ICFCSE 2011), Xi'an, 2011, 608-614.

DOI: 10.1109/icfcse.2011.152

Google Scholar

[13] T. Zhang, X. Wang and H. Tong: Domain Adaptation in Alternating Structure Optimization (ASO) Algorithm. Proc. 11th iasted international conference on artificial intelligence and applications, Innsbruck, 2011, 50-55.

DOI: 10.2316/p.2011.717-018

Google Scholar

[14] J. Blitzer, R. Mcdonald and F. Pereira: Domain Adaptation with Structural Correspondence Learning. Proc. empirical methods in natural language processing (EMNLP), Sydney, 2006, 120-128.

DOI: 10.3115/1610075.1610094

Google Scholar

[15] Z. Sheng, S. Xie and C. Pan, in: Probability theory and mathematical statistics (In Chinese) (Higher Education Press, Beijing, 2001), 213-253.

Google Scholar