Quality Assessment of the Retrieval System Based on Web-Related Degree

Article Preview

Abstract:

In order to further improve the quality and efficiency of Web search, the method of retrieval system quality assessment based on Web-related degree is proposed in this paper. The specific research process is as follows. Firstly, analyzing category for query and building a set of queries, then, studying the retrieval quality differences of different search engines through assessment experiment, finally, analyzing the factors which impact Web retrieval system quality assessment through four sides separately. The study result shows that using quality assessment method for retrieval system based on Web-related degree can better improve the accuracy of the corresponding assessment indicators and distinguish ability, accuracy ratio is over 90%, which can verify the validity of this method.

You might also be interested in these eBooks

Info:

Periodical:

Advanced Materials Research (Volumes 915-916)

Pages:

1357-1360

Citation:

Online since:

April 2014

Authors:

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2014 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] T. Saracevic. Evaluation of evaluation in information retrieval, Proceedings of the 18th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Seattle, WA, USA, Jul 9-13 (1995).

DOI: 10.1145/215206.215351

Google Scholar

[2] J. Zobel. How reliable are the results of large-scale information retrieval experiments?, Proceedings of the 1998 21st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR'98), Melbourne, Vic., Aust, (1998).

DOI: 10.1145/290941.291014

Google Scholar

[3] E. M. Voorhees, C. Buckley. The effect of topic set size on retrieval experiment error, Proceedings of the Twenty-Fifth Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Aug 11-15, Tampere, Finland, (2002).

DOI: 10.1145/564376.564432

Google Scholar

[4] E. M. Voorhees. Evaluation by highly relevant documents, Proceedings of 24th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, New Orleans, LA, (2001).

DOI: 10.1145/383952.383963

Google Scholar

[5] D. Hawking, N. Craswell etc. Measuring search engine quality, Information Retrieval, 4: 33-59 (2001).

Google Scholar

[6] A. Singhal, M. Kaszkiel. A case study in web search using TREC algorithms, Proceedings of the tenth international conference on World Wide Web. Hong Kong, Hong Kong: ACM Press, 5(2): 708-716 (2001).

DOI: 10.1145/371920.372186

Google Scholar

[7] B. Travis, A. Broder. The Need Behind the Query: Web Search vs Classic Information Retrieval, Proceedings of the sixth search engine conference, Boston, Massachusetts, (2001).

Google Scholar

[8] D. Hawking, N. Craswell etc. Measuring search engine quality, Information Retrieval, 4: 33-59 (2001).

Google Scholar