Building a Distributed Testing Execution System Based on TTCN-3

Article Preview

Abstract:

In this paper, a distributed testing execution system is designed, which provides a mechanism of node communication, test script deployment, test scheduling, executor-driving and test result collection in distributed environment. A workload model is established, by which testers can describe the performance testing requirement. A performance testing framework is given, which simulates user behaviors in real environment based on virtual users so as to generate workload from the system under test (SUT). It can control the execution of virtual users by TTCN-3 standard interface. After executing the performance testing, test report is generated by extracting log. A method of generating performance test-case is studied by reusing functional test scripts. By executing performance testing on an online bookstore, this paper demonstrates the availability of the method of reusing TTCN-3 functional test scripts and the capability of distributed performance testing system that had been established.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

2772-2778

Citation:

Online since:

May 2014

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2014 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] Diwakar Krishnamurthy, Jerome A. Rolia, Shikharesh Majumdar. A Synthetic Workload Generation Technique for Stress Testing Session-Based Systems [J]. IEEE Transactions on Software Engineering. 32(11): 868-882, (2006).

DOI: 10.1109/tse.2006.106

Google Scholar

[2] SIEMENS. Synchronized Distributed Testing using TTCN-3 [R], TTCN-3 User Conference (2007).

Google Scholar

[3] Katerina Goseva-Popstojanova, Fengbin Li, Xuan Wang, and Amit Sangle. A Contribution Towards Solving the Web Workload Puzzle [J]. Proceedings of the 2006 International Conference on Dependable Systems and Networks, (2011).

DOI: 10.1109/dsn.2006.2

Google Scholar

[4] Ina Schieferdecker, Jens Grabowski. Introduction to the special section on advances in test automation: the evolution of TTCN-3 [J]. International Journal on Software Tools for Technology Transfer, 10(4): 281-283, (2012).

DOI: 10.1007/s10009-008-0071-4

Google Scholar

[5] Barford, P. and Crovella, M. Generating Representative Web Workloads for Network and Server Performance Evaluation [J]. Proceedings of the 1998 ACM SIGMETRICS joint international conference on Measurement and modeling of computer systems: 151-160, (1998).

DOI: 10.1145/277851.277897

Google Scholar

[6] Gábor Ziegler, György Réthy. Performance testing with TTCN-3 [R], TTCN-3 User Conference, (2010).

Google Scholar

[7] Helmut Neukirchen, Benjamin Zeiss, Jens Grabowski. An approach to quality engineering of TTCN-3 test specifications [J]. International Journal on Software Tools for Technology Transfer, 10(4): 309-326 , (2010).

DOI: 10.1007/s10009-008-0075-0

Google Scholar

[8] Ina Schieferdecker, Bernard Stepien, Axel Rennoch. PerfTTCN, a TTCN language extension for performance testing [J]. Testing of Communicating Systems, (2012).

DOI: 10.1007/978-0-387-35198-8_2

Google Scholar

[9] Robert L. Probert, Bernard Stepien, Pulei Xiong. Formal Testing of Web Content using TTCN-3 [R], TTCN-3 User Conference, (2012).

DOI: 10.1109/icit.2012.6210016

Google Scholar

[10] Giovanni Denaro, Andrea Polini, Wolfgang Emmerich. Early performance testing of distributed software applications [J]. ACM SIGSOFT Software Engineering Notes. 29(1): 94-103, (2012).

DOI: 10.1145/974043.974059

Google Scholar

[11] Dorina Petriu. Architecture-Based Performance Analysis Applied to a Telecommunication System [J]. IEEE Transactions on Software Engineering, 26(11): 1049 - 1065, (2011).

DOI: 10.1109/32.881717

Google Scholar

[12] Matthew J. Rutherford. A Case for Test-Code Generation in Model-Driven Systems [J]. Generative Programming And Component Engineering, Vol. 48: 377-396. (2012).

DOI: 10.1007/978-3-540-39815-8_23

Google Scholar

[13] Dipl. -Ing. Martin Steppler. Performance Analysis of Communication Systems Formally Specified in SDL [J]. Workshop on Software and Performance : 49-62, (1998).

DOI: 10.1145/287318.287329

Google Scholar

[14] Paul Baker, Dominic Evans. TRex – The Refactoring and Metrics Tool for TTCN-3 Test Specifications [J]. IEEE Computer Society: TAIC-PART: 90-94, (2012).

DOI: 10.1109/taic-part.2006.35

Google Scholar

[15] Steve Counsell, Rob M. Hierons. Refactoring test suites versus test behaviour - a TTCN-3 perspective [J], Fourth international workshop on Software quality assurance: 31 - 38, (2010).

DOI: 10.1145/1295074.1295081

Google Scholar

[16] Zhen Ru Dai, Jens Grabowski, Helmut Neukirchen. Timed TTCN-3 - a real-time extention for TTCN-3 [J]. Proceedings of the IFIP 14th International Conference on Testing Communicating Systems XIV: 407-424. (2012).

DOI: 10.1007/978-0-387-35497-2_28

Google Scholar

[17] Raúl PeñaOrtiz,Julio Sahuquillo, Ana Pont, José A. Gil. Modeling continuous changes of the user's dynamic behavior in the WWW [J]. Proceedings of the 5th international workshop on Software and performance: 175 - 180,(2012).

DOI: 10.1145/1071021.1071040

Google Scholar

[18] Mahnaz Shams, Diwakar Krishnamurthy, Behrouz Far. A model-based approach for testing the performance of Web Applications [J]. Proceedings of the 3rd international workshop on Software quality assurance: 54 - 61, (2011).

DOI: 10.1145/1188895.1188909

Google Scholar