The Application of Random Forest in Finance

Article Preview

Abstract:

The variation of futures price is affected by a lot of factors. It is a challenge to predict the price’s trend. In this paper, we apply random forest technique to predict the type of K-line in next day. First, we empirically select nine technical indexes as input variables or features. Then, these features and the associated class label, i.e. the type of K-line, are used to construct RF classifier. The experimental results demonstrate that RF is effective and can be used in the trend prediction of futures price.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

947-951

Citation:

Online since:

March 2015

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2015 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] P.C. Chang, C.Y. Fan, and C.H. Liu. Integrating a Piecewise Linear Representation Method and a Neural Network Model for Stock Trading Points Prediction. IEEE Transactions on Systems, Man, nd Cybernetics-Part C: Applications and Reviews, Vol. 39, No. 1, pp.80-92. (2009).

DOI: 10.1109/tsmcc.2008.2007255

Google Scholar

[2] L.K. Luo, X. Chen. Integrating Piecewise Linear Representation and Weighted Support Vector Machine for Stock Trading Signal Prediction. Applied Soft Computing, Vol. 13, No. 2, February 2013, pp.806-816. (2013).

DOI: 10.1016/j.asoc.2012.10.026

Google Scholar

[3] Y.R. Xu, Z.G. Li and L.K. Luo. A Study on Feature Selection for the Trend Prediction of Stock Trading Price. International Conference on Computational and Information Sciences, ICCIS 2013, pp.579-582. (2013).

DOI: 10.1109/iccis.2013.160

Google Scholar

[4] T.Y. Luo, L. Tian, X.H. Tang, Y.H. Dong. Application of least squares support vector machine in futures price forecasting. 3rd International Conference on Electronics Computer Technology, v 3, pp.173-176. (2011).

DOI: 10.1109/icectech.2011.5941825

Google Scholar

[5] L. Breiman, Random Forests, Machine Learning, vol. 45, p.5–32. (2001).

Google Scholar

[6] L. Breiman, Bagging predictors, Machine Learning, vol. 24, issue 2, pp.123-140. (1996).

Google Scholar

[7] E. Alfaro, M. Gámez and N. García, Adabag: An R package for classification with boosting and bagging, Journal of Statistical Software. (2013).

DOI: 10.18637/jss.v054.i02

Google Scholar