[1]
Y. H. Chen, A. Abraham: Feature selection and intrusion detection using hybrid flexible neural tree. Lecture Notes Computer Science. Forum Vol. 3498 (2005), pp.439-444.
DOI: 10.1007/11427469_71
Google Scholar
[2]
Y. H. Chen, B. Yang, J. Dong and A. Abraham: Time-series forecasting using flexible neural tree model. Information Science. Forum Vol. 174 (2005), pp.219-235.
DOI: 10.1016/j.ins.2004.10.005
Google Scholar
[3]
Y. H. Chen, B. Yang and J. Dong J: Evolving flexible neural networks using ant programming and PSO algorithm. In: Proceeding of International Symposium on Neural Networks, Lecture Notes on Computer Science. Forum Vol. 2 (2004), pp.211-216.
DOI: 10.1007/978-3-540-28647-9_36
Google Scholar
[4]
Y. H. Chen, B. Yang and J. Dong: Nonlinear system modeling via optimal design of neural trees. Journal of Neural System. Forum Vol. 14 (2004), pp.125-137.
DOI: 10.1142/s0129065704001905
Google Scholar
[5]
Y. H. Chen, B. Yang and A. Abraham: Flexible neural trees ensemble for stock index modeling. Neruocomputing. Forum Vol. 70 (2007), pp.697-703.
DOI: 10.1016/j.neucom.2006.10.005
Google Scholar
[6]
Y. H. Chen, L. Peng and A. Abraham: Exchange rate forecasting using flexible neural trees. International Symposium on Neural Networks. Forum Vol. 3973 (2006), pp.518-523.
DOI: 10.1007/11760191_76
Google Scholar
[7]
Y. H. Chen, A. Abraham and B. Yang: Hybrid flexible neural- tree-based intrusion detection systems. International journal of intelligent systems. Forum Vol. 22 (2007), pp.337-352.
DOI: 10.1002/int.20203
Google Scholar
[8]
X. Yao, Y. Lin: A new evolutionary system for evolving artificial neural networks. IEEE Transactions on Neural Networks. Forum Vol. 8 (1997), pp.694-713.
DOI: 10.1109/72.572107
Google Scholar
[9]
X. Yao (1999) Evolving artificial neural networks. Proceedings of the IEEE 87: 1423-1447.
Google Scholar
[10]
P. J. Angeline, G. M. Sauders and J. B. Pollack: An evolutionary algorithm that constructs recurrent neural networks. IEEE Trans. on Neural Networks. Forum Vol. 5 (1994), pp.54-65.
DOI: 10.1109/72.265960
Google Scholar
[11]
B. T. Zhang, H. Muhlenbein: Genetic programming of minimal neural nets using Occam's razor. In Proceeding of fifth International Conference on Genetic Algorithms, Forrest S, Editor, Morgan Kaufmann. Forum Vol. 1 (1993), pp.342-349.
Google Scholar
[12]
B. T. Zhang, H. Muhlenbein: Evolving optimal neural networks using genetic algorithm with Occam's razor. Complex System (1993).
Google Scholar
[13]
B. T. Zhang, H. Muhlenbein: Synthesis of sigma-pi neural networks by the breeder genetic programming. In: Proceeding of IEEE International Conference on Evolutionary Computation. Forum Vol. 1 (1994), pp.318-324.
DOI: 10.1109/icec.1994.349933
Google Scholar
[14]
B. T. Zhang, P. Ohm and H. Muhlenbein Evolutionary induction of sparse neural trees. Evolutionary Computation. Forum Vol. 15 (1997), pp.213-236.
DOI: 10.1162/evco.1997.5.2.213
Google Scholar
[15]
M. Mitchell, J. H. Holland: When will a genetic algorithm outperform hill climbing? In: Proceedings of the 5th International Conference on Genetic Algorithms, Morgan Kaufmann Publishers Inc. San Francisco. Forum Vol. 1 (1993), pp.647-656.
Google Scholar
[16]
H. Muhlenbein: How Genetic Algorithms really work. Mutation and hill-climbing. In: Parallel Problem Solving from Nature PPSN II, North-Holland. Forum Vol. 1 (1992), p.15–25.
Google Scholar
[17]
I. Rojas, H. Pomares and B. J. Luis et al.: Time series analysis using normalized PG-RBF network with regression weights. Neurocomputing. Forum Vol. 42 (2002), p.267–285.
DOI: 10.1016/s0925-2312(01)00338-1
Google Scholar