Flexible Beta Basis Function Neural Tree model:
Artificial Neural Network (ANN) is a growing interdisciplinary field which considers the systems as adaptive, distributed and mostly nonlinear, three of the elements found in the real applications. Several types of networks have been emerged, in the literature, for the multi-hidden layer networks and for single hidden layer networks. ANN mimics the learning behavior of biological systems by updating its parameters (including interconnection weights and in certain cases transfer function parameters).
The know-how accumulated, so, through advanced work in the process of ANN creating and learning showed that its reliability can be conditioned by the appropriate structure; the connection ways between the nodes; the chosen transfer function; and the learning algorithm. Many efforts have been provided in the literature to address these issues using evolutionary computation, and in such case the model is noted Evolving Artificial Neural Network (EANN).
In order to promote EANN’ research, the Tunisian Research Groups in Intelligent Machines of University of Sfax (REGIM-Lab of Sfax) will provide the Flexible Beta Basis Function Neural Tree system (FBBFNT) freely of charge to mainly neural networks’ researchers and to increase total of researches done to enhance EANN process. FBBFNT is available as Matlab code containing structure and learning process of this model as well as its experimentation with benchmarks of prediction and identification. This model is used on [1, 2].
FBBFNT is a multi-hidden layer network based on the Beta function. It relies on the tree-based encoding method instead of the matrix-based encoding method used for the Beta basis function single-hidden layer network. Seen that the tree-based encoding method is more flexible and gives a more adjustable and modifiable architecture.
Release of the FBBFNT code
The FBBFNT code could be downloaded from own Google Drive provided after scanned the release agreement.
The researcher(s) agrees to the following restrictions and requirements on The FBBFNT model:
- Redistribution: Without prior written approval from the code administrator, the FBBFNT code, in whole or in part, will not be further distributed, published, copied, or disseminated in any way or form whatsoever, whether for profit or not. This includes further distributing, copying or disseminating to a different facility or organizational unit in the requesting university, organization, or company.
- Modification and Commercial Use: Without prior written approval from the code administrator, the FBBFNT model, in whole or in part, will not be modified and will not be used for commercial use.
- Acknowledgment to FBBFNT code: In all documents and papers that use the FBBFNT model, this system should be acknowledged as: “Portions of the research in this paper use the FBBFNT model achieved by the Research Groups in Intelligent Machines, University of Sfax, Tunisia” and a citation to “FBBFNT model, http://www.regim.org/publications/codes/FBBFNT/” should be added into the references.
- Citation/Reference: All documents and papers that report on evolving neural network’ research that uses the FBBFNT system will acknowledge the use of the code by including an appropriate citation to the following:
Bouaziz, H. Dhahri, A.M. Alimi, ‘Evolving Flexible Beta Operator Neural Trees (FBONT) for Time Series Forecasting’, T. Hung et al. (Eds.) : Proceedings of 19th International Conference in neural information Processing (ICONIP’2012), Part III, Series: Lecture Notes in Computer Science, Springer-Verlag, vol. 7665, pp. 17-24, Doha-Qatar, 2012. http://link.springer.com/chapter/10.1007/978-3-642-34487-9_3
Bouaziz, H. Dhahri, A.M. Alimi, and A. Abraham, ‘A Hybrid Learning Algorithm For Evolving Flexible Beta Basis Function Neural Tree Model’, Neurocomputing, vol. 117, pp. 107-117, 2013.
- Publication to REGIM of Sfax: A copy of all reports and papers that are for public or general release that use the FBBFNT’ code should be forwarded immediately upon release or publication to the code administrator.
- Indemnification: Researcher agrees to indemnify, defend, and hold harmless the REGIM-Lab of the University of Sfax in Tunisia and its Board of Trustees, officers, employees and agents, individually and collectively, from any and all losses, expenses, damages, demands and/or claims based upon any such injury or damage (real or alleged) and shall pay all damages, claims, judgments or expenses resulting from researcher’s use of the FBBFNT’ code.
- Bouaziz, H. Dhahri, A.M. Alimi, ‘Evolving Flexible Beta Operator Neural Trees (FBONT) for Time Series Forecasting’, T. Hung et al. (Eds.) : Proceedings of 19th International Conference in neural information Processing (ICONIP’2012), Part III, Series: Lecture Notes in Computer Science, Springer-Verlag, vol. 7665, pp. 17-24, Doha-Qatar, 2012. http://link.springer.com/chapter/10.1007/978-3-642-34487-9_3
- Bouaziz, H. Dhahri, A.M. Alimi, and A. Abraham, ‘A Hybrid Learning Algorithm For Evolving Flexible Beta Basis Function Neural Tree Model’, Neurocomputing, vol. 117, pp. 107-117, 2013. http://www.sciencedirect.com/science/article/pii/S0925231213001975
To download the FBBFNT’ code, please fill-in the pdf release agreement <link>, sign it and send it to the code administrator <email@example.com>