logo

SCIENTIA SINICA Informationis, Volume 47, Issue 11: 1445-1463(2017) https://doi.org/10.1360/N112017-00066

Dynamic full Bayesian ensemble classifiers for small time series

More info
  • ReceivedMay 23, 2017
  • AcceptedJun 30, 2017
  • PublishedNov 15, 2017

Abstract

Improving the reliability of small time series classifiers with continuous attributes is an important and challenging task. The information contained in small time series is not sufficient and a temporal dependency exists between data records, which makes it very difficult to optimize the fitting degree between the classifier and the data, and many mature techniques of non-time series data classifiers are not practical. We use a dynamic full Bayesian classifier to increase the amount of information provided by the attribute to the class, and realize the fusion of temporal and nonsequential information. By combining the conditional joint density estimation of attributes based on the multivariate Gaussian kernel function with a diagonal smoothing parameter matrix, the interval division of smoothing parameter values, the timing progressive classification accuracy criterion, the construction of the smoothing parameter configuration tree, classifier selection and averaging, etc, a dynamic full Bayesian ensemble classifier was established for small time series. Experiments were performed using small time series in macroeconomic analysis. The results show that the optimized dynamic full Bayesian ensemble classifiers have very good classification accuracy.


Funded by

国家自然科学基金(61272209)

上海市自然科学基金(15ZR1429700)


References

[1] Wang S C, Du R J, Liu Y. The Learning and Optimization of Full Bayes Classifiers with Continuous Attributes. Chin J Comp, 2012, 35: 2129-2138 CrossRef Google Scholar

[2] Chow C, Liu C. Approximating discrete probability distributions with dependence trees. IEEE Trans Inform Theor, 1968, 14: 462-467 CrossRef Google Scholar

[3] Friedman N, Geiger D, Goldszmidt M. Bayesian network classifiers. Machine Learning, 1997, 29: 131-163 CrossRef Google Scholar

[4] Domingos P, Pazzani M. On the optimality of the simple Bayesian classifier under zero-one loss. Machine Learning, 1997, 29: 103-130 CrossRef Google Scholar

[5] de Campos C P, Corani G, Scanagatta M. Learning extended tree augmented naive structures. Int J Approximate Reasoning, 2016, 68: 153-163 CrossRef Google Scholar

[6] Cheng J, Greiner R. Comparing Bayesian network classifiers. In: Proceedings of the 15th Conference on Uncertainty in Artificial Intelligence (UAI-99), San Francisco, 1999. 101--108. Google Scholar

[7] Acid S, de Campos L M, Castellano J G. Learning Bayesian Network Classifiers: Searching in a Space of Partially Directed Acyclic Graphs. Mach Learn, 2005, 59: 213-235 CrossRef Google Scholar

[8] Yager R R. An extension of the naive Bayesian classifier. Inf Sci, 2006, 176: 577-588 CrossRef Google Scholar

[9] Webb G I, Boughton J R, Wang Z. Not So Naive Bayes: Aggregating One-Dependence Estimators. Mach Learn, 2005, 58: 5-24 CrossRef Google Scholar

[10] Wang S C, Liu X H, Tang H Y. The learning and optimizing of Markov network classifiers based on dependency analysis. Pattern Recogn Artif Inteligence, 2006, 19: 485--490. Google Scholar

[11] Wang S C, Xu G L, Du R J. Restricted Bayesian classification networks. Sci China Inf Sci, 2013, 56: 078105. Google Scholar

[12] Flores M J, Gámez J A, Martínez A M. Domains of competence of the semi-naive Bayesian network classifiers. Inf Sci, 2014, 260: 120-148 CrossRef Google Scholar

[13] Daniel B, Aryeh K. A finite sample analysis of the naive Bayes classifier. J Mach Learn Res, 2015, 16: 1519--1545. Google Scholar

[14] John G H, Langley P. Estimating continuous distributions in Bayesian classifiers. In: Proceedings of the 11th Conference on Uncertainty in Artificial Intelligence (UAI-1995), San Francisco, 1995. 338--345. Google Scholar

[15] Pérez A, Larra?aga P, Inza I. Supervised classification with conditional Gaussian networks: Increasing the structure complexity from naive Bayes. Int J Approximate Reasoning, 2006, 43: 1-25 CrossRef Google Scholar

[16] Pérez A, Larra?aga P, Inza I. Bayesian classifiers based on kernel density estimation: Flexible classifiers. Int J Approximate Reasoning, 2009, 50: 341-362 CrossRef Google Scholar

[17] He Y L, Wang R, Kwong S. Bayesian classifiers based on probability density estimation and their applications to simultaneous fault diagnosis. Inf Sci, 2014, 259: 252-268 CrossRef Google Scholar

[18] Gutiérrez L, Gutiérrez-Pe?a E, Mena R H. Bayesian nonparametric classification for spectroscopy data. Comp Stat Data Anal, 2014, 78: 56-68 CrossRef Google Scholar

[19] Xiang Z L, Yu X R, Kang D K. Experimental analysis of na?ve Bayes classifier based on an attribute weighting framework with smooth kernel density estimations. Appl Intell, 2016, 44: 611-620 CrossRef Google Scholar

[20] Wang S, Gao R, Wang L. Bayesian network classifiers based on Gaussian kernel density. Expert Syst Appl, 2016, 51: 207-217 CrossRef Google Scholar

[21] Wang S C, Gao R, Du R J. Restricted Bayesian network classifier based on Gaussian Copula. Chinese J Comput, 2016, 39: 1612--1625. Google Scholar

[22] Wang S C, Gao R, Du R J. Restricted Gaussian classification network. Acta Autom Sin, 2015, 41: 2128--2140. Google Scholar

[23] Gaussian Classifier-Based Evolutionary Strategy for Multimodal Optimization. IEEE Trans Neural Netw Learning Syst, 2014, 25: 1200-1216 CrossRef Google Scholar

[24] Martınez M, Sucar L E. Learning dynamic naive bayesian classifier. In: Proceedings of the 21st International Florida Artificial Intelligence Research Symposium (FLAIRS-21), San Francisco, 2008. 655--659. Google Scholar

[25] Palacios-Alonso M A, Brizuela C A, Sucar L E. Evolutionary learning of dynamic naive Bayesian classifiers. J Autom Reason, 2009, 45: 21--37. Google Scholar

[26] Arriaga A, Sucarsuccar H H, Mendozaduran L E, et al. A comparison of dynamic naive Bayesian classifiers and hidden Markov models for gesture recognition. J Appl Res Technol, 2011, 9: 81--102. Google Scholar

[27] Wang S C, Zhang J F, Wang H. The method of dynamic naive Bayesian classifier for impact analysis of Chinas economic growth. ICIC Express Lett Part B Appl, 2013, 4: 7--12. Google Scholar

[28] AlKhateeb J H, Pauplin O, Ren J. Performance of hidden Markov model and dynamic Bayesian network classifiers on handwritten Arabic word recognition. Knowledge-Based Syst, 2011, 24: 680-688 CrossRef Google Scholar

[29] Yu B, Mark T. Learning gene regulations from multiple knockout data via an efficient dynamic Bayesian network reconstruction. Bioph J, 2011, 100: 311--322. Google Scholar

[30] Kafai M, Bhanu B. Dynamic Bayesian Networks for Vehicle Classification in Video. IEEE Trans Ind Inf, 2012, 8: 100-109 CrossRef Google Scholar

[31] Wang S C, Bi Y J, Pei Z. The method of dynamic Bayesian network classifiers for impact analysis on China import and export of goods. Chinese J Manage Sci, 2011, 19: 625--629. Google Scholar

[32] Wang S C, Pei Z, Bi Y J. Dynamic Bayesian network classifier model for predicting the cyclical turning points of economic fluctuation. J Ind Eng Eng Manage, 2011, 25: 173--177. Google Scholar

[33] Wang S C, Gao R, Du R J. Learning and optimization of dynamic naive Bayesian classifiers for small time series. Control Decis, 2017, 32: 163--166. Google Scholar

[34] Heckerman D. Bayesian networks for data mining. Data Min Knowledge Discovery, 1997, 1: 79-119 CrossRef Google Scholar

[35] Friedman N, Murphy K P, Russell S. Learning the structure of dynamic probabilistic networks. In: Proceedings of the 14th International Conference on Uncertainty in Artificial Intelligence, Madison, 1998. 139--147. Google Scholar

Copyright 2020 Science China Press Co., Ltd. 《中国科学》杂志社有限责任公司 版权所有

京ICP备18024590号-1