[1]
|
Mitchell, T.M. (1997) Machine learning. McGraw-Hill, New York.
|
[2]
|
Jordan, M.I. (1995) Why the logistic function? A tutorial discussion on probabilities and neural networks techno- logy. MIT Computational Cognitive Science Report 9503, Massachusetts.
|
[3]
|
Aha, D., Kibler, D. and Albert, M. (1991) Instance-based learning algorithms. Machine Learning, 6(1), 37-66.
|
[4]
|
Quinlan, J.R. (1993) Programs for machine learning. San Mateo, Morgan Kaufmann, California.
|
[5]
|
Lisboa, P.J.G., Ifeachor, E.C. and Szczepaniak, P.S. (2000) Artificial neural networks in biomedicine. Springer-Ver- lag, London.
|
[6]
|
Cooper, G.F. and Herskovits, E. (1992) A Bayesian me- thod for the induction of probabilistic networks from data. Machine Learning, 9(4), 309-347.
|
[7]
|
Lucas, P. (2001) Bayesian networks in medicine: A model-based approach to medical decision making. Proceedings of the EUNITE Workshop on Intelligent Systems in Patient Care, Vienna, 73-97.
|
[8]
|
Antal, P., Verrelst, H., Timmerman, D., Van Huffel, S., De Moor, B., Vergote, I. and Moreau, Y. (2000) Bayesian networks in ovarian cancer diagnosis: Potentials and limitations. 13th IEEE Symposium on Computer-Based Medical Systems, Texas Medical Center, Houston, 103- 108.
|
[9]
|
van Gerven, M., Jurgelenaite, R., Taal, R., Heskes, T. and Lucas, P. (2007) Predicting carcinoid heart disease with the noisy-threshold classifier. Artificial Intelligence in Medicine, 40(1), 45-55.
|
[10]
|
Pearl, J. (1988) Probabilistic reasoning in intelligent systems: Networks of plausible inference. 2nd Edition, Morgan Kaufmann, San Francisco.
|
[11]
|
Spirtes, P., Glymour, C. and Scheines, R. (2000) Causa- tion, prediction, and search. 2nd Edition, the MIT Press, Cambridge.
|
[12]
|
Pradhan, M., Henrion, M., Provan, G., Favero, B.D. and Huang, K. (1996) The sensitivity of belief networks to imprecise probabilities: An experimental investigation. The Artificial Intelligence Journal, 85(1-2), 363-397.
|
[13]
|
Cheng, J., et al. (2002) Learning Bayesian networks from data: An information theory based approach. The Artificial Intelligence Journal, 137(1-2), 43-90.
|
[14]
|
Cooper, G. and Herskovits, E. (1992) A Bayesian method for the induction of probabilistic networks form data. Machine Learning, 9(4), 309-347.
|
[15]
|
Heckerman, D. (1998) A tutorial on learning with Bayesian networks, learning in graphical models. Kluwer Academic Publishers, Dordrecht, 301-354.
|
[16]
|
Chickering, D. (1996) Learning Bayesian networks is NP-complete. In: Fisher, D. and Lenz, H., Eds., Learning from Data: Artificial Intelligence and Statistics, 4(1) 121-130.
|
[17]
|
Friedman, N., Nachman, I. and Peer, D. (1999) Learning Bayesian network structure from massive datasets: The ‘sparse candidate’ algorithm. Uncertainty in Artificial Intelligence, 15(3), 206-215.
|
[18]
|
Tsamardinos, I., Brown, L. and Aliferis, C. (2006) The max-min hill-climbing Bayesian network structure learn- ing algorithm. Machine Learning, 65(1), 31-78.
|
[19]
|
Meinshausen, N. and Buhlmann, P. (2006) High dimen- sional graphs and variable selection with the lasso. The Annals of Statistics, 34(3), 1436-1462.
|
[20]
|
Heckerman, D., Geiger, D. and Chickering, D. (1995) “Learning Bayesian networks: The combination of knowledge and statistical data. Machine Learning, 20(3), 197-243.
|
[21]
|
Tsamardinos, I. and Aliferis, C. (2003) Towards prince- pled feature selection: Relevancy, filters and wrappers. The 9th International Workshop on Artificial Intelligence and Statistics, Florida, 334-342.
|
[22]
|
Guyon, I. and Elisseeff, A. (2003) An introduction to variable and feature selection. Journal of Machine Lear- ning Research, 3(7-8), 1157-1182.
|
[23]
|
Dash, M. and Liu, H. (2003) Consistency-based search in feature selection. Artificial Intelligence, 151(1-2), 155- 176.
|
[24]
|
Hall, M. (1999) Correlation based feature selection for machine learning. Ph.D. Dissertation, University of Waikato, New Zealand.
|
[25]
|
Kononenko, I. (1994) Estimating attributes: Analysis and extension of RELIEF. European Conference on Machine Learning, Catania, 171-182.
|
[26]
|
Jakulin, A. and Bratko, I. (2003) Analyzing attribute dependencies. PKDD, Ljubljana.
|
[27]
|
Jakulin, A. and Bratko, I. (2004) Testing the signifi- cance of attributes interactions. International Conference on Machine Learning, 20, 69-73.
|
[28]
|
Zhao, Z. and Liu, H. (2007) Searching for interacting features. Proceedings of International Joint Conference on Artificial Intelligence, Nagoya.
|
[29]
|
Blake, C., Keogh, E. and Merz, C.J. (1998) UCI repository of machine learning databases. University of California, Irvine. http://www.ics.uci.edu/~mlearn/MLRepo sitory.html
|
[30]
|
Aha, D.W., Kibler, D. and Albert, M.K. (1991) Instance- based learning algorithms. Machine Learning, 6(1), 37-66.
|
[31]
|
Witten, I.H. and Frank, E. (2005) Data mining-pracitcal machine learning tools and techniques with JAVA implementations. 2nd Edition, Morgan Kaufmann Publi- shers, California.
|
[32]
|
Bouckaert, R.R. (2008) Bayesian network classifiers in weka for version 3-5-7. Artificial Intelligence Tools, 11(3), 369-387
|
[33]
|
Cortes, C. and Mohri, M. (2003) AUC optimization vs. error rate Minimization. Advances in Neural Information Processing Systems, 11(10), 356-360.
|
[34]
|
Chobanian, A.V., Bakris, G.L., Black, H.R., Cushman, W.C., Green L.A. and Izzo, J.L. (2003) The seventh report of the joint national committee on prevention, detection, evaluation, and treatment of high blood pressure. The JNC 7 Report, Journal of the American Medical Association, 289(19), 2560-2572.
|
[35]
|
(2001-2002) Cardiovascular diseases – Prevention and control. WHO Chemical Vapor Deposition Strategy Conference, Nottingham.
|
[36]
|
Kearney, P.M., Whelton, M., Reynolds, K., et al. (2005) Global burden of hypertension: Analysis of worldwide data. Lancet, 365(9455), 217-223.
|