Share This Article:

Feature Selection for Time Series Modeling

Abstract Full-Text HTML Download Download as PDF (Size:401KB) PP. 152-164
DOI: 10.4236/jilsa.2013.53017    5,647 Downloads   8,041 Views   Citations

ABSTRACT

In machine learning, selecting useful features and rejecting redundant features is the prerequisite for better modeling and prediction. In this paper, we first study representative feature selection methods based on correlation analysis, and demonstrate that they do not work well for time series though they can work well for static systems. Then, theoretical analysis for linear time series is carried out to show why they fail. Based on these observations, we propose a new correlation-based feature selection method. Our main idea is that the features highly correlated with progressive response while lowly correlated with other features should be selected, and for groups of selected features with similar residuals, the one with a smaller number of features should be selected. For linear and nonlinear time series, the proposed method yields high accuracy in both feature selection and feature rejection.

Conflicts of Interest

The authors declare no conflicts of interest.

Cite this paper

Q. Wang, X. Li and Q. Qin, "Feature Selection for Time Series Modeling," Journal of Intelligent Learning Systems and Applications, Vol. 5 No. 3, 2013, pp. 152-164. doi: 10.4236/jilsa.2013.53017.

References

[1] K. Javed, H. A. Babri and M. Saeed, “Feature Selection Based on Class Dependent Densities for High-Dimen sional Binary Data,” IEEE Transactions on Knowledge and Data Engineering, Vol. 24, No. 3, 2012, pp. 465-477. doi:10.1109/TKDE.2010.263
[2] M. A. Hall, “Correlation-Based Feature Selection for Machine Learning,” Ph.D. Dissertation, the University of Waikato, Hamilton, 1999.
[3] R. Kohavi and G. H. John, “Wrappers for Feature Subset Selection,” Artificial intelligence, Vol. 97, No. 1, 1997, pp. 273-324. doi:10.1016/S0004-3702(97)00043-X
[4] C. Chatfield, “The Analysis of Time Series: An Introduc tion,” Chapman and Hall/CRC, London, 2003.
[5] Q. Qin, Q.-G. Wang, S. Ge and G. Ramakrishnan, “Chi nese Stock Price and Volatility Predictions with Multiple Technical Indicators,” Journal of Intelligent Learning Systems and Applications, Vol. 3, No. 4, 2011, pp. 209-219. doi:10.4236/jilsa.2011.34024
[6] H. Nguyen, P. Sibille and H. Garnier, “A New Bias Compensating Leastsquares Method for Identification of Stochastic Linear Systems in Presence of Coloured Noi se,” Proceedings of the 32nd IEEE Conference on Deci sion and Control, San Antonio, 15-17 December 1993, pp. 2038-2043.
[7] L. Lennart, “System Identification: Theory for the User,” PTR Prentice Hall, Upper Saddle River, 1999.
[8] J. Le Roux and C. Gueguen, “A Fixed Point Computation of Partial Correlation Coefficients in Linear Prediction,” Acoustics, Speech, and Signal Processing, IEEE International Conference on ICASSP’77, Vol. 2, May 1977, pp. 742-743.
[9] D. Liu, S.-Y. Cho, D.-M. Sun and Z.-D. Qiu, “A Spear man Correlation Coefficient Ranking for Matching-Score Fusion on Speaker Recognition,” TENCON 2010-2010 IEEE Region 10 Conference, Fukuoka, 21-24 November 2010, pp. 736-741. doi:10.1109/TENCON.2010.5686608
[10] Y. Sun, S. Todorovic and S. Goodison, “Local-Learning Based Feature Selection for High-Dimensional Data Ana lysis,” IEEE Transactions on Pattern Analysis and Ma chine Intelligence, Vol. 32, No. 9, 2010, pp. 1610-1626. doi:10.1109/TPAMI.2009.190
[11] H. Nguyen, K. Franke and S. Petrovic, “Improving Effectiveness of Intrusion Detection by Correlation Feature Selection,” ARES’10 International Conference on Availability, Reliability, and Security, Krakow, 15-18 February 2010, pp. 17-24.
[12] A. Agliari, G. Bischi, L. Gardini and I. Sushko, “Intro duction to Discrete Nonlinear Dynamical Systems,” 2009.

  
comments powered by Disqus

Copyright © 2019 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.