[1]
|
T. Schott, “Global megatrends and their effects on the production of the future,” in Conference Proceedings, 5th International Conference on Industrial Informatics, Vienna, Austria, Vol. 1, pp. 18, July 23-27, 2007, http://www. indin2007.org/keynote_schott.php.
|
[2]
|
Wikipedia, Ambient intelligence.
http://en.wikipedia.org/wiki/Ambient_intelligence.
|
[3]
|
X. Cai, N. Zhang, G. Venayagamoorthy, and D. C. I. Wunsch, “Time series prediction with recurrent neural networks using a hybrid pso-ea algorithm,” in Proceedings of IEEE International Joint Conference on Neural Networks, N. Zhang, Ed., Vol. 2, pp. 1647–1652, 2004.
|
[4]
|
H-G. Zimmermann, H-G. Zimmermann, R. Grothmann, A. Schafer, and C. Tietz, “Dynamical consistent recurrent neural networks,” in Proceedings of IEEE International Joint Conference on Neural Networks IJCNN ’05, R. Grothmann, Ed., Vol. 3, pp. 1537–1541, 2005.
|
[5]
|
J. L. Elman, “Finding structure in time,” Cognitive Science, Vol. 14, No. 2, pp. 179–211, 1990.
|
[6]
|
A. Waibel, T. Hanazawa, G. Hinton, and K. Shikano, “Phoneme recognition using time-delay neural networks,” IEEE Transactions on Acoustics, Speech and Signal Processing, Vol. 37, No. 3, pp. 328–339, 1989.
|
[7]
|
P. Mastorocostas and J. Theocharis, “On stable learning algorithm for block-diagonal recurrent neural networks, part 1: The RENNCOM algorithm,” IEEE International Joint Conference on Neural Networks, Vol. 2, pp. 815– 820, 2004.
|
[8]
|
H. J?ger, “The ‘echo state’ approach to analyzing and training recurrent neural networks,” German National Research Center for Information Technology, Technical Report GMD 148, 2001.
|
[9]
|
H. Gao, R. Sollacher, and H-P. Kriegel, “Spiral recurrent neural network for online learning,” in Proceedings of the 15th European Symposium on Artificial Neural Networks, ESANN’2007, April 25-27, 2007, Bruges, Belgium, M. Verleysen, Ed., pp. 483–488, 2007.
|
[10]
|
R. J. Williams and D. Zipser, “A learning algorithm for continually running fully recurrent neural networks,” Neural Computation, Vol. 1, pp. 270–280, 1989.
|
[11]
|
R. Sollacher and H. Gao, “Efficient online learning with spiral recurrent neural networks,” in Proceedings of World Conference on Computational Intelligence WCCI ’08, 2008.
|
[12]
|
H. Gao and R. Sollacher, “Conditional prediction of time series using spiral recurrent neural network,” in Proceedings of the European Symposium on Artificial Neural Networks-Advances in Computational Intelligence and Learning, Bruges, Belgium, April 23-25, 2008.
|
[13]
|
J. Polastre, R. Szewczyk, and D. Culler, “Telos: Enabling ultra-low power wireless research,” in The Fourth International Conference on Information Processing in Sensor Networks: Special track on Platform Tools and Design Methods for Network Embedded Sensors, pp. 364–369, 2005.
|
[14]
|
D. Gay, P. Levis, R. von Behren, M. Welsh, E. Brewer, and D. Culler, “The nesC language: A holistic approach to networked embedded systems,” ACM SIGPLAN Notices, Vol. 38, No. 5, pp. 1–11, 2003. http://www.tinyos.net/.
|
[15]
|
R. J. McEliece, E. C. Posner, E. R. Rodemich, and S. S. Venkatesh, “The capacity of the Hopfield associative memory,” IEEE Transactions on Information Theory, Vol. 33, No. 4, pp. 461–482, 1987.
|
[16]
|
F. Lewis, “Optimal estimation: With an introduction to stochastic control theory,” A Wiley-Interscience Publica-tion, ISBN: 0-471-83741-5, 1986.
|
[17]
|
R. Kalman, “A new approach to linear filtering and pre-diction problems,” Transactions of the ASME–Journal of Basic Engineering, Vol. 82, pp. 35–45, 1960.
|
[18]
|
G. Welch and G. Bishop, “An introduction to the Kalman filter,” University of North Carolina at Chapel Hill, Department of Computer Science, Technical Report, pp. 95-041, 2002.
|