Towards Real-World Applications of Online Learning Spiral Recurrent Neural Networks
Rudolf SOLLACHER, Huaien GAO
DOI: 10.4236/jilsa.2009.11001   PDF    HTML     6,073 Downloads   10,991 Views   Citations


Distributed intelligent systems like self-organizing wireless sensor and actuator networks are supposed to work mostly autonomous even under changing environmental conditions. This requires robust and efficient self-learning capabilities implementable on embedded systems with limited memory and computational power. We present a new solution called Spiral Recurrent Neural Networks (SpiralRNN) with an online learning based on an extended Kalman filter and gradients as in Real Time Recurrent Learning. We illustrate its stability and performance using artificial and real-life time series and compare its prediction performance to other approaches. SpiralRNNs perform very stable and show an ac-curacy which is superior or similar to other state-of-the-art approaches. In a memory capacity evaluation the number of simultaneously memorized and accurately retrievable trajectories of fixed length was counted. This capacity turned out to be a linear function of the size of the recurrent hidden layer, with a memory-to-size ratio of 0.64 for shorter trajectories and 0.31 for longer trajectories. Finally, we describe two potential applications in building automation and logistics and report on an implementation of online learning SpiralRNN on a wireless sensor platform under the TinyOS embedded operating system.

Share and Cite:

R. SOLLACHER and H. GAO, "Towards Real-World Applications of Online Learning Spiral Recurrent Neural Networks," Journal of Intelligent Learning Systems and Applications, Vol. 1 No. 1, 2009, pp. 1-27. doi: 10.4236/jilsa.2009.11001.

Conflicts of Interest

The authors declare no conflicts of interest.


[1] T. Schott, “Global megatrends and their effects on the production of the future,” in Conference Proceedings, 5th International Conference on Industrial Informatics, Vienna, Austria, Vol. 1, pp. 18, July 23-27, 2007, http://www.
[2] Wikipedia, Ambient intelligence.
[3] X. Cai, N. Zhang, G. Venayagamoorthy, and D. C. I. Wunsch, “Time series prediction with recurrent neural networks using a hybrid pso-ea algorithm,” in Proceedings of IEEE International Joint Conference on Neural Networks, N. Zhang, Ed., Vol. 2, pp. 1647–1652, 2004.
[4] H-G. Zimmermann, H-G. Zimmermann, R. Grothmann, A. Schafer, and C. Tietz, “Dynamical consistent recurrent neural networks,” in Proceedings of IEEE International Joint Conference on Neural Networks IJCNN ’05, R. Grothmann, Ed., Vol. 3, pp. 1537–1541, 2005.
[5] J. L. Elman, “Finding structure in time,” Cognitive Science, Vol. 14, No. 2, pp. 179–211, 1990.
[6] A. Waibel, T. Hanazawa, G. Hinton, and K. Shikano, “Phoneme recognition using time-delay neural networks,” IEEE Transactions on Acoustics, Speech and Signal Processing, Vol. 37, No. 3, pp. 328–339, 1989.
[7] P. Mastorocostas and J. Theocharis, “On stable learning algorithm for block-diagonal recurrent neural networks, part 1: The RENNCOM algorithm,” IEEE International Joint Conference on Neural Networks, Vol. 2, pp. 815– 820, 2004.
[8] H. J?ger, “The ‘echo state’ approach to analyzing and training recurrent neural networks,” German National Research Center for Information Technology, Technical Report GMD 148, 2001.
[9] H. Gao, R. Sollacher, and H-P. Kriegel, “Spiral recurrent neural network for online learning,” in Proceedings of the 15th European Symposium on Artificial Neural Networks, ESANN’2007, April 25-27, 2007, Bruges, Belgium, M. Verleysen, Ed., pp. 483–488, 2007.
[10] R. J. Williams and D. Zipser, “A learning algorithm for continually running fully recurrent neural networks,” Neural Computation, Vol. 1, pp. 270–280, 1989.
[11] R. Sollacher and H. Gao, “Efficient online learning with spiral recurrent neural networks,” in Proceedings of World Conference on Computational Intelligence WCCI ’08, 2008.
[12] H. Gao and R. Sollacher, “Conditional prediction of time series using spiral recurrent neural network,” in Proceedings of the European Symposium on Artificial Neural Networks-Advances in Computational Intelligence and Learning, Bruges, Belgium, April 23-25, 2008.
[13] J. Polastre, R. Szewczyk, and D. Culler, “Telos: Enabling ultra-low power wireless research,” in The Fourth International Conference on Information Processing in Sensor Networks: Special track on Platform Tools and Design Methods for Network Embedded Sensors, pp. 364–369, 2005.
[14] D. Gay, P. Levis, R. von Behren, M. Welsh, E. Brewer, and D. Culler, “The nesC language: A holistic approach to networked embedded systems,” ACM SIGPLAN Notices, Vol. 38, No. 5, pp. 1–11, 2003.
[15] R. J. McEliece, E. C. Posner, E. R. Rodemich, and S. S. Venkatesh, “The capacity of the Hopfield associative memory,” IEEE Transactions on Information Theory, Vol. 33, No. 4, pp. 461–482, 1987.
[16] F. Lewis, “Optimal estimation: With an introduction to stochastic control theory,” A Wiley-Interscience Publica-tion, ISBN: 0-471-83741-5, 1986.
[17] R. Kalman, “A new approach to linear filtering and pre-diction problems,” Transactions of the ASME–Journal of Basic Engineering, Vol. 82, pp. 35–45, 1960.
[18] G. Welch and G. Bishop, “An introduction to the Kalman filter,” University of North Carolina at Chapel Hill, Department of Computer Science, Technical Report, pp. 95-041, 2002.

Copyright © 2023 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.