Recurrent Neural Networks (RNN)

RNNs are universal and general adaptive architectures, that benefit from their inherent a) feedback - to cater for long time correlations, b) nonlinearity - to deal with non-Gaussianity and nonlinear signal generating mechanisms, c) massive interconnection - for high degree of generalisation, d) adaptive mode of operation - for operation in nonstationary environments.

You can download from here some Wind Data used in some of the simulations in the work below. These are 2D (complex) data, for three different wind regimes of 'low', 'medium', and 'high' dynamics.

Below is some of our work on Real-, complex-, and quaternion-valued RNNs.

Legend: MATLAB code, PDF files, Supplements and data.

Quaternion Valued RNNs:

  1. Y. Xia, M. Xiang, Z. Li, and D. P. Mandic, "Echo state networks for multidimensional data: Exploiting noncircularity and widely linear models", in D. Comminiello and J. Principe (editors), Adaptive Learning Methods for Nonlinear System Modelling, pp. 267-288, Elsevier, 2018. [pdf]
  2. C. U. Bukhari, C. Cheong-Took, and D. P. Mandic, "Quaternion Valued Nonlinear Adaptive Filtering," IEEE Transactions on Neural Networks, vol. 22, no. 8, pp. 1193- 1206, 2011. 10.pdf" title="download in PDF form">[pdf]
  3. S. Javidi, C. Cheong-Took, and D. P. Mandic, "A Fast Independent Component Analysis Algorithm for Quaternion Signals," IEEE Transactions on Neural Networks, vol. 22. no. 12, pp. 1967-1978, 2011. 10.pdf" title="download in PDF form">[pdf]
  4. C. Cheong-Took and D. P. Mandic, "Augmented Second-order Statistics of Quaternion Random Signals ," Signal Processing, vol. 91, pp. 214-224, 2011. 10.pdf" title="download in PDF form">[pdf]
    Addendum to the paper and some clarifications
  5. C. Cheong-Took, D. P. Mandic, and F. Zhang, "On Unitary Diagonalisation of a Special Class of Quaternion Matrices ," Applied Mathematics Letters, vol. 24, pp. 1806-1809, 2011. [pdf]
  6. S. Javidi, C. Cheong Took, C. Jahancahi, N. Le Bihan, and D. P. Mandic, "Blind Extraction of Improper Quaternion Sources ," in Proceedings of ICASSP'11, pp. 3708-3711, 2011. [pdf]
  7. C. Cheong-Took, G. Strbac, K. Aihara, and D. P. Mandic, "Quaternion-valued short term forecasting of three-dimensional wind and atmpospheric parameters," Renewable Energy, vol. 36, pp. 1754-1760,2011. [pdf]
  8. C. C. Cheong-Took and D. P. Mandic, "A Quaternion Widely Linear Adaptive Filter ," IEEE Transactions on Signal Processing, vol. 58, no. 8, pp. 4427-4431, 2010. [pdf]
  9. C. Ujang-Bukari, C. Cheong-Took, and D. P. Mandic, "Split-Quaternion Nonlinear Adaptive Filtering", Neural Networks, vol. 23, no. 3, pp. 426-434, 2010. [pdf]
  10. C. Cheong-Took and D. P. Mandic, "Fusion of Heterogeneous Data Sources: A Quaternionic Approach," Proceedings of MSLP'09, pp. 456-461, 2008. [pdf]
Complex Valued RNNs:
  1. Our book, published April 2009 by Wiley: Complex Valued Nonlinear Adaptive Filters: Noncircularity, Widely Linear and Neural Models. Read more and download the supplementary MATLAB toolbox.
  2. We often use the term 'augmented' for 'widely linear', e.g. augmented CLMS (ACLMS) instead of widely linear CLMS, referring to the use of Augmented Statistics. This also makes the naming of the algorithms more elegant, e.g. 'augmented nonlinear gradient descent' instead of 'widely linear nonlinear gradient descent'

  3. Y. Xia, B. Jelfs, M. M. Van Hulle, J. Principe, and D. P. Mandic, "An Augmented Echo State Network for Adaptive Filtering of Complex Noncircular Signals", IEEE Transactions on Neural Networks, vol. 22, no. 1, pp. 74-83, 2011. [pdf]
  4. B. Che Ujang, C. Cheong-Took, and D. P. Mandic, "Identification of improper processes by variable taplength complex-valued adaptive filters", in Proceedings of IJCNN, pp. 2265-2270, 2010. [pdf]
  5. D. P. Mandic, S. Javidi, S. L. Goh, A. Kuh and K. Aihara, "Complex Valued Prediction of Wind Profile Using Augmented Complex Statistics", Renewable Energy, vol. 34, no. 1, pp. 196-201, 2009. [pdf]
  6. D. P. Mandic, P. Vayanos, M. Chen, and S. L. Goh, "Online Detection of the Modality of Complex Valued Real World Signals," International Journal of Neural Systems, vol. 18, no. 2, pp. 67-74, 2008. [pdf]
  7. S. L. Goh and D. P. Mandic, "Stochastic Gradient Adaptive Complex-Valued Nonlinear Neural Adaptive Filters with a Gradient Adaptive Step Size," IEEE Transactions on Neural Networks, vol. 18, no. 5, pp. 1511-1516, 2007. [pdf]
  8. D. P. Mandic, S. L. Goh, and K. Aihara, "Sequential Data Fusion via Vector Spaces: Fusion of Heterogeneous Data in the Complex Domain," International Journal of VLSI Signal Processing Systems, vol. 48, no. 1, pp. 99-108, 2007. [pdf]
  9. S. L. Goh and D. P. Mandic, "An Augmented Extended Kalman Filter Algorithm for Complex-Valued Recurrent Neural Networks," Neural Computation, vol. 19, no. 4, pp. 1039-1055, 2007. [pdf]
  10. S. L. Goh and D. P. Mandic, "An Augmented CRTRL For Complex-Valued Recurrent Neural Networks", Neural Networks, vol. 20, no. 10, pp. 1061-1066, 2007. [pdf]
  11. S. L. Goh, M. Chen, D. H. Popovic, K. Aihara, D. Obradovic and D. P. Mandic, "Complex-Valued Forecasting of Wind Profile," Renewable Energy, vol. 31, pp. 1733-1750, 2006. [pdf]
  12. S. L. Goh and D. P. Mandic, "Nonlinear Adaptive Prediction of Complex Valued Signals by Complex-Valued PRNN", IEEE Transactions on Signal Processing, vol. 53, no. 5, pp. 1827-1836, 2005. [pdf]
  13. S. L. Goh and D. P. Mandic, "A General Complex RTRL Algorithm for Nonlinear Adaptive Filters", Neural Computation, vol. 16, no. 12, pp. 2699-2713, 2004. [pdf]
  14. T. Gautama, D. P. Mandic, and M. Van Hulle, "A non-parametric test for detecting the complex-valued nature of time series," International Journal of Knowledge-Based Intelligent Engineering Systems, vol. 8, no. 2, pp. 99-106, 2004. [pdf]
  15. A. I. Hanna and D. P. Mandic, "A General Fully Adaptive Normalised Gradient Descent Learning Algorithm For Complex-Valued Nonlinear Adaptive Filters", IEEE Transactions on Signal Processing, vol. 51, no. 10, pp. 2540-2549, 2003. [pdf]
  16. A. I. Hanna and D. P. Mandic, "Complex–Valued Nonlinear Neural Adaptive Filters with Trainable Amplitude of Activation Functions", Neural Networks, vol. 16, no. 2, pp. 155-159, 2003. [pdf]
Real Valued RNNs:
  1. D. P. Mandic and J. A. Chambers, "Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures, and Stability," Wiley, 2001.
  2. M. Chen, T. Gautama, and D. P. Mandic, "An Assessment of Qualitative Performance of Machine Learning Architectures: Modular Feedback Networks," IEEE Transactions on Neural Networks, vol. 19, no. 1, pp. 183-189, 2008. [pdf] [MATLAB code] [Zipped content]
  3. M. Pedzisz and D. P. Mandic, "Homomorphic Neural Network for Modelling and Prediction", Neural Computation, vol. 20, no. 4, pp. 1042-1064, 2008. [pdf]
  4. D. P. Mandic, P. Vayanos, C. Boukis, B. Jelfs, S. L. Goh, T. Gautama, and T. Rutkowski, "Collaborative Adaptive Learning Using Hybrid Filters", in Proceedings of ICASSP 2007, vol. III, pp. 921-924, 2007. [pdf]
  5. A. I. Hanna and D. P. Mandic, "On An Improved Approach to Nonlinear System Identification Using Neural Networks," Journal of the Franklin Institute, vol. 340, pp. 363-370, 2003. [pdf]
  6. S. L. Goh and D. P. Mandic, "Recurrent neural networks with trainable amplitude of activation functions," Neural Networks, vol.16, pp. 1095-1100, 2003. [pdf]
  7. D. P. Mandic, "Data-Reusing Recurrent Neural Adaptive Filters," Neural Computation, vol. 14, no. 11, 2693-2708, 2002. [pdf]
  8. D. P. Mandic, A. I. Hanna, and M. Razaz, "A Normalised Nonlinear Gradient Descent Algorithm with a Gradient Adaptive Step Size" IEEE Signal Processing Letters, vol. 8, no. 11, pp. 295-297, 2001. [pdf]
  9. D. P. Mandic and J. A. Chambers, "Towards an Optimal Learning Rate for Backpropagation" Neural Processing Letters, vol. 11, no. 1, pp. 1-5, 2000. [pdf]
  10. D. P. Mandic and J. A. Chambers, "A Normalised Real Time Recurrent Learning Algorithm" Signal Processing, vol. 80, no. 9, pp. 1909-1916, 2000. [pdf]
  11. D. P. Mandic and J. A. Chambers, "On the Choice of Parameters of the Cost Function in Nested Modular RNNs" IEEE Transactions on Neural Networks, vol. 11, no. 2, pp. 315-322, 2000. [pdf]
  12. D. P. Mandic and J. A. Chambers, "Relationships Between the A Priori and A Posteriori Errors in Nonlinear Adaptive Filters" Neural Computation, vol. 12, no. 6, pp. 1285-1292, 2000. [pdf]
  13. D. P. Mandic, J. A. Chambers, and M. Bozic,"On Global Asymptotic Stability of Fully Connected Recurrent Neural Networks" In Proceedings of ICASSP'00, pp. 3406-3409, 2000. [pdf]
  14. D. P. Mandic,"The NNGD Algorithm for Neural Adaptive Filters" Electronics Letters, vol. 36, no. 9, pp. 845-846, 2000. [pdf]
  15. D. P. Mandic and J. A. Chambers, "Toward an Optimal PRNN Based NARMA Predictor" IEEE Transactions on Neural Networks, vol. 10, no. 6, pp. 1435-1442, 1999. [pdf]
  16. D. P. Mandic and J. A. Chambers, "Toward an Optimal PRNN Based NARMA Predictor" IEEE Transactions on Neural Networks, vol. 10, no. 6, pp. 1435-1442, 1999. [pdf]
  17. D. P. Mandic and J. A. Chambers, "Relationship Between the Slope of the Activation Function and the Learning Rate for the RNN" Neural Computation, vol. 11, no. 5, pp. 1069-1077, 1999. [pdf]
  18. D. P. Mandic and J. A. Chambers, "Exploiting Inherent Relationships in RNN Architectures" Neural Networks, vol. 12, no. 10, pp. 1341-1345, 1999. [pdf]